-
Notifications
You must be signed in to change notification settings - Fork 66
openshift jenkins-sync-plugin is generating too much noise #3266
Comments
Is this the sync plugin from DevStudio? Or something else? |
FYI Issue and related repo fabric8io/jenkins-sync-plugin#19 |
updated the title, |
Thanks is this something we fix or we need OpenShift to fix? |
@piyush1594 I had a quick proper look at this and my java is pretty sh**/rusty, I digged into the code of upstream and see this change which seems to be the one we want : openshift/jenkins-sync-plugin@ec93d26 Adapting this to our fork of a fork (long story for the reader who's not aware of the context), this seems to be the only thing needed : diff --git a/src/main/java/io/fabric8/jenkins/openshiftsync/PipelineJobListener.java b/src/main/java/io/fabric8/jenkins/openshiftsync/PipelineJobListener.java
index a9fbf40..4757652 100644
--- a/src/main/java/io/fabric8/jenkins/openshiftsync/PipelineJobListener.java
+++ b/src/main/java/io/fabric8/jenkins/openshiftsync/PipelineJobListener.java
@@ -285,7 +285,7 @@ public class PipelineJobListener extends ItemListener {
return;
}
}
- updateBuildConfigFromJob(job, jobBuildConfig);
+ boolean bcupdated = updateBuildConfigFromJob(job, jobBuildConfig);
if (!hasEmbeddedPipelineOrValidSource(jobBuildConfig)) {
// this pipeline has not yet been populated with the git source or an embedded pipeline so lets not create/update a BC yet
@@ -296,7 +296,7 @@ public class PipelineJobListener extends ItemListener {
if (create) {
OpenShiftUtils.addAnnotation(jobBuildConfig, Annotations.JENKINS_JOB_PATH, JenkinsUtils.getFullJobName(job));
}
-
+
if (create) {
try {
BuildConfig bc = getOpenShiftClient().buildConfigs().inNamespace(jobBuildConfig.getMetadata().getNamespace()).create(jobBuildConfig);
@@ -306,10 +306,12 @@ public class PipelineJobListener extends ItemListener {
logger.log(Level.WARNING, "Failed to create BuildConfig: " + NamespaceName.create(jobBuildConfig) + ". " + e, e);
}
} else {
- try {
- getOpenShiftClient().buildConfigs().inNamespace(jobBuildConfig.getMetadata().getNamespace()).withName(jobBuildConfig.getMetadata().getName()).cascading(false).replace(jobBuildConfig);
- } catch (Exception e) {
- logger.log(Level.WARNING, "Failed to update BuildConfig: " + NamespaceName.create(jobBuildConfig) + ". " + e, e);
+ if (bcupdated) {
+ try {
+ getOpenShiftClient().buildConfigs().inNamespace(jobBuildConfig.getMetadata().getNamespace()).withName(jobBuildConfig.getMetadata().getName()).cascading(false).replace(jobBuildConfig);
+ } catch (Exception e) {
+ logger.log(Level.WARNING, "Failed to update BuildConfig: " + NamespaceName.create(jobBuildConfig) + ". " + e, e);
+ }
}
}
} maybe i am completely wrong tho, cc @gabemontero |
May be this one also openshift/jenkins-sync-plugin@53ce3f4#diff-b9ed119c66ed02d31e9cb3377f727207 can resolve our problem |
Is this waiting on other tasks before being tackled? |
@piyush1594 can you please update this issue with what you have been done ? pull-requests and deps etc.. ? We need to keep an update on the P0 every day, |
PR on sync plugin is merged fabric8io/jenkins-sync-plugin#24 and PR on openshift-jenkins-s2i-config is there fabric8io/openshift-jenkins-s2i-config#159 |
Excellent!
|
UPDATE: This issue has been fixed in Jenkins sync plugin by the PR mentioned above. This is blocked on tenant wise update and issue related to this is #3393 Thanks |
As #3393 is closed and according to #3422 (comment), seems like this has been done. Closing this. |
This is to followup on this issue #3240
Openshift sync plugin is simply generating a humongous amount of data in a very stupid way. From what was observed it tries to synchronise the Jenkins jobs with OpenShift plugin by simply writing again and again without looking if there was previous changes.
This stress the system pretty badly and generate load on the system exponentially to the number of users we have on the system.
Project issue: fabric8io/jenkins-sync-plugin#19
Raising this as P0 since this is getting worse and worse and need to get this fixed ASAP
The text was updated successfully, but these errors were encountered: