You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to deploy argocd with redis-ha proxy. This chart argo-cd chart uses dependency subchart for redis-ha https://github.com/DandyDeveloper/charts/tree/master/charts/redis-ha. Now when i deploy the argocd helm chart on the kubernetes cluster with nodes containing latest linux image. ha-proxy pods goes into carshloopback off state due to OOM.
After investigating found solution of setting maxconn parameter value 4096 in haproxy.cfg file . I can see in subchart there is parameter called customConfig where we can define this maxconn parameter. But it is not working https://github.com/DandyDeveloper/charts/blob/redis-ha-4.27.6/charts/redis-ha/values.yaml#L203.
Can someone help me how to make it workin in argo-cd helm chart?
Describe the bug
Hi Team,
I am trying to deploy argocd with redis-ha proxy. This chart argo-cd chart uses dependency subchart for redis-ha https://github.com/DandyDeveloper/charts/tree/master/charts/redis-ha. Now when i deploy the argocd helm chart on the kubernetes cluster with nodes containing latest linux image. ha-proxy pods goes into carshloopback off state due to OOM.
After investigating found solution of setting maxconn parameter value 4096 in haproxy.cfg file . I can see in subchart there is parameter called customConfig where we can define this maxconn parameter. But it is not working https://github.com/DandyDeveloper/charts/blob/redis-ha-4.27.6/charts/redis-ha/values.yaml#L203.
Can someone help me how to make it workin in argo-cd helm chart?
Ref
Regards,
krimesh
Related helm chart
argo-cd
Helm chart version
7.7.1
To Reproduce
I tried to configure under haproxy block in values.yaml as shown below but it doesnt work.
Expected behavior
redis-ha-haproxy pod should be in running state.
Screenshots
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: