Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Don't Merge] Trigger codegen change to run CI #1423

Closed
wants to merge 7 commits into from

Conversation

WardBrian
Copy link
Member

#1422 is failing for a reason I suspect is unrelated to the changes there so I wanted to run CI independently.

@WardBrian
Copy link
Member Author

Hey @serban-nicusor-toptal - could you tell if anything is different from these runs and https://jenkins.flatironinstitute.org/blue/organizations/jenkins/Stan%2FStanc3/detail/PR-1396/1/pipeline/311/?

As far as I can tell, nothing in the code has changed to cause the end-to-end at O1 to fail

@serban-nicusor-toptal
Copy link
Contributor

serban-nicusor-toptal commented May 8, 2024

I think the difference is that the successful one ran on jenkins2 and the failed one on jenkins agent.
This can be verified by changing this label here from linux to linux && mesa (for jenkins2)
https://github.com/stan-dev/stanc3/blob/master/Jenkinsfile#L508

@WardBrian
Copy link
Member Author

@serban-nicusor-toptal failed even on jenkins2, any other leads?

@serban-nicusor-toptal
Copy link
Contributor

I ran it once on jenkins to be sure.
Well, we know that the dependencies don't change because it runs inside docker image 'stanorg/ci:gpu'.
And it does pull the correct commit b85dab8 then it stashes it right away.
I was looking at this diff but it isn't helpful 771f372...b85dab8
Can this be a result of a hardware change ? Do we know what could make it fail from a machine/infra perspective?

@WardBrian
Copy link
Member Author

I agree the diff is not very useful, but I also diffed the generated code from the current master from the code generated by the 2.34.0 binary and there are not any differences in how the optimizations generate code

@serban-nicusor-toptal
Copy link
Contributor

serban-nicusor-toptal commented May 9, 2024

Looking at the docker image, I see that gpu is a bit older than gpu-cpp17

https://hub.docker.com/layers/stanorg/ci/gpu-cpp17/images/sha256-f5f87c58cf7809f76c851e94b0e7919b95236f327fe402e75ffcf175a0f9f6e9?context=explore
vs
https://hub.docker.com/layers/stanorg/ci/gpu/images/sha256-1760e2bea62fc914f0d4ee667e8be0544a27d0d2264104b2b60b3b030c256f91?context=explore

Tho it looks like gpu is the one used in the successful pipeline too.
Any idea about what else I can look into to try and track this down ?

@WardBrian
Copy link
Member Author

I'm guessing it might have been an underlying hardware change. Unless something changed in Math and we missed it? @andrjohns has there been any movement in the exp, fma functions or the gamma, weibull, bernoulli, or normal distributions? I feel like I would have seen that

@WardBrian
Copy link
Member Author

Hey @dylex - has the jenkins hardware changed appreciably since ~Jan 31? We're seeing some different numerical behavior compared to then, even if we try older versions of our code

@dylex
Copy link
Contributor

dylex commented May 10, 2024

I believe the last hardware change to jenknis was Nov last year, when we upgraded the jenkins control node.

@WardBrian
Copy link
Member Author

Seems like this resolved itself?

@WardBrian WardBrian closed this May 24, 2024
@serban-nicusor-toptal
Copy link
Contributor

That is for sure weird but I'm glad it's working now.

@WardBrian WardBrian deleted the TESTING-codegen-ci branch June 20, 2024 23:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants