Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ddr training improvements #173

Merged
merged 3 commits into from
Oct 20, 2023
Merged

Ddr training improvements #173

merged 3 commits into from
Oct 20, 2023

Conversation

jlaitine
Copy link

Summary

This is a collection of patches for the ddr training verify step, created during DDR initialization studies.

  • More strict DQ / DQS eye sanity check; don't only check the eye width, but also centering
  • Check only the TXDLY value that will actually be used
  • Improve CA training verify step

Impact

This may result more errors from the ddr training verify step, but I assume that those errors are then valid ones and can be mitigated by better initialization values.

Testing

Tested on saluki-v2, testing is still ongoing

After read training, check also that the eye is centered properly. Sometimes
after the training the width is long enough, but it is not centered.

Signed-off-by: Jukka Laitinen <[email protected]>
Just verify that the delay for the selected clock != 0

Signed-off-by: Jukka Laitinen <[email protected]>
Corrections to CA training verify step. The original copied from HSS didn't
make sense in all aspects:
- The check is not per lane, so it should be out of the "for (lane_sel" loop.
- The check wasn't proper. The expected outcome is just a vector of increasing numbers
  separated enough

Signed-off-by: Jukka Laitinen <[email protected]>
@jlaitine jlaitine requested review from pussuw and eenurkka October 13, 2023 12:52
@jlaitine jlaitine merged commit c266572 into master Oct 20, 2023
8 checks passed
@pussuw pussuw deleted the ddr_training_improvements branch October 20, 2023 19:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants