Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vulkan : do not use tensor->extra #9407

Merged
merged 2 commits into from
Oct 2, 2024

Conversation

rgerganov
Copy link
Collaborator

This patch allows using the Vulkan backend with the RPC backend as tensor->extra is no longer used.

Ref: #8536

This patch allows using the Vulkan backend with the RPC backend as
tensor->extra is no longer used.

Ref: ggerganov#8536
@github-actions github-actions bot added Vulkan Issues specific to the Vulkan backend ggml changes relating to the ggml tensor library for machine learning labels Sep 10, 2024
@rgerganov
Copy link
Collaborator Author

I have tested this by verifying that llama-cli generates the same output with a fixed seed on master and this branch. Backend ops tests also pass.

If you are OK with the proposed refactoring, I will also update the code surrounded by GGML_VULKAN_CHECK_RESULTS

@0cc4m
Copy link
Collaborator

0cc4m commented Sep 23, 2024

@ggerganov @rgerganov Apologies for the delay, I'm inbetween jobs and moving to a new city, so my time is rather limited. I'll try to find time to take care of the PRs waiting for my review this week.

@ggerganov
Copy link
Owner

Thanks, no worries.

Copy link
Collaborator

@0cc4m 0cc4m left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is nice, thank you. It cleans up some of the leftover code from previous iterations. It worked in my tests.

The only issue is that it breaks the GGML_VULKAN_CHECK_RESULTS mode. Can you take a look at that or would you like me to?

@0cc4m
Copy link
Collaborator

0cc4m commented Sep 29, 2024

@rgerganov I fixed the issue in rgerganov#2
Please take a look.

@rgerganov
Copy link
Collaborator Author

@rgerganov I fixed the issue in rgerganov#2 Please take a look.

Sorry for the delay, somehow I missed the notifications for this PR. Changes look good and merged.

@rgerganov
Copy link
Collaborator Author

this should be good to go

@ggerganov ggerganov merged commit 00b7317 into ggerganov:master Oct 2, 2024
53 checks passed
dsx1986 pushed a commit to dsx1986/llama.cpp that referenced this pull request Oct 29, 2024
* vulkan : do not use tensor->extra

This patch allows using the Vulkan backend with the RPC backend as
tensor->extra is no longer used.

Ref: ggerganov#8536

* Adapt GGML_VULKAN_CHECK_RESULTS to extra removal (Mobile-Artificial-Intelligence#2)

---------

Co-authored-by: 0cc4m <[email protected]>
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Nov 15, 2024
* vulkan : do not use tensor->extra

This patch allows using the Vulkan backend with the RPC backend as
tensor->extra is no longer used.

Ref: ggerganov#8536

* Adapt GGML_VULKAN_CHECK_RESULTS to extra removal (#2)

---------

Co-authored-by: 0cc4m <[email protected]>
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Nov 18, 2024
* vulkan : do not use tensor->extra

This patch allows using the Vulkan backend with the RPC backend as
tensor->extra is no longer used.

Ref: ggerganov#8536

* Adapt GGML_VULKAN_CHECK_RESULTS to extra removal (#2)

---------

Co-authored-by: 0cc4m <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ggml changes relating to the ggml tensor library for machine learning Vulkan Issues specific to the Vulkan backend
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants