-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix GLB parser index buffer initialization #7042
base: main
Are you sure you want to change the base?
Conversation
This breaks the GLB examples. |
@mvaligursky hmm, why does vercel preview deployment have this webgpu index buffer? Edit: nevermind, those are different between webgpu and webgl |
Ok, the issue is that when Index Buffer sets data, it uses byte length with bytes count to check the size. It expects it to be an array buffer, but receives and acts as if it is a typed array: engine/src/platform/graphics/index-buffer.js Lines 160 to 171 in c66a67f
Array Buffer in this case has a byte length of 4288, while the typed array has 72 with an offset: So, the numbers will match only if a typed array is passed, when using 72, instead of 4288. @mvaligursky please advice. |
I suppose one option could be to change the docs, so the index buffer takes a typed array, instead of array buffer, and change the procedural so that it passes a typed array as well. Not sure about the code base though, but it seems a typed array is used everywhere or we would have noticed this earlier. |
Fixes #5869
Fixes a bug, where GLB parser passes a typed array instead of an array buffer to the index buffer constructor.
I confirm I have read the contributing guidelines and signed the Contributor License Agreement.