-
Notifications
You must be signed in to change notification settings - Fork 669
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add new object and tokens to WEBGL_video_texture #2511
base: main
Are you sure you want to change the base?
Conversation
tharkum
commented
Sep 19, 2017
|
||
<contributor>Andrei Volykhin, LG Electronics</contributor> | ||
|
||
<contributor>Mark Callow, HI Corporation</contributor> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please change to Edgewise Consulting. I left HI a couple of years ago.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I used information from old extension, i will change it.
<overview> | ||
<p>This extension defines a new object, the <code>HTMLElementTexture</code>, | ||
that can be used to efficiently transfer a sequence of image frames from | ||
a producer which out of control the WebGL into a WebGL texture. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"which out of control the WebGL" -> "which is outside the control of WebGL".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall very nice! Please fix the various textual issues I've noted and add an issue about the returning the samples in linear space or the space of the original image.
For example, if the original image is in the sRGB color space then the RGB value | ||
returned by the sampler will also be sRGB, and if the original image is stored | ||
in ITU-R Rec. 601 YV12 then the RGB value returned by the sampler will be | ||
an RGB value in the ITU-R Rec. 601 colorspace)</p> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I wrote in #2508, this decision was made before sRGB support in OpenGL ES. If we continue with this approach a query for either the colorspace name or the actual transfer function will be needed so apps can convert to linear data for blending and can convert properly to the canvas color space.
We should consider having this extension return samples in linear space instead. It should only be a few more instructions in the injected shader code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, i will remove this part from proposal and instead add issue for linear vs non-linear color space.
|
||
<p>This extension does not cover the details of how a <code>HTMLELementTexture</code> | ||
object could be created from a producer element. | ||
Different kinds of producers works differently and described in additional externsion |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"works" -> "work".
"and described" -> "so object creation is described"
|
||
<samplecode xml:space="preserve"> | ||
|
||
<p> This a fragment shader that samples a element texture.</p> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"a element" -> "an element"
|
||
<function name="updateTexImage" type="void"> | ||
Update the texture image to the most recent frame from the image stream | ||
and release the previously-held frame. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change the order. Put release first. E.g, "release the previously-held frame, if any, and update the texture image to the most recent frame."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately i forgot to add information how we should use bindTexture + updateTexImage().
I think updateTexImage() shouldn't latch the most recent frame from image stream if our linked WebGL texture isn't bound to new texture target in active texture unit currently or need to use the SurfaceTexture way (it will implicitly bind its texture to the TEXTURE_VIDEO_WEBGL texture target.).
} | ||
</pre> | ||
|
||
<p>This shows application that renders HTML*Element using proposed extension.</p> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"renders HTMLElement" to "renders an HTMLElement"
if (ext === null) | ||
return; | ||
|
||
elementTexture = ext.create*Texture(customElement, customTexture); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we really need other extensions? What would create*Texture need to do differently for different types of HTMLElement? If we really need them, this example should show calling getExtension for one of them and using a create function from the extension.
..... | ||
|
||
gl.activeTexture(gl.TEXTURE0); | ||
gl.bindTexture(ext.TEXTURE_ELEMENT_WEBGL, customTexture); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems unnecessary to keep binding and unbinding this customTexture. Unless the underlying implementation requires it, we shouldn't do it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added this code to mention how we will use this texture with new target in the rendering stage.
No point to keep binding and unbinding this texture all time. I will remove these lines.
Attempting to abstract between handling videos, and any other kind of rendered HTML element, is in my opinion the wrong direction to go with these extensions. Videos are treated, and rendered, very differently than other elements. The hardware-accelerated video decoding paths tend to produce YUV format textures. The rendering of arbitrary other DOM elements does not; that rendering process produces RGB textures. Further, DOM layout and rendering are expensive operations, and an API designed to latch in a rendered texture from the DOM should be structured differently – more asynchronously – than one designed to work with videos. Additionally, if a user's single shader is intended to abstract over sampling a YUV or RGB texture, that's going to introduce either significant dynamic branching in the shader – costing run-time performance – or require dynamic shader recompilation by the browser depending on the type of texture being sampled, which will add significant complexity to WebGL implementations. I suggest that you make the improvements to the precision of the WEBGL_video_texture extension proposal directly on that extension proposal, and abandon this one. I don't think we should rewrite WEBGL_video_texture on top of this proposal. |
@kenrussell < Thanks for the comments. We will discuss it internally and update soon. |
@kenrussell lists 2 issues:
The obvious solution to no. 1 is to use a regular sampler for non-video elements so we should move the special texture target and sampler to the video texture extension. Note the dynamic branching or recompilation issue can arise with videos of different formats. Therefore it is important for the spec. to explain this as, in the proposed element_texture spec. We might even consider a stronger recommendation to use a separate shader per video. On no. 2, the only possible synchronous part of this spec. is With all the browsers seemingly moving to a multi-process architecture isn't there a good chance that 2d canvases and iframes might be rendered asynchronously to the WebGL canvas? In that case you'll need the same buffering and possible wait-for-completion as video and an I still think it is possible to find a useful common base extension. |
If we do go ahead with a video-only solution, I suggest replacing the content of WEBGL_video_texture with this proposal with the generic HTMLElement parts removed. This is a far more complete proposal than WEBGL_video_texture. |
Althrough common extension for arbitary HTML elements will allow to make common interface for them it will make everything much complex. I will upload new WEBGL_video_texture on base of this proposal and we could focus on async and color space issues. If i will do force push for pull request, will it be ok? |
* Add new HTMLElementTexture object and new tokens for TEXTURE_VIDEO_WEBGL texture target.
|