You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We've developed a 2D/3D asset management and inspection tool that uses server-side rendering. Rendered frames are encoded to a video stream and sent back to client devices for display. This allows for dense assets to be interactive even on lower end mobile devices. In addition to standard interaction models (touch/mouse/keyboard controls), we also have an AR viewing mode. You can see the AR functionality in action to visualize a 1 billion triangle mesh at our Siggraph Real Time Live Demo at the 6:30 mark: https://youtu.be/BQM9WyrXie4?t=389
This is happening inside chrome canary on android. You'll see at the 6:56 mark that the mesh, while workable, does have some issues with being slightly behind-sync with the real world environment. The reason it is behind sync is that this approach (and essentially all such server-side rendering approaches) must send the view transform from client to server, render a frame, encode the frame into a video stream, send that back to the client, decode, and overlay it with the AR camera display. While this turnaround can be made fairly fast (a few tens of milliseconds), the 3D content will always be slightly behind the video.
If we put aside HMDs and focus on arkit/arcore, if we only had some means of controlling when frames from the camera are displayed on the user's screen, it would make it possible to introduce a slight delay and synchronize the display of camera frames with the rendered results from the server. Is there anything planned within webxr to allow for this? Thanks.
The text was updated successfully, but these errors were encountered:
We've developed a 2D/3D asset management and inspection tool that uses server-side rendering. Rendered frames are encoded to a video stream and sent back to client devices for display. This allows for dense assets to be interactive even on lower end mobile devices. In addition to standard interaction models (touch/mouse/keyboard controls), we also have an AR viewing mode. You can see the AR functionality in action to visualize a 1 billion triangle mesh at our Siggraph Real Time Live Demo at the 6:30 mark:
https://youtu.be/BQM9WyrXie4?t=389
This is happening inside chrome canary on android. You'll see at the 6:56 mark that the mesh, while workable, does have some issues with being slightly behind-sync with the real world environment. The reason it is behind sync is that this approach (and essentially all such server-side rendering approaches) must send the view transform from client to server, render a frame, encode the frame into a video stream, send that back to the client, decode, and overlay it with the AR camera display. While this turnaround can be made fairly fast (a few tens of milliseconds), the 3D content will always be slightly behind the video.
If we put aside HMDs and focus on arkit/arcore, if we only had some means of controlling when frames from the camera are displayed on the user's screen, it would make it possible to introduce a slight delay and synchronize the display of camera frames with the rendered results from the server. Is there anything planned within webxr to allow for this? Thanks.
The text was updated successfully, but these errors were encountered: