You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Controllers are not exposed by the gamepads module outside of VR or AR mode. This is a good anti-profiling feature. However that does make it tricky to identify what control modes are possible, and thus instruct the user, before the user enters VR or AR mode.
Example: In Elfland Glider (https://elfland-beta.surge.sh), on desktop, control is via the keyboard. On mobile, control is via tilting the device. In VR, control is via the head (if no positional controllers are available) or the controllers. Desktop and mobile can be distinguished when the application first launches, so appropriate instructions are displayed on the overlay. However, it is not readily possible to distinguish, using the WebXR API, whether positional controls (and how many) are available. So, the overlay must include instructions for both control using the head and control using controllers. (It is possible, and Elfland Glider implements, switching between head and controller controls as controllers come on- and off-line.)
Example: an exercise application might need to know if leg motion is trackable, to constrain what exercise regimes are offered.
Example: a sign language training application needs to know if hand tracking is available.
So, one of two things is needed:
A requiredFeature / optionalFeature of the XRSessionMode for InputSources
Best Practice of how to inform the user how to do things using the available InputSources.
The text was updated successfully, but these errors were encountered:
Controllers are not exposed by the gamepads module outside of VR or AR mode. This is a good anti-profiling feature. However that does make it tricky to identify what control modes are possible, and thus instruct the user, before the user enters VR or AR mode.
Example: In Elfland Glider (https://elfland-beta.surge.sh), on desktop, control is via the keyboard. On mobile, control is via tilting the device. In VR, control is via the head (if no positional controllers are available) or the controllers. Desktop and mobile can be distinguished when the application first launches, so appropriate instructions are displayed on the overlay. However, it is not readily possible to distinguish, using the WebXR API, whether positional controls (and how many) are available. So, the overlay must include instructions for both control using the head and control using controllers. (It is possible, and Elfland Glider implements, switching between head and controller controls as controllers come on- and off-line.)
Example: an exercise application might need to know if leg motion is trackable, to constrain what exercise regimes are offered.
Example: a sign language training application needs to know if hand tracking is available.
So, one of two things is needed:
The text was updated successfully, but these errors were encountered: