Skip to content

Commit

Permalink
XR docs improvements (#661)
Browse files Browse the repository at this point in the history
* Update XR docs for Apple Vision Pro

* Use arrow functions

* Title tweak

* More arrow functions

* Edit pass for XR optimization page

* Improve VR spaces page

* Remove newline

* Edit Types of VR page
  • Loading branch information
willeastcott authored Aug 1, 2024
1 parent 0e28c46 commit 854b19e
Show file tree
Hide file tree
Showing 9 changed files with 69 additions and 65 deletions.
14 changes: 7 additions & 7 deletions docs/user-manual/xr/ar/hit-testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,10 @@ The most basic way is to start probing straight from the viewer forward vector:
// start a hit test
app.xr.hitTest.start({
spaceType: pc.XRSPACE_VIEWER, // from a viewer space
callback: function (err, hitTestSource) {
callback: (err, hitTestSource) => {
if (err) return;
// subscribe to hit test results
hitTestSource.on('result', function (position, rotation) {
hitTestSource.on('result', (position, rotation) => {
// position and rotation of hit test result
// based on a ray facing forward from the viewer reference space
});
Expand All @@ -55,9 +55,9 @@ When an XR session is started on a monoscopic device (such as a mobile phone wit
```javascript
app.xr.hitTest.start({
profile: 'generic-touchscreen', // touch screen input sources
callback: function (err, hitTestSource) {
callback: (err, hitTestSource) => {
if (err) return;
hitTestSource.on('result', function (position, rotation, inputSource) {
hitTestSource.on('result', (position, rotation, inputSource) => {
// position and rotation of hit test result
// that will be created from touch on mobile devices
});
Expand All @@ -73,9 +73,9 @@ The most common way to start hit testing is from a ray of an input source (e.g.

```javascript
inputSource.hitTestStart({
callback: function (err, hitTestSource) {
callback: (err, hitTestSource) => {
if (err) return;
hitTestSource.on('result', function (position, rotation) {
hitTestSource.on('result', (position, rotation) => {
// position and rotation of a hit test result
// based on a ray of an input source
});
Expand All @@ -96,7 +96,7 @@ ray.direction.set(0, -1, 0); // point downwards
app.xr.hitTest.start({
spaceType: pc.XRSPACE_LOCALFLOOR,
offsetRay: ray,
callback: function (err, hitTestSource) {
callback: (err, hitTestSource) => {
// hit test source that will sample real world geometry
// from the position where AR session started
}
Expand Down
2 changes: 1 addition & 1 deletion docs/user-manual/xr/ar/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ When using PlayCanvas’ built-in AR support, the scene’s primary camera’s c
To start an AR session, device support and availability should also be checked first. Then, on user interaction, such as a button click or other input, an AR session can be started:

```javascript
button.element.on('click', function () {
button.element.on('click', () => {
// check if XR is supported and AR is available
if (app.xr.supported && app.xr.isAvailable(pc.XRTYPE_AR)) {
// start AR using a camera component
Expand Down
2 changes: 1 addition & 1 deletion docs/user-manual/xr/capabilities.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ Here is an example of enabling the experimental API for [WebXR Layers][3]:
```javascript
app.xr.start(cameraComponent, pc.XRTYPE_VR, pc.XRSPACE_LOCAL, {
optionalFeatures: [ 'layers' ],
callback: function(err) {
callback: (err) => {
if (err) {
console.log(err);
return;
Expand Down
31 changes: 15 additions & 16 deletions docs/user-manual/xr/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,47 +5,46 @@ sidebar_position: 20

![VR View](/img/user-manual/xr/vr-view.png)

PlayCanvas lets you create [AR][6] (Augmented Reality) and [VR][7] (Virtual Reality) applications for a variety of devices based on the new WebXR API, as well as through external integrations.
PlayCanvas lets you create [AR](/user-manual/xr/ar/) (Augmented Reality) and [VR](/user-manual/xr/vr/) (Virtual Reality) applications for a variety of devices based on the new WebXR API, as well as through external integrations.

## Capabilities

Through extensions, WebXR is ever growing and various platforms are constantly implementing new and existing WebXR Modules. The PlayCanvas Engine provides access to these modules in the form of integrations, so they are easier to work with and work nicely with PlayCanvas' systems.

You can check a [list of currently supported modules][5].
You can check a [list of currently supported modules](/user-manual/xr/capabilities/).

## Platforms

WebXR is a new API and it is being rolled out gradually to all major platforms. Up-to-date support can be checked on [caniuse.com][3].
WebXR is a new API and it is being rolled out gradually to all major platforms. Up-to-date support can be checked on [caniuse.com](https://caniuse.com/webxr).

Additionally, support can be achieved with the [WebXR Polyfill][4].
Additionally, support can be achieved with the [WebXR Polyfill](https://github.com/immersive-web/webxr-polyfill).

On **mobile**, WebXR works on Android with VR and AR session types.

On **HMDs**, such as Meta Quest, WebXR is well-supported for VR and AR session types. Apple Vision Pro currently supports VR session types when enabled in Safari settings.

On **desktop**, WebXR currently works in Chrome and Edge, and devices are linked through various native APIs, such as SteamVR, OpenXR, and others. This covers the majority of desktop-based VR devices and allows devices such as Meta Quest to be used via Steam Link.

## Testing WebXR without XR device
## Testing WebXR without an XR Device

To start developing with WebXR today, a Chrome [extension][1] can be used which emulates the WebXR API. This allows developers to interact with various head-mounted displays and controllers.
You do not need to have XR hardware to start developing with WebXR. You can install the [Immersive Web Emulator](https://chromewebstore.google.com/detail/immersive-web-emulator/cgffilbpcibhmcfbgggfhfolhkfbhmik) Chrome extension which emulates the WebXR API. It allows you to simulate various head-mounted displays and controllers via the browser's Dev Tools.

## Getting started with WebXR
:::danger

To start an XR session, support and availability should be checked first. Then, on user interaction XR, a session can be started:
Do not use the [WebXR API Emulator](https://chromewebstore.google.com/detail/webxr-api-emulator/mjddjgeghkdijejnciaefnkjmkafnnje) Chrome extension. It is not compatible with PlayCanvas. PlayCanvas applications will throw an exception if it is active.

:::

## Getting Started with WebXR

To start an XR session, support and availability should be checked first. Then, on a user interaction, a session can be started:

```javascript
button.element.on('click', function () {
button.element.on('click', () => {
// check if XR is supported and VR is available
if (app.xr.supported && app.xr.isAvailable(pc.XRTYPE_VR)) {
// start VR session providing a camera component
app.xr.start(entity.camera, pc.XRTYPE_VR, pc.XRSPACE_LOCALFLOOR);
}
});
```

[1]: https://chromewebstore.google.com/detail/immersive-web-emulator/cgffilbpcibhmcfbgggfhfolhkfbhmik
[3]: https://caniuse.com/#feat=webxr
[4]: https://github.com/immersive-web/webxr-polyfill
[5]: /user-manual/xr/capabilities/
[6]: /user-manual/xr/ar/
[7]: /user-manual/xr/vr/
8 changes: 4 additions & 4 deletions docs/user-manual/xr/input-sources.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,10 +29,10 @@ Some input sources are **transient** and have a short lifespan during their prim
It is best to subscribe to `add` and `remove` events and then create their visual representation if needed:

```javascript
app.xr.input.on('add', function (inputSource) {
app.xr.input.on('add', (inputSource) => {
// input source been added

inputSource.once('remove', function () {
inputSource.once('remove', () => {
// know when input source has been removed
});
});
Expand All @@ -43,15 +43,15 @@ app.xr.input.on('add', function (inputSource) {
Each input source can have a primary action `select`. For controllers, it is a primary button/trigger. For the touch-screen, it is a tap. For hands, it is a pinch of thumb and index fingers. There are also `selectstart` and `selectend` events which you can subscribe to as follows:

```javascript
inputSource.on('select', function () {
inputSource.on('select', () => {
// primary action
});
```

Or through the input manager:

```javascript
app.xr.input.on('select', function (inputSource) {
app.xr.input.on('select', (inputSource) => {
// primary action
});
```
Expand Down
35 changes: 15 additions & 20 deletions docs/user-manual/xr/optimizing-webxr.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@ sidebar_position: 30

## Introduction

A high and consistent framerate is critical for making an enjoyable XR experience. When creating VR/AR content, it is more important than ever to test and optimize early and maintain the target framerate throughout development.
A high and consistent frame rate is critical for making an enjoyable XR experience. When creating VR/AR content, it is more important than ever to test and optimize early and maintain the target frame rate throughout development.

For AR experiences, framerates must be managed carefully as world tracking sometimes incurs significant performance costs. This is in addition to the typically performance-constrained mobile hardware most users have access to.
For AR experiences, frame rates must be managed carefully as world tracking sometimes incurs significant performance costs. This is in addition to the typically performance-constrained mobile hardware most users have access to.

For VR experiences, rendering is especially expensive due to the fact that the scene must be rendered once for each view (eye). While PlayCanvas is highly optimized to ensure VR rendering doesn't fully duplicate effort, stereo rendering remains more expensive than mono rendering.

Expand All @@ -19,41 +19,36 @@ PlayCanvas, however, includes several features specifically designed to let your

### Draw Calls and Batching

Draw Calls are operations when the engine provides necessary information to the GPU for rendering an object. The more objects you have in the scene, the more draw calls it will require to render each frame. To reduce the number of draw calls it is recommended to minimize the number of objects in the frame by culling, [static batching][5] and [instancing][6].
Draw Calls are operations when the engine provides the necessary information to the GPU for rendering an object. The more objects you have in the scene, the more draw calls it will require to render each frame. To reduce the number of draw calls, it is recommended to minimize the number of objects in the frame by culling, [static batching](/user-manual/graphics/advanced-rendering/batching/) and [instancing](/user-manual/graphics/advanced-rendering/hardware-instancing/).

### Runtime lightmap generation

Each dynamic light has a per-frame runtime cost. The more lights have you the higher the costs and the slower your scene will render. By baking lights into lightmaps you can hugely reduce the cost of static lights to that of simply rendering a texture. Lightmaps can be generated offline using your favorite 3D modeling tool or you can use PlayCanvas's built-in Runtime Lightmap Generation.

Read more about using [runtime lightmap generation][1].
Each dynamic light has a per-frame runtime cost. The more lights you have, the higher the cost and the slower your scene will render. By baking lights into lightmaps you can hugely reduce the cost of static lights to that of simply rendering a texture. Lightmaps can be generated offline using your favorite 3D modeling tool or you can use PlayCanvas' built-in [Runtime Lightmapper](/user-manual/graphics/lighting/runtime-lightmaps/).

### Cautious use of real-time shadows

For similar reasons to dynamic lights, dynamic shadows also have a per-frame runtime cost. Omni lights, in particular, have to render the scene 6 times to generate shadow maps. You should avoid having many lights casting dynamic shadows.
For similar reasons to dynamic lights, dynamic shadows also have a per-frame runtime cost. Omni lights, in particular, have to render the scene 6 times to generate shadow maps. You should avoid having too many lights casting dynamic shadows.

### Watch your fill rate and overdraw

The fill rate refers to the number of shader operations that are applied to each pixel on the screen. If you have expensive fragment shader calculations (e.g. lots of lights and complicated materials) and a high resolution (e.g. a mobile phone with a high device pixel ratio) then your application will spend too much time rendering the scene to maintain a high frame rate.
Fill rate refers to the number of pixels that can be filled by the GPU over time (normally per second). If you have expensive fragment shaders (e.g. lots of lights and complex materials) and a high resolution (e.g. a mobile phone with a high device pixel ratio) then your application will spend too much time rendering the scene to maintain a high frame rate.

Overdraw refers to the rendering inefficiency that occurs when multiple layers of pixels are processed for the same screen area. This can happen for valid reasons (multiple layers of blending and/or transparency) or redundant reasons (more distant pixels being overwritten by nearer opaque pixels). For the latter case, you are wasting GPU processing trying to draw pixels that are not visible.

Overdraw refers to how many pixels are overwritten by drawing geometry that is obscured by other geometry closer to the camera. Too much overdraw shows that you are wasting GPU processing trying to draw pixels that are not visible. This is usually caused by transparency on materials and non-opaque blending.
:::tip

Using an extension like [WebGL Insight][2] can help you visualize overdraw
Using an extension like [WebGL Insight](https://github.com/3Dparallax/insight) can help you visualize overdraw.

:::

### Garbage collection stalls

Web browsers feature automatic garbage collection of unused Javascript objects. The PlayCanvas engine is designed to minimize runtime allocations and you should try to do the same in your code. Pre-allocate vectors and other objects and re-use them so that there are not lots of objects created and discarded every frame.
Web browsers feature automatic garbage collection of unused JavaScript objects. The PlayCanvas engine is designed to minimize runtime allocations and you should try to do the same in your code. Pre-allocate vectors and other objects and re-use them so that there are not lots of objects created and discarded every frame.

### Profiling Tools

PlayCanvas comes with a built-in profiler tool. In the Editor use the Launch Profiler button to run your application with profiling active. [Read more about the profiler][3]
PlayCanvas comes with a built-in [profiler tool](/user-manual/optimization/profiler/). In the Editor, enable the Profiler option in the Launch menu to run your application with profiling active.

### General optimization tips

[Many more optimization guidelines][4] are available.

[1]: /user-manual/graphics/lighting/runtime-lightmaps/
[2]: https://github.com/3Dparallax/insight
[3]: /user-manual/optimization/profiler/
[4]: /user-manual/optimization/guidelines/
[5]: /user-manual/graphics/advanced-rendering/batching/
[6]: /user-manual/graphics/advanced-rendering/hardware-instancing/
[Many more optimization guidelines](/user-manual/optimization/guidelines/) are available.
14 changes: 7 additions & 7 deletions docs/user-manual/xr/using-webxr.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,9 @@ if (app.xr.supported) {
}
```

## Starting
## Starting an XR Session

To start XR session, you can use method on the Camera Component or [XrManager][2] on the Application. To start an XR session you need to provide CameraComponent and provide the type of XR session, reference space, and optional object with additional arguments:
To start XR session, you can use method on the Camera Component or [XrManager][2] on the Application. To start an XR session, you need to provide CameraComponent and provide the type of XR session, reference space, and optional object with additional arguments:

```javascript
app.xr.start(entity.camera, pc.XRTYPE_VR, pc.XRSPACE_LOCALFLOOR);
Expand All @@ -32,7 +32,7 @@ button.on('click', () => {
To know when a session is started, you can subscribe to the `start` event:

```javascript
app.xr.on('start', function () {
app.xr.on('start', () => {
// XR session has started
});
```
Expand All @@ -41,7 +41,7 @@ Session type or reference space might not be available on a particular platform,

```javascript
entity.camera.startXr(pc.XRTYPE_VR, pc.XRSPACE_UNBOUNDED, {
callback: function(err) {
callback: (err) => {
if (err) {
// failed to start session
}
Expand All @@ -60,7 +60,7 @@ app.xr.end();
Also, the user might exit XR via some external process like the back button in the browser. [XrManager][2] will fire events associated with the session `end`:

```javascript
app.xr.on('end', function () {
app.xr.on('end', () => {
// XR session has ended
});
```
Expand All @@ -83,12 +83,12 @@ if (app.xr.isAvailable(pc.XRTYPE_VR)) {
You can subscribe to availability change events too:

```javascript
app.xr.on('available', function (type, available) {
app.xr.on('available', (type, available) => {
console.log('XR session', type, 'type is now', available ? 'available' : 'unavailable');
});

// or specific session type
app.xr.on('available:' + pc.XRTYPE_VR, function (available) {
app.xr.on('available:' + pc.XRTYPE_VR, (available) => {
console.log('XR session VR type is now', available ? 'available' : 'unavailable');
});
```
Expand Down
14 changes: 10 additions & 4 deletions docs/user-manual/xr/vr/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,20 @@ PlayCanvas also lets you create Virtual Reality (VR) applications.

## Platforms

VR capabilities are available across various platforms: desktop (Chrome, Edge), mobile (Chrome, Samsung) and HMDs (Oculus, Magic Leap, Pico).
VR capabilities are available across various platforms: desktop (Chrome, Edge), mobile (Chrome, Samsung) and HMDs (Apple Vision Pro, Meta, Magic Leap, Pico).

:::warning

Due to an issue in WebKit on Apple Vision Pro, you must currently disable `Anti-Alias` in the Scene Settings of your project.

:::

## Getting started with WebXR VR

To start a VR session, device support and availability should be checked first. Then, on user interaction such as a button click or other input, a VR session can be started:
To start a VR session, device support and availability should be checked first. Then, on a user interaction such as a button click or other input, a VR session can be started:

```javascript
button.element.on('click', function () {
button.element.on('click', () => {
// check if XR is supported and VR is available
if (app.xr.supported && app.xr.isAvailable(pc.XRTYPE_VR)) {
// start AR using a camera component
Expand All @@ -33,4 +39,4 @@ app.xr.end();

## Starter Kits

PlayCanvas provides a ‘VR Kit’ project to help you and your VR experience get up and running faster. When creating a new project, simply select ‘VR Kit’ from the dialog.
PlayCanvas provides a ‘VR Kit’ project to help you and your VR experience get up and running faster. When creating a new project, simply select ‘VR Kit’ from the New Project dialog.
Loading

0 comments on commit 854b19e

Please sign in to comment.