Skip to content

Commit

Permalink
Adding isaac basic usage
Browse files Browse the repository at this point in the history
  • Loading branch information
abhihjoshi committed Dec 28, 2024
1 parent 0305f3d commit c72a53d
Showing 1 changed file with 28 additions and 0 deletions.
28 changes: 28 additions & 0 deletions docs/modules/renderers.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,34 @@ Ray tracing can be performed in real time. We are currently working on enhancing

Path tracing typically offers higher quality and is ideal for offline learning. If you have the time to collect data and plan to train algorithms using offline data, we recommend using path tracing for its photorealistic results.

### Basic usage

Once all dependecies for Isaac rendering have been installed, users can run the `robosuite/scripts/render_dataset_with_omniverse.py` to render previously collected demonstrations using either ray tracing or path tracining. Below we highlight the arguments that can be passed into the script.

- **dataset**: Path to hdf5 dataset with the demonstrations to render.
- **ds_format**: Dataset format (options include `robosuite` and `robomimic` depending on if the dataset was collected using robosuite or robomimic, respectively).
- **episode**: Episode/demonstration to render. If no episode is provided, all demonstrations will be rendered.
- **output_directory**: Directory to store outputs from Isaac rendering and USD generation.
- **cameras**: List of cameras to render images. Cameras must be defined in robosuite.
- **width**: Width of the rendered output.
- **height**: Height of the rendered output.
- **renderer**: Renderer mode to use (options include `RayTracedLighting` or `PathTracing`).
- **save_video**: Whether to save the outputs renderings as a video.
- **online**: Enables online rendering and will not save the USD for future rendering offline.
- **skip_frames**: Renders every nth frame.
- **hide_sites**: Hides all sites in the scene.
- **reload_model**: Reloads the model from the Mujoco XML file.
- **keep_models**: List of names of models to keep from the original Mujoco XML file.
- **rgb**: Render with the RGB modality. If no other modality is selected, we default to rendering with RGB.
- **normals**: Render with normals.
- **semantic_segmentation**: Render with semantic segmentation.

Here is an example command to render an video of a demonstration using ray tracing with the RGB and normal modality.

```bash
$ python robosuite/scripts/render_dataset_with_omniverse.py --dataset /home/abhishek/Documents/research/rpl/robosuite/robosuite/models/assets/demonstrations_private/1734107564_9898326/demo.hdf5 --ds_format robosuite --episode 1 --camera agentview frontview --width 1920 --height 1080 --renderer RayTracedLighting --save_video --hide_sites --rgb --normals
```

### Rendering Speed

Below, we present a table showing the estimated frames per second when using these renderers. Note that the exact speed of rendering might depend on your machine and scene size. Larger scenes may take longer to render. Additionally, changing renderer inputs such as samples per pixel (spp) or max bounces might affect rendering speeds. The values below are estimates using the `Lift` task with an NVIDIA GeForce RTX 4090. We use an spp of 64 when rendering with path tracing.
Expand Down

0 comments on commit c72a53d

Please sign in to comment.