RTX Sensors#
RTX sensors in Isaac Sim use the Omniverse RTX Renderer’s RTX Sensor SDK to sense the environment, enabling interaction with materials in visual and non-visual spectra. This means an RTX-based Lidar can model returns from light interaction with transparent or reflective surfaces, and an RTX-based Radar can model returns accounting for material emissivity and reflectivity in the radio spectrum.
Isaac Sim organizes utilities supporting RTX sensors into the isaacsim.sensors.rtx extension.
Getting Started#
To get started with RTX sensors:
Add a sensor to your scene: Use Create > Isaac > Sensors > RTX Lidar or RTX Radar from the menu, or use the Python APIs described in the sensor-specific pages below.
Collect data: Attach annotators to the sensor to extract point cloud data, scan buffers, or raw
GenericModelOutputdata.Visualize output: Use the Debug Draw Extension to visualize point clouds, or configure viewport debug views.
Integrate with ROS2: Follow the RTX Lidar ROS2 Tutorial to publish sensor data as
PointCloud2orLaserScanmessages.
Sensor Types#
Data Collection and Materials#
Advanced Topics#
Extension Architecture#
RTX sensors are built using the omni.sensors extension suite. To understand more about how RTX sensors are modeled,
and how to build your own, review the following documentation:
Important Settings#
The following settings affect RTX sensor behavior and performance:
Setting |
Default |
Description |
|---|---|---|
|
|
Output Lidar data on GPU. Must be |
|
|
Output Radar data on GPU. Must be |
|
|
Enable hit normal output. Increases VRAM usage. |
|
|
Enable stable 128-bit object IDs for semantic segmentation. |
|
|
Enable Motion BVH for motion compensation and Doppler effects. |
Motion BVH#
RTX sensors use Motion BVH to improve accuracy when modeling motion-related sensor effects, for example, the motion of objects during sensor exposure, or the motion of the sensor itself as it collects data.
By default, Motion BVH is disabled in Isaac Sim to improve performance. The following RTX Sensor features are affected by Motion BVH:
RTX Lidar
Motion BVH must be enabled for RTX Lidar motion compensation to work correctly.
RTX Radar
Motion BVH must be enabled for the Doppler effect, and therefore RTX Radar entirely, to be modeled correctly.
How to Enable Motion BVH#
Note
Enabling Motion BVH can significantly increase rendering time by increasing VRAM usage for all sensors and must be left disabled when not needed.
There are two ways to enable Motion BVH:
In standalone Python workflows, you can enable Motion BVH by specifying
enable_motion_bvhasTruein theSimulationAppconstructor:
from isaacsim import SimulationApp simulation_app = SimulationApp({"enable_motion_bvh": True}) simulation_app.close()
In all workflows, you can enable Motion BVH by specifying the following settings on the command line:
--/renderer/raytracingMotion/enabled=true \ --/renderer/raytracingMotion/enableHydraEngineMasking=true \ --/renderer/raytracingMotion/enabledForHydraEngines='0,1,2,3,4'
Troubleshooting and Known Issues#
Common Issues#
- Annotators return empty data
Ensure the simulation timeline is playing. RTX Sensor Annotators rely on the timeline to collect data. Also verify that
--/app/sensors/nv/lidar/outputBufferOnGPUor--/app/sensors/nv/radar/outputBufferOnGPUis set totrue.- Point cloud appears to “drag” behind moving objects
If the Lidar rotation rate is slower than the frame rate, accumulated scan data may contain returns from multiple frames. This is expected behavior for rotating Lidars. Consider using per-frame output instead of accumulated scans.
- Lidar scans are incomplete in standalone Python workflows
Consider setting
--/app/player/useFixedTimeStepping=trueto force frames to have a fixed time step, ensuring the Lidar model does not discard points if a frame has a slightly longer simulated time than the Lidar scan period. This setting istrueby default in the full Isaac Sim app, butfalseby default in standalone Python workflows.- Radar simulation does not show Doppler effects
Motion BVH must be enabled for Doppler effects to be modeled correctly. See How to Enable Motion BVH.
- Timestamps are discontinuous after pause/resume
The
GenericModelOutputAOV timestamp is independent of the animation timeline and continues to increase even when paused. This is expected behavior.
Performance Considerations#
VRAM Usage: Each RTX sensor requires GPU memory. Multiple sensors or high-resolution configurations increase VRAM usage.
Motion BVH: Enabling Motion BVH significantly increases VRAM usage and rendering time.
Normal Output: Enabling
--/app/sensors/nv/lidar/publishNormals=trueincreases VRAM usage.Stable IDs: Enabling
--/rtx-transient/stableIds/enabled=truehas minimal performance impact but requires additional processing for object ID resolution.
Hardware Requirements#
RTX sensors require an NVIDIA RTX GPU with ray tracing support. Performance scales with GPU capabilities, particularly:
VRAM capacity (affects number of sensors and resolution)
Ray tracing cores (affects simulation speed)