Graph-Based Motion Planner Tutorial#

This tutorial demonstrates how to use the GraphBasedMotionPlanner class in the cuMotion integration to find collision-free paths using sampling-based algorithms (RRT variants).

By the end of this tutorial, you’ll understand:

  • How to set up the graph-based motion planner with world state management

  • How to plan to C-space, task-space, and translation-only targets

  • How to execute planned paths using trajectories

  • How to update world state for accurate planning

Prerequisites

To follow along with the tutorial, you can search and enable the cuMotion Examples extension within your running Isaac Sim instance. Within the isaacsim.robot_motion.cumotion.examples extension, there is a fully functional example of graph-based motion planning being used to plan collision-free paths.

Key Concepts#

The cuMotion graph-based motion planner provides:

  • Sampling-Based Planning: Uses RRT variants to find collision-free paths

  • Multiple Target Types: Supports C-space, task-space pose, and translation-only targets

  • Shared World State: Uses the same CumotionWorldInterface as other algorithms (often managed via WorldBinding)

  • Path Interface: Returns Path objects from the Motion Generation API, which are directly convertible to Trajectory objects (see the Trajectory Planning and Execution tutorial)

Initializing the Planner#

First, set up a CumotionWorldInterface as described in the World Interface tutorial. For convenience, we initialize the CumotionWorldInterface with a WorldBinding instance. Once you have a CumotionWorldInterface, you can create the planner:

    # Create graph-based motion planner
    planner = GraphBasedMotionPlanner(
        cumotion_robot=cumotion_robot,
        cumotion_world_interface=world_binding.get_world_interface(),
    )

Configuring Planner Parameters#

Access the underlying cuMotion planner configuration to modify parameters:

    # Get the underlying cuMotion planner config
    planner_config = planner.get_graph_planner_config()

    # Modify parameters directly per cuMotion documentation
    # (See cuMotion API documentation for available parameters)

Updating World State#

The world binding must be updated before planning if obstacles or the robot base have moved:

    # Update robot base transform
    world_binding.get_world_interface().update_world_to_robot_root_transforms(articulation.get_world_poses())

    # Synchronize transforms (updates all obstacles and world state)
    world_binding.synchronize_transforms()

Planning to C-Space Targets#

The simplest planning target is a configuration space (joint space) position:

    # Define initial and target joint configurations
    q_initial = np.array([0.0, -0.5, 0.0, -2.0, 0.0, 1.5, 0.75])
    q_target = np.array([0.5, -0.3, 0.5, -1.5, 0.5, 1.2, 0.5])

    # Plan path
    path = planner.plan_to_cspace_target(q_initial, q_target)

    if path is None:
        print("Planning failed!")
    else:
        print(f"Path found with {path.get_waypoints_count()} waypoints")

The planner returns a Path object containing waypoints that form a collision-free path.

Planning to Task-Space Targets#

The planner can also plan to task-space pose targets. The plan_to_pose_target method takes world-frame coordinates directly:

    # Define target pose in world frame
    target_position_world = np.array([0.6, 0.1, 0.5])
    target_orientation_world = np.array([1.0, 0.0, 0.0, 0.0])  # quaternion wxyz

    # Plan to task-space target (takes world-frame coordinates)
    q_initial = np.array([0.0, -0.5, 0.0, -2.0, 0.0, 1.5, 0.75])
    path = planner.plan_to_pose_target(q_initial, target_position_world, target_orientation_world)

Planning to Translation-Only Targets#

For cases where only position matters (not orientation), use translation-only planning. The plan_to_translation_target method takes world-frame coordinates directly:

    # Define target position in world frame
    target_position_world = np.array([0.6, 0.1, 0.5])

    # Plan to translation-only target (takes world-frame coordinates)
    q_initial = np.array([0.0, -0.5, 0.0, -2.0, 0.0, 1.5, 0.75])
    path = planner.plan_to_translation_target(q_initial, target_position_world)

Executing Planned Paths#

Planned paths are returned as Path objects from the Motion Generation API. Paths cannot be executed directly - they must first be converted to Trajectory objects. The Path class provides a method to convert paths to minimal-time trajectories that respect velocity and acceleration limits.

First, convert the path to a trajectory:

    # Convert Path to minimal-time trajectory using Motion Generation API
    # Use reasonable velocity and acceleration limits (adjust based on robot capabilities)
    max_velocities = np.array([2.0, 2.0, 2.0, 2.0, 2.5, 2.5, 2.5])  # rad/s
    max_accelerations = np.array([2.0, 2.0, 2.0, 2.0, 2.0, 2.0, 2.0])  # rad/s²

    trajectory = path.to_minimal_time_joint_trajectory(
        max_velocities=max_velocities,
        max_accelerations=max_accelerations,
        robot_joint_space=robot_joint_space,
        active_joints=controlled_joint_names,
    )

Note

These trajectory objects are not statefully tied to the Articulation class in Isaac Sim. Therefore, the user must generally make sure to design trajectories that start at the current configuration space of the robot, if they intend to immediately execute that trajectory.

Once you have a trajectory, you can execute it in two ways:

Direct execution: Sample the trajectory at each time step and apply the joint states directly to the articulation:

    # Sample the trajectory at the current time
    target_state = trajectory.get_target_state(t)
    if target_state is not None and target_state.joints.positions is not None:
        # Apply directly to the articulation
        articulation.set_dof_position_targets(
            positions=target_state.joints.positions,
            dof_indices=target_state.joints.position_indices,
        )

Using TrajectoryFollower: Use the TrajectoryFollower controller from the Motion Generation API to integrate trajectory execution into a larger control system:

    # Create TrajectoryFollower controller
    follower = TrajectoryFollower()

    # Set the trajectory
    follower.set_trajectory(trajectory)

    # Get current estimated state
    joint_state = JointState.from_name(
        robot_joint_space=robot_joint_space,
        positions=(robot_joint_space, articulation.get_dof_positions()),
        velocities=(robot_joint_space, articulation.get_dof_velocities()),
    )
    estimated_state = RobotState(joints=joint_state)

    # Reset the follower (sets start time)
    follower.reset(estimated_state, None, t)

    # In your control loop, call forward to get desired state
    desired_state = follower.forward(estimated_state, None, t)
    while desired_state is not None:
        if desired_state.joints.positions is not None:
            # Apply desired state to robot
            articulation.set_dof_position_targets(
                positions=desired_state.joints.positions,
                dof_indices=desired_state.joints.position_indices,
            )
        # Update simulation
        simulation_app.update()
        # Update estimated state and time for next iteration
        estimated_state = RobotState(
            joints=JointState.from_name(
                robot_joint_space=robot_joint_space,
                positions=(robot_joint_space, articulation.get_dof_positions()),
                velocities=(robot_joint_space, articulation.get_dof_velocities()),
            )
        )
        t += step  # Advance time
        desired_state = follower.forward(estimated_state, None, t)

The TrajectoryFollower follows the standard controller interface (reset and forward) and can be composed with other controllers in the Motion Generation API. For more details on trajectory execution, see the Trajectory Planning and Execution tutorial.

Note

The scenario.py example uses Articulation.set_dof_positions() to directly set physics state for perfect demonstration of the planned trajectory. Real robots require controllers to follow joint targets instead, and should use one of the aforementioned methods to execute the trajectory.

Example Usage#

Note

To experiment with this tutorial interactively, see the scenario.py file in the isaacsim.robot_motion.cumotion.examples extension at isaacsim/robot_motion/cumotion/examples/graph_planner/scenario.py.

The following videos demonstrate graph-based motion planning as shown in the isaacsim.robot_motion.cumotion.examples extension.

The first video shows graph-based motion planning controlling the robot to reach a joint-space and task-space targets while avoiding obstacles in the scene.

../_images/isim_6.0_full_tut_viewport_graph_planner_normal_usage.webp

Graph-based motion planning tracking joint-space and task-space targets while avoiding obstacles#

The second video demonstrates adding a new obstacle to the scene, resetting the world state, and running the graph planner again. Objects are added in the same way as described in the World Interface tutorial. Inside the scenario.py file, a new CumotionWorldInterface is created for every plan. This is not generally necessary if objects are not being added or removed from the scene.

../_images/isim_6.0_full_tut_viewport_graph_planner_adding_capsule.webp

Adding an obstacle, resetting the cumotion world, and running the graph planner again#

Summary#

This tutorial demonstrated:

  1. Planner Initialization: Creating the GraphBasedMotionPlanner with a CumotionWorldInterface for world state management

  2. Planner Parameters: Configuring the planner parameters for advanced behavior

  3. World Updates: Keeping world state synchronized for accurate planning

  4. C-Space Planning: Planning to joint configuration targets

  5. Task-Space Planning: Planning to pose and translation-only targets

  6. Trajectory Execution: Converting planned paths to trajectories and executing them directly or using TrajectoryFollower

The cuMotion graph-based motion planner provides powerful collision-free path planning capabilities while maintaining consistency with the centralized world state architecture.

Next Steps#