Trajectory Generator Tutorial#

This tutorial explores how the TrajectoryGenerator in the cuMotion integration can be used to create smooth, time-optimal trajectories from waypoints or path specifications.

By the end of this tutorial, you’ll understand:

  • How to generate trajectories from C-space waypoints

  • How to create and use path specifications

  • How to convert between world coordinates and robot base frame coordinates

  • How to execute trajectories using the Trajectory interface

Prerequisites

To follow along with the tutorial, you can search and enable the cuMotion Examples extension within your running Isaac Sim instance. Within the isaacsim.robot_motion.cumotion.examples extension, there is an example of the TrajectoryGenerator being used to generate trajectories from C-space waypoints and path specifications.

Key Concepts#

The TrajectoryGenerator provides:

  • Unified Interface: Single TrajectoryGenerator class handles both C-space and task-space trajectory generation

  • Path Specifications: Uses cuMotion’s native path specification API directly

  • Time-Optimal: Generates trajectories that are time-optimal while respecting joint velocity and acceleration limits

  • Direct API Access: Full access to cuMotion’s trajectory generator for parameter modification

  • Motion Generation API Integration: Generated trajectories implement the Trajectory interface

Initializing the Generator#

Create a TrajectoryGenerator with a robot configuration:

    # Load robot configuration
    cumotion_robot = load_cumotion_supported_robot("ur10")
    articulation = Articulation(robot_prim_path)

    # Create trajectory generator
    generator = TrajectoryGenerator(
        cumotion_robot=cumotion_robot,
        robot_joint_space=articulation.dof_names,
    )

Configuring Trajectory Parameters#

Access the underlying cuMotion trajectory generator to modify parameters. See the cuMotion Python API documentation for available parameters and methods:

    # Get the underlying cuMotion generator
    cspace_gen = generator.get_cspace_trajectory_generator()

    # Modify parameters using the cuMotion Python API
    # See the cuMotion Python API documentation for available parameters and methods

Generating from C-Space Waypoints#

The simplest approach is to generate trajectories directly from C-space waypoints:

    # Define waypoints
    waypoints = [
        np.array([-0.41, 0.5, -2.36, -1.28, 5.13, -4.71]),
        np.array([-1.43, 1.0, -2.58, -1.53, 6.0, -4.74]),
        np.array([-2.83, 0.34, -2.11, -1.38, 1.26, -4.71]),
    ]

    # Generate trajectory
    trajectory = generator.generate_trajectory_from_cspace_waypoints(waypoints)

    if trajectory is None:
        print("Failed to generate trajectory")
    else:
        print(f"Trajectory duration: {trajectory.duration} seconds")

Generating from Path Specifications#

For more complex paths, use cuMotion’s path specification API. See the cuMotion Python API documentation for available path specification methods:

    # Create C-space path spec
    initial_position = np.array([-0.41, 0.5, -2.36, -1.28, 5.13, -4.71])
    path_spec = cumotion.create_cspace_path_spec(initial_position)
    path_spec.add_cspace_waypoint(np.array([-1.43, 1.0, -2.58, -1.53, 6.0, -4.74]))
    path_spec.add_cspace_waypoint(np.array([-2.83, 0.34, -2.11, -1.38, 1.26, -4.71]))

    # Convert C-space path spec to linear path before generating trajectory
    linear_path = cumotion.create_linear_cspace_path(path_spec)

    # Generate trajectory from linear path
    trajectory = generator.generate_trajectory_from_path_specification(linear_path)

Task-Space Path Specifications#

Task-space paths require coordinate conversion from Isaac Sim world frame to robot base frame. The cuMotion integration includes helper functions to convert between Isaac Sim world-frame representation (position, quaternion tuple) to cumotion.Pose3 types in the robot base frame, since cuMotion works entirely in the base frame:

    # Get robot base transform directly from articulation
    robot_base_positions, robot_base_orientations = articulation.get_world_poses()

    # Create task-space path spec
    task_space_position_targets = np.array(
        [[0.3, -0.3, 0.1], [0.3, 0.3, 0.1], [0.3, 0.3, 0.5], [0.3, -0.3, 0.5], [0.3, -0.3, 0.1]]
    )
    task_space_orientation_targets = np.tile(np.array([0, 1, 0, 0]), (5, 1))  # quaternion wxyz

    initial_pose = isaac_sim_to_cumotion_pose(
        position_world_to_target=task_space_position_targets[0],
        orientation_world_to_target=task_space_orientation_targets[0],
        position_world_to_base=robot_base_positions,
        orientation_world_to_base=robot_base_orientations,
    )
    path_spec = cumotion.create_task_space_path_spec(initial_pose)

    # Add linear path segments for remaining waypoints
    for i in range(1, len(task_space_position_targets)):
        target_pose = isaac_sim_to_cumotion_pose(
            position_world_to_target=task_space_position_targets[i],
            orientation_world_to_target=task_space_orientation_targets[i],
            position_world_to_base=robot_base_positions,
            orientation_world_to_base=robot_base_orientations,
        )
        path_spec.add_linear_path(target_pose)

    # Generate trajectory (automatically converts to C-space)
    trajectory = generator.generate_trajectory_from_path_specification(
        path_spec,
        tool_frame_name="ee_link",
    )

Executing Trajectories#

Once you have a trajectory, you can execute it in two ways:

Note

These trajectory objects are not statefully tied to the Articulation class in Isaac Sim. Therefore, the user must generally make sure to design trajectories that start at the current configuration space of the robot, if they intend to immediately execute that trajectory.

Direct execution: Sample the trajectory at each time step and apply the joint states directly to the articulation:

    # Sample the trajectory at the current time
    target_state = trajectory.get_target_state(t)
    if target_state is not None and target_state.joints.positions is not None:
        # Apply directly to the articulation
        articulation.set_dof_position_targets(
            positions=target_state.joints.positions,
            dof_indices=target_state.joints.position_indices,
        )

Using TrajectoryFollower: Use the TrajectoryFollower controller from the Motion Generation API to integrate trajectory execution into a larger control system:

    # Create TrajectoryFollower controller
    follower = TrajectoryFollower()

    # Set the trajectory
    follower.set_trajectory(trajectory)

    # Get current estimated state
    joint_state = JointState.from_name(
        robot_joint_space=robot_joint_space,
        positions=(robot_joint_space, articulation.get_dof_positions()),
        velocities=(robot_joint_space, articulation.get_dof_velocities()),
    )
    estimated_state = RobotState(joints=joint_state)

    # Reset the follower (sets start time)
    follower.reset(estimated_state, None, t)

    # In your control loop, call forward to get desired state
    desired_state = follower.forward(estimated_state, None, t)
    while desired_state is not None:
        if desired_state.joints.positions is not None:
            # Apply desired state to robot
            articulation.set_dof_position_targets(
                positions=desired_state.joints.positions,
                dof_indices=desired_state.joints.position_indices,
            )
        # Update simulation
        simulation_app.update()
        # Update estimated state and time for next iteration
        estimated_state = RobotState(
            joints=JointState.from_name(
                robot_joint_space=robot_joint_space,
                positions=(robot_joint_space, articulation.get_dof_positions()),
                velocities=(robot_joint_space, articulation.get_dof_velocities()),
            )
        )
        t += step  # Advance time
        desired_state = follower.forward(estimated_state, None, t)

The TrajectoryFollower follows the standard controller interface (reset and forward) and can be composed with other controllers in the Motion Generation API. For more details on trajectory execution, see the Trajectory Planning and Execution tutorial.

Note

The scenario.py example uses Articulation.set_dof_positions() to directly set physics state for perfect demonstration of the planned trajectory. Real robots require controllers to follow joint targets instead, and should use one of the aforementioned methods to execute the trajectory.

Example Usage#

Note

To experiment with this tutorial interactively, see the scenario.py file in the isaacsim.robot_motion.cumotion.examples extension at isaacsim/robot_motion/cumotion/examples/trajectory_generator/scenario.py.

The following video demonstrates trajectory generation as shown in the isaacsim.robot_motion.cumotion.examples extension.

../_images/isim_6.0_full_tut_viewport_trajectory_generator_normal_usage.webp

Trajectory generator generating trajectories from C-space waypoints and path specifications#

Summary#

This tutorial demonstrated:

  1. Generator Initialization: Creating the TrajectoryGenerator with a robot configuration

  2. Parameter Configuration: Configuring trajectory generator parameters for advanced behavior

  3. C-Space Waypoints: Generating trajectories directly from joint space waypoints

  4. Path Specifications: Using cuMotion’s native path specification API

  5. Task-Space Paths: Converting world coordinates to base frame for task-space planning

  6. Trajectory Execution: Using the Trajectory interface to evaluate and apply trajectories over time

The cuMotion trajectory generator provides a unified interface for both C-space and task-space trajectory generation while maintaining direct access to cuMotion’s powerful features.

Next Steps#