NVIDIA Brings Generative AI Tools, Simulation and Perception Workflows to ROS Developer Ecosystem

At ROSCon in Odense, one of Denmark’s oldest cities and a hub of automation, NVIDIA and its robotics ecosystem partners announced generative AI tools ,simulation, and perception workflows for Robot Operating System (ROS) developers.

Among the reveals were new generative AI nodes and workflows for ROS developers deploying to the NVIDIA Jetson platform for edge AI and robotics. Generative AI enables robots to perceive and understand the context of their surroundings, communicate naturally with humans and make adaptive decisions autonomously.

 

Generative AI Comes to ROS Community

ReMEmbR, built on ROS 2, uses generative AI to enhance robotic reasoning and action. It combines large language models (LLMs), vision language models (VLMs) and retrieval-augmented generation to allow robots to build and query long-term semantic memories and improve their ability to navigate and interact with their environments.

The speech recognition capability is powered by the WhisperTRT ROS 2 node. This node uses NVIDIA TensorRT to optimize OpenAI’s Whisper model to enable low-latency inference on NVIDIA Jetson, resulting in responsive human-robot interaction.

The ROS 2 robots with voice control project uses the NVIDIA Riva ASR-TTS service to make robots understand and respond to spoken commands. The NASA Jet Propulsion Laboratory independently demonstrated ROSA, an AI-powered agent for ROS, operating on its Nebula-SPOT robot and the NVIDIA Nova Carter robot in NVIDIA Isaac Sim.

At ROSCon, Canonical is demonstrating NanoOWL, a zero-shot object detection model running on the NVIDIA Jetson Orin Nano system-on-module. It allows robots to identify a broad range of objects in real time, without relying on predefined categories.

Developers can get started today with ROS 2 Nodes for Generative AI, which brings NVIDIA Jetson-optimized LLMs and VLMs to enhance robot capabilities.

 

Enhancing ROS Workflows With a ‘Sim-First’ Approach

Simulation is critical to safely test and validate AI-enabled robots before deployment. NVIDIA Isaac Sim, a robotics simulation platform built on OpenUSD, provides ROS developers a virtual environment to test robots by easily connecting them to their ROS packages. A new Beginner’s Guide to ROS 2 Workflows With Isaac Sim, which illustrates the end-to-end workflow for robot simulation and testing, is now available.

Foxglove, a member of the NVIDIA Inception program for startups, demonstrated an integration that helps developers visualize and debug simulation data in real time using Foxglove’s custom extension, built on Isaac Sim.

 

New Capabilities for Isaac ROS 3.2

NVIDIA Isaac ROS, built on the open-source ROS 2 software framework, is a suite of accelerated computing packages and AI models for robotics development. The upcoming 3.2 release enhances robot perception, manipulation and environment mapping.

Key improvements to NVIDIA Isaac Manipulator include new reference workflows that integrate FoundationPose and cuMotion to accelerate development of pick-and-place and object-following pipelines in robotics.

Another is to NVIDIA Isaac Perceptor, which features a new visual SLAM reference workflow, enhanced multi-camera detection and 3D reconstruction to improve an autonomous mobile robot’s (AMR) environmental awareness and performance in dynamic settings like warehouses.

 

Partners Adopting NVIDIA Isaac 

Robotics companies are integrating NVIDIA Isaac accelerated libraries and AI models into their platforms.

  • Universal Robots, a Teradyne Robotics company, launched a new AI Accelerator toolkit to enable the development of AI-powered cobot applications.
  • Miso Robotics is using Isaac ROS to speed up its AI-powered robotic french fry-making Flippy Fry Station and drive advances in efficiency and accuracy in food service automation.
  • Wheel.me is partnering with RGo Robotics and NVIDIA to create a production-ready AMR using Isaac Perceptor.
  • Main Street Autonomy is using Isaac Perceptor to streamline sensor calibration.
  • Orbbec announced its Perceptor Developer Kit, an out-of-the-box AMR solution for Isaac Perceptor.
  • LIPS Corporation has introduced a multi-camera perception devkit for improved AMR navigation.
  • Canonical highlighted a fully certified Ubuntu environment for ROS developers, offering long-term support out of the box.

 

Connecting With Partners at ROSCon

ROS community members and partners, including Canonical, Ekumen, Foxglove, Intrinsic, Open Navigation, Siemens and Teradyne Robotics, will be in Denmark presenting workshops, talks, booth demos and sessions. Highlights include:

  • “Nav2 User Meetup” Birds of a Feather session with Steve Macenski from Open Navigation LLC
  • “ROS in Large-Scale Factory Automation” with Michael Gentner from BMW AG and Carsten Braunroth from Siemens AG
  • “Integrating AI in Robot Manipulation Workflows” Birds of a Feather session with Kalyan Vadrevu from NVIDIA
  • “Accelerating Robot Learning at Scale in Simulation” Birds of a Feather session with Markus Wuensch from NVIDIA
  • “On Use of Nav2 Docking” with Open Navigation’s Macenski

Additionally, Teradyne Robotics and NVIDIA are co-hosting a lunch and evening reception on Tuesday, Oct. 22, in Odense, Denmark.

The Open Source Robotics Foundation (OSRF) organizes ROSCon. NVIDIA is a supporter of Open Robotics, the umbrella organization for OSRF and all its initiatives.

For the latest updates, visit the ROSCon page.

 

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Discover how human-robot collaboration can take flexibility to new heights!

Discover how human-robot collaboration can take flexibility to new heights!

Humans and robots can now share tasks - and this new partnership is on the verge of revolutionizing the production line. Today's drivers like data-driven services, decreasing product lifetimes and the need for product differentiation are putting flexibility paramount, and no technology is better suited to meet these needs than the Omron TM Series Collaborative Robot. With force feedback, collision detection technology and an intuitive, hand-guided teaching mechanism, the TM Series cobot is designed to work in immediate proximity to a human worker and is easier than ever to train on new tasks.