Reseach
Coco Uses NVIDIA Isaac Sim to Power Urban Autonomy
Author:
Bolei Zhou (Chief AI Scientist) and Brad Squicciarini (Co-Founder and CTO)
Posted Date:
From Simulation to Sidewalks: How Coco Robotics Uses NVIDIA Simulation Technology to Power Urban Autonomy
On a crowded city sidewalk, a small delivery robot pauses as a pedestrian crosses its path. A cyclist moves quickly behind it. A car door opens unexpectedly nearby. Within seconds, the robot adjusts and continues forward, aiming to fulfill its delivery mission on time.
Moments like this are deceptively complex. For an autonomous system, they represent a dense web of perception, prediction, and decision-making in cluttered urban environments, where safety is non-negotiable.
At Coco Robotics, solving this challenge at scale means going far beyond real-world testing. Before their robots encounter these situations on the street, they have already experienced them thousands of times in simulation, inside virtual environments powered by NVIDIA Omniverse libraries and NVIDIA Isaac Sim.
Coco Robot navigates safely in a realistic urban simulation enabled by NVIDIA Omniverse libraries and UrbanVerse (Video by Patrick and Akshat)
Enhancing Safety Through Simulation
We operate hundreds of autonomous delivery robots in highly dynamic urban environments across major cities like Los Angeles and Miami. Every deployment introduces new interactions: pedestrians with varying behaviors, changing traffic patterns, and rare yet critical edge cases such as tree roots and broken curbs on sidewalks.
Relying solely on real-world testing would make it difficult to iterate quickly or safely. Some scenarios are too dangerous to recreate intentionally, while others occur too infrequently to provide meaningful coverage.
To address this, we have built a simulation-first evaluation pipeline on top of NVIDIA Omniverse libraries and the open Isaac Sim simulation framework.
Within this environment, our models are exposed to a wide range of urban scenarios, from everyday interactions to unusual and safety-critical events. Our engineers can systematically test the autopilot system's behavior, identify weaknesses, and refine models before real-world deployment.
This approach allows us to scale evaluation in a way that would be impossible in the physical world, accelerating development while maintaining a strong focus on safety and reliability.
NVIDIA Omniverse Libraries as the Foundation for Realistic Urban Simulation
Achieving meaningful simulation requires more than synthetic environments. It requires fidelity in terms of visual appearance, physics, and dynamics. NVIDIA Omniverse libraries provide the core infrastructure that makes this possible by combining RTX-powered rendering with physically accurate simulation in Isaac Sim to closely mirror real-world urban conditions.
Lighting, materials, and geometry behave consistently with physical environments, narrowing the gap between simulated and real perception. At the same time, robot dynamics, sensor behavior, and environmental interactions are modeled with high accuracy. Cameras and LiDAR operate as they would in reality, generating data that can be directly used in autonomy pipelines. Built on OpenUSD, Omniverse also enables scalable scene construction, allowing the creation and evolution of complex urban environments rather than isolated test cases.
For Coco Robotics, this combination of realism and scalability is essential, as it ensures that insights gained in simulation translate into tangible improvements in how robots behave on real sidewalks.
Scaling Urban Simulation with UrbanVerse
The impact of simulation becomes clear in its direct connection to deployment. By validating autopilot systems in high-fidelity virtual environments with realistic scene layouts, Coco Robotics can identify and address failure modes before they occur in the real world. Models are stress-tested against dense pedestrian flows, unexpected obstacles, and rare interactions that are difficult to capture through real-world testing alone.
This scale is enabled by UrbanVerse, an urban simulation platform developed by UCLA’s Vision & Autonomy Lab (VAIL) on top of NVIDIA Omniverse libraries. With over 100K 3D urban assets and diverse city scenes, UrbanVerse expands simulation beyond a limited set of handcrafted scenarios, allowing autonomy systems to be evaluated across a wide range of environments.
UrbanVerse further extends this capability through a pipeline that converts real-world video into simulation-ready environments, enabling rapid recreation of complex scenarios and continuous expansion of coverage. The result is a shift from static testing to a scalable, data-driven process, where new environments and edge cases can be generated on demand to accelerate iteration and strengthen the real-world reliability of Coco’s autopilot model.
Open-sourced 3D assets from UrbanVerse (Video by Mingxuan and Honglin)
Open-sourced crafted scenes from UrbanVerse (video by Mingxuan and Honglin)
A New Paradigm for Urban Autonomy
The collaboration between Coco Robotics and NVIDIA reflects a broader shift in how autonomous systems are developed. Simulation is no longer just a supporting tool, and it has become a central pillar of the autonomy stack, enabling engineers to explore scenarios at scale, test systems under controlled conditions, and iterate with speed and precision. NVIDIA Omniverse libraries provide the technological foundation for this shift, bringing together realism, physics, and scalability in a unified platform, while Coco Robotics demonstrates how these capabilities translate into real-world impact on city streets.
As urban environments grow more complex, the demands on autonomous systems will only increase. Meeting these challenges requires not just better models, but better ways to evaluate them. By combining NVIDIA Omniverse libraries with a simulation-first approach, Coco Robotics is helping define how autonomy can be developed safely and efficiently at scale, where every successful real-world interaction is backed by countless experiences in simulation, and where the future of urban autonomy is being built one scenario at a time.



