You don’t normally see tech giants like Nvidia and Apple collaborating, but at this week’s Nvidia GTC 2024, the two companies announced their partnership around the Vision Pro. Nvidia is integrating its Omniverse Cloud platform into Apple’s headset, allowing users to directly interact with objects and design through the Vision Pro.
The foundation of this support lies in a set of Omniverse Cloud APIs that can stream Omniverse assets to Apple’s headset. Instead of running on the Vision Pro itself, designers can stream scenes created with the Universal Scene Description (OpenUSD) in Omniverse to the Vision Pro and interact with the 3D objects in a native environment.
Nissan demonstrated this capability in a video where users could swap paint colors, adjust trim, and explore the interior of a car using spatial awareness through the Vision Pro.
Get your weekly breakdown of the technology behind PC gaming
While this collaboration will have a significant impact on the enterprise sector, it also has consumer implications. Nvidia is showcasing its ability to stream interactive 3D applications to the Vision Pro, facilitated by Nvidia’s Graphics Delivery Network (GDN) already used to stream 3D applications from the cloud, emphasizing its compatibility with the Vision Pro.
Key to this development are the Omniverse Cloud APIs. At GTC, Nvidia introduced five new APIs focused on Omniverse Universal Scene Description (OpenUSD) cloud, which can be used individually or collectively:
- USD Render: supporting ray-traced renders of OpenUSD data
- USD Write: enabling modifications to OpenUSD data
- USD Query: facilitating interactive scenes
- USD Notify: tracking changes in USD
- Omniverse Channel: connecting tools and projects across scenes
Currently, Omniverse Cloud on the Vision Pro is concentrated on enterprise applications, aligning with the focus of Apple’s headset. This lays a crucial foundation for streaming interactive 3D applications to Apple’s headset in the future. While the Vision Pro is powerful, streaming highly detailed 3D scenes with features such as ray tracing will require significant resources, making the streaming capability a promising prospect for future apps.
Editors’ Recommendations