Nvidia announces open beta for Omniverse as a ‘metaverse’ for engineers

Nvidia has announced an open beta for its Omniverse, a virtual environment the company describes as a “metaverse” for engineers.

Company CEO Jensen Huang showed a demo of the Omniverse, where engineers can work on designs in a virtual environment, as part of the keynote talk at Nvidia’s GPU Technology Conference, a virtual event being held online this week. More than 30,000 people from around the world have signed up to participate.

The Omniverse is a virtual tool that allows engineers to collaborate. It was inspired by the science fiction concept of the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One.

“The metaverse analogy is excellent,” Nvidia media and entertainment general manager Richard Kerris said in a press briefing. “It’s actually one that we use internally, quite a lot. You’ll be able to collaborate anywhere in the world in this virtual environment. And your workflow is key, whether you’re an end user or developer. So we really are excited about it as a platform.”

Leveraging Nvidia technology

Above: The Omniverse is where robots learn to be robots.

Nvidia has worked on the tech for a while, with early access lasting 18 months. The Omniverse, which was previously available only in early access mode, enables photorealistic 3D simulation and collaboration. It is intended for tens of millions of designers, engineers, architects, and other creators and will be available for download this fall.

The Omniverse uses Nvidia’s RTX 3D simulation tech to enable engineers to do things like work on a car’s design inside a simulation while virtually walking around it or sitting inside it and interacting with it in real time. Engineers on remote teams will be able to work alongside architects, 3D animators, and other people working on 3D buildings simultaneously, as if they were jointly editing a Google Doc, Kerris said. He added that “The Omniverse was built for our own needs in development.”

Above: Nvidia’s Marbles at Night demo showcases complex physics and lighting in the Omniverse.

The open beta of Omniverse follows an early access program in which customers such as Foster + Partners and ILM — along with 40 other companies and 400 individual creators — have been evaluating the platform. The cloud-based platform runs in the datacenter using servers based on chips from Nvidia, such as the Nvidia Quadro RTX A6000 chips being introduced today.

Huang views the Omniverse as the beginning of the Star Trek Holodeck concept “realized at last.”

Huang said in his speech, “The metaverse is coming. Future worlds will be photorealistic, obey the laws of physics or not, and inhabited by human avatars and AI beings.”

He said that games like Fortnite or Minecraft or Roblox are like the early versions of the metaverse. But he said the metaverse is not only a place to play games. It’s a place to simulate the future.

Pixar and other allies

Omniverse is based on Pixar’s widely adopted Universal Scene Description (USD), the leading format for universal interchange between 3D applications. Pixar used it to make animated movies. The platform also uses Nvidia technology, such as real-time photorealistic rendering, physics, materials, and interactive workflows between industry-leading 3D software products.

“With the entire community starting to move toward this open platform [USD] for exchanging 3D information, including the objects, scenes, materials, and everything else, it was the best place for us to start,” Kerris said. “And because of that, we now are able to work with all kinds of third-party applications.”

Omniverse enables collaboration and simulation that could become essential for Nvidia customers working in robotics, automotive, architecture, engineering, construction, manufacturing, media, and entertainment.

Above: Nvidia’s Omniverse can be used for entertainment creation.

Industrial Light & Magic, a Lucasfilm company and maker of visual effects for movies such as the Star Wars series, has been evaluating it for creative and animation pipelines.

Other early adopters include leading architectural design and engineering firms and telecommunication companies such as Foster + Partners, an architectural design and engineering firm, and Woods Bagot, an architectural and consulting firm. The tech allows them to have a hybrid cloud workflow for the design of complex models and visualizations of buildings.

Omniverse has support from many major software leaders, such as Adobe, Autodesk, Bentley Systems, Robert McNeel & Associates, and SideFX. Blender is working with Nvidia to add USD capabilities to enable Omniverse integration with its software.

Simultaneous real-time access

Above: Nvidia’s Omniverse works with a lot of other technologies.

So it looks like engineers will be the first to kick the tires on the metaverse, which I’m hoping will someday replace the Zoomverse we’re all stuck in right now. Damn, I should have been an engineer. Being an engineering thinker, I asked Kerris whether the Omniverse would be able to deal with latency, or interaction delays across the cloud.

He noted that the only information that has to be transmitted across the internet to the other users are the parts of a project that are being changed. That means most of what you’re looking at doesn’t have to be constantly refreshed on everybody’s screen, making the real-time updating of the Omniverse more efficient. Nvidia’s Nucleus tech is a kind of traffic cop that communicates what is changing in a scene as multiple parties work on it at once.

Above: COVID-19 simulation in Omniverse.

“A decent connection to the cloud gives you the real-time performance that you’ll need to have that kind of workflow feel like you’re in the same room with one another person, even if you are in different parts of the world,” Kerris said.

What can you do in this engineer’s metaverse? You can simulate the creation of robots through a tool dubbed Isaac. That lets engineers create variations of robots and see how they would work with realistic physics. So they can simulate what a robot would do in the real world by first making the robot in a virtual world. There are also Omniverse Connectors, which are plugins that connect third-party tools to the platform. That allows the Omniverse to be customized for different vertical markets.

If you’ve been wondering what technology might be useful for the regular person’s metaverse, the Omniverse offers a pretty big clue. As an FYI, VentureBeat is holding a metaverse conference on January 27.

Source: Read Full Article