The DeanBeat: A Big Bang week for the metaverse

Join Transform 2021 this July 12-16. Register for the AI event of the year.

The metaverse had a couple of Big Bangs this week that should put it on everyone’s radar. First, Epic Games raised $1 billion at a $28.7 billion valuation. That is $11.4 billion more valuable than Epic Games was just nine months ago, when it raised $1.78 billion at a $17.3 billion value.

And it wasn’t raising this money to invest more in Fortnite. Rather, it explicitly said it was investing money in building out its plans for the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. Epic Games CEO Tim Sweeney has made no secret of his ambitions for building the metaverse and how it should be an open metaverse.

And while that might sound crazy, he received $200 million from Sony in this round, on top of $250 million received from Sony in the last round. I interpret this to mean that Sony doesn’t think Sweeney is crazy, and that it too believes in his dream of making the metaverse happen. And if Sony believes in the metaverse, then we should expect all of gaming to set the metaverse as its North Star. Epic’s $1 billion in cash is going to be spent on the metaverse, and that amount of money is going to look small in the long run.

Epic Games has a foothold to establish the metaverse because it has the users and the cash. It has 350 million-plus registered users for Fortnite. And it has been investing beyond games into things like social networks and virtual concerts, as Sweeney knows that the metaverse — a place where we would live, work, and play — has to be about more than just games. Games are a springboard to the metaverse, but they’re only a part of what must be built.

Above: These people are not people. They MetaHumans.

One of the keys to the metaverse will be making realistic animated digital humans, and two of Epic’s leaders — Paul Doyle and Vladimir Mastilović — will speak on that topic at our upcoming GamesBeat Summit 2021 conference on April 28 and April 29. This fits squarely in the notion of building out the experience of the metaverse. We need avatars to engage in games, have social experiences, and listen to live music, according to my friend Jon Radoff, CEO of Beamable, in a recent blog post. Meanwhile, Nvidia announced this morning something called GanVerse, which can take a 2D picture of a car and turn it into a 3D model. It’s one more tool to automate creation for the metaverse.

To make the metaverse come to life, we need so many more layers, including discovery tools, a creator economy, spatial computing to deliver us the wow 3D experience, decentralization to make commerce between worlds seamless and permissionless, human interface and new devices that make the metaverse believable, and infrastructure too.

The Omniverse

Above: BMW Group is using Omniverse to build a digital factory that will mirror a real-world place.

And when you think about those things, that is what we got in another Big Bang this week as Nvidia announced its enterprise version of the Omniverse, a metaverse for engineers. By itself, that doesn’t sound to exciting. But drilling deep on it, I learned a lot about how important the Omniverse could be in providing the foundational glue for the metaverse.

“The science fiction metaverse is near,” said Jensen Huang, CEO of Nvidia, in a keynote speech this week at the company’s GTC 21 online event.

First, Nvidia has been working on the Omniverse — which can simulate real-world physics — for four years, and it has invested hundreds of millions of dollars in it, said Richard Kerris, Nvidia media and entertainment general manager, in a press briefing.

Nvidia started this as Project Holodeck using proprietary technology. But it soon discovered the Universal Scene Description language invented by Pixar for describing 3D data in an open, standardized way. Pixar invented this “HTML of 3D” and shared it with its vendors because it didn’t want to keep reinventing 3D tools for its animated movies.

“The way to think about USD is the way you would think about HTML for the internet,” Huang said. “This is HTML for 3D worlds. Omniverse is a world that connects all these worlds. The thing that’s unique about Omniverse is its ability to simulate physically and photorealistically.”

It open sourced USD about eight years ago, and it has spread to multiple industries. One of the best things about it is that it enable remote collaboration, where multiple artists could work on the same 3D model at once.

Above: The metaverse market map

Nvidia made USD the foundation for the Omniverse, adding real-time capabilities. Now BMW Group, Ericsson, Foster + Partners, and WPP are using it, as are 400 enterprises. It has application support from Bentley Systems, Autodesk, Adobe, Epic Games, ESRI, Graphisoft, Trimble, Robert McNeel & Associates, Blender, Marvelous Designer, Reallusion, and Wrnch. That’s just about the entire 3D pipeline for tools used to make things like games, engineering designs, architectural projects, movies, and advertisements.

BMW Group is building a car factory in the Omniverse, replicating exactly what it would build in the real world but doing it first in a “digital twin” before it has to commit any money to physical construction. I saw a demo of the factory, and Nvidia’s engineers told me you could zip through it at 60 frames per second using a computer with a single Nvidia GeForce RTX card (if you can get one).

“You could be in Adobe and collaborate with someone using Autodesk or the Unreal Engine and so on. It’s a world that connects all of the designers using different worlds,” Huang said. “As a result, you’re in a shared world to create a theme or a game. With Omniverse you can also connect AI characters. They don’t have to be real characters. Using design tools for these AI characters, they can be robots. They can be performing not design tasks, but animation tasks and robotics tasks, in one world. That one world could be a shared world, like the simulated BMW factory we demonstrated.”

Above: Bentley’s tools used to create a digital twin of a location in the Omniverse.

Nvidia hopes to test self-driving cars — which use Nvidia’s AI chips — inside the Omniverse, driving them across a virtual U.S. from California to New York. It can’t do that in the real world. Volvo needs the Omniverse to create a city environment around its cars so that it can test them in the right context. And its engineers can virtually sit in the car and walk around it while designing it.

The Omniverse is a metaverse that obeys the laws of physics and supports things that are being created by 3D creators around the world. You don’t have to take a Maya file and export it in a laborious process to the Omniverse. It just works in the Omniverse, and you can collaborate across companies — something that the true metaverse will require. Nvidia wants tens of millions of designers, engineers, architects and other creators — including game designers — to work and live in the Omniverse.

“Omniverse, when you generalize it, is a shared simulated virtual world. Omniverse is the foundation platform for our AR and VR strategies,” Huang said. “It’s also the platform for our design and collaboration strategies. It’s our metaverse virtual world strategy platform, and it’s our robotics and autonomous machine AI strategy platform. You’ll see a lot more of Omniverse. It’s one of the missing links, the missing piece of technology that’s important for the next generation of autonomous AI.”

Why the Omniverse matters to games

Above: Nvidia Omniverse

By building the Omniverse for real-time interaction, Nvidia made it better for game designers. Gamers zip through worlds at speeds ranging from 30 frames per second to 120 frames per second or more. With Nvidia’s RTX cards, they can now do that with highly realistic 3D scenery that takes advantage of real-time ray tracing, or realistic lighting and shadows. And Kerris said that most what you see doesn’t have to be constantly refreshed on every user’s screen, making the real-time updating of the Omniverse more efficient.

Tools like Unreal or Unity can plug into the Omniverse, thanks to USD. They can be used to create games, but once the ecosystem becomes mature, they can also absorb assets created by other industries. Games commonly include realistic replicas of cities. Rockstar Games built replicas of New York and Los Angeles for its games. Ubisoft has built places such as Bolivia, Idaho, and Paris for its games. Imagine if they built highly realistic replicas and then traded them with each other. The process of creating games could be more efficient, and the idea of building a true metaverse, like the entire U.S., wouldn’t seem so crazy. The Omniverse could make it possible.

Some game companies are thinking about this. One of the studios playing with Omniverse is Embark Studios, founded by Patrick Soderlund, former head of studios for Electronic Arts. Embark is backed by Nexon, one of the world’s biggest makers of online games. And since the tools for Omniverse will eventually be simplified, users themselves might one day be able to contribute their designs to the Omniverse.

Huang thinks that game designers will eventually feel more comfortable designing their worlds while inside the Omniverse, using VR headsets or other tools.

Above: Nvidia’s Omniverse can simulate a physically accurate car.

“Game development is one of the most complex design pipelines in the world today,” Huang said. “I predict that more things will be designed in the virtual world, many of them for games, than there will be designed in the physical world. They will be every bit as high quality and high fidelity, every bit as exquisite, but there will be more buildings, more cars, more boats, more coins, and all of them — there will be so much stuff designed in there. And it’s not designed to be a game prop. It’s designed to be a real product. For a lot of people, they’ll feel that it’s as real to them in the digital world as it is in the physical world.”

Omniverse enables game developers working across this complicated pipeline, allowing them to be connected, Huang said.

“Now they have Omniverse to connect into. Everyone can see what everyone else is doing, rendering in a fidelity that is at the level of what everyone sees,” he said. “Once the game is developed, they can run it in the Unreal engine that gets exported out. These worlds get run on all kinds of devices. Or Unity. But if someone wants to stream it right out of the cloud, they could do that with Omniverse, because it needs multiple GPUs, a fair amount of computation.”

He added, “That’s how I see it evolving. But within Omniverse, just the concept of designing virtual worlds for the game developers, it’s going to be a huge benefit to their work flow. The metaverse is coming. Future worlds will be photorealistic, obey the laws of physics or not, and be inhabited by human avatars and AI beings.”

Brands and the metaverse

Above: Hasbro’s Nerf guns are appearing inside Roblox.

On a smaller scale, Roblox also did something important. It cut a deal with Hasbro’s Nerf brand this week, where some brand new blasters will come out on Roblox. Roblox doesn’t make the blasters itself. Rather, it picks some talented developers to make them, so that it stays true to its user-generated content mantra. But the fact that Roblox and partner with a company like Hasbro shows the brands have confidence in Roblox, as it has demonstrated in deals with Warner Bros.

Usually, user-generated content and brands don’t mix. The users copy the copyrighted brands, and the brands have to take some legal action. But Roblox invests a lot in digital safety and it doesn’t seem to have as big a problem as other entities. That’s important. We know that Roblox is a leading contender for turning into the metaverse because it has the users — 36 million a day. But the real test is whether the brands will come and make that metaverse as lucrative as other places where the brands show up, like luxury malls.

And FYI, we’ve got a panel on Brands and the Metaverse at our GamesBeat Summit 2021 event on April 28-29. Kudos for Steven Augustine of Intel for planting that thought in my brain months ago.

I feel like the momentum for the metaverse is only getting stronger, and it is embedding itself in our brains as a kind of Holy Grail — or some other lost treasure in other cultures — that we must find in order to reach our ultimate goals.

GamesBeat

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Source: Read Full Article