Jensen Huang received the chip industry’s highest honor, the Robert N. Noyce award, last night in an event with his peers. And while the evening was all about semiconductors and AI, one thing the cofounder and CEO of Nvidia said caught my attention.
Speaking on stage with former New York Times journalist John Markoff, Huang said that Nvidia has found a purpose in using its artificial intelligence and simulation technology made possible by its graphics chips and deep learning neural network algorithms. His company is gathering all of the experts it needs to simulate climate science so the world’s biggest supercomputers will be able to model climate change and predict how the Earth will change over decades. And Huang will use this to warn us all about the fate of the planet. On top of that, he will build us the metaverse for free.
In order to do that, Nvidia will have to build something on the foundation of its Omniverse, the “metaverse for engineers” that has become a platform for simulating “digital twins.” For instance, BMW is building a digital twin of a car factory, and once it gets the simulation right, it will build the exact same thing in the physical world.
On Thursday evening, Huang echoed something he announced at the Nvidia GTC conference last week. He said that Nvidia plans to build the world’s most powerful AI supercomputer dedicated to predicting climate change. Named Earth-2, or E-2, the system would create a digital twin of the Earth in Omniverse.
The 2nd Annual GamesBeat and Facebook Gaming Summit and GamesBeat: Into the Metaverse 2
January 25 – 27, 2022
“All the technologies we’ve invented up to this moment are needed to make Earth-2 possible. I can’t imagine a greater or more important use,” Huang said last week.
He said that the simulation would be so precise it would need meter-level accuracy, and if necessary Nvidia would spend the money to offset the computing power used to run the simulation. The challenge would be to take decades of climate data from satellite recordings and then ingesting that into the supercomputer for simulation purposes, Huang said.
And last night, he added, “We’re going to go build that digital twin of the Earth. It’s going to be gigantic. This is going to be the largest AI supercomputer on the planet. It’s going to bring some of the brightest computer scientists, the brightest physical scientists and climate scientists on the planet to go and use that computer to predict” how the Earth will change over decades.
It hit me that if Nvidia builds this physically accurate world in the Omniverse, that would be the equivalent of a real world metaverse. And so I went up to Huang afterward and I took a selfie with him. Then I asked him, “If you build this digital twin of the Earth, do you get the metaverse for free.”
And he said, “If we build the digital twin of the Earth, we will get the metaverse for free.”
It’s going to be a huge mission, and Nvidia can justify finding the funding sources and expertise necessary to take on climate change as a problem by building out this Omniverse, which could replicate a lot of the details of the Earth that game developers and other metaverse developers would need to create their own virtual worlds, like a replica of New York City that we could use in a game.
For sure, some of the details are going to be different. As Richard Kerris, head of development at Omniverse, told me in an interview, game developers will want a lot of the world to operate fast so people can move at a lightning 60 frames per second, and he noted that some of the details of the world would be too fine-grained for that purpose.
But this is the thing. Nvidia is going to build this version of the world anyway. The game developers can use that for their own purposes and use the Omniverse as the free foundation for whatever they want to build for entertainment purposes.
I have always wondered how ambitious companies are going to build something like the metaverse, and now I see that some things are more important to simulate, and if we accomplish those things, we’ll get a very special bonus in the form of a replica of the Earth, which we can use as a jumping point for our imaginations to build our own versions of the metaverse.
This reminds me of “dual use” technologies that were once developed for the military, like the original 3D simulators that trained soldiers how drive tanks or fly jets. Those simulators became the foundations for modern 3D video games, from virtual reality simulations to fighter jet games. Here, we’ll have one of the most important scientific projects of all time, with contributions from so many experts.
Maybe the governments of the world can provide Nvidia with some of the funding it needs, or Nvidia, which has a market value of $789 billion, could afford to build this on its own. And once it’s built, it can become the foundation for the modern metaverse.
My point is this. We don’t often get a lot of things for free. But if we invest in solving a problem that could decipher our climate and help everybody understand something that could kill our planet, then we should be grateful that we should get something like the metaverse — which would help us enjoy our virtual worlds without killing the planet — as a byproduct.
Omniverse Avatars in enterprises and games
On another level, Nvidia is also creating AI-like characters to model the behavior of pedestrians and self-driving cars. It is making the investment to re-create human-like behavior, and last week it also unveiled its Omniverse Avatars, which are 3D characters that can use AI to serve as things like non-player characters in games.
Huang said these Omniverse Avatars will be full of cool technologies, including speech recognition, speech synthesis, natural speech synthesis from text, multi-language translation, face animation, eye tracking, and other things that will go into making quality game characters.
“When you’re playing these games, you’re just going to talk to the characters inside,” he said. “They’ll understand. They’ll literally understand. You’ll say, ‘Go up there and take a right.’ You’ll be able to talk to your team, talk to your partners. It’s going to be so interesting. They’ll talk back. They’ll have computer vision. They’ll go forward and see there’s a tank coming around the corner, and because they see it, not just because it was coded into the game that way. They’ll see it. It’ll be a lot easier for characters to use perception than for the programmer to write everything into the software.”
Huang thinks that one of the more important reasons to create the Omniverse Avatars is to provide better customer service.
“There’s a severe shortage in drive-throughs, fast food. We’ve now made it possible for you to have a very good conversation with an avatar,” he said. “You can say it in just about any way you like. You could ask it for recommendations. There are a lot of ways to describe a burger now, but it will still recognize what you mean. Everything from customer service, intelligent retail checkout, you name it. There are 25 million restaurants and stores, and everybody’s short on labor. This is going to be one of the top areas.”
The Omniverse Avatars are another example of dual-use technologies. They will be built for customer service, which is a compelling business model. (And I hope it won’t simply eliminate human jobs.) But it will also provide the human-like avatars that we’re going to need by the millions as the realistic characters inside a game world or a metaverse world.
These Omniverse Avatars could be detached from the Omniverse if we’d like them to be, Huang said. But if they learn new behavior that is more realistic, that should be fed back into the Omniverse to improve the models for the characters. Again, I brought up the notion of how the Omniverse could become the metaverse — at least the kind that game developers would create.
“There are so many different applications for the metaverse,” Huang said. “For consumers, for video games and consumers we’ve already spoken about, it will likely be that we’re the engine. We’re the underlying technology. We’re the engine of it. In the case of enterprise use, particularly industrial use, we will be the entire engine, the entire simulation engine for the digital twin.”
He added, “In the case of edge — this retail application we were just talking about is really edge computing from enterprise terms. It’s just so cute that you don’t think about it as an edge computing application anymore. But this is almost the ultimate edge computing application. Instead of a fleet of tractors or a fleet of AMRs, autonomous moving robots, it’s a bunch of little animated characters. This is the thing that moves those pixels. These are really robotics applications. They’re just really cute.”
And he said, “Enterprise edge remains one of the great opportunities for the metaverse, for Omniverse. On the other extreme, of course, is all of the consumer stuff. With consumers, the way we’ll work with the industry is very similar to the way we work with the video game industry today. We’ll provide the underlying engine, the infrastructure. It might be in the cloud. It might be GFN. It might be in their cloud. We’ll provide the engine-level infrastructure.”
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. How will you do that? Membership includes access to:
- Newsletters, such as DeanBeat
- The wonderful, educational, and fun speakers at our events
- Networking opportunities
- Special members-only interviews, chats, and “open office” events with GamesBeat staff
- Chatting with community members, GamesBeat staff, and other guests in our Discord
- And maybe even a fun prize or two
- Introductions to like-minded parties