If there are three “buzzword” topics that have certainly generated a lot of hype in recent years, they would be digital twins, generative AI and the metaverse.
Digital Twins, Generative AI and the Metaverse
Adobe Stock
However, one area where they’re undoubtedly creating more than just hot air is gaming and 3D design, where companies like Unity and Epic Games are pulling the strings to connect these hot technology topics.
The Epic Games platform and the Unity platform are best known for supporting many of the most popular video games in history, but are also widely used to create experimental 3D designs, virtual reality environments and all kinds of simulations for the industry and leisure market.
To discuss this convergence and the potential it offers for democratizing access to real-time 3D design, I recently took the opportunity to speak with Marc Whitten, President of Create at Unity.
We’ve talked about some of the ways artificial intelligence (AI) — specifically the emerging class of generative AI apps like ChatGPT and Stable Diffusion — will soon make it easier for anyone to create digital twin simulations and interactive 3D experiences and environments. This has the potential to revolutionize the many industries that have already embraced these technologies – including gaming, automotive, manufacturing and healthcare.
But the potential doesn’t stop there. As well as democratizing the creative process, the benefits will no doubt be felt during the “runtime” – when users will be able to interact with simulations, digital twins and immersive natural language environments, and so the information and insights they create need to retrieve in reality -time.
He told me, “When you connect (3D simulated environments) to natural language and AI-based tools, you can literally ask… ‘Hey computer, what’s happening on the shop floor?’ or “What are the three most important issues that I need to pay special attention to now?”
In this theoretical situation, a supervisor oversees the operation of his premises or facility through a real-time 3D graphical representation. It can be a factory, a retail store, or a sports or entertainment facility such as an amusement park or sports stadium. Viewed on a screen, through a virtual reality (VR) headset, or even as images superimposed on the real-world view with augmented reality (AR) glasses, they can track predictions of what will happen next will be in real time.
Could this – and not the whimsical 3D environments of the 3D worlds of Meta Horizons or Decentraland – be the true vision of how the much-discussed metaverse will develop?
“The term is overrated almost to the point of meaninglessness,” Whitten tells me.
“It started out by covering everything… What people really want is to think about the next generation of these experiences where 3D plays a part.” The ability for everyone to interact, whether it’s in a group or in a broader setting … that’s why I prefer terms like “digital twin” … they describe (better) what a given company is trying to do to get value out of the technology.”
More and more companies and industries are entering the world of gaming to help them achieve that vision – because that’s where the expertise lies.
In the 1990s, 3D graphics technology advanced enough that video games began to move away from the two-dimensional, Pac-Man-style bitmap images that characterized the first two decades of their existence.
Since then, game developers have pushed the boundaries of creating realistic, simulated worlds using tools like Unity and Unreal Engine. That know-how is now being leveraged by companies like those responsible for building the new Vancouver airport – who created a realistic 3D simulation before settling on a single construction tool. Or automakers like Daimler, who have adopted the technology for their configurator apps, which buyers can use to select options for new vehicles.
Whitten goes beyond these existing applications and looks forward to a future where all industries transform their static, flat data into real-time 3D models to redesign their value chain from the ground up.
In his view, this maturation will result in companies starting with a “representative” digital twin – where a 3D environment is identifiable as the environment, process, or system that it models. Then they will transition to a “connected” digital twin and on to a “live data” twin. This describes a level of maturity where the simulation is based on data collected from real-world sensors and scanners, allowing the model to be updated as it happens.
He refers to the next and last stage of maturity (in this model) as the “predictive twin”. Here the simulation and data modeling are so sophisticated that the operator can have an effective look into the future.
He says, “That’s part of the beauty of connecting these things through a real-time engine. Real-time 3D means it’s physical, so you can run the clock forward and say, under those conditions, we’ll simulate the next period.” Based on real-world data coming in, as well as future simulated data.
“What does that tell us to react to? Having that kind of information flowing through the company will enable everyone to make more informed decisions much more quickly.”
Returning to the world of gaming, Whitten sees applications for generative AI and natural language technology that promise to deliver richer and more immersive experiences for the gamers of the future.
He again talks about the benefits both in the creative phase – where generative AI will reduce the amount of work involved in creating content and environments by allowing designers to simply describe what they want to create instead of tediously creating everything with 3D modeling tools.
And players will interact with non-player characters (NPCs) who have seemingly realistic intelligence and conversational skills. In other words, no more identical guards patrolling the medieval towns or space stations of our game environments.
To bring this exciting vision to life, however, ways would need to be developed to integrate AI processing into the game loop in a commercially viable way.
“Because games are potentially played by 100 million people – and if they all had to go to the cloud to use AI and figure out what an NPC is going to say, it would cost too much money.”
However, this is a problem Unity has been working to solve, and edge computing will likely be part of the solution — with smaller neural networks running on players’ own devices. Maybe not as much processing power as GPT-4, but definitely enough to make today’s NPCs look like the wooden toys our grandparents played with.
Overall, Whitten tells me he’s extremely excited about the impact these three cutting-edge technology trends will have on gaming, as well as the broader industry and economy.
He tells me, “The mission at Unity is that we believe the world will get better with more creators… We’ll understand things that we’ve never had the opportunity to do before, and those things will feel more alive and real because.” we can infuse them.” with AI… and that will lead to extraordinary things.”
You can Click here to watch my full conversation with Marc Whitten, President of Create at Unity, in which we will delve deeper into the convergence of Generative AI, Digital Twins and the Metaverse and the impact he predicts will have on many industries.
To keep up to date with new and emerging business and technology trends, be sure to subscribe to the newsletter my newsletterfollow me Twitter, LinkedInAnd youtubeand look at my books Future Skills: The 20 skills and competencies everyone needs to thrive in a digital world And The Internet of the Future: How Metaverse, Web 3.0 and Blockchain will change economy and society.
follow me Twitter or LinkedIn. Cash My website or some of my other work Here.