Facebook went all in on the “metaverse” a year and a half ago. The company changed its name to Meta and started pumping a billion dollars a month into a Hail Maryboondoggle to achieve relevance in the post-social world to come.
Now Meta finds itself entering a “metaverse winter” — a general decline in investment and excitement around the idea. Meta itself has laid off thousands in its metaverse and social businesses alike.
The metaverse isn’t a set of technologies; it’s a vision about future human culture. It’s about what product companies and the public might do with a set of technologies — mainly live and work in virtual spaces and play in virtual worlds.
Apple has been developing what it calls “extended reality” hardware for two decades, and is now expected to roll out its first goggles later this year. The glasses will be virtual reality (VR) capable, but Apple will emphasize augmented reality (AR).
Apple now owns a quarter of the enterprise PC market, half the enterprise smartphone market and most of the enterprise tablet market. One under-appreciated question: How will Apple leverage its extended reality platforms to expand its dominance in the enterprise?
It’s a reasonable prediction that over the next five years, Apple will target business communications (the bionic meeting room) and other white-collar applications, industrial design and — you guessed it! — the coming “digital twin” revolution. There is no VR or AR without 3D virtual spaces and virtual objects, which have to be designed and built and — in the case of AR — placed into a digitalized scan of the actual 3D environment.
The most advanced version of all this technology for navigating virtual spaces and conjuring up virtual objects in the real world — and designing and building and scanning — will happen not just for the “metaverse,” but also for the benefit of “digital twin” platforms.
Digital twins: when failure is not an option
On April 11, 1970, three astronauts found themselves in a spaceship hurling toward the moon at 400 miles per minute. The plan was to make NASA’s third manned moon landing. Suddenly, the astronauts on board Apollo 13 heard a loud “bang!” That was the sound of a small explosion that blew off the side of the spacecraft, cut its power and dumped the crew’s oxygen supply into space.
With no new air replenishing the cabin, the astronauts scampered into the lunar module (LM) — the separate, detachable spacecraft designed to actually land on the moon while the main craft remained in lunar orbit.
The landing was cancelled. Now, the mission had only one objective: to somehow, against all odds, get the astronauts back to Earth alive. To do so, the crew had to repurpose and re-engineer different parts of their spacecraft to do a large number of things those parts weren’t designed to do.
In the end, their lives were saved in part because NASA had what was essentially the world’s only “digital twin” system.
A “digital twin” is a virtual replica of an existing physical object, system or infrastructure. In NASA’s case, this came in the form of 15 simulators used for training and for testing mission parameters. NASA engineers used the computer-simulation capabilities of the simulators to figure out what went wrong, test a variety of potential solutions, and choose the best, which they relayed to the Apollo crew.
The concept was so successful that NASA began deliberately creating “digital twins” of spacecraft separate from the simulators. NASA coined the tern “digital twin” in 2010.
A “digital twin” is not an inert model. It’s a personalized, individualized, dynamically evolving digital or virtual model of a physical system. It’s dynamic in the sense that everything that happens to the physical system also happens to the digital twin — repairs, upgrades, damage, aging, etc.
Companies are already using “digital twins” for integration, testing, monitoring, simulation, predictive maintenance on bridges, buildings, wind farms, aircraft and factories. But these are still very early days in the “digital twin” realm.
How to understand digital twins
A digital twin system has three parts: The physical system, the virtual digital copy of that physical system and a communications channel linking the two. Increasingly, this communication is the relaying of sensor data from the physical system.
It’s made from three major technology categories. If you imagine a Venn diagram of “metaverse” technologies in one circle, “IoT” in a second circle and “AI” in the third, “digital twin” technology occupies the overlapping center. Digital twins are different from models or simulations in that they are far more complex and extensive and change with incoming data from the physical twin.
The digital twin implementations that exist today in many industries are all nascent. Detailed digital twins are still impossible for complex systems. We’re still waiting for better AI, better sensors and better tools such as the ones we assume will drive the “metaverse.”
Let’s look ahead s few years to see how digital twins will serve as a cornerstone for enterprise digital transformation.
It’s 2027, and a delivery drone company goes all-in on digital twins, creating a separate digital twin of each of 15,000 drones in operation in major cities worldwide. Every actual part of each individual drone is mapped one-to-one with a digital, virtual counterpart. Dozens of sensors embedded all over the physical drone measure temperature, humidity, vibration, wing stresses and the operational efficiency of moving parts. Conditions of the drone itself — altitude, speed, direction, external moisture levels and many other metrics, update the digital drone in real-time. All this data is fed into the digital drone, changing its operations and affecting its virtual state.
Suddenly, one of the drones falls out of the sky and crashes. But why?
Engineers working from home don VR goggles and bring up the crashed drone’s digital twin in a high-resolution 3D shared virtual environment. They replay the crash while moving around inside the drone, which shows 3D copies of all parts, plus sensor-based contextual data — basically AR inside VR. They quickly realize that the rudder controller failed because of overheating.
In a normal aviation scenario, all 15,000 controllers would be replaced at very high cost and without any assurances the new controllers wouldn’t also fail. But in the digital twin scenario, there’s a better way.
Digital twins to the rescue
Partnering with AI, the engineers determine that this particular controller failed because it operated in Phoenix, AZ, where ground temperatures can exceed 115 degrees in the shade, and rise higher in direct sun. The repeated heating, cooling, and heating over time weakened a chemical adhesive in the controller.
It gets better! The company also maintains a digital twin of its drone factory — a detailed virtual replica of the entire system, updated in real-time by myriad sensors on every part of the physical factory. So it can trace the history of the specific failed sensor, where the AI points out that it was manufactured in summer and was in the top five percentile of reaching high temperatures during assembly. It appears that the damaging heat stress probably began in the factory.
Like a chess computer, the AI considers 57 possible “moves,” or remedies, recommending the safest and most cost effective: 1) manufacture all future controller parts in winter and store for assembly; 2) switch to a more heat-resistant adhesive in the part; and 3) preemptively replace the controller on the 47 other drones that operate in hot climates.
In this example, using the digital twin system saved money, prevented accidents, helped the environment (by not requiring the replacement of all controller parts), and made positive changes in operations and manufacturing without serious downtime for either the factory or the drones.
This is the sharp end of the digital transformation revolution, using advanced technologies for agility, cost-efficiency, time-efficiency, and safety.
It’s time to re-adjust the benefits of the technologies we’re always talking about. IoT becomes mission-critical technology. AI partners with engineers to optimize every process in real-time. And AR and VR make digital twins come to life as vividly as their physical counterparts.
Virtual spaces aren’t just about creating metaverse fantasy worlds. They’ll be better used to improve the real world.