The Metaverse Defined, AR/VR Meets D.C. Comics

DC Comics Dr. Manhattan

DC Comics Dr. Manhattan

D.C. Comics Dr. Manhattan explains the metaverse as a set of independently observed parallel universes that either fork or will converge to one sane truth for all observers, despite the momentary divergences observable to superheroes, because only they can observe multiple universes at the same time

HOLLYWOOD, CA (GoshRobin) 2022/7/15 – Interesting metaverse article from Kevin Niekrawietz popped up on my feed. Kevin and I know each other from Capgemini Invent and Frog. Working with Capgemini, I researched how to bring human resources and banking into the metaverse.

One point in Kevin’s article that I’ll expand upon here, it says…

There is still no clear definition of the Metaverse.

How can this be true? Can we not answer with a clear definition?

There is a definition of metaverse at Wikipedia, but it’s not very satisfying:

In futurism and science fiction, the metaverse is a hypothetical iteration of the Internet as a single, universal and immersive virtual world that is facilitated by the use of virtual reality (VR) and augmented reality (AR) headsets. In colloquial use, a metaverse is a network of 3D virtual worlds focused on social connection.

DC Metaverse

DC Metaverse

For a more metaphysical definition, we may turn to the bible of DC canon, to the insights of radioactive physicist Dr. Manhattan:

“The Birth of the Speed Force rattles the METAVERSE, Superman’s TimeLine shifts forward and Reality Divides for THE FIRST TIME, Creating the Multiverse. Every time there is a CHANGE in the METAVERSE, the Multiverse Grows”

DC Metaverse Analysis

So what does Dr. Manhattan mean by all that?

The Metaverse we all know is the one DC calls Prime Earth. We only experience our one metaverse, but superheroes can move between different metaverses. From the superhero point of view, The Multiverse is the collection of universes to travel between. Usually, separate universes in The Multiverse do not effect each other.

Every time a player in her or his metaverse experiences a crisis, makes a decision that changes it, changes may ripple out to the other universes in the multiverse to keep the superhero’s metaverse consistent, by creating, changing, or eliminating a metaverse (per Schrödinger’s Cat). However, the Speed of Light universal physical constant ensures time passes self-consistently in each metaverse, that each player avoids observable paradoxes and thereby remains sane.

There is not only the DC Multiverse, but other multiverses making up the Omniverse, a grand universe of infinite dimensions. By the way, there is a useful mathematical model for this, Tensors in Infinite Dimensions.

Meanwhile, Back on Prime Earth

The above metaphysics description may be easier to think about by relating it to everyday experiences we understand as software developers and product designers.

Code as Metaverse

When 3D programming in Unreal Engine or Unity, we may think of that system as our coding universe. Anything we create with UE is our own metaverse. The UE source code exists in a Github repo, along with other related and unrelated codebases, something like a multiverse.

If we create our own fork of UE on Github, we are splitting a codebase into a new universe. Going forward our fork diverges, but may later merge again with Epic’s UE codebase, if we issue a pull request.

Github is not the only place to host source code. I have codebases I keep at Github and Gitlab, which are separate projects (universes). These and all the other repos in the world when thought of together are something like an omniverse.

Converging Diverging Game Universes

When creating a First Person Shooter (FPS) game we must contend with network latency, that network packets are not instantaneous. Information limited to traveling the speed of light can create observable discrepancies that must be resolved to keep the physical world sane.

Cold Waters

Cold Waters

Consider a video game like Cold Waters, with two virtual 3D ships engaged in a battle. One ship fires a torpedo at the other. At the network level, Ship One dispatches a packet to the game server, to be relayed to the other players, that Ship One has fired.

Anticipating the coming attack, Ship Two is already veering away sharply. Ship Two dispatches a packet to the game server, to be relayed to the other players, that Ship Two has changed course.

Meanwhile, the UE game clients for each player continue tracking stale data at 120fps (screen update every 8 milliseconds). Ship One sees Ship Two hit. Ship Two has veered away and sees Ship One’s torpedo miss.

The players’ individual views of the game do not agree. Is Ship Two destroyed? Or, not? And to resolve it either way at our game server, we can’t go backwards in time. In Star Trek parlance, this is a physical warp in our universe, a region where physical laws no longer apply. Ship Two is both destroyed and unharmed at the same time. How do we make it consistent, despite the packet latency, so game reality will make sense to all players?

A solution is to split the difference. Ship One sees the hit on Ship Two. However, Ship Two is only damaged, not destroyed. Ship Two sees the torpedo miss. However, it causes an explosion nearby, so the near miss causes damage. Now the game universe is back together again. That Ship Two is damaged makes sense to all players.

This is how we did it in 1995 for the Department of Defense war game ModSAF/NPSNET, the first realistic networked FPS game. A military war game used to train NATO Special Forces for missions. Programmers who worked on it went on to build entertainment games like Call of Duty. Cheap VR games were not feasible in 1995 before PC graphics cards existed.

Call of Duty Motion-Capture System

I designed motion-capture software used by AAA games and major motion pictures, that inserts live performers into VR. Here’s a behind-the-scenes peek from Call of Duty:

Call of Duty players in person on mo-cap sound stage perform as their virtual selves in Call of Duty in real time.

Call of Duty players in person on mo-cap sound stage perform as their virtual selves in Call of Duty in real time.

An Anti-Submarine Warfare (ASW) War Game for Sonar Operators

While working with the Office of Naval Research (ONR), I designed a war game for training navy sonar operators. Deploying a real submarine for a destroyer to practice chasing is expensive. My Anti-Submarine Warfare (ASW) training game was integrated with navy destroyers so sonar operators may practice hunting virtual submarines while at sea. A training game designed to use existing sonar waterfall displays.

Sonar operator monitoring sonar waterfall displays

Sonar operator monitoring sonar waterfall displays

Arleigh Burke-class destroyer USS Decatur (DDG 73)

Arleigh Burke-class destroyer USS Decatur (DDG 73)

Star Trek Warp Physics

A note about Star Trek warp physics, Warp Drive is not just science fiction, but an area of active research. The Faster-Than Light (FTL) travel concept is something like Supercavitating torpedoes. These create a bubble around themselves in order to travel faster underwater than the physics of water allows. For FTL, a gravity warp shortens space around the ship.

For a ship warping space to shorten the distance to travel FTL, it could make sense to describe it as the perceived distance instead of as the perceived time. In the film Star Wars, Han Solo describes the Millennium Falcon as “the ship that made the Kessel run in less than twelve parsecs!”  However, it takes 3.258 years to travel a parsec at the speed of light, and 39 years to travel 12 parsecs. In 1977 when making Star Wars, Han Solo actor Harrison Ford was 35 years old, too young to have traveled for 39 years. He isn’t just speedy, Han Solo is a time traveler.

About Robin Rowe

Robin RoweDirector of Metaverse Engineering,