The Metaverse: Next Gen ’Net With a Human Touch

May 15, 2023 9:30 AM ET
Art of person with a VR set on their head

Jenn Mullen,  CONTRIBUTOR

Over the course of its ten-episode inaugural season, Source De[Code]’s mission was to deconstruct the myths surrounding today’s most buzz-worthy technologies—artificial intelligence, digital twins, and big data. In the season finale, host Ben Coffin talks to three guests from previous episodes—Sarah Laselva, Jonathan Wright, and Dr. Silviu Torin-- and an incredibly special new guest—Emmy award winner and professor at the University of Texas at Austin, Dr. Alan Bovik-- to explore the place where these three technologies intersect: The Metaverse.

The Metaverse: The point between physical and digital is human

I In 2021, Mark Zuckerberg announced Facebook’s Meta rebrand and ushered in a tsunami of press speculation about the Metaverse, a term coined by Neal Stephenson in his 1992 dystopian sci-fi novel Snow Crash. Despite what Mark Zuckerberg might have us believe, the concept of a technology-immersive reality was first introduced to the world by The Chamber of Life, a story published in 1929 by pulp sci-fi magazine Amazing Stories. In the story, the narrator is invited to enter the Chamber of Life by its inventor who sees in the young narrator his own youthful ambition and passion for progress. While there, the narrator meets a host of people who introduce him to marvels of technology that augment every aspect of daily life. Considering the prescience of his prose, Wertenbaker and his short list of published works deserve a modern revival.

Even before the pandemic, contactless alternatives to traditionally physical interactions were on the rise. Debates about screen time and the merits of virtual work and school have replaced prime-time queries from evening newscasters asking parents if they know where their children are. We can work, shop, work out, and connect with friends without ever needing to put pants or shoes on. Technology has offered so many new and wonderful forms of convenience and entertainment that we have only recently realized how thirsty we have become for authentic human connections. As with all things, the absence of human connection during the pandemic has ignited a new appreciation for it. This appreciation is, surprisingly, tempered with most people desirous of a middle ground where the contactless conveniences of Covid and the magic spark intrinsic to real-world dynamics meet.

Charting a Course to the Metaverse

Like a pendulum, every new breakthrough is polarizing, with some rejecting new ideas out of hand and others jumping in with both feet. Eventually, innovation breakthroughs occur that appease everyone so we eventually land somewhere in the middle. The Metaverse’s seamless immersive marriage of the physical and digital acts like gravity pulling this pendulum towards the center. Neither Stephenson nor Wertenbaker—nor any of the countless other visionary creators—were able to succinctly describe the intertwining of our physical and digital realities. The reason for this—and your middle school language arts teacher would back me up on this—is that you cannot write with succinct authority about something you have not experienced.

Neither Stephenson, Wertenbaker, nor any of the countless future-gazing visionaries who have given readers a glimpse into their view of a world where the digital and physical worlds are unified would have satisfied the demands of a language arts teacher. While they may be able to answer the ‘5 W’s’—who, what, where, when, and why— of the metaverse adequately enough, the ‘how’ would evade them. Ben’s panel of experts, on the other hand, understands the complexities behind realizing the full and as-yet-undiscovered potential of the Metaverse.

Hazards and Roadblocks Ahead

“I don’t think anyone can predict at this point what it’s going to look like,” says Sarah Laselva about the Metaverse, “but I think everyone would be foolish to say it’s not going to be very different—and a huge opportunity.” Realizing something completely new on the magnitude of the Metaverse will require a different sort of visionary thinking than that of the sci-fi futurists who painted portraits of a future defined by a tech-immersive human experience.

To revolutionize digital experiences requires revolutionary new testing methods. By definition, the Metaverse is a revolution. It’s “really focusing on collaboration,” says Jonathan Wright. But it “completely changes the dynamic of how we test everything. We have to test how we’re actually viewing the content in stereo.” Creating a truly connected digital experience requires app developers to properly convey the full spectrum of human communication—both spoken and body language. Doing this requires a deeper understanding of biometrics—body tracking, gesture tracking, and gaze—, and adds new layers of complexity to the testing workflows.

The Metaverse represents a novel new revenue generation channel. Brands like Gucci and Nike see how consumers have responded to the ability to customize their avatars in apps like Roblox and Fortnite. The rise of non-fungible tokens (NFTs) also reflects this rising desire for exclusivity and digital individualism. But this is a limited view of how technology immersion will impact the human world. The testing process that Jonathan discussed will benefit from simulated realities, as well. Dr. Silviu Torin explains that simulation software “can provide the virtual environment whereby a lot of tier-one automakers can validate and verify their algorithms and a lot of their components.”

The simulation capabilities predicted in this wider lens view of the Metaverse will improve road safety These types of simulations aggregate greater volumes of better-defined data that would otherwise be unwieldy and cost-prohibitive to collect manually. Rapidly changing weather conditions and unpredictable pedestrian and cyclist movements are difficult to bake into testing algorithms right now. Dr. Torin expects that simulation technology will enable better machine learning algorithms that “identify these corner cases from different sources and generate the scenarios. This is much more efficient.”

Putting the Metaverse to Good Use

Intimacy is required for any simulated interaction to be truly realistic, and this intimacy is going to be difficult to experience if you are connected to the Metaverse isolated in your home. We say things in online interactions that we would never say to someone we were standing in front of. Face-to-face interactions are indeed intimate and make it far more difficult to utter words that seem acceptable online. It is also what dystopian sci-fi leans into. The sterile, cold digital environment grants permission for equally cold, cruel interactions. When we are outside, squinting in the sunlight and chatting with neighbors, these same interactions would be deemed intolerable. This disparity in social behavior is one of the clearest indicators of the boundary between the physical and digital worlds.

Despite being more connected to each other than ever before, we are also more isolated. We create echo chambers online that allow us to block out the beautiful, messy chaos of the real world. In turn, we are seeing more people acting on the shared sentiments of their echo chambers. Dr. Alan Bovik hopes that the Metaverse can bring that aspect of intimacy to online interactions. “I want the Metaverse to be something where you’re experiencing the world. And it is augmented.” The Metaverse will be a dramatic paradigm shift in how we interact with our world. By giving people the ability to experience people, places, cultures, and wonders they would otherwise never be able to destigmatize the “other” and humanizes things we would otherwise fear.

“I want you outside,” Dr. Bovik says on the show, inciting riotous cheers from every parent tuning in. He envisions the Metaverse as something that allows us to bring our online community with us as we explore the world. In the Metaverse, he sees wearable devices like glasses having the ability to let you interact with true digital twins of your network as opposed to the cartoonish avatars we currently think of. “You can create a model of that person at your receiver, hopefully inside those glasses,” he explains. You would be able to see the physical micro-expressions that communicate so much more than words alone can convey. Simulations that allow us to literally see the world through others’ eyes has world-changing potential. It creates a new opportunity for global empathy and increased compassion that inspires action in ways that words and pictures never can.

About the Guest: Dr. Alan Bovik

Dr. Alan Bovik is a vision scientist, engineer, and educator. He holds the Cockrell Family Regents Endowed Chair at the Cockrell School of Engineering at the University of Texas at Austin, where he is also the director of the Laboratory for Image and Video Engineering (LIVE). Dr. Bovik is also a faculty member in University of Texas at Austin’s Department of Electrical and Computer Engineering, the Machine Learning Laboratory, the Institute for Neuroscience, and the Wireless Networking Communications Group.

In addition to these esteemed positions, Dr. Bovik is also a two-time Emmy Award winner. In 2015, Dr. Bovik earned a Primetime Emmy Award in recognition of the perception-based video quality measurement tools he developed that have now become the industry standard in television. In 2021, he won a Technology and Engineering Emmy Award for developing perceptual metrics for video encoding optimization.

Catch Up on Season 1 of Source De[Code]

Before you dive into the Metaverse, catch up on all that season 1 of Source De[Code] has to offer on Apple Podcasts, Spotify, Amazon Music, and Google Podcasts. Visit Source De[Code] online to learn more about the show, its host, and guests, and to access resources that will deepen your understanding of the technologies discussed throughout the season. Can’t get enough technology podcast content? Get updates on upcoming seasons and other Keysight podcasts by subscribing to the Source De[Code] mailing list.