Q+A: Is This The Metaverse?

drawing of virtual reality goggles
drawing of virtual reality goggles

Taking a break from reality and stepping into a world where you have a little more control over things probably sounds pretty good right now. Tech companies, like the one formerly known as Facebook, think so. And they’re betting big on building that world: “the metaverse.”

While the concept of an alternate, virtual reality — integrated seamlessly enough to pop into and out of as easily as walking through a door — has been the stuff of science fiction for decades, recent technological developments, like blockchain and advances in game engines, are making it easier to envision a reality where the trappings of a virtual life could actually be connected to the real world.

To better understand how close, or far, we are to realizing a metaverse-like virtual reality – or if it’s already upon us – three Drexel University faculty experts, who study and develop some of the technology that could enable it, shared their insights with the News Blog.

Youngmoo Kim, PhD, is a professor in the College of Engineering and director of Drexel’s Expressive and Creative Interaction Technologies (ExCITe) Center. Kim and his team work to build community around the study and development of technology that can bring people together and improve society.

In a recent edition of his “Creating at a Distance” e-newsletter, Kim discussed the topic of a metaverse, and the efforts of Meta (formerly Facebook) to create one, writing:

If we’re talking about an alternate reality that is indistinguishable from our physical reality, we still have a long way to go. But if the Metaverse is an alternate space where people go to express themselves, interact, and transact, we’re already there. In gaming environments, such as Roblox, Minecraft, and other massive online worlds, gamers have created alternate representations of themselves and spend much of their time interacting through their “avatars”. Arguably, even some messaging and social media platforms are primitive ‘Metaverses.’

If we look at the popularity of games like Minecraft and Roblox, it seems obvious that there’s plenty of interest in building an alternate reality or digital representation of reality — but are games like these actually an indicator that a Zuckerberg-like “metaverse” is near? Or is the participation in these games driven by something different or distinct from wanting to create/participate in something like Meta’s vision of a metaverse?

My short answer is “no.” Gamers like playing these games, and some enjoy designing and creating content within these games (appearance skins, mini-games, items). But that’s quite different from Meta’s vision, which is that the “metaverse” is a second reality that we inhabit, meaning that we live in it and it becomes indispensable. While a “Matrix” or “Ready Player One”-style “Oasis” may be possible sometime in the future, it’s not just over the horizon.

Despite the sci-fi appeal of such visions, there needs to be a compelling draw for people to want to be there… to be able to express things, accomplish meaningful things, and acquire things (money, prestige). Other than gaming, is there anything that makes you *want* to be in VR? Perhaps I’m wishcasting, but I believe actual reality is still pretty compelling, and most people are still drawn to real in-person experiences vs. virtual ones. 

In the newsletter piece, you argue that metaverse-enabling technology is still not quite there – VR goggles are bulky and can cause motion sickness; haptics aren’t quite right yet – what technological challenge do you think will be most difficult to overcome in creating a ubiquitous virtual/augmented reality?

Johanna Stern of The Wall Street Journal did a great video trying to live 24 hours in today’s Metaverse… it’s not recommended! There’s also this weird effect known as the “uncanny valley,” where humans seem to be able to accept real human characters and those that are figurative (cartoons, animations), but things that are “close, but not quite” human (zombies, pseudo-realistic animation) are just creepy. Perhaps the most prominent example is the movie “Polar Express,” one of the first animated movies driven by facial tracking, which many people found off-putting (whereas audiences eagerly accepted the non-human Na’vi of “Avatar” a few years later).

I think a similar principle applies to VR experiences — if it doesn’t “feel” right, it won’t be accepted. I don’t think the current technology gets us that close. By that, I mean small screens mounted on goggles/glasses and other accessories (controllers, gloves, etc.).  Ultimately, it may require brain-computer interfaces (BCI), which is in its infancy and nowhere near ready for consumers. BCI also brings up so many medical and ethical issues that as a society, we won’t be ready for a long time.

What are the most important lessons metaverse architects might learn from gaming environments like Roblox and Minecraft when it comes to inclusion/accessibility?

It’s well-established that most game designers are white males, particularly at the largest studios that produce the most popular games. This has led to severe and destructive myopia (see the criminal mass misogyny of “Gamergate” or the recent turmoil at Activision, which for years turned a blind eye to rampant sexual harassment within the company). It would be an absolute mistake to replicate commercial game dev culture for metaverse efforts. But Facebook, now Meta, has not been exemplary (to put it lightly) regarding diversity, equity, and inclusion issues, either. If they are responsible for the foundational architecture of the mass metaverse, I fear we’re all in trouble.

We’ve also seen the enormous appetite for other genres emerging from non-U.S.-centric (white-cis-male-dominated) cultures: Hip hop, anime, Afro-futurism, K-pop, and Bollywood, just to name a few. I encourage content developers to hire designers and creators with deep knowledge of different cultures, experiences, and non-Western forms into leadership roles. There’s a much larger market out there than is traditionally targeted and nearly infinite stories to tell and new experiences to create. Don’t stay planted in one place when there’s the whole of human experience to explore!

Frank Lee, PhD, is a professor in Westphal College of Media Arts & Design and director of the Entrepreneurial Game Studio in Drexel’s ExCITe Center. In the EGS, Lee’s team explores new possibilities in interactive gaming. Outside of the Studio, Lee’s work and vision has transformed Philadelphia’s built environment into grand interactive video game installations.

How have video games already evolved toward the portable alternative reality that seems to be a defining feature of “the metaverse”?

If we consider a metaverse as an environment that is virtual, interconnected and persistent, then I would say the best platform that can showcase all three of these properties is games.  Video games can be seen as an early prototype of the metaverse.

One nice example of this would be a game like “World of Warcraft.” The game is virtual — it takes place online in a fantasy world populated by people, shops, and cities with its own currency and economy. There is even an exchange system in place to convert the in-game currency to real currency and vice versa. The game is interconnected — while servers are limited to 1,200 concurrent players, there are millions of players playing across many servers. The game is persistent: even if you logout of the game, time continues to pass in the server. 

For millions of people who have been and currently are playing WoW, they have been already living in a rough prototype of the metaverse for the last 16 years.

How long have some of these features, like avatars, porting characters and artifacts between games, paying real money for upgrades/artifacts existed in video games?

Within games, the concept of avatars can be directly attributed to the classic tabletop game, “Dungeons and Dragons,” where you create a fully realized avatar that grows and persists through many different sessions of games potentially across many years. For some D&D players, their avatars are real to them as you and me. And, it isn’t just concepts of avatars. The whole concept of world building can be traced from D&D to video games, to the metaverse.

As for paying real money for upgrades, called “microtransactions” in videogames, it’s common these days, for example in popular games like “Fortnite,” historically people point to the selling of a horse armor in “Elder Scrolls IV” — for $2.50 in 2006 — as the first microtransaction in games. But you can even go even a little earlier to “Second Life,” which was released in 2003. While it isn’t a game per se, it was conceived as a virtual world and had its own currency and economy, as well as exchange system for real currency.

What do you think will be the next step in this evolution of video games?

There are several technological steps that I am hoping for in games that will bring us closer to the metaverse. First, is for more interconnected worlds. Currently a WoW server can handle up to 1,200 players concurrently. That means I can fully connect and interact with only up to 1,200 players at any given time. However, for a fully realized metaverse, it should be more like how it is with email. With email, I can connect to billions of people. It takes just about the same time to email someone in Philly versus someone in Korea. It gets to their mailbox in seconds. Ideally, an interconnected game world should allow me to connect with millions of people as well. 

Second, is for a more immersive experience. VR, while impressive, is entirely visual. Recent advances in audio, such as spatial audio, has also been impressive. However, advancement in the interface for the other three senses (tactile, smell and taste) has been minimal. Of the three, most likely advances will be in tactile interfaces. However, companies are also working on the other two senses, such as this one working on smell.

What lessons do you think Meta, or other tech companies that are looking at creating a metaverse, could learn from the speed bumps and pitfalls video games have experienced over the years?

Three of the big lessons they could take from the example of the development of a game like “World of Warcraft” are:

  1. You need to provide a constant stream of new content. Whenever a new expansion for WoW is released, it is followed by a rapid rise in users then rapid drop off. The obvious solution is user-generated content, but that brings its own problems.
  2. You need to make it easier for new users, as well as making it interesting for experienced users. It is not an easy balance to maintain, as Blizzard, the company that created WoW, has learned. The longer the game has existed and the bigger the game gets, it’s hard to simultaneously maintain the needs of the experienced players, while at the same time make it easier to onboard new players.
  3. Problems in the real world will follow into the virtual world, and can sometimes be worse — such as the issue of harassment. This has been an ongoing problem in multiplayer games where the feel of anonymity and the cover of virtuality can sometimes bring out the worst in people.  

Nick Jushchyshyn, PhD, is a professor in Westphal College of Media Arts & Design, director of the College’s Virtual Reality & Immersive Media program and director of Drexel’s Immersive Research Lab. Jushchyshyn’s team studies the latest in immersive media technology, including the hardware and graphic design technology driving virtual and augmented reality, immersive projection and virtual production.

When it comes to the technology that would enable a seamless immersive VR experience, what are some of the biggest pieces that are still coming into place? Which have the farthest to go?

Overall, these technologies have yet to achieve ubiquity and integration, within their own ecosystem, as well as within the daily lives of society. We’re at the earliest stages of developing that now. 

But if you think back to just 20 years ago, iPhones, iPads, the Android operating system, Netflix Streaming, Facebook — just a few of the technological experiences and resources that have transformed the way we routinely interact and conduct commerce today — simply didn’t exist in 2002.

We appear to be in a similar position with immersive technology and blockchain today. While the core technologies themselves exist and are in regular, active use in select industries — in some cases having decades of history — the broad accessibility and intuitive ease-of-use needed for broad adoption is still very much a work in progress.

As transformative as these technologies can be, there haven’t been effective implementations released (yet) that are readily applicable to day-to-day life for the general public. But that is certainly the direction we appear to be moving toward in the coming years.

It seems like video games have been at the forefront of creating a metaverse, of sorts, how has gaming technology evolved to improve the VR experience? 

Improvements in graphics quality, fidelity and speed are the most readily apparent. As a result of this, the toolset has been adopted by an ever-growing range of industries outside of games, including, not just movie and episodic entertainment production, but also areas such as fashion, automotive, architecture, medicine and more.

But under the hood there have also been developments in social interaction and the ease of in-game purchases. The big deal is that “metaverse technology” has broken out of being purely a VR thing.

For the gaming industry to grow, it was imperative for the industry to create a game experience that could run on a variety of platforms – from Xbox, PlayStation, to PC, mac, iPhone, Android — so the graphics technology behind video games already works on all of those platforms — VR and AR being just two of them.

Now you can access and participate in a metaverse through phone or tablet or web browser.

This is significant because traditional VR technology is not necessarily convenient or accessible to everyone. But anything created with these gaming engines is accessible on any platform today and that is thanks to technologies that began in gaming world.

Can you explain how gaming VR technology is being more widely used outside of the gaming world?

Virtual production tools, also called “real-time graphics tools,” or game engines – Unity and Unreal Engine, are two examples. Until about five years ago, those were mainly used in creation of video games, occasionally in some film/TV production. Now, their use has really exploded beyond gaming.

This is primarily because the technology has improved so much that the graphics are photo-realistic, but it’s also because of COVID-19. These technologies let people interact in a remote, distributed fashion, even on projects that are highly visual and detailed. Film and television production is one example. Directors and producers were already using VR game engines to produce samples of their film concepts as part of the previsualization and location-scouting portion of the production process. But during the pandemic, whole films and shows were created using game engines and through the remote collaboration they enable. A good bit of Disney+ content, such as the Marvel series and “The Mandalorian,” were created in this way during the pandemic.

This video was created by Nick Jushchyshyn using the Unreal Engine

The pandemic has accelerated the adoption of the technology of the metaverse in a number of other industries as well.

Many retailers had to develop apps that allowed people to try on products virtually. Now they’re continuing to develop the technology as a way of saving costs and becoming more sustainable. If people can virtually try on clothing, for example, then the stores don’t have to keep as much in stock, this saves on shipping and storage and production cost and the energy consumption to produce and deliver these products.

COVID-19 became a proving ground for distributed collaboration using real-time 3D technology.

What new jobs/specialties in digital media could you see emerging as tech companies strive to create these fully realized virtual environments that would constitute the metaverse?

The most in-demand, specialized skills right now are in 3D computer graphics and machine learning/programming disciplines. The ability to create 3D models, create spaces and video in digital 3D is increasingly a high-demand skillset.

This is a very exciting time for people working in this area of technology. It’s a lot like when film was transitioning to digital. This technology has been in place in many industries for years, but it is still brand new in industries like retail, fashion and education. This is a huge space for experts to explore and into which they can expand the use of this technology.

At Drexel we’re preparing our students by weaving 3D graphics technology into a number of educational tracks. We teach it in game design, the visual effects major and immersive media classes. So, all students on these tracks have 4-5 years of experience specifically with this technology by the time they graduate.

This academic year, we also launched a new program for students who want to go into business and entrepreneurship in the metaverse. It’s a collaboration between Westphal College of Media Arts & Design, LeBow College of Business and The Close School of Entrepreneurship where students get experience side-by-side with experts in fields related to business and the technology of the metaverse.

So, our digital media students are learning from and along with classmates in business and entrepreneurship, and the business and entrepreneurship students are learning about the digital technology that enables the metaverse. The goal of all this is to demystify these things for our students so they are prepared to succeed in this important technology and economic sector of the future.

News media interested in speaking with Kim, Lee, or Jushchyshyn, should contact Britt Faulstick, associate director, media relations, bef29@drexel.edu or 215.796.5161.