Did you miss a session from the Future of Work Summit? Head over to our Future of Work Summit on-demand library to stream.
We’re all in a rage about the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. But we have a host of issues to consider when it comes to an ethical way to build and run the metaverse — before we plunge into it too quickly.
That was the message from a panel moderated at our GamesBeat Summit: Into the Metaverse 2 event by Kate Edwards, the CEO of Geogrify and executive director of the Global Game Jam.
Kent Bye, who runs the Voices of VR Podcast, said on the panel that he is concerned about privacy and general ethical frameworks around XR (extended reality, which includes virtual reality, mixed reality, and augmented reality). What data is collected through these metaverse systems, and where does that data go and how is it used?
Another panelist was Micaela Mantegna, an affiliate at the Berkman Kline Center for Internet and Society at Harvard University (and founder of Women in Gaming Argentina). She also does research on AI ethics. And Jules Urbach, CEO of Otoy and a longtime visual technologist, rounded out the panel.
“We can move into a situation where we have all this really intimate biometric and physiological data that is being radiated from our bodies, captured by this technology, and start to undermine what the Morningstar Group considers to be fundamental neural rights,” Bye said.
Three top investment pros open up about what it takes to get your video game funded.
Mental and biological privacy
The definition of insanity
If we don’t have mental privacy and biological privacy, some of the new technologies could essentially read our minds, model our identity, reach fine-grained and contextually relevant conclusions, and then nudge our behaviors to the point where it undermines our intentional actions, Bye said.
That would hurt our ability “to make decisions within the integrity without being influenced by all these external influences,” Bye said.
Urbach said his own particular concern is eye tracking, “which is obviously something that VR and AR devices are going to be able to do.” It reminds him of when advertisers did eye-tracking to see how people read and react to the words that they’re reading.
“Eye tracking was really a map of somebody’s mind and intent,” Urbach said. “It also is something that, when you think about how, on a web browser, you move the mouse. It is used to identify you. So my concern is things like that can be used to infer intent, even subconscious intent. That should be controlled by the owner. And it shouldn’t be something that gets turned into a advertising cue. And it shouldn’t be used for tracking. So even when you externalize things in the metaverse and you’re looking at how these eyes are moving in a VR space outside of the goggles, you can still build a digital fingerprint, just like we can fingerprint walking and other things. So those things all need to be protected. And user should have the right to privacy” and not surrender those rights in the metaverse.
Mantegna said she agreed with what Bye said. She said that we’re talking about the metaverse, but we’re also still debating what the metaverse is without having a real consensus on the definition.
“One of the current definitions is this iteration of internet and social networks and gaming coming in to a convergence,” Mantegna said. “And this also translates the problems that were already known about social media, about internet governance, and about AI ethics. What Kent was referring to about about the data, that is currently the fuel of the artificial intelligence. Autonomous systems have already been very troublesome in the ethical” domain.
She added that the problems could get tougher in the metaverse because we’re adding this layer of immersiveness to the technology.
“A lot of this data is being taken out of our bodies in a very unconscious way,” she said. “So we are not able to prevent that. So the problems that were already known and the rights we already [surrendered], and in talking about artificial intelligence ethics and talking about human rights on the internet, we should translate it into the conversation of metaverse.”
Specific privacy challenges
Apple is enforcing tougher privacy rules.
From the opening comments, Edwards concluded the common thread of concerns was about privacy.
“It’s certainly been a concern already in our common internet usage and use of other devices. And so it’s a pretty strong theme that even the public at large is very attuned to, I believe, even though we still give up our data in our location, readily playing Pokemon Go and all kinds of other fun stuff,” Edwards said. “I know you mentioned a couple of examples of privacy concerns. Is there anything else or more specifically, dealing with the privacy issue? Jules, you mentioned eye tracking, as one particular issue, like being able to track people and keep a record of what people are looking at? Are there other things along those lines, like specific technologies or metadata that we are concerned about when it comes to privacy?”
Urbach said the feedback loop bothers him a lot. If you understand subconsciously what somebody is thinking and doing before they do, then you provide some sort of ad in the metaverse, and that is going to trigger them to buy something, then that could be a lot worse in the metaverse, Urbach said.
Bye said we can expect sensor fusion, where all of these data collection devices come together, gathering data that’s coming from our body, but also from things like brain control interfaces and the neural data and eventually being able to potentially decode our thoughts.
“So our our thoughts, our ideas, but also our actions and what we’re doing,” Bye said. “These technologies are aware of our context. I think there’s a paradigm shift that needs to happen between thinking about identity, thinking about privacy in terms of our identities, something that’s a static immutable object, because all the laws are defining whether or not that information that gets out is going to be able to identify us, which I think is a concern.”
We’ll have the biometric data and get contextually relevant AI, “on top of all of the sensor fusion, so it can model your actions, your behaviors, your emotional reactions, your physiological reactions to things that you can’t even control,” Bye said.
Like subliminal advertising, it operates at this unconscious level.
“It is going to start to get to this point where you’re sleepwalking into this dystopia, and there is not a clear way, legally, to” put up much resistance, Bye said.
A paternalistic approach is to say you should never use any of this data, or you should only use it for medical applications. And from an entertainment perspective, you will need this data to refine the entertainment.
“How do you draw the line between the contextual relevance and the use and appropriate use of that data in that entertainment context?” Bye said. “You can set up a last bastion of privacy or create the worst surveillance technology that we have ever seen.”
With great power comes great responsibility
Doing what a spider can.
“To quote, the very wise, Uncle Ben from Spider-Man, with great power comes great responsibility,” Mantegna said. “We are heading into a world where technology is going to be able to decode information from our brains, but only to implant or manipulate what is going inside.”
She said we’re talking about these dystopian nightmares as something of the future, but she noted this is something we have already have you seen with generative artificial intelligence.
“That’s why I like to talk about I think about magnitudes because this is going to become worse, for sure,” she said. “We already have artificial generative artificial intelligence, or we already have artificial intelligence, creating inferences about us. siphoning our data.”
We already have problems related to bias, transparency, efficiency, and those are going to be ingrained in this new technology, she said. The technology that is going to power [those things] is already here.
“So my concern is, how are we going to shape this? And moving forward to the metaverse?” she said.
Edwards said that the question is how we will have data ownership and personal sovereignty and the modeling of your identity in a digital space.
“What level of ethical responsibility are the companies, the platforms, going to have over allowing that personal sovereignty?” Edwards asked.
One of the problems is that the internet is universal and globally decentralized, but law is often territorial in nature. That makes it hard to govern technologies on a global basis, and it will get harder to govern as the technology is decentralized with Web 3.
If we see the tech for the metaverse in an open environment, or concentrated in a walled garden, we’ll have a need to push for stronger consumer protection, Mantegna said. Otherwise, we’ll be at the mercy of the platform’s terms and conditions.
And Europe has had real influence in getting tough on issues like privacy with the General Data Protection Regulation (GDPR), and it has begun to influence other parts of the world with privacy laws, Urbach said. He said that the laws should require that we have opt-in rights for use of data for AI training and other purposes.
“My concern is that if you just have a vertical platform that has one browser one, one app, that’s not great,” he said. “We should try to go back to the open web model for the open metaverse. And if we miss that, I think that would be that’s going to be bad.”
He thinks that decentralization and crypto payments could be a strong force to push for an open metaverse. Bye noted that Unity’s Tony Parisi has come up with the seven rules of the metaverse and one of them is that the metaverse is hardware independent.
He noted that we have duopolies like Android and iOS in mobile, and that could carry over to where we have very few players providing metaverse services to us, since big companies will control the metaverse the way they control social media today.
We could see different models emerge like Facebook/Meta’s emphasis on less privacy but cheaper technologies, while Apple will support privacy but require you to pay for it with higher product costs.
The ethics of interoperability
Social networking concept.
Edwards asked if we have an ethical responsibility to provide interoperability because if it doesn’t exist, then neither does the metaverse.
“It is in a way very similar to how we have the the longstanding internet access model with where we all go through ISPs,” Edwards said. “But generally, the ISP experience tends to be invisible because the service that we are provided with or, basically the internet that we’re getting to our homes is pretty much the same and just depends on the technology with your cable or fiber or something like that.”
Edwards added, “If you can’t easily move between platforms. If you’re stuck in a particular walled garden, which we tend to see that model pop up frequently, is that really the metaverse? And what responsibility do these companies have to actually work with each other to ensure that that kind of cross platform access?”
Bye said groups like the Khronos Group and Open XR can create a standard set of interoperable application programming interfaces (APIs). Work is also happening on WebXR, but the question is whether big players like Apple will support that. But each of these companies must make business decisions about how interoperability should work.
Urbach said active discussions are happening now about how to make interoperable technologies like glTF (the GL Transmission Format) for the efficient transmission and loading of 3D scenes and models by applications. But we still need to pull together a lot of different technologies. Apple, Nvidia, and others agreed to the 3D data format of the Universal Scene Description (USD), which originated with Pixar and powers Nvidia’s Omniverse simulation technology. All of that is promising, Urbach said.
Mantegna said she agreed with Urbach and she said that technical interoperability has to go with intellectual property law, which gives you legal and durable interoperability as a layer on top of the technical interoperability.
She said one of the promises about the metaverse is that true ownership of your digital assets.
“You’re not going to be able to take it or have the same functionalities from one metaverse to another because it’s going to be this other layer of regulation and intellectual property and licensing and contracts and antecedents of service that will prevent you from doing so,” Mantegna said. “And one of the huge discussions that still is ongoing is how the first sale principle is going to apply to digital goods. If you are buying a T-shirt in the analog world, you can take it and wear it wherever you want. But that might not be true for the metaverse for a cosmetic item. It is very similar to how you can buy an ebook and you cannot take it from one platform to another.”
Edwards brought up the problem of socio-economic disparity throughout the globe and how that changes from locale to locale. How can companies grant equal access to others around the world?
Urbach believes that access to the metaverse may be very similar to getting access to the internet. Mobile phones have enabled access to the internet worldwide.
“I think that the mobile phone revolution will continue to the metaverse and I think that as far as the hardware, the bandwidth needed, you’ll have probably pretty good coverage just as an evolution of what the mobile phone has done for most of the world,” he said. “To me, it’s an extension of covering what mobile phone, hardware and bandwidth has done.”
He thinks the metaverse will be distributed via cloud services that can be decentralized, and that could give equal access for information through the open web.
Bye noted that VR has become more accessible because one company, Meta, has been subsidizing the cost of the headsets with a “model of surveillance capitalism.”
“You’re getting more accessibility, but at the same time, you’re maybe mortgaging people’s privacy,” Bye said. “And so there’s there is kind of a tension there in order to really financially pay for some of that, you have these trade offs that are inherent. And so how do you do things perfectly? Well, you can’t do things for free so you do have to decide what is more valuable for having a diverse, inclusive, [policy that] gets the technology into most people as possible.”
The challenge is the lack of a clear path to how we get to both an accessible technology and one that guarantees a lot of the rights that we would otherwise surrender with surveillance capitalism. Bye believes a new federal privacy law could accomplish this balance.
“When we think about ethics, that might not seem actionable. And that’s a difficulty to translate it into good practices that could be put into practice in everyday work,” Mantegna said.
She said that the access issue involves a few things. One is accessing the internet, and she noted a recent United Nations report found that a third of the global population has never been on the internet. Another issues is access to the technology that delivers the best results. And then we need to address the issues of access to hardware, which won’t be the same based on what everybody can afford.
“How are we going to ensure this balance?” Mantegna asked.
As long as ethical principles are just a guideline without concrete obligations, we are going to fail, Mantegna said.
“We don’t want to fail,” Edwards said. “We want to do this right. It’s not something we are going to solve easily, but it is something that I’m hoping that as we go into the development of the metaverse we are going to be eyes wide open on this issue.”
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn More
Read More Feedzy