By Andrea Vittorio
The expansion of digital experiences accessible through immersive headsets and related technologies is generating concerns about the information that’s collected on people who wear the devices, and how to protect it.
The metaverse promises to bring innovation to sectors from education to e-commerce, with the potential to generate up to $5 trillion in economic impact by 2030, according to a June report by management consulting firm
Devices such as
“There’s not a lot of clarity on what data the device is collecting and what data apps are collecting,” said Jeremy Nelson, director of the University of Michigan’s XR Initiative, which is exploring ways to bring extended-reality devices into teaching.
Device makers are tasked with educating the public as well as policymakers on the technology’s functioning and future—raising questions for how data protection laws might apply.
In educational contexts, XR device makers and app developers must navigate existing laws covering children’s personal information and academic records. Other applications of the technology are primarily governed by privacy laws such as Europe’s General Data Protection Regulation and those enacted in US states like California and Illinois. But new laws may be necessary to keep data from the metaverse’s immersive digital worlds private and secure.
Companies and investors already have poured about $120 billion into the metaverse in the first five months of 2022, more than double the $57 billion invested in all of 2021, the McKinsey report says.
“Extended reality” is the umbrella term for the different technology types in this space: virtual, augmented, and mixed reality. Virtual reality comprises wholly computer-generated environments. Augmented reality, meanwhile, imposes digital elements onto the real world, as with the popular game Pokémon GO. Then there’s mixed reality, which blends the digital world with the physical world using holograms.
Devices designed to access digital realms got a recent boost in the new $280 billion US law incentivizing domestic production of computer chips. “Immersive technologies” are eligible for the CHIPS and Science Act’s research and development funding, a provision pushed for by the XR Association.
The industry trade group is working with a bipartisan congressional caucus focused on the various extended-reality technologies to educate lawmakers on beneficial uses of such devices, such as for medical research or telehealth services during the Covid-19 pandemic. Rep. Suzan DelBene (D-Wash.) and other members of what they’ve dubbed the Reality Caucus—its full name is the Congressional Caucus on Virtual, Augmented and Mixed Reality Technologies—also are exploring the technology’s policy implications, including for people’s privacy.
“Privacy is paramount,” said Joan O’Hara, senior vice president of public policy at the XR Association. The group wants to “take some of the mystery out of the technology” by explaining to device users how it works and how it uses data, O’Hara said in an email.
The trade association also formed an advisory council tasked with making recommendations on policy questions surrounding the industry, including privacy issues. The council is made up of tech company executives and representatives from civil society and academia.
“The industry needs to get this right,” O’Hara said in an email. “If users don’t trust the industry to protect their privacy, they won’t use the technology.”
Consumers may be unaware of or unfamiliar with what kind of data is gathered when they use extended reality devices. XR devices with hand tracking, for example, might estimate a user’s hand size and follow their hand movements.
Privacy advocates say information about a person’s physical traits and movements is sensitive data that should be subject to heightened protections.
“People have almost like a fingerprint about how you move your head and your hands when you talk,” said the University of Michigan’s Nelson. He is collaborating with the XR Safety Initiative, a group that issued a set of standards for safeguarding data in the metaverse—especially data that’s associated with children.
Tracking data from virtual reality devices can be used to identify individuals, researchers at Stanford University found in a study of about 500 participants published in 2020. Devices with privacy policies that allow sharing of de-identified information may accomplish “very little” by taking a user’s name off the data set, the study suggests.
A person’s interactions in the metaverse also could reveal characteristics about them.
“Imagine there are in-world ads on the wall,” said Jon Callas, director of public interest technology at the nonprofit Electronic Frontier Foundation. “They can tell which one your eyes linger on and thus construct a personality profile of you based on things that interest you.”
Businesses that make extended-reality devices or apps face the challenge of making sure they have permission to collect data at different points during a user’s experience, said Sarah Bruno, a partner at Reed Smith LLP.
A privacy notice must be given at or before the time of data collection, according to Bruno, whose law firm recently issued a report on legal issues raised by the rise of the metaverse.
People also input information about themselves to create accounts to use a digital reality device, such as their name and payment information.
In the past, users of Meta’s VR devices had to log in with a Facebook account, stretching their credentials across platforms. Meta is rolling out a new login system that allows VR accounts to be created using an email address, Facebook account, or Instagram account.
Users of Microsoft’s HoloLens 2 can choose to sign in using iris authentication, meaning a person wearing the headset is recognized via an eye scan. The data, stored locally on the device, isn’t shared and is protected by two layers of encryption, the company says.
To make an account, Meta’s device asks for a date of birth to verify that a user is age 13 or older. Though the US currently has no overarching federal privacy law covering adults or teenagers, kids age 12 and under are subject to the Children’s Online Privacy Protection Act, which governs digital data collected about them.
Common Sense Media, an organization that rates media and technology for kids, has reviewed several extended-reality devices and their privacy policies, with plans to release a report on its findings in the fall.
“Virtual reality technology is moving so quickly, we have to stop and reflect,” said Girard Kelly, director of Common Sense’s privacy program.
“We have to think about how we can define what privacy means in virtual reality,” Kelly said at a forum for state attorneys general earlier this month.
To contact the reporter on this story:
To contact the editors responsible for this story:
To read more articles log in.
Learn more about a Bloomberg Law subscription.