BETA
This is a BETA experience. You may opt-out by clicking here
Edit Story
Web3

Facebook's Metaverse Could Be Overrun By Deep Fakes And Other Misinformation If These Non-Profits Don’t Succeed

Following

Mark Zuckerberg’s virtual-reality universe, dubbed simply Meta, has been plagued by a number of problems from technology issues to a difficulty holding onto staff. That doesn’t mean it won’t soon be used by billions of people. The latest issue facing Meta is whether the virtual environment, where users can design their own faces, will be the same for everyone, or if companies, politicians and more will have more flexibility in changing who they appear to be.

Rand Waltzman, a senior information scientist at the research non-profit RAND Institute, last week published a warning that lessons learned by Facebook in customizing news feeds and allowing for hyper-targeted information could be supercharged in its Meta, where even the speakers could be customized to make them appear more trustworthy to each audience member. Using deepfake technology that creates realistic but falsified videos, a speaker could be modified to have 40% of the audience member’s features without the audience member even knowing.

Meta has taken steps to tackle the problem, but other companies are not waiting. Two years ago, the New York Times, the BBC, CBC Radio Canada and Microsoft launched Project Origin to create technology that proves a message actually came from the source it purports to be from. In turn, Project Origin is now a part of the Coalition for Content Provenance and Authenticity, along with Adobe, Intel, Sony and Twitter. Some of the early versions of this software that trace the provenance of information online already exist, the only question is who will use it?

“We can offer extended information to validate the source of information that they're receiving,” says Bruce MacCormack, CBC Radio-Canada’s senior advisor of disinformation defense initiatives, and co-lead of Project Origin. “Facebook has to decide to consume it and use it for their system, and to figure out how it feeds into their algorithms and their systems, to which we don't have any visibility.”

Launched in 2020, Project Origin is building software that lets audience members check to see if information that claims to come from a trusted news source actually came from there, and prove that the information arrived in the same form it was sent. In other words, no tampering. Instead of relying on blockchain or another distributed ledger technology to track the movement of information online, as might be possible in future versions of the so-called Web3, the technology tags information with data about where it came from that moves with it as it’s copied and spread. An early version of the software was released this year and is now being used by a number of members, he says.


Click here to subscribe to the Forbes CryptoAsset & Blockchain Advisor


But the misinformation problems facing Meta are bigger than fake news. In order to reduce overlap between Project Origin’s solutions and other similar technology targeting different kinds of deception—and to ensure the solutions interoperate—the non-profit co-launched the Coalition for Content Provenance and Authenticity, in February 2021, to prove the originality of a number of kinds of intellectual property. Similarly, Blockchain 50 lister Adobe runs the Content Authenticity Initiative, which in October 2021 announced a project to prove that NFTs created using its software were actually originated by the listed artist.

“About a year and a half ago, we decided we really had the same approach, and we're working in the same direction,” says MacCormack. “We wanted to make sure we ended up in one place. And we didn't build two competing sets of technologies.”

Meta knows deep fakes and a distrust of the information on its platform is a problem. In September 2016 Facebook co-launched the Partnership on AI, which MacCormack advises, along with Google, Amazon.com, Microsoft and IBM, to ensure best practices of the technology used to create deep fakes and more. In June 2020, the social network published the results of its Deep Fake Detection Challenge, showing that the best fake-detection software was only 65% successful.

Fixing the problem isn’t just a moral issue, but will impact an increasing number of companies’ bottom lines. A June report by research firm McKinsey found that metaverse investments in the first half of 2022 were already doubled the previous year and predicted the industry would be worth $5 trillion by 2030. A metaverse full of fake information could easily turn that boom into a bust.

MacCormack says the deep fake software is improving at a faster rate than the time it takes to implement detection software, one of the reasons they decided to focus on the ability to prove information came from where it was purported to come from. “If you put the detection tools in the wild, just by the nature of how artificial intelligence works, they are going to make the fakes better. And they were going to make things better really quickly, to the point where the lifecycle of a tool or the lifespan of a tool would be less than the time it would take to deploy the tool, which meant effectively, you could never get it into the marketplace.”

The problem is only going to get worse, according to MacCormack. Last week, an upstart competitor to Sam Altman’s Dall-E software, called Stable Diffusion, which lets users create realistic images just by describing them, opened up its source code for anyone to use. According to MacCormack, that means it’s only a matter of time before safeguards that OpenAI implemented to prevent certain types of content from being created will be circumvented.

“This is sort of like nuclear non-proliferation,” says MacCormack. “Once it's out there, it's out there. So the fact that that code has been published without safeguards means that there's an anticipation that the number of malicious use cases will start to accelerate dramatically in the forthcoming couple of months.”

Follow me on Twitter or LinkedInCheck out my websiteSend me a secure tip