6 Ways Web3 and Decentralized Identity Technologies Could Stop Deep Fakes

Artificial intelligence (AI) has many upsides, but a significant downside is its role in creating deep fakes and other identity theft fraud mechanisms. The technology is increasingly accessible and affordable, manipulatable media is freely available across the internet, and detecting real from fake is growing more difficult by the day.

A ‘deep fake’ is a video, image, or audio of a real or fictitious person’s likeness or voice that was substantially modified or generated using machine learning and generative AI. The result is convincing content that looks and sounds like it comes from a specific person. In the case of the popular grandparent scam, that ‘specific person’ is the victim’s grandchild whose voice and image have been manipulated to convey a fearful message, while in the growing number of corporate or political deep fakes, the subject could be a CEO, political opponent or even a president.

Deep fakes are used to defraud money and spread disinformation. They threaten privacy, reputation, finances, and peace, potentially on a global scale.

The US government is moving to regulate deep fakes, and some commercial operations have introduced deep fake detection technology, so there are solutions on the horizon. But a solution we see enormous potential in is Web3 and decentralized identity technology, grounded in blockchain.

In a nutshell:

  • Web3 and DI technologies would deliver some secure content “locks” that could be quickly checked and verified as irrefutable proof of authenticity.
  • Blockchain would make it incredibly difficult for malicious actors to manipulate or tamper with digital content since every change would be permanently recorded and verifiable.

There are a few ways to leverage this technology:

  1. A person would possess an official decentralized identifier (DID) (which includes a public key) on a public blockchain. Any piece of content that person creates would be cryptographically hashed and signed with a digital signature created by the private key associated with their public key, proving its authenticity, or
  2. A person would hold verified credentials for their digital persona in a digital wallet, which again would prove the authenticity of their created content. In the digital wallet, they would also store other verified real-world credentials for their digital persona, such as driver license, passport, proof of residency and so on, or
  3. A person could store in their digital wallet non-fungible tokens (NFT ) backed by blockchains to prove ownership of digital assets including video and voice recordings.  If a recording is protected by an NFT owned by a reliable person or organization then the recording can be trusted and is not subject to fraud.

With all these solutions, anyone watching a video or listening to or reading a piece of content could be assured of the content’s author and that the content has not been modified. This provides protection against deep fakes, including those created using AI.

Sudo Platform offers DI solutions—explore now, or read our DI whitepapers

This great article from Deloitte summarizes the 6 ways Web3 and DI would act as a safeguard against deep fakes and other generative AI scam mechanisms:

Provenance tracking

1. Provenance tracking: Cryptography and distributed ledger technology makes it almost impossible to conceal manipulated origins of content.

Self-sovereign identity

2. Self-sovereign identity: DI gives the user complete control over their identity, including their content, shutting out the potential criminal element.

Tamper-proof meta data

3. Tamper-proof meta data: Embedding cryptographically verified metadata such as timestamps, location, and device information, readily distinguishes authentic content from manipulated versions.

Authenticity certificates

4. Authenticity certificates: Digital certificates attached to validated original media act as irrefutable proof of authenticity.

Algorithmic detection

5. Algorithmic detection: Blockchain-based platforms can use deepfake detection software that algorithmically flags suspected fakes and deters bad actors.

Incentives alignment

6. Incentives alignment: Leveraging tokenized incentives and reputation staking, common in Web3 models, may discourage the production and dissemination of manipulated content.

But wait, there’s more …

Interestingly, on the flip side of AI is its potential to combat the scams it actually helped create by shoring up Web3 and DI systems. AI can help secure identities and monitor social networks for suspicious activity or profiles, weeding out forgeries and inauthentic content.

A Forbes writer describes this “quid pro quo” arrangement between Web3/DI and AI this way: “Blockchain-verified profiles can offer a type of ‘proof of humanity,’ a place to store digital assets and identifiers, and a single point of access into the Web3 ecosystem—all while keeping the user’s privacy under their control. Meanwhile, AI can work to monitor and secure these profiles, keeping people from becoming the next victims of AI-wielding criminals.”

We can act right now

This is an interesting and rapidly evolving topic. For now, the key takeaway is that AI-driven fraud will likely remain a threat for some time, but leveraging Web3 and DI technology grounded in blockchain and even pairing it with AI technology presents a viable solution that’s available right now. Explore the solutions we offer in the Anonyome Platform.

Suggested articles:

2024 was the Biggest Year for Data Breaches: Here’s How to Stay Safe in 2025

2024 goes down in history as the year with the largest and most destructive data breaches. The non-profit Identity Theft Resource Centre says in the…

Beware AI-Generated Deepfake Texts, Emails and Websites This Holiday Season

If a deal looks too good to be true, it probably is. Scroll on, or be very cautious if you decide to click. That’s the…

Anonyome Wins Prestigious SuperNova Award for Digital Wallet that Will Transform Agriculture

A digital wallet co-developed by Anonyome Labs and Indico which will transform trusted data sharing in the agriculture industry has won a prestigious Constellation Research…