Crypto deepfakes prove that X is failing as an industry platform
X, formerly Twitter, has a deepfake problem, one that I believe could — and, frankly, should — end its role as the premier social platform for the crypto industry.
Scrolling through the endless chaos of X, it doesn’t take long to come across a video, usually starring a stilted Michael Saylor or Brad Garlinghouse, promising me access to a crypto airdrop. Solana co-founder Anatoly Yakovenko’s likeness has also been used.
These videos are, of course, scams, putting a new and decidedly worrisome technology spin on an old problem — similar to the ever-prevailing threat of crypto phishing attacks.
The videos employ AI-generated voice recordings and what appear to be added lip animations to present “opportunities” to would-be crypto holders.
These “major announcements” share a common call to action: Scan a QR code or follow a link to receive “money.” Obviously, I didn’t click or scan by way through, but the implications are clear for anyone who has encountered the dangers of the modern internet. Interaction means security exposure, and exposure means the potential for stolen digital assets, personal information or more.
One account, live as of the time of writing, had scored 276,000 views for a Saylor deepfake and 139,000 views for a Garlinghouse deepfake.
Read more from our opinion section: Farcaster just might be the new Crypto Twitter
The fact that these posts are often tagged as advertisements — thereby offering them the veneer of legitimacy — is deeply problematic. Far worse is the reality that X is making money off the problem rather than making any kind of visible effort to confront them.
It wasn’t that long ago that X owner Elon Musk pledged to solve the spam bot problem. The reality seems to be that we’ve traded scam accounts that pretend to be Musk for scam accounts that act like crypto industry leaders — in both cases, enticing would-be victims with giveaways.
This is all playing out against the backdrop of a wave of other types of spam advertisements, as The Information detailed earlier this month. The result — fueled by an exodus of major advertisers that has slowly played out since Musk took over the now-former Twitter — is a minefield of questionable products, porn bots and these deep fake videos.
It’s a disaster. The situation further muddies the water of a platform that has, for many years, served as a central hub for inter-industry relations. And, if recent technology developments hold true, the problem will likely become worse over time.
News from Sam Altman’s OpenAI that the company has developed a text-to-video product may put additional tools in the hands of would-be scammers. To be sure, OpenAI appears to be taking an at least publicly cautious approach to allowing access to this tool, in what is doubtless an acknowledgement of the inherent risks. But AI-generated video tools will inevitably become widespread — opening the door to more sophisticated deep fakes like the crypto ones popping up today.
Where to from here? Should alternative social media platforms be embraced? Do crypto-enabled platforms and protocols like Lens or Farcaster hold the key to a social media without these scam videos?
I don’t know. I believe the industry should become louder about this issue — if, for nothing else, to perhaps prompt a response from X and Musk. Remember, the Musk bot issue did go away, but by all appearances the fundamental problem — fake accounts masquerading as sources of authority — has metastasized.
This situation places an industry long dogged by its worst impulses in close proximity with a dangerous collision of technology and exploitation. The crypto industry would be wise to confront it head-on.
Michael McSweeney has worked in crypto media since 2014, including editorships at CoinDesk and The Block. In his spare time, he writes fiction and plays disc golf. Contact Michael at [email protected].
Source