Nueva y divertida consecuencia del deepfake: estafas criptográficas más convincentes


Lopatto, E. (3 de enero de 2024). Las estafas con criptomonedas están mejorando: ¿está YouTube preparado? the verge

And the deepfakes are improving.

The video is almost convincing, with Solana co-founder Anatoly Yakovenko declaring a "historic day" for Solana. He expresses gratitude to the "S-O-L" community and provides a gift via QR code and website. Sure, he sounds robotic — his voice is monotone, which is rare for him — and he barely makes eye contact with the camera, but it's a video, so seeing is believing, right?

Of course, it's a forgery. It's also been up on YouTube for a day. Not only that, but at least one internet user claims to have seen it as an advertisement. And it's not just YouTube. The phony video is surfacing in advertisements on the network once known as Twitter, which Elon Musk rather you call X.

"There has been a substantial increase in deepfakes and other AI-generated content recently," says Austin Federa, head of strategy at the Solana Foundation. (He emphasizes that it is not merely a crypto issue.) He assured me that Solana takes these forgeries seriously and reports them as soon as possible. But Solana isn't in charge of removing the forgeries. It's up to platforms like YouTube and X to decide. And they're being coy about it; Solana uploaded the video to YouTube last night.

YouTube's parent firm, Google, did not reply to a request for comment. YouTube canceled the account associated with the video shortly after this article was published.

For years, people in the cryptocurrency business have complained to me that Big Tech platforms do not act quickly enough to remove scammers. However, deepfakes improve, scams become more convincing, and prompt eradication becomes increasingly critical.

Of course, this is a moderate issue, but it has real ramifications, especially because the possibility of a Bitcoin ETF lurks on the horizon. One advantage of a financialized product is that it is managed by finance specialists, making it appear safer. After so, those participants are less likely to be duped by a phony film.

January 3rd, 5:25 p.m. ET: The video was removed after the article was published, according to the author.

Publicar un comentario

Artículo Anterior Artículo Siguiente