Because of the fast-paced nature of the crypto markets, investors are under intense pressure to rapidly determine if a video message is genuine or not.
Crypto investors have been warned to be on the lookout for “deepfake” crypto frauds in the future, as digital-doppelganger technology advances, making it more difficult for viewers to distinguish between fact and fiction.
According to David Schwed, chief operating officer of blockchain security firm Halborn, the crypto business is more “vulnerable” to deepfakes than ever before since “time is of the essence in making judgements,” leaving less time to evaluate the truth of a video.
According to OpenZeppelin technical writer Vlad Estoup, deepfakes use deep learning artificial intelligence (AI) to create highly realistic digital material by manipulating and altering original information, such as switching faces in movies, images, and audio.
According to Estoup, crypto criminals frequently employ deepfake technology to create fake films of well-known individuals in order to carry out scams.
Scammers utilised old interview footage of FTX’s former CEO and a voice emulator to send users to a fraudulent website promising to “double your Bitcoin” in November, as an example of such a scam.
According to Schwed, the volatile nature of cryptocurrency drives people to panic and take a “better safe than sorry” approach, which can lead to them becoming victims of deepfake scams. He mentioned:
“If a video of CZ is released claiming withdrawals will be halted within the hour, are you going to immediately withdraw your funds, or spend hours trying to figure out if the message is real?”
However, Estoup argues that, while deepfake technology is rapidly evolving, it is still not “indistinguishable from reality.”
Schwed recommends that one fast approach to detect a deepfake is to notice when the individual blinks their eyes. If it doesn’t look natural, it’s probably a deepfake.
This is because deepfakes are created using image files obtained from the internet, where the subject’s eyes are frequently open, explains Schwed. Thus, in a deepfake, the subject’s eyes must be simulated blinking.
Of course, the greatest identification, according to Schwed, is to ask questions that only the genuine person can answer, such as “what restaurant did we meet at for lunch last week?”
Estoup also mentioned that there is AI software available that can detect deepfakes and that one should keep an eye out for significant technological advancements in this field.
He also offered some sound advice: “If it sounds too good to be true, it often is.”
Patrick Hillman, Binance’s chief communications officer, stated in an August blog post last year that a sophisticated hoax was executed using a deepfake of him.
Hillman stated that the team created the deepfake using previous news interviews and TV appearances throughout the years in order to “trick some highly intelligent crypto members.”
He just became aware of this after receiving online remarks thanking him for his time spent speaking with project teams about potentially offering their assets on Binance.com.
SlowMist, a blockchain security business, reported 303 blockchain security events in 2022, with phishing, rug pulls, and other frauds accounting for 31.6% of them.
Disclaimer: The information provided is not trading advice, Bitcoinworld.co.in holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.