Deepfake technology used by crypto fraudsters to bypass know-your-customer (KYC) verification on crypto exchanges such as Binance is only going to get more advanced, Binance’s chief security officer warns.
Deepfakes are made using artificial intelligence tools that use machine learning to create convincing audio, images or videos featuring a person’s likeness. While there are legitimate use cases for the technology, it can also be used for scams and hoaxes.
Deep fake AI poses a serious threat to humankind, and it’s no longer just a far-fetched idea. I recently came across a video featuring a deep fake of @cz_binance , and it’s scarily convincing. pic.twitter.com/BRCN7KaDgq
— DigitalMicropreneur.eth (@rbkasr) February 24, 2023
Speaking to Cointelegraph, Binance chief security officer Jimmy Su said there has been a rise in fraudsters using the tech to try and get past the exchange’s customer verification processes.
“The hacker will look for a normal picture of the victim online somewhere. Based on that, using deep fake tools, they’re able to produce videos to do the bypass.”
Su said the tools have become so advanced that they can even correctly respond to audio instructions designed to check whether the applicant is a human and can do so in real-time.
“Some of the verification requires the user, for example, to blink their left eye or look to the left or to the right, look up or look down. The deep fakes are advanced enough today that they can actually execute those commands,” he explained.
However, Su believes the faked videos are not at the level yet where they can fool a human operator.
“When we look at those videos, there are certain parts of it we can detect with the human eye,” for example, when the user is required to turn their head to the side,” said Su.
“AI will overcome [them] over time. So it’s not something that we can always rely on.”
In August 2022, Binance’s chief communications officer Patrick Hillmann warned that a “sophisticated hacking team” was using his previous news interviews and TV appearances to create a “deepfake” version of him.
The deepfake version of Hillmann was then deployed to conduct Zoom meetings with various crypto project teams promising an opportunity to list their assets on Binance — for a price, of course.
Hackers created a “deep fake” of me and managed to fool a number of unsuspecting crypto projects. Crypto projects are virtually under constant attack from cybercriminals. This is why we ask most @binance employees to remain anonymous on LinkedIn. https://t.co/tScNg4Qpkx
— Patrick Hillmann (@PRHillmann) August 17, 2022
“That’s a very difficult problem to solve,” said Su, when asked about how to combat such attacks.
“Even if we can control our own videos, there are videos out there that are not owned by us. So one thing, again, is user education.”
Related: Binance off the hook from $8M Tinder ‘pig butchering’ lawsuit
Binance is planning to release a blog post series aimed at educating users about risk management.
In an early version of the blog post featuring a section on cybersecurity, Binance said that it uses AI and machine learning algorithms for its own purposes, including detecting unusual login patterns and transaction patterns and other abnormal activity on the platform.
AI Eye: ‘Biggest ever’ leap in AI, cool new tools, AIs are the real DAOs
Read the full article here