Skip to main content
Clear icon
46º

Deepfake AI brings ‘new wave’ of cyber attacks. This new tool is fighting back

IdentifAI founders can expose deepfakes ‘from gaining traction’

Two University of Virginia graduates have teamed up to develop a system to flag deepfake technology appearing on social media platforms from Facebook to YouTube.

Just last week, 21-year-old Paul Vann and 28-year-old Justin Marciano caught an AI version of Coinbase CEO Brian Armstrong offering a “double your money” bitcoin investment deal on YouTube.

Of course, Armstrong never recorded the commercial that was later pulled by YouTube.

“The domains that they (consumers) were being redirected to were created only three days before the attack,” IdentifAI cofounder Paul Vann told News 6. “So the idea is you go there, you send your Bitcoin and you’re waiting for it to come back double, and it never does.”

The AI evolution has started making waves from audio to video, and the fakes are impressive.

On Friday, the Better Business Bureau issued a scam alert writing in part, “Scammers can use new AI technology to mimic the voice of someone you know and create a phone call or voicemail recording. This ‘voice cloning’ technology has recently advanced, and anyone with the right software can clone a voice from a very small audio sample.”

That same technique is evolving with deepfake AI videos from Taylor Swift, Elon Musk and Brian Armstrong.

“It looked like him, sounded like him, and they even got a Coinbase-esque office in the background,“ Vann said.

Marciano views the deepfakes as a “new wave of cyber security attack.”

“Just the fact that this video was able to get through the cracks is pretty surprising,” Marciano said, “We ran each of the five stillshots that were taken from the video through our system and we were able to confirm that this image is not real.”

Marciano noted that it was surprising that “it was a paid advertisement on YouTube, just like you would see for Windex or a Kellogg’s cereal ad.”

“These ads a lot of times are not fully vetted,” Vann told News 6. “Companies pay a lot of money to put the ads out there, but hackers are doing the exact same thing.”

At this point no federal law clearly bans deepfakes, but 10 states have enacted statutes criminalizing “non-consensual deepfake pornography.”

In Florida, House Bill 919 passed in Tallahassee this session targets “certain political advertisements, electioneering communications, or other miscellaneous advertisements to include a specified disclaimer.”

The law mandates that the disclaimer in video must “occupy 4 percent of the vertical picture height.”

For audio AI, the law calls for the disclaimer to be “at least 3 seconds in length.”

If the governor signs the legislation into law, it will go into effect July 1, 2024.


You can listen to every episode of Florida’s Fourth Estate in the media player below: