The danger of Deepfakes

Aditi Dosi
2 min readJan 9, 2023

--

Deepfakes are ranked the biggest AI threat by experts

Deepfakes are ranked the biggest AI threat by experts

The use of technology to manipulate appearances and voices of people into real-looking footage highlights the need to question the rise of Artificial Intelligence and its misuse

AI-supported deepfake technology offers improved capabilities — but it also increases the scale for manipulation and bad actor intervention. Starting with obvious examples: Check out this Tiktok compilation about Tom Cruise. FAKE. Or this video of Barack Obama calling Trump “a total and complete dipshit.” Also, FAKE. The list goes on. There was once a meme in which Nicolas Cage became the fake leading actor of a series of different movies (video compilation). Today, anyone can create a deepfake. No programming skills are needed.

As deepfake technology continues to advance, specialists said that fake content would become more difficult to identify and stop, and could assist bad actors in a variety of aims, from discrediting a public figure to extracting funds by impersonating a couple’s son or daughter in a video call.

Deepfake technology increasingly uses the face-swapping method to scam people. Some of the nefarious purposes of such technology are celebrity pornography, election manipulation, identity theft, financial fraud, etc.

How to spot a deepfake?

It is hard. And it is becoming harder to spot a deepfake. In 2018, US researchers discovered that deepfake faces do not blink as normal people do.

The reason behind this was the machine-learning algorithm uses images to create the scenario and images, generally, do not blink. Hence, the AI never learns to blink. This discovery seemed like a benchmark for detecting deepfakes.

But soon, deepfakes appeared with people blinking normally. As soon as a problem is detected, it is or will be fixed. Thus, verification from several other reliable platforms could be the only way of identifying whether the person genuinely said something.

--

--