Pine from Aifei Temple Quantum Bit | WeChat Official Account QbitAI
DeepFake has been used in telecom fraud, how can we combat it?
Just have him turn his head and look at his profile.
DeepFake has always had this vulnerability: when the face being forged is completely turned sideways (turned 90°), its authenticity drops sharply.
Why does this happen?
There is an article online that analyzes why the effect of facial forgery is greatly reduced when viewed from the side.
Reasons for Profile Distortion
Horizontal Limitations
When using DeepFake for face swapping, the authenticity drops sharply when the face is at an angle.
This is because most 2D-based facial alignment algorithms capture only 50%-60% of the feature points present in the frontal view.
Take the Multi-view Hourglass facial alignment model from “Joint Multi-view Face Alignment in the Wild” as an example.
By identifying facial feature points, the model is trained using this data.
As shown in the image, 68 feature points are identified in the frontal alignment, while only 39 feature points are identified in the side alignment.
The side profile view hides 50% of the feature points, which not only hinders recognition but also interferes with the accuracy of training and subsequent facial synthesis.
DeepFake expert Dr. Siwei Lyu stated:
For current DeepFake technology, the side profile is indeed a big problem. The facial alignment network (facial alignment network) performs very well for frontal views, but not so well for side views.
These algorithms have a fundamental limitation: if you only cover part of your face, the alignment mechanism can work well and is very powerful in that case, but when you turn, more than half of the feature points are lost.
Ordinary People’s Image Data “Desert”
To achieve a relatively realistic effect in face swapping, a large amount of training is required, which means sufficient training data is needed.
Someone online has trained a large dataset to replace Jerry Seinfeld’s face into scenes from Pulp Fiction (1994) .
The resulting side profile images are also hard to spot forgeries:
However, achieving such realistic effects requires extensive training data; in the above example, the TV show “Seinfeld” provided 66 hours of usable footage for this training.
In contrast, ordinary people’s image data is scarce, and very few people take photos that fully capture a 90° side profile.
Therefore, faces forged through DeepFake can easily show flaws when viewed from the side.
Some netizens also joked on Hacker News:
I recently went to an unknown bank to get a card, and they actually required my side profile photo. I was puzzled at the time, but now I finally understand why.
Waving Hands in Front Can Also Identify Forged Faces
When judging whether the other party in a video call is a DeepFake forgery, besides checking the side profile, there is a small method: wave your hand in front.
If it is a forged face, the overlay of the hand and facial image may appear disordered, and there may be a delay during the hand waving process.
△ When swapping faces with Black Widow and Professor X, the overlay of the hand and face became disordered.
Real-time DeepFake faces this problem: it needs to overlay real occlusions onto the unreal facial images, a process commonly referred to as “masking” or “background removal.”
Moreover, real-time DeepFake models need to be able to perform matting as required to a convincing level.
However, there are often many confusing occlusions that can affect the “matting” process, such as occlusions with facial features that can confuse the model, making the “matting” process difficult.
Waving your hand in front of a forged face, the rapid movement of the occlusion can cause significant difficulties for “matting,” resulting in substantial delays and affecting the quality of the overlay.
One More Thing
Face-swapping crimes are not far off; media outlets have reported instances where suspects used DeepFake face-swapping to conduct remote interviews for IT jobs, attempting to infiltrate companies to obtain their customer or financial data, as well as corporate IT data and professional information.
The FBI has sent letters to its Internet Crime Complaint Center, stating that it has received multiple complaints of individuals using stolen information and deepfake videos and audio to apply for remote technical jobs.
In a case described in a federal agency report from May, some face-swapping suspects operated through several shell companies, making it more difficult to identify their identities.
Reference Links: [1]https://metaphysic.ai/to-uncover-a-DeepFake-video-call-ask-the-caller-to-turn-sideways/[2]https://news.ycombinator.com/item?id=32384653[3]https://arxiv.org/pdf/1708.06023.pdf[4]https://gizmodo.com/deepfakes-remote-work-job-applications-fbi-1849118604
— End —
Join the WeChat group for “Artificial Intelligence” and “Smart Cars”!
Welcome to join us, those interested in artificial intelligence and smart cars, to communicate and exchange with AI practitioners and not miss the latest industry developments & technological advancements.
ps. Please be sure to note your name-company-position when adding friends~
Click here👇 to follow me, remember to star it~
One-click three connects “Share”, “Like”, and “View”
See you every day at the forefront of technological advancements~