Skip to main content

How scientists are fighting ‘Deepfake’ celebrity porn videos with AI



Deepfake videos are getting better and better, and it’s terrifying – there’s an intense community trading fake celebrity porn, and manipulated political speeches are furthering the serious fake news epidemic. ‘Deepfakes’ are AI-edited videos, where a person’s face is superimposed onto another – everyone from Katy Perry to Vladimir Putin has become a victim.
Since Motherboard first reported on the new trend at the beginning of 2018, an online crackdown against AI-assisted fake porn and other videos has been taking place. Reddit and Gfycat were some of the first platforms to begin a ‘mass purge’ of hardcore clips featuring prominent musicians and actors with their faces mapped onto porn performers. Twitter, Pornhub, and Discord soon followed, labeling the videos as “nonconsensual” – nevertheless, the volume of videos on these networks is huge and proving difficult to contain.
Now, researchers at the State University of New York are working on AI to detect and combat Deepfakes. The scientists at SUNY figured out that, though hyper realistic, more human behaviours and physical quirks like blinking and breathing weren’t showing up in the videos. This is because the Deepfake AI uses images rather than video. SUNY’s rival AI uses computer vision to determine where blinking is taking place or not to weed out Deepfakes.
Example of blinking eye detection by AIcombatting Deepfakesvia SUNY
The team continues to work on the new AI, hoping to better its ability to detect micro behaviours like breathing and visible pulse, as well as advancing how it sees blinking. Their research is outlined in “In Ictu Oculi: exposing AI generated fake face videos by detecting eye blinking”. 
Some of the most chilling videos include the likes of Gal Gadot, Katy Perry, Emma Watson, and Daisy Ridley, their faces superimposed onto hardcore porn scenes. Others include fake speeches from Donald Trump, Putin, and Barack Obama.
The deep learning method for Deepfakes continues to get better, and may surpass this new AI quite quickly. It’s possible we’ll get to a stage where it’s impossible to tell the difference between real-life and neural network-mapped videos. Researches at SUNY express that the public themselves will need to be well equipped to discern fiction from reality. “It is hard to predict at what point in time such ‘fake’ videos will be indistinguishable from real content for our human eyes,” writers Michael Zollhöfer in the Stanford blog on Deepfakes, via Gizmodo.
“In my personal opinion, most important is that the general public has to be aware of the capabilities of modern technology for video generation and editing. This will enable them to think more critically about the video content they consume every day, especially if there is no proof of origin.”

Comments

Popular posts from this blog

Exynos Galaxy S10 visits GeekBench, still behind the iPhone XS in single-core test

The Samsung Galaxy S10 has been spotted on GeekBench and the numbers are in. This time, it's the Exynos 9820 variant for South Korea (SM-G973N), and there are notable differences with the previously seen results from the S10+  in Snapdragon 855 trim . The single-core result, in particular, is far ahead of any other chip in the Android world - the current Kirin 980 (around the 3400s) and Snapdragon 845 (ballpark is in 2400s) are no match for the next-gen Exynos, and early Snapdragon 855 figures are much lower too (3400-3550 depending on who you ask). In fact, the older Exynos 9810 gets closest, posting single core results upwards of 3600 and into the high 3700s under the right conditions. Even so, the iPhone XS remains out of reach. GeekBench 4.1 (single-core) Higher is better Sort by Label Sort by Value Apple iPhone XS Max 4777 Samsung Galaxy S10 (E9820) 4382 Samsung Galaxy S9+ 3771 Samsung Galaxy Note9 3642 Samsung Galaxy S10+ (S855) 3413 Huawei Mate 20 Pro 329...