Deepfake danger: what a viral clip of Bill Hader morphing into Tom Cruise tells us

Deepfake danger: what a viral clip of Bill Hader morphing into Tom Cruise tells us

Are deepfakes a threat to democracy? The creator of a series of viral clips says he is raising awareness of their subversive potential

You’ve heard of deepfakes – doctored videos fabricating apparently real footage of people – and their potential to disrupt democracy. But this might be the clip that makes you believe it.

A YouTube clip of Saturday Night Live’s Bill Hader in conversation with David Letterman on his late night show in 2008 has gone viral for showing Hader doing an impression of Tom Cruise – as his face subtly shifts into Cruise’s.

The mind-melding video has been viewed nearly 3m times since being uploaded to the YouTube channel Ctrl Shift Face a week ago. “Well, I guess I don’t need to do mushrooms any more,” is the top-voted comment.

Do you believe in alien life and/or UFOs? Follow us on FACEBOOK, RUMBLE, TWITTERINSTAGRAM! Or you can join or group off of social media HERE!!

Other commenters have expressed disquiet at the potential of such realistic AI manipulation. “I’m always amazed with new technology,” wrote one commenter who claimed to be a software engineer, “but this is scary.”

Gavin Sheridan, a tech commentator, tweeted of the clip: “Imagine when this is all properly weaponised on top of already fractured and extreme online ecosystems and people stop believing their eyes and ears.”

But the video’s creator – a Slovakian citizen resident in the Czech Republic, who gave his name as Tom to the Guardian – says his unsettling face-swapping videos are an attempt to raise awareness in the age of fake news and footage.

In the past three months, Tom has made and shared 20-odd deepfake videos, among them The Shining reimagined with Jim Carrey in Jack Nicholson’s role and A Knight’s Tale starring Heath Ledger’s Joker.

The Matrix, with Bruce Lee replacing Keanu Reeves.

On his YouTube channel, he describes them as “windows to parallel universes”, and says his principal aim is to entertain – though he has accepted paid work, mostly in the film and entertainment business. “I don’t want to get tangled in politics too much,” he says.

Although his videos have been described as those of a “deepfake master”, Tom says he is self-taught, but working as a 3D artist in the film and games industry means he has access to a sufficiently powerful computer. Each video takes about three to five days to make, using an open-source programme. “I started by faking myself into shows and movies to entertain me, my friends, family, and experimenting. It’s a lot of trial and error.”

In another deepfake, David Bowie morphs into Rick Astley.

The key to creating a convincing deepfake is using footage that is of high enough resolution to create an accurate data set for the AI network learn from, says Tom. “That’s why it’s hard to deepfake historical figures, like Churchill, for example – there’s not enough data.”

He inserted Carrey into the famous “Here’s Johnny” scene in The Shining and was disappointed: “It didn’t look very good, as you’re too close.” An attempt to replace Christian Bale in the business card scene of American Psycho with Cruise failed because of similarly extreme close-ups – and because Bale was wearing glasses.

To those who might criticise Tom for contributing to the misinformation that abounds on the internet, he says he is transparent. “I always mention that it’s a deepfake in the title and description – I don’t want to mislead anyone.”

‘It didn’t look very good, as you’re too close.’ Jim Carrey inserted into The Shining.

Though he says he is deeply concerned by fake news, and how it has been weaponised by the “alt-right”, he says deepfakes pose less of a threat than fake news articles, which are harder for the platforms to detect.

“It’s an arms race: someone is creating deepfakes, someone else is working on other technologies that can detect deepfakes. I don’t really see it as the end of the world like most people do.”

He singles out two manipulated videos recently spread by supporters of President Trump: one edited to make Nancy Pelosi slur her speech, and one apparently sped up to show CNN’s Jim Acosta accosting a White House intern. “These are exactly the type of videos that are debunked very easily – you can show the original, and, boom, you have the proof.”

Fake news articles, presented as being from a legitimate source and “spreading on Facebook like wildfire”, are of greater concern, says Tom. “To do a good deepfake takes a lot of effort. If you can spread fake news in an easier way, that’s what they will do.”

Tom describes himself as a hobbyist, but says he hopes that his deepfakes raises awareness of the potential of the technology. “People need to learn to be more critical. The general public are aware that photos could be Photoshopped, but they have no idea that this could be done with video.”

As his YouTube profile says: “Do not believe everything that you see on the internet, OK?”


, , , , ,

Leave a Reply

%d bloggers like this: