Deepfake creator TikTok Tom Cruise: the public should not worry about ‘one-click fakes’

When a series of scarily convincing Tom Cruise deepfakes went viral on TikTok, some suggested it was a scary sign of what was to come – the harbinger of an era when AI would let anyone make fake videos of anyone else. The video’s creator, however, Belgian VFX expert Chris Ume, says this is far from the case. Speaking for The Verge on his viral clips, Ume emphasizes the amount of time and effort it takes to make each deepfake, as well as the importance of working with top-notch Tom Cruise impersonator Miles Fisher.

“You can’t do that just by pressing a button,” says Ume. “This is important, it is a message that I want to send to people”. Each clip took weeks of work, he says, using the open source DeepFaceLab algorithm, as well as established video editing tools. “When combining traditional CGI and VFX with deepfakes, it’s better. I make sure you don’t see any of the flaws. “

Ume has been working with deepfakes for years, including creating the effects for the “Sassy Justice” series by South ParkTrey Parker and Matt Stone. He started working on Cruise when he saw a video of Fisher announcing a fictional candidacy for the presidency of the Hollywood star. The two worked together on a follow-up and decided to post a series of “harmless” clips on TikTok. His account, @deeptomcruise, quickly accumulated tens of thousands of followers and likes, before Ume withdrew the videos a few days ago.

“It fulfilled its purpose,” he says of the report. “We had fun. I created awareness. I showed my skills. We made people smile. And that’s it, the project is done.” A TikTok spokesman said The Verge that the account was well within its rules for using deepfakes parody, and Ume notes that Cruise – the real Tom Cruise – has since made his own official account, perhaps as a result of seeing his doppelgänger go viral.

Deepfake technology has been developing for years and there is no doubt that the results are getting more realistic and easier to do. While there has been much speculation about the potential damage that such technology could do to politics, these effects have so far been relatively non-existent. Where technology is definitely causing damage is in the creation of revenge porn or non-consensual porn of women. In such cases, fake videos or images do not have to be realistic to create tremendous damage. Simply threatening someone with the release of false images or creating rumors about the existence of such content can be enough to ruin reputations and careers.

Tom Cruise’s counterfeits, however, show a much more beneficial use of technology: as another part of the CGI toolkit. Ume says there are many uses for deepfakes, from dubbing actors in films and TV, to restoring old images and animating CGI characters. What he emphasizes, however, is the incompleteness of technology operating on its own.

The creation of the fakes took two months to train the basic AI models (using a pair of NVIDIA RTX 8000 GPUs) on Cruise footage and additional processing days for each clip. After that, Ume had to go through each video, frame by frame, making minor adjustments to sell the overall effect; smoothing a line here and covering a gap there. “The most difficult thing is to make it look alive,” he says. “You can see it in the eyes when it’s not right.”

A lot of credit goes to Fisher, says Ume, who captured Cruise’s exaggerated mannerisms, from his maniacal laugh to his intense surrender. “He is a very talented actor,” says Ume. “I just do visual things.” Even so, if you look closely, you can still see moments when the illusion fails, as in the clip below when Fisher’s eyes and mouth falter for a second while he puts on his sunglasses.


Blink and you’ll lose: look closely and you can see Fisher’s mouth and peephole.
GIF: The Verge

While Ume’s point is that his deepfakes take a lot of work and a professional impersonator, it’s also clear that technology will improve over time. It is difficult to predict exactly how easy it will be to create continuous counterfeits in the future, and experts are busy developing tools that can automatically identify counterfeits or verify unedited footage.

Ume, however, says he is not too concerned about the future. We have developed this technology before and society’s conception of truth has more or less survived. “It’s like Photoshop 20 years ago, people didn’t know what photo editing was and now they know about these forgeries,” he says. As deepfakes become more and more common on TV and in movies, people’s expectations change, as did images in the Photoshop era. One thing is certain, says Ume, and that is that genius cannot be put back in the bottle. “Deepfakes are here to stay,” he says. “Everyone believes that.”

Source