Deepfake Videos: An Issue for Video Translators and Voiceover Artists?

October 16, 2019
Deepfake Videos: An Issue for Video Translators and Voiceover Artists?

Do you translate videos for a living or work as a voiceover artist? If so, have you ever come across a ‘deepfake’ video? 

Deepfake videos are quite terrifying in their potential, particularly given how quickly fake news stories can spread. As such, it’s important that the video translation industry stays alert to the deployment of deepfake technology, in order to identify any videos that have been manufactured using it. 

What Is Deepfake Technology?

Deepfake software allows people to manipulate videos with such skill that they look real (anyone watching The Capture in the UK right now can see deepfake explored in thrilling detail). Essentially, deepfake technology can be used to make people appear to do things that they actually haven’t done. 

Embraced initially by intelligence and propaganda agencies (as well as special effects studios), advances in machine learning and artificial intelligence mean that deepfake technology is now available to anyone with an interest in making fake videos. 

The technology works by using neural networks to analyse video footage and algorithmically transpose the face and movements of one human onto another. 

Are you concerned about the potential of deepfake technology yet? If not, you should be! 

Imagine not being able to trust in the legitimacy of anything you watch because you know it could be a fake. Then think about how many people there are out there who believe everything they read in the papers. Scared yet? 

The potential to control the way populations think and behave is enormous. The Cambridge Analytica scandal shows what there is to be gained from the manipulation of voters (watch Netflix’s The Great Hack, if you haven’t already – it’s truly chilling). Politicians who are happy to use voters’ data, targeted advertising and fake news have the potential to rig elections in their favour. Deepfake videos provide the potential to do this even more effectively. 

Imagine it’s late October 2020, just a few days before the next US presidential election. Then picture the whole of America waking up to a video of Donald Trump’s rival candidate engaging in an extra-marital sexual encounter or committing a crime. Deepfake videos can make this happen. This readily available technology is so sophisticated that it can make people appear to do and say anything. And that’s the kind of fake scandal that could swing a tightly fought election race. 

Deepfake Video Examples 

If you want to see deepfake videos in action, just Google ‘deepfake Barack Obama’ and you’ll see the former US president ‘saying’ some truly shocking things. The video is actually from the production company of American actor and comedian Jordan Peele, with Peele putting the words into Obama’s mouth. 

To watch it, it’s unlikely you could tell from the quality of the video itself that it’s a deepfake. Only common sense tells us that Barack Obama wouldn’t refer to President Trump (at least in public) in the way that he does in the clip. The video highlights perfectly the power that deepfake has to smash through the trust we have in what we see. 

So far, deepfake creators are known mainly for their efforts in relation to celebrity pornography – where pornographic videos are forged by editing real celebrities’ faces over those of the actors. An unpleasant experience for the celebrities concerned, no doubt, but a focus that hasn’t caused widespread harm. 

However, the ease with which it’s possible to make deepfake videos means that it’s only a matter of time before we see them used to smear political candidates and stir up tense situations. 

The fact that huge swathes of the population have never heard of deepfake videos makes the situation even more alarming. A 2019 iProov survey found that 72% of UK adults had never heard of deepfake videos. The personal, professional and even national security implications of this are staggering. Almost three quarters of the population would likely believe a deepfake video without even realising that the technology to fake it existed. 

How Video Translators Can Spot a Deepfake Video

It stands to reason that all those who undertake video translation or voice actor work for a living need to be on the lookout for deepfakes. Whether your work involves captioning, subtitling, closed captioning, editing or providing voiceover services, freelancers and translation agencies who provide video translation services to clients globally, and whose work is focused on the technical aspects, are in a strong position to spot deepfake videos. 

But how do you spot something that looks and sounds so convincingly realistic? This isn’t just a question of spotting a lookalike or an impressionist, after all. 

Deeptrace’s first report on the evolution of deepfake videos observes a ‘lack of reliable technology for detection’ when it comes to such videos. As such, human vigilance is front and centre of the effort to detect deepfakes. So, how can you spot one? 

One of the first tell-tale signs of a deepfake video relates to blinking. Deepfake technology uses images to create the ‘skin’ that will appear on the original video. Most politicians and other well-known figures are photographed with their eyes open, so deepfake software can struggle with the blinking element of the deception. The result is that deepfake subjects often blink less that real people do and their blinking may seem slightly off.  

Not enough to go on? Ok, in that case try slowing the video down. Deep learning expert Jonathan Hui suggests that this may show some blurring around the face, but not other elements of the video – another sign of a deepfake. 

Examine the skin tone at the edges of the face too, for any irregularities, and check the chin and eyebrows for any blurring or doubling up. If a hand or other object obscures the face too for a moment, check for blurring there. 

What to Do if a Client Asks You to Translate a Deepfake Video 

It’s up to you whether or not you are happy to translate deepfake videos. Often, the content of the video itself may make the decision easier. Is it clearly humorous? The work of an entertainer showing off their ability to mimic and impersonate celebrities? Or is it something more sinister? 

If you’re concerned, you have the option to address your issues with the client. Freelance translators and voice artists are never obliged to work on projects to which they have a moral objection. As such, if you feel the video has the potential to cause harm, you always have the option to decline the work. 

Remember that audio translation work can contribute to the production of deepfake videos too. If you’re asked to undertake an audio translation that you’re uncomfortable with, you can decline the job. And, of course, if there’s any indication of criminal or harmful content, it’s worth converting the audio to text and then reporting the transcription to the authorities. Even if the file itself isn’t audible to them, the text transcript should provide sufficient grounds for them to look into the situation. 

If you come across a deepfake video on a website, it’s possible to report it. Twitter and Gfycat have committed to deleting deepfake videos and blocking their publishers. Reddit can apply ‘involuntary pornography’ policy violations, while Google’s ban list includes ‘involuntary synthetic pornographic imagery.’

There are also legal challenges to deepfakes in place in some US states, with Virginia, Texas, New York and California all passing legislation designed to tackle deepfakes. Nationally, the Malicious Deep Fake Prohibition Act was introduced to the Senate in 2018 and the Deepfakes Accountability Act to the House of Representatives in 2019. 

With the pace of deepfake software advancement oustripping the pace of technological regulation, the video translation industry is in a position to support the human effort to identify deepfakes and ensure that they don’t cause harm. Whether you run a translation agency or work freelance, you have the power to spot deepfakes before they spread.

You can also help by spreading the word (for example by sharing this article!) so that the wider translation community becomes aware of what deepfakes are and what can be done to spot and report them. 

Are you ready to be part of that effort? If so, make sure you keep up to date with the latest deepfake software developments and be on the lookout for videos that just don’t quite feel right. 

Good luck! 

By Ofer Tirosh

Ofer Tirosh is the founder and CEO of Tomedes, a language technology and translation company that supports business growth through a range of innovative localization strategies. He has been helping companies reach their global goals since 2007.

Share:

STAY INFORMED

Subscribe to receive all the latest updates from Tomedes.

Post your Comment

I want to receive a notification of new postings under this topic

GET IN TOUCH

Need expert language assistance? Inquire now

AI Tools

Begin translations with confidence using the Tomedes Pre-Translation Toolkit

Try Now

Do It Yourself

I want a free quote now and I'm ready to order my translations.

Do It For Me

I'd like Tomedes to provide a customized quote based on my specific needs.

Want to be part of our team?