Just about every year since 1952, Queen Elizabeth II of the United Kingdom has delivered a Christmas address to the masses, and 2020 will be no different. Shortly after she gives her remarks, however, British broadcaster Channel 4 will air an “alternative message” from the Queen, brought to life by deepfake software and an actress with a pseudo-regal affect.
“On the BBC, I haven’t always been able to speak plainly and from the heart,” the “Queen” said in a promo posted to the broadcaster’s Twitter. “So I’m grateful to Channel 4 for giving me the opportunity to say whatever I like without anyone putting words in my mouth.”
There’s relatively little risk that anyone would look at Channel 4’s deepfake and regard it as a genuine message from the Queen. Apart from the fact that actress Debra Stephenson just doesn’t do a great impression, the topics supposedly up for discussion — a candid take on Harry and Meghan’s departure from the royal family and Prince Andrew’s history with financier and sex trafficker Jeffrey Epstein — are far afield of typical Queen-ly protocol. And from a technical perspective, the visuals seen here fall well short of life-like. For deepfakes, these kinds of gaps in verisimilitude aren’t uncommon.
To create these kinds of doctored images, neural networks have to be trained with as much footage as possible of the person intended to be faked — once that’s done, that “understanding” of what the subject looks like can be used to map the original face to someone else’s. Advances in machine learning and the hardware used to power it make creating deepfakes simple for small teams and individuals, but crafting truly convincing ones takes a level of sophistication that few seem to have mastered. For better or worse, this attempt is easily seen through.
Channel 4 says the message is intended to serve as a “stark warning” about the worrying potential of fake news and manipulated messages, and the broadcaster’s director of programs told The Guardian that the video is “a powerful reminder that we can no longer trust our own eyes”.
While Channel 4’s intent might have been to educate, that hasn’t prevented the national broadcaster from taking heat for the stunt. Since the teaser was shared on Twitter last night, the service has been alight with people deriding the decision for (among other things) its perceived bad taste and disrespect, as well as the potential it could have on the way people react to deepfakes in the future:
This is appalling. For a channel that highlights the dangers of fake news and doctored content online in its news output, it’s very disappointing to see it indulging in the same methods for ‘comedy’ and normalising the blurred lines.
— Benjamin Butterworth (@benjaminbutter) December 24, 2020
via https://AiUpNow.com December 24, 2020 at 12:39PM by , Khareem Sudlow,