A Deep Fake Fooled My Mother

A few days after Rosh Hashanah, I met up with my mother at a restaurant. Ironically, or not, it was at a place called Gan Ha’em, Or “Garden of the Mother”. I wasn’t expecting the conversation to cover Elvis.

“Did you know that Elvis sang a Jewish song about Rosh Hashanah?” she mentioned to me. She said she heard from a friend a story about how a famous Israeli singer, Naomi Shemer, had met with Elvis back in the sixties and taught him a song. Then she showed me a clip that someone had sent her of Elvis playing on the Ed Sullivan Show singing a version of the Twelve Months Song (think Neil Sedaka’s Calendar Girl but for the Hebrew calendar). She seemed convinced by the story.

The video instantly screamed deep fake. A few key giveaways:

  • Big heads. I mean, the heads were just not proportional to the bodies. They also didn’t blend at the neck.
  • The over-use of filters to give a sixties television look.
  • The eye movements looked like they came from Uncanny Valley.
  • The camera tracking was too smooth and looked modern.

Oh, the biggest giveaway was that the credits mentioned it was a deep fake. It was a New Year’s greeting by Israeli ad agency B.Y. Creative and Productions and singer Ishay Raziel.

Beyond just creating a deep fake, the creators clearly put effort into creating the back story. They translated a well known holiday song, they adapted the words, they made it sound Elvis-y, and they had to act it out. It wasn’t a simple effort and it was fun to watch. My young daughter, who just learned the Hebrew version, wanted to hear the English version now over and over again.

However, for someone consuming the story through the torrent of messages and posts, the title and the 30 seconds of focus allotted to consuming it can make it seem plausible. Facts about history are being revealed all the time, especially as archives are digitized. It’s in this state of being overwhelmed with information that we can be manipulated and deceived.

If we can’t focus for more than 30 seconds on news or posts, we’re going to be prone to fake news. The fake is looking more real. Deep fakes in 3–4 years from now will be indiscernible from the real and indistinguishable voice synthesis will also be included with these videos. Compounding this problem is that even the real is fake in some way. Post processing of images makes us look better (or more real) and can enhance our appearance.

While the creative implementation of deep fakes today can create confusion, in the very near future, it may create chaos. Democratization of deep fake making, together with creative storytelling, may mean the need for deep scrutiny of all videos and news we consume.

We won’t be able to keep up and there’s a growing need for AI support. While generative adversarial network AI is often used to build deep fakes (one AI creates, the other tries to discern if it’s real or not, and the process keeps going until the discerning AI can’t determine a fake from a real), we need another AI adversary to spot the fakes or at least rouse our suspicion. This AI should produce to us a confidence score as well as plain language explanations as to why it suspects what we’re looking at is a deep fake.

The other solution is to use a block chain for content creation. We should be able to know the origin of every frame and all of the underlying software used to shoot, encode, and edit the content. While this would remove anonymity, it would ensure some authenticity to what we’re viewing.

Beyond a tech solution to spotting deep fakes, ongoing education and awareness campaigns around media consumption will help us spot the entertaining from the deceptive.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Leor Grebler

Leor Grebler

Independent daily thoughts on all things future, voice technologies and AI. More at http://linkedin.com/in/grebler