Baby and Animal Translators

Leor Grebler
3 min readJun 18, 2023

--

Generated by author using Midjourney

In a recent podcast of the Tim Ferriss Show, Tim co-hosts with Kevin Kelly a game called “Heresies” where each participant reveals something that they think others would find heretical and then the group discusses. Tim revealed his heresy. It was a belief that we would soon be able to communicate much better with the animal kingdom.

It instantly brought me back to two cartoon instances of translators. The first was from an ancient episode of The Simpsons where as a comeback from going into bankruptcy, a character named Herb invents a baby translator that can take baby sounds and turn them into intelligible sentences. He becomes rich off the device.

The second was an episode of Rick & Morty where Morty is gifted a headset that allows him to hear what animals are saying. He discovers that squirrels are behind a global conspiracy and based on Morty’s shocked appearance, they realize he can hear them and start to follow him.

From an episode of Rick and Morty

In his “heresy”, Tim mentions that AI and to a lesser extent, psychedelics, will allow us to better understand and communicate with animals. The hope from this belief is that it will lead us to have better insights into the world and maybe lead to more environmentalism. Others on the panel suggested that humans recognizing other creatures as sentient might just lead to sentience no longer being a reason to treat someone with compassion. This will mean that we end up becoming worse to each other, rather than better.

I believe that Tim is right in that AI for deciphering animal communication and translating it to human language is likely in the very short term. The same techniques in building out generative tools can be applied to large samples of animal communication along with the small set of communication we actually understand. We can then against animal reactions.

Another example from SciFi is the 90’s show SeaQuest. It it, a dolphin named Darwin is equipped with a translation system. For some reason, the speech synthesis of the dolphin sounded like a version of Gudetama. However, Darwin ended up being indispensable and saved the crew of the submarine many times. Without the animal translator, they would have been sunk (literally).

With both baby and animal translators, the main challenges are twofold:

  • Acquiring data
  • Tagging the data

Getting clean audio is difficult and filtering the audio can remove artifacts that might actually be key to the communication. The other challenge might be in acquiring enough variety of data. One might get data from a zoo but the way animals communicate there is different than in non-captive animal cultures. For a baby translators, there could be additional ethical considerations and consents required to capture the data.

Tagging the data is a whole new problem. What do we actually understand from the data set? What if the data can be interpreted differently based on geography? I’m getting dizzy thinking about it.

Hopefully, new techniques and maybe the plumbing to tag and capture what we know can allow for us to create our own Dr. Doolittle device in the not-too-distant future.

--

--

Leor Grebler

Independent daily thoughts on all things future, voice technologies and AI. More at http://linkedin.com/in/grebler