Image for post
Image for post

The US Presidential Election set up a conversation on the polarization of Americans in bipartisan politics. The term echo chamber was brought up quite a bit and the idea that we can become more entrenched in our opinions as we’re exposed to others with similar opinions.

The issue is compounded by technology and how we can connect easily with the like-minded. Before, we had to tolerate the different opinions of those in our immediate proximity but now we can find an almost endless supply of the like-minded. This tunnelling of opinion can happen across many facets of our personality and our technology is compounding this.

I’m a ketogenic ovo-lacto-pollo-pesco-vegetarian who’s also a self-experimenter and I hold strong views about certain technologies and the Singularity. I’ve noticed more that I’m patting myself on the back for getting exposed to opinions that validate my choices.

The reason?

Medium recommends to me articles based on what I’ve read. Google Assistant exposes me to news around the topics I’ve searched, my Facebook Feed is the same. YouTube recommends videos based on my viewing habits. Same with Google Play for podcasts. I just keep going further down the rabbit hole. Even for music, we’re recommended based on songs we like.

The result is that we’re exposed to fewer ideas and less news outside our areas of interest. Our worldview becomes smaller and we wonder if there’s anything beyond. We then become easier to control as only our feed can inform us as opposed to the person sitting across the table.

Six years ago, I thought that the distraction of devices was the biggest unintended outcome to affect us and that we needed a more natural way to interact with the world around us to keep us connected. Today, it’s that AI is so good at exposing us to what we want to hear that we end up becoming more insular and difficult to communicate with. This is the biggest challenge for the end of this decade. We’ve unintentionally let AI influence us.

Since the AI genie is already out, we’re going to have to influence it to influence us in ways that can make us better to each other. Can it allow 20% of our feed for opinions we find distasteful or counter to our views? What’s the right mix of opinion? If it can predict that any given article can have too impactful an influence on us, can it bury it in our feed?

Maybe we need our own personal AI-based filters to review the media presented to us from other AIs and add randomness to reduce their effect.

Independent daily thoughts on all things future, voice technologies and AI. More at http://linkedin.com/in/grebler

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store