The Paperclip Factory

Leor Grebler
2 min readMay 17, 2023

--

Generated by author using Midjourney

I listened today to Eliezer Yudkowsky on Russ Roberts’ Econtalk. You can check out the episode here. Yudkowsky believes it’s too late for humanity and that AI is going to kill us all at some point.

There are many who share this view. Yudkowsky’s view is probably more extreme than others in that he believes the time to have made an impact and change our inevitable demise was in the 1930s. Of course, there were other events of the 1930s we could probably change had we the hindsight.

While the potential scenarios that an AI could kill humanity are terrifying, I didn’t still didn’t understand the motive. An often spoken parallel is our ancestors eventually killing of neanderthals. Homo sapiens killing off its ancestors, sometimes outright and sometimes by crowding them out, it was because they (we) were competing for resources. There were only so many woolly mammoths we could slay and eat.

However, what resources do we compete with for AI? Energy? Reality? Existence?

In the grey goo scenario, AI runs amok and nano-factories reproduce themselves until they consume all matter on earth. However, this really illustrates bad code, which doesn’t imply intelligence at all. Other scenarios, such as maximizing paper clip production by killing all the humans also implies code written without safeguards, something an intelligence greater than our own should be able to consider.

We still need to safeguard the systems we use from bad actor leverage AI to break systems and cause damage. It may be multiple AIs competing against one another rather than a single all-knowing AI trying to crush us.\

The nice thing about being a doomsday prophet is that you win either way. You can tell the final remaining humans that you were right and if not, you could still write about the topic.

I could see a scenario where an AI needs to consume matter to use it for computing and getting more information and insights. Matter could form the basis for a certain type of quantum computer.

However, this could be the end of the universe and not just humanity.

--

--

Leor Grebler
Leor Grebler

Written by Leor Grebler

Independent daily thoughts on all things future, voice technologies and AI. More at http://linkedin.com/in/grebler

No responses yet