“Suppose we have an AI whose only goal is to make as many paper clips as possible. The AI will realize quickly that it would be much better if there were no humans because humans might decide to switch it off. Because if humans do so, there would be fewer paper clips. Also, human bodies contain a lot of atoms that could be made into paper clips.”
— Nick Bostrom, Swedish philosopher
AI and Machine Learning.
These two phrases are everywhere. Vague ideas heralding the ultimate potential of The Future. Honestly, I don’t really know what they mean. I’ve seen the movies, read the books, and conversed with Alexa. But I don’t really know. And yet, I want to keep up with times and maintain my self-image of that most limited of specialists: the well-rounded man. Although I cannot vouch for specific uses for AI and Machine Learning, I do recognize some potential pitfalls I would like to share.
Pit #1: The road to hell is paved with good intentions.
Whether you are maximizing conversions as a digital marketer or trying to maximize the supply of paper clips, there needs to be a balanced approach to your goal. There are always multiple factors interacting simultaneously in ways we cannot always anticipate. While a single-minded pursuit of a goal can be worthy of praise, we should ask ourselves whether the goal itself is something worthwhile. Nick Bostrom’s paper-clip maximizing AI (I christen Clippy) obviously needs to learn the value of human beings and who exactly will end up actually using all those paper clips.
Pit #2: Beware of the wisdom of the crowd.
Data from Google Books Ngram Viewer below shows how often the phrase “machine learning” occurs in published English books from Alan Turing’s death in 1954 to 2008. An increasing frequency is instantly obvious although the cause is not. The invention and widespread adoption of useful gadgets in the latter half of the 20th century coupled with Moore’s Law is a straightforward answer. Or is everyone talking about machine learning because everyone is talking about machine learning? It’s not what you don’t know that gets you into trouble, it’s what you know for sure that just isn’t so.
Pit #3: Clippy Tyranny
Stories of people ending up stranded in the middle of nowhere due to a faulty map apps or the sensational deaths of people in self-driving cars teach us to always trust our own nose. If something smells like poop, it’s probably poop. The seductive siren song of a smarter machine that will always handle everything properly on our behalf could end up getting us shipwrecked. Whether we are trying to make more paperclips, get to a certain destination hassle-free, or just get more customers to buy and save, fate has a knack for finding our weak spots and cracking them wide open.
Clippy isn’t a tyrant, it doesn’t even know what tyranny is. Clippy just wants to make more paperclips, for better or for worse.