Escaping Skinner's Box: AI and the New Era of Techno-Superstition
Philosophical Disquisitions - Un pódcast de John Danaher
Categorías:
[The following is the text of a talk I delivered at the World Summit AI on the 10th October 2019. The talk is essentially a nugget taken from my new book Automation and Utopia. It's not an excerpt per se, but does look at one of the key arguments I make in the book. You can listen to the talk using the plugin above or download it here.] The science fiction author Arthur C. Clarke once formulated three “laws” for thinking about the future. The third law states that “any sufficiently advanced technology is indistinguishable from magic”. The idea, I take it, is that if someone from the Paleolithic was transported to the modern world, they would be amazed by what we have achieved. Supercomputers in our pockets; machines to fly us from one side of the planet to another in less than a day; vaccines and antibiotics to cure diseases that used to kill most people in childhood. To them, these would be truly magical times. It’s ironic then that many people alive today don’t see it that way. They see a world of materialism and reductionism. They think we have too much knowledge and control — that through technology and science we have made the world a less magical place. Well, I am here to reassure these people. One of the things AI will do is re-enchant the world and kickstart a new era of techno-superstition. If not for everyone, then at least for most people who have to work with AI on a daily basis. The catch, however, is that this is not necessarily a good thing. In fact, it is something we should worry about. Let me explain by way of an analogy. In the late 1940s, the behaviorist psychologist BF Skinner — famous for his experiments on animal learning —got a bunch of pigeons and put them into separate boxes. Now, if you know anything about Skinner you’ll know he had a penchant for this kind of thing. He seems to have spent his adult life torturing pigeons in boxes. Each box had a window through which a food reward would be presented to the bird. Inside the box were different switches that the pigeons could press with their beaks. Ordinarily, Skinner would set up experiments like this in such a way that pressing a particular sequence of switches would trigger the release of the food. But for this particular experiment he decided to do something different. He decided to present the food at random intervals, completely unrelated to the pressing of the switches. He wanted to see what the pigeons would do as a result. The findings were remarkable. Instead of sitting idly by and waiting patiently for their food to arrive, the pigeons took matters into their own hands. They flapped their wings repeatedly, they danced around in circles, they hopped on one foot, convinced that their actions had something to do with the presentation of the food reward. Skinner and his colleagues likened what the pigeons were doing to the ‘rain dances’ performed by various tribes around the world: they were engaging in superstitious behaviours to control an unpredictable and chaotic environment. It’s important that we think about this situation from the pigeon’s perspective. Inside the Skinner box, they find themselves in an unfamiliar world that is deeply opaque to them. Their usual foraging tactics and strategies don’t work. Things happen to them, food gets presented, but they don’t really understand why. They cannot cope with the uncertainty; their brains rush to fill the gap and create the illusion of control. Now what I want to argue here is that modern workers, and indeed all of us, in an environment suffused with AI, can end up sharing the predicament of Skinner’s pigeons. We can end up working inside boxes, fed information and stimuli by artificial intelligence. And inside these boxes, stuff can happen to us, work can get done, but we are not quite sure if or how our actions make a difference. We end up resorting to odd superstitions and rituals to make sense of it all and give ourselves the illusion of control, and one of the things I worry about, in