by Andrew Binstock, Dr. Dobb’s Journal
In June of this year, the Association of Computing Machinery (ACM) celebrated the centenary of Alan Turing’s birth by holding a conference with presentations by more than 30 Turing Award winners. The conference was filled with unusual lectures and panels (videos are available here) both about Turing and present-day computing. During a break in the proceedings, I interviewed Alan Kay — a Turing Award recipient known for many innovations and his articulated belief that the best way to predict the future is to invent it.
[A side note: Re-creating Kay's answers to interview questions was particularly difficult. Rather than the linear explanation in response to an interview question, his answers were more of a cavalcade of topics, tangents, and tales threaded together, sometimes quite loosely — always rich, and frequently punctuated by strong opinions. The text that follows attempts to create somewhat more linearity to the content. — ALB]
Binstock: Let me start by asking you about a famous story. It states that you’d read more than 100 books by the time you went to first grade. This reading enabled you to realize that your teachers were frequently lying to you.
Kay: Yes, that story came out in a commemorative essay I was asked to write.
Binstock: So you’re sitting there in first grade, and you’re realizing that teachers are lying to you. Was that transformative? Did you all of a sudden view the whole world as populated by people who were dishonest?
Kay: Unless you’re completely, certifiably insane, or a special kind of narcissist, you regard yourself as normal. So I didn’t really think that much of it. I was basically an introverted type, and I was already following my own nose, and it was too late. I was just stubborn when they made me go along.
Binstock: So you called them on the lying. …..
The European Invasion In Computer Science
Kay: You should talk to William Newman, since he’s here. He was part of the British brain-drain. There was also Christopher Strachey, whom I consider one of the top 10 computer scientists of all time. The British appreciate him. They also had Peter Landin. They had memory management and they had timesharing before we did. Then there was a crisis in the early 1960s. And suddenly the young Brits were coming to the United States.
Kay: The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.
The most disastrous thing about programming — to pick one of the 10 most disastrous things about programming — there’s a very popular movement based on pattern languages. When Christopher Alexander first did that in architecture, he was looking at 2,000 years of ways that humans have made themselves comfortable. So there was actually something to it, because he was dealing with a genome that hasn’t changed that much. I think he got a few hundred valuable patterns out of it. But the bug in trying to do that in computing is the assumption that we know anything at all about programming. So extracting patterns from today’s programming practices ennobles them in a way they don’t deserve. It actually gives them more cachet.
The best teacher I had in graduate school spent the whole semester destroying any beliefs we had about computing. He was a real iconoclast. He happened to be a genius, so we took it. At the end of the course, we were free because we didn’t believe in anything. We had to learn everything, but then he destroyed it. He wanted us to understand what had been done, but he didn’t want us to believe in it.
Binstock: Who was that?
Kay: That was Bob Barton, who was the designer of the Burroughs B5000. He’s at the top of my list of people who should have received a Turing Award but didn’t. The award is given by the Association for Computing Machinery (ACM), so that is ridiculous, but it represents the academic bias and software bias that the ACM has developed. It wasn’t always that way. Barton was probably the number-one person who was alive who deserved it. He died last year, so it’s not going to happen unless they go to posthumous awards.
DCL: This interview is well worth a read. I absolutely agree with Alan Kay about Christopher Strachey, who I had the good fortune to know during 1965-68 when he was running the Programming Research Group at Oxford. Strachey was ahead of John McCarthy on ideas such as time-shared computers. And of course his ideas on computer languages are legendary.