Channels ▼
RSS

Design

Interview with Alan Kay


In June of this year, the Association of Computing Machinery (ACM) celebrated the centenary of Alan Turing's birth by holding a conference with presentations by more than 30 Turing Award winners. The conference was filled with unusual lectures and panels (videos are available here) both about Turing and present-day computing. During a break in the proceedings, I interviewed Alan Kay — a Turing Award recipient known for many innovations and his articulated belief that the best way to predict the future is to invent it.

[A side note: Re-creating Kay's answers to interview questions was particularly difficult. Rather than the linear explanation in response to an interview question, his answers were more of a cavalcade of topics, tangents, and tales threaded together, sometimes quite loosely — always rich, and frequently punctuated by strong opinions. The text that follows attempts to create somewhat more linearity to the content. — ALB]

Childhood As A Prodigy

Binstock: Let me start by asking you about a famous story. It states that you'd read more than 100 books by the time you went to first grade. This reading enabled you to realize that your teachers were frequently lying to you.

Kay: Yes, that story came out in a commemorative essay I was asked to write.

Binstock: So you're sitting there in first grade, and you're realizing that teachers are lying to you. Was that transformative? Did you all of a sudden view the whole world as populated by people who were dishonest?

Kay: Unless you're completely, certifiably insane, or a special kind of narcissist, you regard yourself as normal. So I didn't really think that much of it. I was basically an introverted type, and I was already following my own nose, and it was too late. I was just stubborn when they made me go along.

Binstock: So you called them on the lying.

Kay: Yeah. But the thing that traumatized me occurred a couple years later, when I found an old copy of Life magazine that had the Margaret Bourke-White photos from Buchenwald. This was in the 1940s — no TV, living on a farm. That's when I realized that adults were dangerous. Like, really dangerous. I forgot about those pictures for a few years, but I had nightmares. But I had forgotten where the images came from. Seven or eight years later, I started getting memories back in snatches, and I went back and found the magazine. That probably was the turning point that changed my entire attitude toward life. It was responsible for getting me interested in education. My interest in education is unglamorous. I don't have an enormous desire to help children, but I have an enormous desire to create better adults.

The European Invasion In Computer Science

Kay: You should talk to William Newman, since he's here. He was part of the British brain-drain. There was also Christopher Strachey, whom I consider one of the top 10 computer scientists of all time. The British appreciate him. They also had Peter Landin. They had memory management and they had timesharing before we did. Then there was a crisis in the early 1960s. And suddenly the young Brits were coming to the United States.

William was one of the guys who literally wrote the book on computer graphics: Principles of Interactive Computer Graphics with Robert Sproull. William came to Harvard and was Ivan Sutherland's graduate student — got his Ph.D. in 1965 or 1966. William followed Ivan out to Utah; then when Xerox PARC was set up, William came to PARC.

A similar thing happened, but I think for different reasons, in France. So one of the things we benefited from is that we got these incredibly well-prepared Brits and French guys reacting to the kind of devil-may-care attitude, and funding like nobody had ever seen before. These guys were huge contributors. For example, the first outline fonts were done by Patrick Baudelaire at PARC, who got his Ph.D. at Utah. The shading on 3D is named Gouraud shading after Henri Gouraud, who was also at Utah — also under Ivan, when Ivan was there.

The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.

Computing as Pop Culture

Binstock: You seem fastidious about always giving people credit for their work.

Kay: Well, I'm an old-fashioned guy. And I also happen to believe in history. The lack of interest, the disdain for history is what makes computing not-quite-a-field.

Binstock: You once referred to computing as pop culture.

Kay: It is. Complete pop culture. I'm not against pop culture. Developed music, for instance, needs a pop culture. There's a tendency to over-develop. Brahms and Dvorak needed gypsy music badly by the end of the 19th century. The big problem with our culture is that it's being dominated, because the electronic media we have is so much better suited for transmitting pop-culture content than it is for high-culture content. I consider jazz to be a developed part of high culture. Anything that's been worked on and developed and you [can] go to the next couple levels.

Binstock: One thing about jazz aficionados is that they take deep pleasure in knowing the history of jazz.

Kay: Yes! Classical music is like that, too. But pop culture holds a disdain for history. Pop culture is all about identity and feeling like you're participating. It has nothing to do with cooperation, the past or the future — it's living in the present. I think the same is true of most people who write code for money. They have no idea where [their culture came from] — and the Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.
 

Comments:

ubm_techweb_disqus_sso_-26be4e708c84c817e84abd2af4fd4ba2
2013-02-20T18:46:59

Yes, a lot of technology is created hastily and it winds up anemic and flimsy. But what about the opposite problem: how do you know when you're investing too much money on hiring too many geniuses to build systems that are too robust and won't pay off for 30 years?

You know it when it leads to financial losses.

And, yes, competition in the business world can lead to a duplication of effort. But how do you know when there is too much cooperation between people? How do you know when your corporation is so large it can't act rationally because it can't assign accurate prices to internally exchanged goods and services?

Again, you know it when it leads to financial losses.

The "pop culture" nature of technology is simply due to economics, i.e. entrepeneurs managing their scarce resources as the world enjoins them to.


Permalink
winslow17
2012-07-23T20:06:29

Re Kay's thoughts on education and teaching: Just search for him on YouTube and Google Video and you will find a wealth of great footage in which Kay discusses these topics, often at some length.
His wisdom - and I really do believe that's what it is - is both exciting and disturbing. He shines a bright light onto largely unexplored, unpursued directions and territory in computer science (see what he has to say about the Web in general, and how it should have been done), enabling one to see how things might be had so much good work not been overlooked.
And his critique of education goes well beyond the simplistic idea that all we need is to put more technology into the classroom, a la Bill Gates and his gang. As Kay sees it, the computer represents a new medium for conveying and exploring ideas, which is what writing is, too. But most of the use of computers in schools he sees as just "air guitar," a lot of people going through the motions but not really learning how to think better, or at all. They are not making the kind of music that the computer, which he compares to a musical instrument, is meant to create. And that is largely because the teachers in the room are not themselves thinking properly. Putting a piano in a classroom without a musician to coax music from it will not help many people learn music. But with a musician in the room, one doesn't even need a piano for students to learn music. Likewise, math and science, Kay argues; both get taught pretty much in all the wrong ways because teachers have been taught to do it that way.
Anyway, Kay's thought on all this is well worth checking into. True, he can come off as just a guy who invented a great deal of stuff back at Xerox and who is still quite miffed that Msft and others pretty much ignored and eclipsed his inventions with a machine whose design was already decades old ca. 1982. But having heard him out, I'd say he is not holding a grudge, just speaking the truth - bearing witness, that is, to the setback we've all been forced to suffer through. Personally, I find his thinking about as refreshing as any I have encountered in the fields of computing and education. (Well, perhaps Ivan Illich tops Kay in the latter.)


Permalink
ubm_techweb_disqus_sso_-d11f01e82bf6d78cb33acfe49bafcfea
2012-07-18T00:25:11

Barton (and others at Burroughs like Paul King and Bill Lonergan) also had a legacy from Turing - and thus Alan Kay himself, as he gladly and humbly recognizes. So the computing landscape we have today owes a lot to Barton's thinking.

Turing would have designed computers quite different to the now widely-used Von Neumann architecture. A recent article in July, 2012 Communications of ACM makes this clear in Turing's design for the Pilot ACE computer, which avoided the Von Neumann bottleneck of a single register.

This thinking influenced the British computer industry in the English Electric Company which developed the DEUCE (a copy of the ACE) and the KDF9 stack-based machines.

Barton designed the ultimate stack-based machine, the B5000 (now Unisys ClearPath MCP). Barton's machines had no programmer-accessible registers at all, avoiding the Von Neumann bottleneck. Thus the hardware architecture could be optimised without changing the basic system architecture. Virtual machines like JVM owe their existence to this idea (which makes Android strange because it puts programmer-accessible registers back in - surely a retrograde, anti multi-processing approach).

Virtual memory also was adopted by Barton (via Paul King who attended the seminar in May 1960 at UCLA) and Burroughs ten years before IBM 'invented' it. This came from Manchester University's ATLAS computer.

Thus, we could see Barton and maybe Turing as being the first 'think different' people. They saw and solved many of the problems the industry is still grappling with.


Permalink
ubm_techweb_disqus_sso_-d11f01e82bf6d78cb33acfe49bafcfea
2012-07-17T07:44:34

I wholeheartedly agree with Alan that Bob Barton should get a Turing award, or at least be much more recognised than he is in this industry. He did get the inaugural Eckert-Maucley award. He is the greatest computer architect of the 20th century, yet people think of Amdahl, and Cray. Barton developed machines that you could program.

http://dl.acm.org/citation.cfm...

It is nice that you cite the Wikipedia article on Barton that I started.


Permalink
ubm_techweb_disqus_sso_-8a95dfa51af96042aa1a8038e1f9cc5d
2012-07-16T18:55:15

Very interesting interview but... business is about competition but athletics is about cooperation??? That is a mystifying comment -- maybe tuning in to the Olympics this summer will help Dr. Kay. And the notion that competition is any worse for business than it it for athletics is historical and economic poppycock.

Great insights into computing. Business and theology -- not so much.


Permalink
ubm_techweb_disqus_sso_-b651efdb98a5d6bd2b3935d0c3f4a5e2
2012-07-13T20:51:14

Actually, Socrates *is* a saint of the orthodox church, at least in Romania and in Russia perhaps. Together with Plato. Both those names are fairly common in those countries.


Permalink
ubm_techweb_disqus_sso_-156cc853200c48d18ad813c1a9a4eced
2012-07-12T16:06:44

You are correct that Catholicism has a different view on the absolute necessity of baptism (given the theological teaching of baptism of desire / blood, although not de fide teaching of the Church), but your own "knock" on creationists is a little off given they doubt the ability of darwinism to produce life as we know it (preferring the more authoritive account of Genesis) - I didn't know creationism also doubted the use of drugs and modern medical procedures!

Personally, I always thought Christianity would want to accommodate Aristostle, rather than Socrates, given the former's contribution to classical scholasticism.

Regarding the interview, I would dearly like to hear more in-depth thoughts of Alan Kay's in teaching school children, if so many of the current programming languages are inadequate. I have been teaching a group Python in the last year - in other languages, many concepts would have been much harder to explain (especially in C++).


Permalink
ubm_techweb_disqus_sso_-f3076f5537d6dbeacba7a6dbeba1e936
2012-07-12T07:49:39

Ah, very good, Andrew, you had me worried there!

Regarding why I attributed the false claim to Kay himself, it's because the interview transcript literally says so: "we did objects first". But maybe Kay didn't mean to imply that his Norwegian colleagues weren't first. Still, unless I'd known better I certainly would have come away thinking that he'd invented OO. Credit where credit is due ;-)


Permalink
AndrewBinstock
2012-07-12T04:03:02

Thanks. I knew that but put it in the deck incorrectly! Corrected the deck to read "pioneer in object orientation" Note however that Kay does not claim that he did objects first. I'm not sure how you come to attribute that to him.


Permalink
ubm_techweb_disqus_sso_-6aff1359a3f338306ad66954d717979e
2012-07-11T09:33:39

Despite his philosophical digression, which I quite enjoyed, I really think the guy should be more realistic about technology. It's the thing that gave him his fame, but of course, as time passed he was chewed up and spat out. Because technology has eternal youth it will quickly pick up on a wild idea but also dump it just as quickly for the next one. Live by the sword, die by the sword. My 2 cents.


Permalink
ubm_techweb_disqus_sso_-6df66cfa4767d541772f9b6756f91efe
2012-07-11T09:24:16

Hi Andrew - any chance you could post the original audio of the interview?


Permalink
ubm_techweb_disqus_sso_-d661dfe7e1347140423ee13a8fddc023
2012-07-11T09:14:18

From the article:

Binstock: You really radicalized the idea of objects by making everything in the system an object.

Kay: No, I didn't. I mean, I made up the term "objects." Since we did objects first, there weren't any objects to radicalize.

Problem with comprehension or are you a web-developer feeling a little butthurt ?


Permalink
ubm_techweb_disqus_sso_-f3076f5537d6dbeacba7a6dbeba1e936
2012-07-11T02:12:25

"The inventor of object-orientation"? And "... we did objects first"? Uh, no. Sadly ironic, in the context of an article which says "You seem fastidious about always giving people credit for their work", to be taking credit for inventing OO when in fact it was invented by someone else.

OO was invented by Dahl and Nygaard. Kay, under the influence of Simula 67, may have invented the term "object-oriented programming", but he didn't invent OO.


Permalink
AndrewBinstock
2012-07-11T01:19:23

Kay did not mention Romney. In fact, at no point that I recall did he get into discussing politics as such.


Permalink
metanews
2012-07-11T00:34:02

It's an interesting remark in the context of Kay's comment on the inability of modern developers to take advantage of, or even acknowledge, the past. Socrates has a special place in the origin of theChristian religious philosophy, as is rather clearly revealed by even a casual reading of St Augustine's "City of God".

The notion of the baptism of Socrates is really something that points towards this difficulty of absorbing all this knowledge and wisdom that is "pagan". The struggle continues today, though instead of pagan knowledge, it is scientific knowledge that Christianity struggles with. Both cases (which are, arguably, the same case) remind me of that old joke repeated by Woody Allen about the guy who tells a psychiatrist his brother is so crazy he thinks he's a chicken. The psychiatrist suggests treatment, but the guy replies they can't afford that -- they need the eggs. Christianity has always been in that exact position. "Creationists" will gladly accept modern medical treatment for, say, breast cancer, even though the development of that treatment is based on scientific principles they deny.

Kay's point in the interview, if one reads it closely rather than having a knee-jerk reaction to a single word, is that Bob Barton did not receive the reward he so richly deserved (deserves) because he tore down the work of those who did receive the reward. Yet that work is intensely valuable. So, as with Socrates and the Christian church (and Kay is, indirectly, referring to the mother Church, Catholicism, which has a vastly different attitude to baptism etc than those suggested by Mr Deadrock) they continue to eat the eggs while deploring the manner of their production.

Interestingly, Kay's remark about Socrates actually has a political element. The Mormons have an odd tendency to baptise dead people by proxy. Not only has Socrates received the treatment, but also Anne Frank and other Jews -- along with Adolf Hitler. According to the Mormons, the deceased don't have to accept the baptism, but if they do they will be admitted to the lower of the three possible heavens.

Indirectly (possibly edited out?) Kay is commenting on the wonder of having a presidential candidate whose religion goes about baptising dead people. Mitt Romney has been asked by a number of Jewish leaders to please put a halt to the practice, as it is (obviously) very offensive.

And as just a final, nit-picking note, nowhere does Kay say that Socrates deserves to go to heaven. He says "if anyone" does it is Socrates.


Permalink
metanews
2012-07-11T00:00:39

More, please. Really, please. And, deeply, thank you.

I don't know if I'm speaking for others, but I realised some time ago I would never come even close to Kay, Ingalls, and especially Engelbart. Anything I've done that was any good I did by trying to understand what they said. I have quotes by Engelbart pasted over my desk, along with one by Kay where he speaks of the Japanese notion of "ma", the interstitiality between things, such as objects, and how this was the essence of what he wanted from O-O, as Ingalls has said, that in many ways the aspects of messaging were even more important that the containment, the encapsulation of the object. The only quote I look at more often is Engelbart's about how sometimes problems are best solved by finding a greater level of abstraction.

It's so shocking that the most advanced languages we have today were invented over 30 years ago -- Smalltalk and Lisp. C++ was like a defensive move to make a place for C programmers in an object future -- not that C++ doesn't have its place, but that place is far less general than its actual uses. (Note Ward Cunningham's remarks on patterns in DDJ as a means to help people get over C++ eventually.) Java is some bizarre construct that actually is not watered-down O-O or improved imperative, but some new thing of its own (just look at anything it does on the class-side).

Scala makes more sense, but there is something about the half-stumble in its syntax that indicates its attention is focused elsewhere than the quest inbuilt in Smalltalk.

The only other language of promise is Self, and that is unfunded and locked in non-development though so incredibly influential. There are only a few bright lights out there, like David Hasnell, working on adding his "pragmatic Smalltalk" directly into Objective-C in the GnuStep/Etoile project.

Or try choosing a web framework. Ruby on Rails is -- I won't go there. Lift. Play. I end up back with Seaside, though it needs to be completely re-written to take full account of AJAX and HTML5. Weblocks, which is Lisp-based does this, and is closer to the idea of following the Naked Objects "pattern", which makes more sense on the Web than MVC (represent objects directly as textual structures in html, develop flexibility of presentation through CSS). And yet something is somehow missing from it that Seaside possesses.

LivelyKernel by Ingalls, so delightful, the freedom of it.But encumbered by its infrastructure. Where is the browser that would make this a deep part of its code? Instead even the "open source" browsers from Mozilla worry only about market share against IE, as though that was supposed to motivate those community efforts, ever. Microsoft's worst effect has not been its mediocre, market-dominating developments, but the extent to which it remains a distraction.

Kay looked at huge computer machinery and knew one day the same and more would fit in a shoebox, and began designing for the shoebox. Today we have the same systems that fits in a sardine can, but we are designing to make them appear to work with the same capabilities as the shoebox-sized systems. Five years from now an iPhone will be as powerful as a present-day Macintosh. But there is no development in that, its all just miniaturisation.

It must be, has to be, societal, cultural, this resistance to taking the next steps. And that should be expected. But honestly, I find I myself cannot get any purchase on this problem.

So reading what Kay has to say makes me feel sorrowful, and a little disappointed in myself. But it's a good sorrow, and a true disappointment.


Permalink
AndrewBinstock
2012-07-10T23:42:01

Oh come now. It's a delightful piece of conversation. I don't read it as a slam at your beliefs or Christian religions. Now, if you were one of the first developers of the Web or one of the first developers of the browser, then I could see why you'd take offense. But I didn't exclude those either b/c I think they're relevant to Kay's general view of things.


Permalink
ubm_techweb_disqus_sso_-5c471a50164a7b307706e8eba25b8488
2012-07-10T22:58:29

I didn't notice a "knock" on Christianity, but I did notice the interviewee has different beliefs from deadrocks.


Permalink
ubm_techweb_disqus_sso_-de42446e6f004f2e1842dac0ef039664
2012-07-10T22:55:06

Why not? I could argue the same thing about religion being preached in other technical and non technical aspects of our life. I usually ignore it, as I am not a believer, why shouldn't you?

Don't get me wrong, I don't want to get into an argument about religion.

BTW, thanks for the clarification on the issue regarding baptism and heaven - I for one am a complete ignorant on the matter.


Permalink
ubm_techweb_disqus_sso_-657fc4a5fdc90f0ecc04ceaacd48b054
2012-07-10T21:16:11

Hmmm, why the knock on Christianity? Was that a necessary part of the conversation? It seems that that could have been left out to avoid offense and/or controversy. But, since you opened the door: Baptism isn't required, it's an outward sign of an inward transformation. The thief on the cross was not baptised, yet he was promised heaven that day - for his belief in Christ as Savior of the world, not his behavior. Oh, and no one deserves heaven...


Permalink

Video