Today, we mark the death of another giant of programming, "Uncle" John McCarthy. Like the subject of this month's earlier commemoration, Dennis Ritchie, McCarthy created an elegant programming language whose syntax persists today. He also foresaw the need for several technologies that have subsequently been widely adopted in software development.
His most notable contribution by most measures is Lisp, the language he designed in the late 1950s and that today still looks revolutionary in its syntax, despite our long familiarity with it. While Lisp was originally designed for artificial intelligence (a term that McCarthy invented), its embrace of the functional programming model is at last gaining traction in mainstream programming and pushing into the general business programming world. In addition, McCarthy's elegant syntactical choices in Lisp's S-statements have influenced many downstream products, whether they be the macros that led to the creation of Stallman's emacs editor or Rich Hickey's recent language, Clojure. The various languages derived from Lisp and Lisp's own variants testify to the enduring attraction of the model McCarthy created at a time when most languages that were not assembly language hewed to a strictly common do-this, do-that imperative format.
His work in AI explored many themes that would later become important fields on their own: robotics, logic programming, and modeling, among many others.
Even in those early days, McCarthy foresaw the benefits of technologies that could have been only on the distant horizons of the needs of the times. One of those, garbage collection (GC), is now de rigueur in any newly proposed language. The benefits of GC are today so obvious that one wonders why it took so long for it to be widely embraced especially when one considers the plague of memory leaks that ensued without it and the correspondingly long hours spent in debuggers tracking down misbehaving RAM gobblers.
McCarthy also foresaw the concept of "utility computing" as a widespread way of making CPU cycles available to customers. In his time, this came to pass as "time sharing." It was implemented originally as time slicing, which was the primary way most scientists, researchers, students, and businesses could access computers. They used teletypes (tty) as terminals hooked up to mainframes over telephone lines and were accorded a time slice on a rotating basis. Today, the model of access to computing on remote devices on an on-demand basis is gaining widespread attention as cloud computing (and earlier, in a somewhat different form, as grid computing.) The key difference, of course, is that McCarthy first mentioned this model in 1961, an era that predated the Web, PCs, minicomputers, and certainly the era of widespread remote access to processing power.
His was, in some senses, a vision that was more science fiction than potential reality. And in the same way that science fiction astounds us when it correctly adumbrates an eventual invention, McCarthy's work and vision surprise us today for their insight and their longevity. The analogy would have pleased McCarthy, who wrote many sci-fi stories and also was a frequent commentator on world events. (Some of McCarthy's writings on these topics can be found on his web page at Stanford, where he taught for many years after heading up the AI lab at MIT.)
All visionaries can see things that others could not see, but John McCarthy's unique contribution was seeing so very far into the future.