Channels ▼


Gary Kildall and Collegial Entrepreneurship

In the early days of the personal-computer revolution, the atmosphere at those shoestring startup companies with names like "Golemics" and "Loving Grace Cybernetics" was often more academic than businesslike. This collegiate ambiance touched everything, from the ways in which decisions were made and respect allocated, right down to sophomoric pranks and styles of dress.

There's a fairly obvious reason for this, or at least for some of it: Microcomputers were a new field, ripe for rapid advances, and that's a situation that fits neatly into a collegial atmosphere in which information is openly shared. When discoveries are freely shared, it's easier to build quickly on those discoveries; conversely, when progress is rapid, there is less reason to hold onto yesterday's discoveries. This natural synergy between rapid progress and information sharing is one key factor in the spectacular growth in the use and acceptance of computers over the past 20 years. It's one of the reasons that the personal-computer revolution really has been a revolution.

In time, companies like Apple and Microsoft would emphasize this synergy, explicitly calling their corporate headquarters "campuses." Even today, computer hardware and software companies often have a lot of the look and feel of grad school. But this predilection for a collegial atmosphere predates Apple and Microsoft. And while it didn't start there either, it was nowhere more evident in the early days than at one of the first personal-computer software companies—Digital Research. Digital Research could hardly have been anything but collegial: The company that gave personal computers their first operating system was the brainchild of a perpetual academic and born teacher. His name was Gary Kildall.


Gary Kildall seemed fated to be a teacher. His uncle would later claim that it was what Gary had always wanted. Teaching certainly was in his blood: The Kildall family actually owned and operated a small college. More precisely, it was a school for the teaching of navigation, based in Seattle, Washington. The Kildall College of Nautical Knowledge, the family called it; it was founded in 1924 by Gary's grandfather. Many Kildalls taught or worked at the school, including Gary himself, for a while, after graduating from high school.

But he had decided that year that he was going to be a math teacher, so he enrolled at the University of Washington. Newly married to high-school sweetheart Dorothy McEwen, he buckled down and applied himself to his studies, trying to put a childhood of mediocre grades, fast cars, and pranks behind him.

Somewhere along the way to a math degree he got hooked on computers. On finishing his degree, Gary went on to graduate school in computer science. He was still headed for a career in teaching, only now it would be teaching computer science at one of the few colleges that had programs back then. But there was a hitch. He had joined the Naval Reserve, and it was the '60s, with the Vietnam war in full flower. The Navy gave him a choice: Go to Vietnam or take a faculty position at the Naval Postgraduate School in Monterey, California.

Gary thought about it for a microsecond and chose Monterey. Even when the Navy told him what to do, the answer was the same: Teach.


It was in Monterey that Gary created CP/M, the program that brought him success and that became the unquestioned standard operating system throughout the microcomputer industry. CP/M was a good product and deserved, for many technical reasons, to be the standard. But getting there first always helps, too. And CP/M actually appeared a year before the first commercial microcomputer arrived on the scene.

Unlike operating systems before and since, CP/M was not the result of years of research by a team of software engineers. It was, like most software of its time, the invention of one individual. That individual was Gary Kildall, and if chance put Kildall in just the right place at just the right time, you would have to say, in retrospect, that chance chose well. As it did with Bill Gates, chance spoke to Gary Kildall through a note on a college bulletin board, college bulletin boards apparently being the Schwabb's Drug Store of personal-computer fame.

The note talked about a $25 "microcomputer," a pretty good deal even at 1972 prices. It was actually describing not a computer but the first microprocessor, the 4004 that Ted Hoff had designed at Intel. Presumably, this note was an advertisement torn from a recent issue of Electronics News. Intel had hired Regis McKenna to write the ad at Hoff's urging. Hoff was convinced that techies would see the virtue of this new device, this general-purpose processor, and urged that it be advertised, extravagantly but not altogether inaccurately, as a "microcomputer." This would make it absolutely clear that it was not just another limited-purpose device, but something fundamentally different. Hoff was sure that engineers and programmers would get it.

Kildall got it, literally, sending off his $25 for one of the first Intel 4004 chips.

It was 1972. Kildall was busy teaching computer science at the United States Naval Postgraduate School in Monterey. He and Dorothy (and son Scotty) had moved into a house in neighboring Pacific Grove. The Seattle natives loved this scenic coastal town, with its laid-back, fog-draped ambiance. The place suited the easy-going professor. Whether in class or among family and friends the lanky, shaggy-maned Kildall spoke with the same soft voice, the same disarming wit. Although he was teaching at a naval installation, he wouldn't have been out of place on any college campus in his customary sport shirts and jeans. When he had a point to make he would often cast about for chalk or a pencil; he was an incurable diagram drawer.

Gary was happy in his marriage, happy to be living by the ocean, happy not to have gone to Vietnam, and most definitely happy in his job. He loved teaching, and the work left him time to program. Nothing in his life was preparing him to run a business, to handle a spectacularly successful software company supplying the essential software for hundreds of different computer models in an industry running wild. Everything argued for his staying right where he was forever, teaching and programming. At first, the 4004 seemed to fit in with that scenario.

Gary started writing programs for the 4004. His father, up at that little navigation school in Seattle, had always wanted a machine that would compute navigation triangles. Gary made that his project, writing some arithmetic programs to run on the 4004, thinking idly that he might come up with something that his father could use. He was really just fooling around with the device, trying to see how far he could push it, and with what speed and accuracy.

Not all that far, he soon learned. The 4 in 4004 meant that the device dealt with data in 4-bit chunks—less than a character. Getting anything useful done with it was a pain, and performance was pitiful. Although he was frustrated by the limitations of the 4004, he was fascinated by what it promised. Early in 1972 he visited Intel and was surprised to see how small the microcomputer division (dedicated to the 4004 and the new 8008) was: The company had set aside only a few small rooms for the entire division. Gary and the Intel microcomputer people got along well, though, and he began working there as a consultant on his one free day a week. He spent months programming the 4004 in this day-a-week mode until he "nearly went crazy with it." He realized—and it was a radical idea for the time — that he would never go back to "big" computers again. Which is not to say that he stopped using "big" computers. With both the 4004 and the significantly more powerful 8008 that he soon moved on to, he was doing his development work on a minicomputer, much as Bill Gates and Paul Allen did later in writing software for the breakthrough MITS Altair computer. Like Paul Allen, he wrote programs to simulate the microprocessor on the "big" minicomputer, and used this simulated microprocessor, with its simulated instruction set, to test the programs he wrote to run on the real microprocessor.

But unlike Gates and Allen, Gary had the benefit of a development system, essentially a full microcomputer spun out around the microprocessor, so he could try out his work on the real thing as he went along. In a few months he had created a language implementation called "PL/M," a version of the mainframe language PL/I that was significantly more sophisticated than Basic.

The Lab

As partial payment for his work, Gary received a development system of his own, which he immediately set up in the back of his classroom. This allowed him to combine his new obsession with microcomputers and his love of teaching. The system in the back of the classroom became the Naval Postgraduate School's first — if not the world's first — academic microcomputer lab.

And academic it was. This was not just Gary's toy; he used it to teach students about the technology, and encouraged them to explore it. His curious students took him up on it, spending hours after class tinkering with the machine. When Intel upgraded this Intellec-8 from an 8008 to its new 8080 processor and gave Gary a display monitor and a high-speed paper tape reader, he and his students were working with a system comparable to—favorably comparable to—the early Altair computer before the Altair was even conceived.

Gary realized, though, that he was missing an essential ingredient of a really useful computer system — an efficient storage medium. In the early '70s, paper tape was one of the standard storage media, along with the infamous punched card. Neither was very efficient, and the issue was particularly critical on microcomputer systems because the relatively slow microprocessors couldn't offset the inherent slowness of the mechanical process of punching holes in pieces of paper.

IBM had recently introduced a new storage medium that was much faster and more efficient. It was based on the existing technology of recording data as patterns of magnetization on large rapidly spinning disks, a medium that had everything going for it except price. But IBM engineers figured out how to scale down this technology to something smaller and more affordable, creating the floppy-disk drive.

One $5 floppy disk held as much data as a 200-foot spool of paper tape, and a floppy-disk drive could be had for around $500. The combination of the microprocessor and the floppy disk drive meant that, in Kildall's words, "It was no longer necessary to share computer resources." In other words, the elements of a personal computer were at hand. Well, most of the elements. Gary soon found that some important components were still annoyingly missing.

By this time, an industry was developing to create these floppy-disk drives in volume, and Shugart was the pioneer of this industry. Once again, Gary traded some programming for some hardware, getting himself (and the microcomputer lab) a Shugart disk drive. But for the disk drive to work with the Intellec-8, another piece of hardware was needed, a controller board that fit in the Intellec-8 and handled the complicated communication between the computer and disk drive. This piece of hardware, unfortunately, did not exist.

Gary tried his hand more than once at building the controller. When that proved more challenging than he expected, he explored the idea of using a different magnetic medium — ordinary audio tape, mounted on a conventional tape recorder. His efforts in interfacing a tape recorder with the Intellec-8 were no more successful than his efforts to build a disk controller. It soon became clear that his considerable programming expertise was no substitute for the hardware knowledge needed to build a device that would connect the Intellec-8 with an efficient storage device. It is worth noting that Kildall was well ahead of his time: When MITS, IMSAI, and other companies began marketing microcomputers, they began with paper-tape or magnetic-tape storage. It would be several years yet before disk drives came into common use on microcomputers.

Finally, in 1973, admitting hardware defeat, Gary turned to an old friend from the University of Washington, John Torode. Torode would later found his own computer company, but in 1973, he was just doing a favor for his old friend. "John," Gary said, "we've got a really good thing going here if we can just get this drive working." Torode got the drive working.


Meanwhile, Gary found himself involved with another hardware engineer on another microprocessor-based project. This project, for all its apparent frivolousness, was the first hint of any genuine commercial ambitions on the part of Gary Kildall. The project was the ill-fated Astrology Machine.

Ben Cooper was a San Francisco hardware designer who had worked with George Morrow on disk systems and later would, like Torode, start his own computer company, Micromation. In the early '70s, he thought he could build a commercially successful machine to cast horoscopes, and he enlisted Gary's help.

The business was not a success — "a total bust," Gary later called it. Still, the Astrology Machine gave Gary the first field test of several programs he had written and rewritten over the past months: a debugger, an assembler, and part of an editor. He also wrote a Basic interpreter that he used to program the Astrology Machine. Since, for Gary, there was little distinction between his academic work and his commercial or consulting work, he passed on the tricks he came up with to his students. He passed the tricks he came up with in writing the Basic interpreter on to a young naval officer named Gordon Eubanks (today, president and CEO of Symantec). All the programs, with the exception of the interpreter, became part of the disk operating system he was writing to control the controller that Torode was building.

Related Reading

More Insights

Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.