Programming & The PC Revolution

Twenty-five years ago, computer programming was big-iron heavy - then the personal computer came along and everything changed. In this special anniversary retrospective, Eugene Kim looks back at what computing was like then, and DDJ's role in that revolution


January 01, 2001
URL:http://www.drdobbs.com/programming-the-pc-revolution/184404434

Jan01: Programming & The PC Revolution

Eugene, formerly a senior technical editor for DDJ, is a freelance programmer and writer. He can be contacted at eekim@ eekim.com.


Sidebar: The Most Significant Event in the Last 25 Years of Computing: Comments from DDJ Luminaries
Sidebar: A DDJ Timeline

Twenty-five years ago, computer programming was a mature and well-understood field. By the late 1960s, computer scientists had already invented and exhaustively studied programming's most fundamental data structures and algorithms, from lists and trees to searches and sorts. Programming paradigms, from functional to structured to object-oriented, had all been introduced. The term "software engineering" had been coined, and many researchers had begun to study the discipline of programming. Computer scientists spoke excitedly about the future of programming.

And then, software development seemed to take a giant step backward.

The culprit was the microcomputer, a $400 jumble of wires and switches and lights that ostensibly did the same thing as the sleek and expensive cabinet-sized machines that real programmers used. Most professionals ignored the microcomputer, and so the machine became the domain of so-called hobbyists.

Most of these newly minted microcomputer junkies had no prior experience with software or programming, and it showed. To the horror of real programmers, who advocated high-level languages and structured programming, these hackers coded mostly in machine language or in Basic. Programmers in the know had officially declared the GOTO statement harmful. Hobbyists sprinkled GOTOs liberally throughout their Basic programs.

Twenty-five years ago this month, Bob Albrecht and Dennis Allison started a magazine devoted to these programmers and their octal coding, and unstructured programming ways. They called it Dr. Dobb's Journal of Calisthenics and Orthodontia: Running Light Without Overbyte, a name that confused professional programmers and practicing dentists everywhere. Members of both professions subscribed.

The only word that could accurately describe this magazine, with its whimsical name and its psychedelic cover, was "unconventional." Not only was it unique in its ragged look and irreverent tone, it was unconventional in its coverage. At the time, computer magazines fell under one of three categories: theory, hardware, or business. Dr. Dobb's Journal was all about software.

DDJ was written by and for practicing programmers, and it published source code with abandon. It didn't matter that many of the early readers and contributors didn't know a parser from a preprocessor. More often than not, articles written by these less experienced hackers would cause so-called real programmers to rear their ugly heads. However, instead of passing judgment on their unseasoned peers, they chose to educate them through letters and follow-up articles.

What played out on the pages of DDJ in those early days proved a point made by Grace Hopper in the late 1970s. After she had delivered a conference speech on the history of programming languages, Bob Rosin of Bell Laboratories complained about attending "a microcomputer discussion in which the phrases 'hand assembly' and 'octal patch' were prominent," and asked Hopper, "How can we enhance the probability that microcomputers will illuminate truths rather than allow people to ignore the hard-won and hard-learned discipline of programming?"

Hopper explained that it was "an educational and marketing problem," and remarked, "By and large, the people concerned with the large computers have totally ignored the so-called hobbyist community, though I would point out they are not a hobbyist community: They are small businessmen, doctors, lawyers, small towns, counties! They are a very worthwhile audience."

This audience proved worthwhile, indeed. Personal computers, née microcomputers, brought computing to the masses, and the masses helped revolutionize computing. Far from sending software development back to the vacuum-tube age, personal computers whipped the software industry into adulthood, and in so doing, changed the way software was built.

A Culture of Sharing

The Basic language served as a starting point for this revolution in software and software development. In many ways, Basic was the perfect language for hobbyists. Thomas Kurtz and John Kemeny had designed the language in the 1960s for the student body at Dartmouth, only a quarter of whom were majoring in the sciences or engineering. Their driving motivation was simplicity and accessibility over all else. In a presentation on the history of Basic, Kurtz explained, "If ordinary persons are to use a computer, there must be simple computer languages for them. Basic caters to this need by removing unnecessary technical distinctions (such as integer versus real), and by providing defaults (declarations, dimensioning, output formats) where the user probably doesn't care."

Unfortunately, the capabilities that made Basic so accessible and popular also made it unsuitable for larger, more sophisticated software. Its weak typing was great for beginning programmers, but inappropriate for people needing more robust programs and more complex data structures. Its subroutine capability was primitive at best, making it a poor language for writing modular, structured programs.

Despite its deficiencies, Basic was vital because it made computers more accessible to a broader audience, which in turn, fostered a curiosity among hobbyists about bigger and better things in the world of software development.

The articles published in early issues of DDJ were representative of this trend. In the first issue of DDJ, Allison presented Tiny Basic, a compact interpreter that he had built, and challenged others to improve on it. In the next issue, Allison, who taught computer science at Stanford, wrote an article entitled "A Critical Look at Basic," where he lambasted the language, stating, "The language lacks the mechanisms to structure the problem's algorithm and data well. It breeds bad habits, habits which are difficult to unlearn."

That Allison could both encourage the use of Basic and at the same time criticize the language for its deficiencies was indicative of why computers in the hands of hobbyists did not kill progress in programming, as some had predicted. Unlike the white-hooded programmers of the 1950s, who guarded their knowledge with guild-like secrecy, the personal computer programmers encouraged sharing and community — and by the mid 1970s, there was much knowledge to share.

First and foremost, there were algorithms and data structures. Fundamental problems cropped up over and over again. How do you shuffle a deck of cards? How do you sort a list of names? How do you calculate the square root of a number? Fortunately for the hobbyists, there was no need to figure out the answers to such questions (although many did, either out of ignorance or for the sheer thrill), because the answers were already known. Additionally, many algorithms gained new relevance due to the nature of these new machines. Compression algorithms, for example, were far more important to programmers dealing with harsh memory constraints than they were to programmers writing software for machines with memory to spare.

Second, there were languages — and over time, lots of them. Initially, personal computer programmers gravitated to system languages like Forth and introductory languages like Basic. However, as programmers stretched the limitations of these languages, they began to explore the many other languages at their disposal.

Pascal was one such language. Niklaus Wirth had introduced the language in 1970 to promote good programming practices, but it was a different innovation that led to its broad acceptance among personal computer programmers. Wirth had developed a virtual machine architecture, known as the P-machine, to make compiled Pascal programs portable. The University of California at San Diego's Kenneth Bowles recognized the value of the P-machine architecture, and ported it to a number of personal computers, leading to the development of the UCSD Pascal system. Bowles's implementation not only helped evangelize structured programming to the microcomputing masses, it introduced a sophisticated virtual machine architecture technique and it emphasized the importance of portability.

Portability was one reason why many programmers of personal computers started expressing interest in another language. Bell Labs's Dennis Ritchie introduced C around the same time as Wirth introduced Pascal. Developed for the new UNIX operating system, Ritchie designed C to be a high-level assembly language for writing portable system software. Unlike UCSD Pascal's interpretive system, compiled C was extremely fast, an attribute that was especially attractive to developers dealing with slow microprocessors.

The third and vital lesson that hobbyists wanted to and did learn from their more experienced peers concerned software engineering. Knowing the best algorithms and languages for a particular task was not enough information to build great software. Technique was just as important as the tools. While not all of the existing experience necessarily applied to what these PC coders were trying to do, at the very least, these programmers could learn about the importance of coding style, testing methods, and the different programming paradigms.

Objects: The Center of Change

As this new class of programmers rapidly learned what was already known, programming as a discipline continued to progress. While there were many important innovations in software development overall, perhaps the most significant over the past 25 years was the widespread adoption of object-oriented programming.

Object-oriented programming has many roots, both in computer science and elsewhere. Object-oriented languages can be traced back to the 1950s with Lisp and the late 1960s with Simula. Alan Kay, who coined the term "object-oriented programming" as a researcher at Xerox PARC, found some inspiration for his Smalltalk language from cell biology and the works of the philosopher/mathematician Leibniz.

Most importantly, Kay was motivated to create Smalltalk and the principles of object-oriented programming by a larger goal: pervasive personal computing. In 1971, Kay predicted that personal computers would be widespread and networked, and that for this to happen, software had to meet three characteristics — zero replication time and cost, high development time and cost, and low change time and cost. Certainly, all three characteristics were desirable for all software, even in a world without personal computers. However, personal computers made such characteristics urgent necessities.

Smalltalk (and object-oriented programming in general) was a way to achieve these goals. In theory, programs would be easier to write because they were modeled on things that were easier to understand. Smalltalk, in fact, was designed as a programming language for children. Additionally, programs would be easier to read, and hence, easier to maintain, vastly reducing the largest costs associated with software.

Personal computers made graphical user interfaces important because they improved the interaction between people and machines. Object-oriented languages like Smalltalk enabled the widespread adoption of these interfaces. How many programmers ever have to code a button or a scrollbar from scratch today? And would graphical user interfaces be as common as they are if more programmers did have to create and integrate these widgets and their behavior from scratch?

These early graphical widgets eventually led to more general software components, with the hope that fully functional applications could be wired together using prewritten objects and minimal coding. While no one has completely achieved this ambitious goal yet, software today requires far less coding than its complexity would seem to indicate. Object-oriented application frameworks such as the Microsoft Foundation Classes — love them or hate them — are largely responsible.

Most importantly, object-oriented programming helped lay the foundation for many advances in software engineering, from testing techniques to programming methodologies. Perhaps the most important recent innovation is design patterns, which were defined and documented in the early 1990s by a number of specialists in the field of object-oriented design.

Their observation was simple: Software design, even with the benefit of objects, is complex, and programmers can learn many things from those who do design well. With this in mind, these researchers began exploring and analyzing well-designed object-oriented software, and cataloging the patterns that they discovered. Many of the most common patterns are recorded in Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides's now classic Design Patterns: Elements of Reusable Object-Oriented Software.

Coming Full Circle

While object-oriented programming had its origins in personal computers, its influence has spread well beyond the PC. In this sense, over the past quarter of a century, software development has come full circle. Twenty-five years ago, computing old-timers were responsible for most of the innovations in software, as microcomputer enthusiasts rushed to catch up and apply the new knowledge. Today, most of the innovation comes from those working in networked, personal computing environments.

Computing of all kinds have truly converged into a single discipline, and the events of the last few decades demonstrate this. Fifty years ago, programmers had to deal with the challenges of writing compact programs for slow computers with minimal memory. The resulting knowledge rapidly became moot, as computers got faster and cheaper. Then, 30 years later, personal computers arrived on the scene and suddenly the old lessons became pertinent again.

Recently, embedded systems have become significant, and once again, the old lessons have become important. Despite the software bloat that is now commonplace on personal computers, there is still a need for software that runs quickly, efficiently, reliably, and predictably on slow processors with limited memory. However, programmers are better prepared this time around. They simply have a far greater understanding of how to build good software.

Software development has flourished over the past quarter century, and, at 25 years old, Dr. Dobb's Journal continues to flourish as well. Sure, the magazine is glossier, the editors a little grayer, and the tone slightly less irreverent. However, while the topics have changed over the years, DDJ continues to maintain its breadth and depth of coverage. Readers and contributors are still passionate about their craft. And the occasional dentist still calls asking about a subscription.

Some things never change.

DDJ

Jan01: Programming & The PC Revolution

The Most Significant Event in the Last 25 Years of Computing:
Comments from DDJ Luminaries

The worst idea ever to arise in the history of computing is shared code libraries, which are absurd in a time of cheap 30-GB hard drives. Most of the instability of the Windows platforms is due to DLL conflicts. One app, one block of code. The best idea in computing history is component-based visual programming, as implemented in Visual Basic and (especially) Delphi. I can write in a weekend what would have taken me a month 10 or 15 years ago, because RAD allows me to inherit rather than reinvent my wheels.

—Jeff Duntemann

Future programmers will look back on the latter part of the 20th century as a kind of dark ages—so many programming resources turned to focus on user interfaces (Mac, Windows, X) that algorithm development suffered. Not to say that there were no important algorithms developed, simply that the majority of programmers were no longer looking at algorithms—or even data structures—but instead became slaves of window handles, menus, and the like. Just as record blocking and I/O management eventually became operating system functions, user interface code is becoming more of a commodity you expect from your operating system and this is bringing a focus back to algorithms.

—Al Williams

The world changed in the late 1980s when programmers I knew at several startups became multimillionaires. That was great news for the individuals (they're rich), for the public (motivated builders make great products), and for the field (money attracts good people). The dark side has been an obsession with business plans, focus groups, and stock prices.

—Jon Bentley

The most significant event in the last 25 years of computing is not a single event, but the general accelerated progress in micro-miniaturizing of digital and mixed-signal electronic devices. This is driving the economics of computing and communications, making them affordable to an ever expanding, worldwide population of users.

—James Hendrix

The IBM PC. More than the Altair 8800 or the Apple II, it truly opened up the market for personal computing.

—P.J. Plauger

There is no most significant event in computing. What we have represents a confluence of ideas and implementations. Today, the Web is an obvious choice as Most Significant, but the Internet would not be so popular and powerful if there were not millions of personal computers available, which required the invention of the microprocessor and so on. Take-it-for-granted computing would not have taken place if we still demanded that everybody had to write shell scripts and memorize DOS commands.

—Jef Raskin

We live in the age of the algorithm, in which enormously sophisticated and flexible artifacts are created at keyboards, yet economies and lives depend on their working. The flexibility is so great that we have come to see algorithms everywhere we look. Biologists speak of analog computer modules within DNA transcription. Physicists perform experiments with their computers first and look at the world second. But analogy can warp the mind. When we see the world and think that controllability is the default and design always possible, then we view surprise as a bug that must be stamped out. But surprise is what makes our life journey so interesting. If there is a deity out there, we still underestimate her.

—Dennis Shasha

As a user, I nominate Visicalc, the application that made desktop computers necessary by automating a tedious, error-prone manual task that most business people must perform and, in a tie for most significant event, the Mosaic web browser for spawning the technology that made computers an essential home appliance. As a businessman, I nominate the IBM PC—an open-architecture, clonable, business-oriented platform. Prior to that, businesses viewed small computers as unreliable toys built by hippies in garages with frivolous names such as Apple and Altair. The PC and its IBM logo validated the technology. As a programmer, I nominate the common-user-interface, device-independent GUI operating environment. Many hardware and software technologies converged to make it possible for programmers to target common platforms leaving details of the user interface to the environment. As an author, I nominate as a negative event the "...for dummies" book phenomenon, a publishing model that spawned computer book "lines," sacrificing content for format and emphasizing visual eye candy over meaningful information.

—Al Stevens

As a harbinger of a business-unusual peer-to-peer computing revolution, as an example of open-source guerrilla development inside the walls of corporate America, and as one of the early and defining artifacts of a new Internet culture that is neither lawless nor unethical but that is evolving new loyalties and new values, Gnutella gets my nomination.

—Michael Swaine

The most significant event in the past 25 years of computing is undoubtedly the rise of the Internet. The ability of programmers and technicians to collaborate worldwide has led to huge advances that might be singled out as hugely significant developments. Among them are the rise of Java and, especially of Linux, which was born as an Internet project. The next development enabled by the Internet will be embedded, dedicated devices that serve specialized purposes from metering power usage to medical monitoring and remote diagnostics of complex equipment. We are actually merely standing on the threshold of what an expanded and high-bandwidth Internet can make possible.

—Tom Williams

The Xerox Alto. These folks redefined the way that most of us prefer to interact with our machines, and did so in an a way that transcends the operating-system wars. Whether we run a Microsoft product, Mac, Linux—most of us use the UI pioneered by the Alto.

—Tom Genereaux

First was Borland's introduction of Turbo Pascal for (I think) $29.95. Not only did this change the entire pricing model for computer software, but it made a real development tool available to the masses, taking software development out of the hands of large companies. The other milestone is the publication of the Gamma, Helms, Johnson, and Vlissides "Design Patterns" book, which not only changed the nature of the design process at the implementation level, but more importantly, introduced a common vocabulary of design that made efficient communication (and thus synergy) possible between designers.

—Allen Holub

Jan01: Programming & The PC Revolution

A DDJ Timeline



1975



February 1975
Bill Gates and Paul Allen license their Basic to MITS.
March 1975
Homebrew Computing Club holds its first meeting.
September 1975
First issue of BYTE magazine is published.



1976



January 1976
Dr. Dobb's Journal of Tiny BASIC Calisthenics & Orthodontia: Running Light Without Overbyte debuts, thanks to Dennis Allison and Bob Albrecht.
February 1976
Jim Warren joins DDJ as editor.
March 1976
Steve Wozniak and Steve Jobs finish the computer circuit board they call the "Apple I."
August 1976
First article on cryptography appears in DDJ. Apple cofounder Steve Wozniak contributes first of several articles, "Floating Point Routines for the 6502."
September 1976
Macintosh creator Jef Raskin joins DDJ and publishes his first article. His author bio states, "He is well known for his heretical belief that people are more important than computers, and that computer systems should be designed to alleviate human frailties, rather than have the human succumb to the needs of the machine."
December 1976
Michael Shrayer writes the first word processor for microcomputers, the Electric Pencil. Steve Wozniak and Randy Wigginton demonstrate the first Apple II at the Homebrew Computer Club. Bill Gates drops out of Harvard.

1977

January 1977
"Lawrence Livermore Lab's 8080 Basic," by John Dickenson, et al.
February 1977
Tom Pittman, one of the first software entrepreneurs for personal computers, writes an article advocating commercial over free software entitled "Free Software? Or—Support Your Local Software Vendor." In 1977, free software with source code was the norm within the microcomputer community, and publications like DDJ were the standard means of distribution. Ward Christensen, bulletin-board systems pioneer and coauthor of CBBS, contributes an article to DDJ on disassembling 8080 code. Bill Gates and Paul Allen form Microsoft. "An 8080 Disassembler Written in MITS 3.2 Basic," by Jef Raskin.
August 1977
Tandy/Radio Shack announces the TRS-80 microcomputer.
September 1977
"Computer Applications for the Handicapped," by Warren J. Dunning. "An Interactive Programming Language for Control of Robots," by Lichen Wang.



1978



February 1978
Gary Kildall, computer pioneer and creator of PL/M and CP/M, writes "A Simple Technique for Static Relocation of Absolute Machine Code."
March 1978
Kenneth Bowles writes "Status of the UCSD Pascal Project." Steve Wozniak contributes his second DDJ article, this one entitled "Renumbering and Appending Basic Programs on the Apple-II Computer."
May 1978
First article on Forth appears in DDJ. Forth became a regular topic for many years, and resulted in several special issues devoted entirely to Forth. DDJ continues to publish articles about Forth today. "Proposed Standard for the S-100 Bus," by George Morrow and Howard Fullmer.
June/July 1978
"A Tiny Basic Extension Package," by Leor Zolman.
September 1978
"Lisp For the 6800," by Fritz van der Wateren.

1979

April 1979
"Of Interest" section debuts, making it the second oldest running column in DDJ (the editorial being the first).
June/July 1979
Curt Noll and Laura Nickel describe how they used a computer to discover the 25th and 26th Mersenne Prime, one of the early and important contributions computers made to mathematics. It helped stir the controversy over whether real mathematicians used computers. Intel introduces the 8088 microprocessor.
November/December 1979
"Preliminary Programming Specs from the VDM-2/Graphic Display," by Lee Felsenstein.

1980

March 1980
"Mathematical Typography," by Donald Knuth.
April 1980
"An Introduction to Algorithm Design," by Jon Bentley. The first issue devoted to algorithms. In addition to a reprinted article by Jon Bentley, the author of Programming Pearls and currently a contributing editor to DDJ, the first column devoted to algorithms appeared. In this column, Dennis Allison wrote about merge sorts.
May 1980
"The C Programming Language," by Dennis Ritchie, Brian Kernighan, et al. First issue devoted to the C programming language.
September 1980
Tim Paterson shows Microsoft his 86-DOS operating system written for the 8086. "A Runtime Library for the Small C Compiler," by Ron Cain.

1981

August 1981
IBM announces its IBM Personal Computer.

1982

July 1982
"CP/M-86 vs MSDOS: A Technical Comparison," by Dave Cortesi. "Graphics on IBM's Personal Computer," by Ray Duncan.
December 1982
James Hendrix released the Small-C Compiler Version 2.

1983

January 1983
First article on the Ada programming language appears. Roger Gregory writes about Ted Nelson's Xanadu project in "Xanadu: Hypertext From the Future."
July 1983
DDJ breaks 100 pages for the first time. AT&T Bell Labs designs C++.
October 1983
Anthony Skjellum starts a bimonthly column on C and UNIX called "C/UNIX Programmer's Notebook."
November 1983
Borland releases Turbo Pascal.

1984

January 1984
Apple introduces the Macintosh.
March 1984
"RSA: A Public Key Cryptography System," by C.E. Burton. The first article on the RSA public key cryptography algorithm.
May 1984
First article on Modula-2 appears.
June 1984
DDJ reviews Turbo Pascal.
December 1984
First special issue devoted to the UNIX operating system.

1985

January 1985
Apple had been widely criticized for the Macintosh's limited memory and lack of expansion capability. Thomas Lafleur and Susan Raab resolve this with their article, "Fatten Your Mac," with step-by-step, unauthorized instructions for increasing the RAM in Macintosh computers to 512 KB.
February 1985
"Tiny Basic for the 6800," by Gordon Brandly
March 1985
DDJ's first article on Prolog appears: "Tour of Prolog," by David Cortesi. Allen Holub's "C Chest" replaces Skjellum's "C/UNIX Programmer's Notebook." Richard Stallman publishes his GNU Manifesto under the title "Realizable Fantasies."
May 1985
"A Compiler Written in Prolog," by G.A. Edgar.
November 1985
Microsoft ships Windows 1.0. "The Software Designer," by Paul Heckel.
December 1985
First article on windowing operating environments appears.

1986

January 1986
Bob Blum's long-running "CP/M Exchange" column finally retires. DDJ is available electronically for the first time on Compuserve.
October 1986
Article on programming the 80386 is published.

1987

January 1987
Issue devoted to programming the 68000 CPU. Article on the OS-9 operating system appears.
February 1987
Ernest Tello premiers a column on artificial intelligence. This wide-ranging, but unfortunately short-lived column, first introduced object-oriented programming to DDJ readers.
April 1987
Neural networks had come back into vogue five years earlier, when then Caltech scientist John Hopfield introduced his Hopfield networks. The first article on neural networks in DDJ appeared in this issue.

1988

March 1988
Issue devoted to object-oriented programming.
May 1988
Robert Carr, creator of the Framework integrated software package and the PenPoint operating system, contributes article, "Developing for the User." Michael Swaine's "Programming Paradigms" column debuts.
August 1988
Al Stevens takes over Allen Holub's C column, which is renamed "C Programming."

1989

February 1989
Rabindra Kar and Kent Porter publish "Rhealstone: A Real-time Benchmarking Proposal." Jeff Duntemann begins his popular "Structured Programming" column.
March 1989
Jim Gettys publishes an article on X Windows, "Network Windowing Using the X Window System." Tim Berners-Lee proposes the World Wide Web.
September 1989
Michael Abrash and Dan Illowsky publish "Roll Your Own Minilanguages with Mini-Interpreters."
November 1989
Anders Hejlsberg, author of Turbo Pascal and architect of Microsoft's C# language, publishes "Container Object Types in Turbo Pascal."
December 1989
Bertrand Meyer publishes "Writing Correct Software With Eiffel."

1990

February 1990
Tim Paterson, original author of MS-DOS, coauthors "Managing Multiple Data Segments Under Microsoft Windows," his first article for DDJ.
October 1990
Al Williams publishes the first segment of his two-part article "Roll Your Own DOS Extender."
November 1990
The League for Programming Freedom publishes an article warning of the dangers of software patents.

1991

January 1991
William and Lynne Jolitz start a multipart series "Porting UNIX to the 386." Their work—a port of BSD UNIX to the 80386 architecture—resulted in 386BSD, which eventually spawned FreeBSD and NetBSD. Rob Pike, Dave Presotto, Ken Thompson, and Howard Trickey write about the Plan 9 OS. Lotus founder Mitchell Kapor writes, "A Software Design Manifesto."
April 1991
DDJ publishes articles on neural nets and genetic algorithms
September 1991
David Betz presents a tiny, object-oriented language called "Bob." Bruce Schneier publishes "One-Way Hash Functions," his first of many articles for DDJ.



1992



April 1992
Mac Cody writes "The Fast Wavelet Transform." Ron Avitzur publishes "Your Own Handprinting Engine."
June 1992
DDJ steps into the world of the "Personal Supercomputer" with Ian Hirschsoln's three-part article.
October 1992
Looking toward today's e-commerce, Brad Cox writes about electronic distribution of software objects and pay-per-use software.
December 1992
DDJ looks at new data types, including spatial data and sound. Dick Gabriel examines persistence in a programming environment.

1993

January 1993
64-bit programming first appears in DDJ.
September 1993
Andrew Schulman publishes "Examining the Windows AARD Detection Code," foreshadowing Microsoft's anti-trust troubles.
October 1993
First article on the Perl programming language appears. Eric Bina and Marc Andreessen develop the Mosaic web browser. Intel announces the Pentium microprocessor.

1994

February 1994
Kent Beck, Smalltalk guru and inventor of the Extreme Programming methodology, introduces design patterns.
April 1994
Scott Guthery examines algorithms for mobile computing.
May 1994
Richard Burgess presents "MMURTL: Your Own 32-bit Operating System."
December 1994
First article on the World Wide Web and HTML appears.
Special Issue 1994
Devoted to interoperable objects, with introductory and in-depth articles on COM, CORBA, and other technologies, this issue quickly became one of DDJ's most popular. One of the contributors was Joe Firmage, who recently gained notoriety for his views on UFOs.

1995

January 1995
Ron Rivest publishes an article on the RC5 encryption algorithm.
March 1995
DDJ awards its first Excellence in Programming Awards to Linux creator Linus Torvalds and STL author Alexander Stepanov.
June 1995
Mark Coast and Terry Mellon introduce their software methodology in "Constructing Operational Specifications."
July 1995
DDJ publishes first article on the PNG graphics file format
August 1995
Arthur van Hoff publishes the first technical article on the Java programming language, called "Java and Internet Programming."
December 1995
Marc Najork's "Visual Programming in 3-D" is one of the first articles on visual languages and the future of software development.

1996

January 1996
Ian Goldberg and David Wagner write about a security flaw in the Netscape browser.
March 1996
Andy Yuen presents "A Tiny Preemptive Multitasking Forth."
April 1996
Peter Danzig writes about the Harvest object cache.
August 1996
Mark Russinovich, Bryce Cogswell, and Andrew Schulman uncover the SoftRAM 95 scam.
September 1996
Alan Cooper, creator of Visual Basic, writes an article entitled, "Goal-Directed Software Design."

1997

August 1997
Robert Collins goes inside the Pentium II math bug.
September 1997
T.V. Raman's Emacspeak speech-feedback system provides an alternative UI for the visually impaired.
October 1997
The precursor to the recently announced Advanced Encryption Standard is published in DDJ. It is entitled "The Block Cipher Square Algorithm," by Joan Daemen, Lars R. Knudsen, and Vincent Rijmen.

1998

January 1998
Peer-to-peer programming for the Internet was covered in an article by Louis Thomas, Sean Suchter, and Adam Rifkin.
February 1998
XML, Python, Perl, Tcl, and others of today's mainstream scripting languages are covered.
July 1998
DDJ goes inside DVDs with this article by Linden deCarmo.

1999

June 1999
Lincoln Stein's "A DNA Sequence Class in Perl" is central to the Human Genome Project.
October 1999
DDJ returns to Small-C with "The Small Scripting Language," by Thiadmer Riemersma.

2000

February 2000
Philip Wadler presents "GJ: A Generic Java."
March 2000
Wireless communication gains momentum, as James Wilson and Jason Krontz examine the Bluetooth spec. Gnutella is released.
September 2000
Dan Farmer and Wietse Venema launch a DDJ series on "Forensic Computer Analysis."
October 2000
"The C# Programming Language," by Scott Wiltamuth is published; it's the first technical article on the new language from Microsoft.
November 2000
"Kerberos vs The Leighton-Micali Protocol," by Aviel Rubin.
December 2000
"Dr. Dobb's Software In the 21st Century" special issue is released.

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.