Programming Paradigms

Michael ventures into the land of the lizards before adding his two cents to LEO lore. He then introduces a new "Paradigms Past" feature.


July 01, 1998
URL:http://www.drdobbs.com/programming-paradigms/184410614

Dr. Dobb's Journal July 1998: Programming Paradigms

Michael is editor-at-large for DDJ. He can be contacted at [email protected].


Sidebar: Paradigms Past

Embarrassed at never having read Finnegan's Wake, I recently tried to struggle through one of the many guides to James Joyce's dense masterpiece. All I got for my efforts, though, was a lamentable penchant for portmanteau words, as in the subhead below. A portmanteau word is an invented word constructed by compressing two or more words into the space reserved for one. "Thissue" compresses "this issue" and "calmtense" compresses "column contents," as well as suggesting a portmanteau mood for this month's column: calm/tense. You know: Relax and enjoy the roller-coaster ride.

Tracking the ups and downs of computer technology, the paradigmatic undulations, the huckster hype and the cotton candy fluff of vaporware, has always felt to me like a day at the carnival. To lessen any vertigo induced by this column's roller-coaster ride, I've instituted the next section, a sort of table of contents for the column. Relax and read on.

Thissue Calmtense

Land of the Lizards

This spring, Netscape did what it had promised and made the source code available for its Communicator product, inviting the entire software-development community to join in the process of developing the next version of Communicator, and, not incidentally, any products of their own they'd like to spin off. Within two weeks, 100,000 people had downloaded the code.

Some of the code was missing, for the very good reason that Netscape didn't control the copyright on software it had licensed from others. Some was missing for a dumb reason: the U.S. government policy on encryption software. Within seven hours, the missing crypto had been replaced, and the U.S. government made a monkey of, by Australian programmers.

Days later, James Clark, the technical lead on the World Wide Web Consortium's XML working group, (no relation to Netscape cofounder Jim Clark) added his XML parser to the Mozilla code base, and Netscape released the source for JavaScript 1.3. And so on. Collaborative programming was off and running.

Netscape likes to call this Open Source software. But Netscape didn't invent it (and, to be fair, doesn't pretend otherwise). It was called Free Software when Richard Stallman seemed to be all alone out there promoting the spirit of sharing what he learned at MIT in the '60s. Stallman's crusade always seemed noble but quixotic. Sure, it would be a nicer world if everyone shared their programming discoveries with their programming peers. Sure, technology would progress more rapidly if programmers didn't have to invent the virtual wheel every day because some earlier wheel inventor had locked up the plans. Sure, we'd all stand taller if we could stand on the shoulders of giants. But software development is (often) a business, and intellectual property has dollar value.

And so Stallman and the GNU heroes labored almost unknown. Until the Web arrived.

Maybe it was just because the suits hadn't figured out yet how to make money on the Web, maybe it was because the Web got its start among academics, but for some reason the tools of choice among webmasters were often free tools. Linux, Perl, Apache. The Apache Web server's name even suggested the collaborative development process that Netscape is encouraging -- a patchy construction, a quilt. And Netscape's cofounder was both savvy enough to understand the virtues of the approach and gutsy enough to push his company to put on a quilting bee.

Now, thanks to Netscape and Marc Andreessen, the Free Software/Open Source movement has momentum. Let's hope it continues. One place to watch or join the movement is http://www.mozilla.org/, but there's also http://www.opensource.org/.

The Tangled WEB

I have recently been looking at a source code editor whose author provides the source code with the product. Although I think that Ed planned to release the source for LEO long before Netscape did ditto for Communicator.

LEO was inspired by Donald Knuth's model of literate programming. Its inventor, Ed Ream, who also invented RED (which long-time DDJ readers will recognize as a text editor published in DDJ some 15 years ago), has been fiddling with literate programming for over a decade, but it never quite worked for him.

Literate programming came onto the scene about the time Ed wrote RED. Knuth published the first paper on literate programming, and the language-plus-programs that embodied it, back in 1984. Looking for a three-letter English word that hadn't already been applied to computers, Knuth decided to call his literate programming system WEB. (Later CWEB, but I'll use the earlier term as a generic here.)

When you write a program in WEB, you break your code down into sections, and these sections, written in C or Pascal or whatever plus WEB syntax, serve as the source code for two different WEB routines. One produces documentation that describes the program clearly and facilitates debugging. Its output goes to a text-formatting program such as Knuth's own TEX. The other produces the machine-executable code, which serves as input to a compiler or interpreter. Since the same source generates both the documentation and the executable code, they are sure to be consistent with one another. The best introduction to literate programming is the book of the same title, by Donald E. Knuth (CSLI Lecture Notes No. 27, 1992; ISBN 0-937073-80-6).

Ed's problem with WEB was that it was too hard to know when to create new sections and to keep track of all the sections once they were created. He now says that "flat literate programs have too little structure." His solution: Add outlines.

There's a devilish irony in Ed's decision. One reason that Knuth chose the name WEB for his system was that he was coming to realize that programs are better understood as webs than as hierarchical structures. He wanted to get away from top-down, hierarchical programming models. WEB was designed to allow programmers to write top-down, bottom-up, some-of-each, or stream-of-consciousness, and still be able to convey to a human reader the relationships among the parts of the program. Ed put back the hierarchy, using More, the outliner from Userland written by Doug Baron et al., as his model.

What we're talking about here: An outline consists of headlines that can be moved simply by dragging them. Each headline contains body text. Headlines define the organization of the document; a parent outline can contain zero or more children. Children are indented from their parents. You can expand or contract headlines simply by double-clicking near the headline. (Yes, double-clicking. Ed is currently developing on a Mac, although his real target platform is Rhapsody.) Ed soon decided that outlines solved all his problems with literate programming. Since then he's been doing all his programming using Leo and literate outlines, and says he'd never willingly program in any other form.

Knuth has said the same thing about his approach, but, as Ed points out, the programming world has not beaten a path to Knuth's door demanding WEB. Maybe it lacks something. Maybe that something is outlines.

LEO is specifically a text editor for C programs (it also supports C++ and Objective-C syntax). It employs a simple language called SWEB, based on Knuth's WEB, but considerably simpler (Ed took out all the typesetting code). Your program is expressed primarily in the body text of headlines that you create. This body text consists of plain C (C++, Objective-C) code, with additional directives defined by the SWEB language. You use LEO's Tangle command to translate the SWEB code to C code.

The combination of outlines and SWEB, Ed claims, makes programming significantly easier and more fun. And while outlines impose a hierarchical structure, they don't impose a single hierarchy. You can organize and reorganize your program at will, creating different views and organizations of the same program.

Leo in Action

In LEO or in WEB, you use this notion of sections to structure your code. Ed claims several virtues for sections:

As I said, Ed is making the source to LEO available (not for free, though, and subject to a license that I haven't seen yet). Example 1 is a snippet of the code for the print function, written using LEO, to demonstrate that coding with LEO isn't too unfamiliar.

Up, Up, and Away

I was wondering whatever became of the Japanese Fifth Generation Computing effort, so I did some research.

In 1981 Kazuhiro Fuchi announced the Fifth Generation Project, the same year U.S. artificial-intelligence pioneer Edward Feigenbaum started the first successful artificial-intelligence company, Teknowledge. The following April, the Fifth Generation Project was officially launched with a multimillion-Yen budget to develop hardware and software, focusing on parallel processing and logic, to solve problems requiring inference.

Hundreds of researchers worked on the project at any one time, and it became one of the best training grounds for computer scientists in Japan. Five different computers came out of the project over the next ten years, as well as a specialized operating system, database systems, and programming tools. In 1992, at the end of the project, Feigenbaum visited Japan and judged the Japanese work to be at least on a level with work done independently in the U.S. Some of the work was definitely the most advanced in the world. But traditional PCs and workstations soon surpassed the power of the Fifth Generation hardware. And the software, written for the obsolete hardware, was largely ignored. In 1992, The Ministry of International Trade and Industry granted the researchers a two-year extension to port the software to UNIX, and the Fifth Generation Project dissolved in 1995.

The software, which was placed in the public domain, may yet be used on UNIX-based highly parallel multiprocessor machines, but the most important outcome of the Fifth Generation Project may be that it gave an unparalleled education to a generation of young computer scientists.

Two Bits Make a Quantum Leap

Man was not meant truly to understand quantum physics. Not this man, anyway. I read a lot about it, and just when I think I've really understood some crucial concept, I look at it from another point of view and it makes no sense to me.

When quantum computing takes over, I'm going to be in trouble. Fortunately, it looks like quantum computers won't bump the PC off my desk or yours. They will be used, when they are actually practical, for certain specialized tasks, such as factoring large numbers. Okay, some of you are thinking of reasons why you might want to factor large numbers on your desktop PCs. Maybe you'll have QCs alongside or instead of your PCs.

By encoding information in spin states of a proton, say, which exist as a superposition of both 0 and 1 until a measurement is made, a QC can theoretically explore different paths through a mathematical problem simultaneously, making them useful for factoring and cryptographic work.

But quantum computers are a long way from practical today. It's amazing enough that they are even considered possible. Now Isaac Chuang of IBM's Almaden Research Center and Neil Gershenfeld of MIT are claiming that they've built one and that it can answer two questions about four numbers, like "which of the number 1, 2, 3, or 4 is greater than 2 and odd." The "computer" consists of the nuclei of a carbon atom and a hydrogen atom in a chloroform molecule, manipulated by magnetic fields and radio waves.

According to Lov Grover, a physicist at AT&T Bell Labs, Chuang and Gershenfeld have demonstrated "that quantum computing works, not just with pencil and paper, but in the lab."

LEO Gets Down to Business

The first business computer was not a quantum computer but a cold-cut computer. It was built by a British catering company, according to a new book by David Caminer, John Aris, Peter Hermon, and Frank Land ( LEO: The Incredible Story of the World's First Business Computer, McGraw Hill, 1997).

If that's not weird enough for you, how about this: In 1947, two bean counters from Lyons & Co., the catering company, visited the Princeton lab where ENIAC was being built. They went home and told management, in effect, "We oughtta get us one of these things." Lyons funded some computer research at Cambridge, where some of the fundamental steps in inventing the digital computer were being taken, and a couple of years later took the fruits of that investment, hired some engineers, and built a computer. They were up and running with custom catering-business software by 1951.

The computer was called the Lyons Electronic Office, or LEO. Then again, establishing firsts in computer history is like defining nationhood. A fictional Irishman once defined a nation as "the same people living in the same place. Or also living in different places." That was Leopold Bloom, from James Joyce's Ulysses, and another Leo.

DDJ


Copyright © 1998, Dr. Dobb's Journal
Dr. Dobb's Journal July 1998: Paradigms Past

Paradigms Past

By Michael Swaine

Dr. Dobb's Journal July 1998

void print(void)
{
 FTAG("print");
 <<< Define print vars >>>
 STATB(ftag);
 <<< Return if there is nothing to print >>>
 <<< Save the old port >>>
 PrOpen(); if (PrError() != noErr) goto done;
 <<< Create a TPrint Record.  goto close on error >>>
 cur_res_file = CurResFile();
 PrValidate(print_h);
 if (PrJobDialog(print_h) == FALSE) {
  cancel_flag = TRUE; goto close;
 }
 <<< Initialize the page counts >>>
       ...
 <<< Warn about any print errors >>>
done:
 <<< Restore the old port >>>
 STATX(ftag);
}

Example 1: Sample LEO code.


Copyright © 1998, Dr. Dobb's Journal
Dr. Dobb's Journal July 1998: Paradigms Past

Dr. Dobb's Journal July 1998

Paradigms Past


It seems like a simple question: Who invented the calculator? But questions of paternity are often tricky. According to the U.S. Patent Office, the inventor was a bank clerk in St. Louis named William Seward Burroughs, in 1886. Burroughs, the namesake and ancestor of beat author William S. Burroughs, built both a calculator and a company to sell it (American Arithmometer Company, later Burroughs Adding Machine Co.).

But Burroughs was beaten to the punch (by fully 66 years) by one Charles Xavier Thomas, Thomas of Colmar to his friends, who built the first commercial mass-produced calculator in 1820. It could add, subtract, and multiply, and, if you helped it a little, even divide. It took up most of a desktop, and continued to be sold for 90 years.

Tom had got the idea, though, from a 17th century invention of Gottfried Wilhelm von Leibniz. Leibniz's Stepped Reckoner was definitely a calculator: It added, subtracted, and did multiplication by repeated addition and shifting. Although Leibniz was an early booster of the binary system, his machine was decimal.

But Leibniz wasn't first: Three years before he even planned his machine, a Brit named Samuel Morland had built a machine for toting up (decidedly nondecimal) British currency.

Was Morland the first? Nope. Both he and Leibniz had merely expanded on an invention of Blaise Pascal. The Pascaline, built in 1642 for Pascal's tax-collector dad, was (aha!) the first digital adding machine (and the first digital business machine). Pascal sold about a dozen of them. But the story doesn't end there. Still earlier, in 1624, a fellow named Schickard had built a Calculating Clock that could add and subtract. If you really needed to multiply, you could use the slide rule affixed to the front.

And even earlier, sometime in the 1500s, an artist drew some (recently discovered) sketches for a mechanical device that would add and subtract numbers. When a machine was built based on these sketches, it actually worked.

The name of this artist, arguably the true inventor of the calculator, was Leonardo da Vinci. Another Leo.

-- M.S.


Copyright © 1998, Dr. Dobb's Journal

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.