Michael is editor-at-large for DDJ. He can be contacted at [email protected]
Some sort of transformation is underway in how we interact with the Internet. Where it will lead, I don't think anyone knows. But could it be that one of the defining metaphors of the Internet is facing metaphorical death?
In the book Cyberspace: First Steps, editor Michael Benedikt began by defining cyberspace. Ten different ways. In retrospect, one wonders if he should have seen that the concept was shaky. Maybe he did. "Cyberspace as described in this book," Benedikt immediately went on to say, "does not exist."
Fair enough; after all, the book was subtitled "First Steps." But the book came out in 1992. Thirteen years later, we still haven't arrived at the cyberspace so abundantly defined in that book, and it may be time to ask whether we're even headed in its general direction.
The Metaphor of Place
A common theme in those 10 definitions of cyberspace, and arguably the sine qua non of cyberspace, is location. Cyberspace is a place, or a virtual geography that embeds notional places.
Presaged by Roy Trubshaw (MUDs), created and reified by Tim Berners-Lee and Marc Andreessen (the WWW and the first visual web browser), its future direction predicted by Mark Pesce and many others (VRML), cyberspace has been variously imagined as a network of places to go and things to see, a web of semistable locations in a rich visual environment, a virtual space, eliminating distance, moving inexorably toward a less-dystopian version of a communal hallucination scripted by William Gibson.
It seems to me that in recent years, we have been moving away from this metaphor of cyberspace, and specifically from the metaphor of place. I point to several trends in support of this claim.
The present-day weblog is the descendant of the personal web site of a decade ago, but it differs from its ancestor in several significant ways. Most blogs are still pretty egocentric, but they are usually more dynamic, newsy, more rooted in the news of the day or other current developments, than personal web sites. Blogs comment on the passing scene rather than catalog the owner's obsessions. Blogs are sometimes collaborative and even when not, they tend to be more responsive to input, and as a result more responsible, than personal web sites.
Blogs are also less style-conscious. They are generally less interested in decor than in information. It seems to me significant that bloggers put a large amount of effort into the blog's content and very little into its skin, usually using an off-the-rack blogging template.
I'm not trying to define or delimit blogs here. I realize that there are a number of people who think that's their job, and they're welcome to it. I'm not arguing with those who question Arianna Huffington's celeb blog as an oxymoron. ("Arianna Huffington, the columnist and onetime candidate for governor of California, is about to move blogging from the realm of the anonymous individual to the realm of the celebrity collective...Among those signed up to contribute [to her blog] are Walter Cronkite, David Mamet, Nora Ephron, Warren Beatty, James Fallows, Vernon E. Jordan Jr., Maggie Gyllenhaal..."The New York Times.) Nor with those who say that Kos is too much of a celebrity to be a blogger now. Nor with those who say that Matt Drudge isn't really a blogger (Drudge is one who says this). Maybe Arianna will end up reinventing the Algonquin Round Table. Or the Well. Or Crossfire. This stuff is open-ended and keeps redefining itself.
Besides, any time you write about blogs in any other medium, some blog poster will say that you're wrong and even if you aren't he read it on a blog first.
But it does seem that there is an emerging message that even someone writing in a print publication can read, which is that blogging is about information more than about eye candy or personal promotion. And furthermore, that it's about information as a flow that one sticks one's hand into and stirs, rather than information as the output of one's personal (or corporate) infofactory.
Blogs are still web sites, still arguably places, but that's not what matters about them. The Web's unit of meaningful granularity is the site, but blogging's is the post.
RSS feeds move us out even farther from the metaphor of place. RSS feeds are exclusively concerned with information. RSS invites us to think in terms of a sort of information component architecture. You want to build a blog commenting on the events of the day? Use these handy factoid plug-ins.
There is a different feeling and a different set of values that comes along with RSS. It's something that you can examine in your own mind if you've ever explored a MUD and now provide an RSS feed. These are not different answers to the same question. RSS is simply not concerned with the things that MUDs were (are?) concerned with.
That said, RSS and blogs still reflect many of the values of earlier Internet modalities: democratization, decentralization, implementation of a gift economy, and so on.
Another big trend involves web services. Is it too glib to call web services RSS for robots? Probably. But at least in terms of the claim I'm making about the direction of the interment, it seems that the web-services story is similar to the RSS story: web services emphasizes capability rather than locale.
"The average IQ loss [due to e-mail] was measured at 10 points, more than double the four point mean fall found in studies of cannabis users."
Martin Wainwright, The Guardian
"If watching TV really makes you smarter, as Steven Johnson argued in an article in yesterday's New York Times Magazine (in an excerpt from his forthcoming book), then I guess I need to watch a lot more of it, because try as I might, I could make no sense of Johnson's piece."
Dana Stevens, Slate
E-mail makes you dumber and TV makes you smarter, according to recent news items. Pardon me if I doubt both claims.
E-mail is the oldest Internet modality, but I think that it's also part of what defines where the Internet is going today. That The Guardian runs an article saying that e-mail lowers your IQ by 10 points may or may not say something about The Guardian's ability to exercise sober judgment in the face of what appears to be a silly claim by some headline-seeking researchers, but it does say something about how centrally important e-mail is. Every complaint you hear by a taxi driver or grocery clerk about the amount of junk mail they get just shows to what extent e-mail has insinuated itself into our society. That they complain about its drawbacks is less important than the simple fact that everybody is using it.
But for such a hot technology, e-mail doesn't look very hot. After all these years, e-mail hasn't really embraced the idea that you might want to put pictures in your messages: Images are typically handled as "attachments," which is to say, less important, poorly integrated afterthoughts. Contrary to most technological trends, e-mail shows little sign of wanting to be anything but the stodgy drab thing it is. I think that this just means that this is one more area where the information is what's important.
It's the Information, Stupid
Weigh all these trends and it begins to feel as though Internet technologies are not following the cyberspace real estate model (location, location, location). The future direction of the net seems to be more and more about information, delivered in various ways and in whatever unit size makes the most sense for that information.
Of course, one might claim that this was Tim Berners-Lee's vision for the Web all along.
Death of the PC?
Delivered in various ways I said, and it sure is. The desktop computer is ceding more and more mindspace and marketplace to devices that can be carried in one hand: phones, music players, PDAs. Although these handhelds are still often single-function devices, the trend is toward more functionality, and the technological imperative will probably ensure that handheld devices continue to take on more and more of the tasks we use computers for. What will it mean on that morning when we wake up to a USA Today graphic telling us that handhelds have won that more information is being processed, or more time is being spent, with handheld devices than with PCs?
Won't we suddenly realize that we've paid far too little attention to the user interface issues associated with these devices? Not "no attention," surely, but too little? Because the amount of thinking directed toward handheld UIs seems minuscule compared to the comparable work on PC UIs, and it's surely a harder problem.
Assuming that downsizing Windows isn't the ultimate answer for handheld device user interface design, could it be that there simply is no digital solution? That is, no truly satisfactory user interface for handheld computers that depends on the use of your 10 (or factoring in toes, 20) digits? Are we knocking on a door that only speech can unlock?
And if so, will unlocking that door unleash all the drawbacks of speech? If we, to quote Tolkien, "speak, friend, and enter," will we encounter a Ballrog?
Last month's column was a little bit death-obsessed, and this one seems to share the dark motif. The following items were written for last month's column, but fit here just as well.
Legacy Code and Coding Legacy
Another software company packed with mathematics Ph.D.s, and about which a question of legacy has also arisen, is Mathematica.
The issue came to my attention in the form of an e-mail from Jeroen van Dael of The Netherlands. Van Dael very respectfully points out that, while Stephen Wolfram's achievements are great and Wolfram's mathematical software Mathematica is very powerful, the general impression that Wolfram more or less invented software for doing symbolic mathematics is incorrect.
Now I was aware that Wolfram has been taken to task more than once for not granting proper credit to his intellectual antecedents. To my way of thinking, he has a way of leaving his readers with the impression that he invented big chunks of mathematics, computer science, physics, and other disciplines, only mentioning predecessors briefly, sort of in the spirit of a Christian theologian acknowledging that earlier religions came up with the virgin birth and resurrection ideas first but dismissing this as an unimportant historical detail.
Van Dael recounts how, in 1979, Wolfram visited Dr. Martinus J.G. Veltman at the University of Utrecht to learn about Veltman's program Schoonschip, which was one of the first programs for doing computer algebra. Veltman generously gave of his time, not to mention his source code, and Wolfram went back to Illinois and wrote his first algebra program, SMP, the predecessor to Mathematica, announcing it in 1981. Without much acknowledgment to Veltman.
So I did a little research.
Actually, I didn't need research to tell me that neither SMP nor Schoonschip was the first symbolic math program; there was at least Macsyma, which grew out of Project MAC back in the late '60s and early '70s at MIT. I remember Macsyma, which today lives on as an open-source project named "Maxima." This name destroys the original acronym (Project MAC's SYmbolic MAnipulator), but it still has an appropriately mathematical sound.
One of the more curious artifacts I unearthed in researching the Mathematica-Schoonschip matter was a speech given 10 years ago at the First International IMACS Conference on Applications of Computer Algebra by one Michael Wester http://www.math.unm.edu/~wester/ papers/ACApoem/. In it, Wester rattles off all the computer algebra programs he can think ofand he can think of a lot of themin verse. His list includes:
Macsyma, Vaxima, MuPAD, Reduce, Axiom, Derive, Maple, Mathematica, FORM, PARI, Schoonschip, Theorist, FELIX, Weyl, Aljabr, Paramacs, Theorist, FLAC, GNU-calc, SENAC, STENSOR, SHEEP, SIMATH, Magma, Macaulay, KANT, MuMath, Ricci, AMP, JACAL, Galois, Mercury, Calculus, LIE, UBASIC, Number, Cayley, XPL, and PFSA.
But that didn't help me with the question of the relationship between Mathematica and Schoonschip.
I checked and found that the Mathematica documentation states clearly that Wolfram started work on SMPwhich it calls "the first modern computer algebra system"in 1979, the same year he acquired the source code to Schoonschip. And that documentation was written by Wolfram himself.
Wolfram did in fact mention both Schoonschip and Macsyma in announcing SMP, in the following terms:
"Of the existing symbolic manipulation programs, SCHOONSCHIP was the only one designed to handle the very large expressions encountered, and MACSYMA the only one of any generality."
But nothing in that acknowledges any influence on his own work. This is, I'm afraid, typical of Wolfram. It's clear that Mathematica today bears little resemblance to the fast but limited program that Veltman developed all those years ago. And Mathematica differs from SMP for a very practical reason: Wolfram wanted commercial rights to SMP and couldn't get them. But did Wolfram base Mathematica on Schoonschip? And if so, did he properly acknowledge that debt?
I found several items that bear on the question, but the most interesting was this item, published on the Wolfram.com web site on the occasion of Veltman's being awarded a Nobel Prize in Physics in 1999 (it was for physics but the work honored was based on computer algebra):
"In 1979, when Stephen Wolfram began designing SMP, the first modern computer algebra program and predecessor to Mathematica, he visited with Dr. Veltman to learn more about Veltman's experience in creating SCHOONSCHIP. According to Dr. Veltman, 'Later, on the basis of this program [SCHOONSCHIP], the commercially successful software program Mathematica was developed by Wolfram...'"
So whether Wolfram properly acknowledges Veltman, Wolfram.com does so with grace and magnanimity. May all our web sites represent us so well.
Death of Consensus?
I'm puzzling over the significance of a device called FOXBlocker. Plugged into the back of your TV, this coaxial cable filter allows you to block reception of FOX News (make that FOX "news"), much as some parents block Adult channels (make that "Adult" channels). Now a few disclaimers are in order: You can probably block any channel you like (make that "dislike") without spending $9, the device may degrade reception on adjacent channels, and it doesn't work on digital cable or satellite. But if you buy one, the company will e-mail FOX News advertisers for you, to "send them a message."
(Note: FOXBlocker is different from TVBGONE, a device that turns off other people's televisions. Technology serves all shades of the political spectrum.)
But let's say that FOXblocker catches the public imagination. Will it inspire a rash of related products, like a PBSblocker device to protect the unwary from being psychically scarred by an accidental viewing of Antiques Roadshow? Will feminists demand a SpikeTVBlocker and Spike viewers lust after a LifetimeChannelBlocker? With a careful deployment of blocking devices, we could all filter out any viewpoints we don't agree with although rather than dangle a dozen dongles behind our screens, we might skip the Blockers at that point and opt for Allowers that only let friendly channels pass our portals.
What I'm puzzling over is whether FOXblocker is a symptom of (and a potential contributor to) the decline of the consensus worldview, and if so, how I should feel about that. Will we one day all have our own carefully selected collection of TV news sources that reinforce the worldviews of the communities with which we identify? If so, is FOXblocker just an extension of a process that began with cable television's erosion of the mindshare of broadcast TV?
And then I wonder, is this really something new, or is it a return to what once passed for normalcy? Did the three-network model of broadcast TV create a temporary artificial beige consensus, putting us all in a persistent vegetative state, waiting for someone to plug the feeding tube? Are cable and satellite TV and FOXblocker simply reintroducing the print publication model, in which it doesn't get into your home unless you choose to bring it in? Or, looking forward rather than backward for a precedent, is TV becoming bloggified?