Al is a DDJ contributing editor. He can be contacted at firstname.lastname@example.org.
A while back Michael Swaine took a poke in one of his columns about the "...for Dummies" line of books from IDG Books. Subsequently, DDJ got a letter from IDG's lawyers enjoining Michael from infringing their trademarks merely to denigrate such a prestigious body of work. Michael, now a person of property in Oregon and proprietor of his own business with, no doubt, a trademark or two of his own to protect, has dutifully refrained from showing further disrespect to the property of others, particularly those with deep pockets and huge staffs of snarling lawyers.
Several years before, when it was okay to mention dummies in public, Michael remarked in his column that people who converse with dummies, also known as ventriloquists, are, in his opinion, geeks. I responded in a column of my own that in my youth I was a ventriloquist and did not consider myself a geek until such time as I read Michael's comments. It's embarrassing to find yourself forced out of a closet when you didn't know you were in one.
I once published a cartoon that I drew (Yeah, I used to be a cartoonist, too, Michael, what of it?) in a book named, Welcome to Programming, published by MIS:Press in 1994 and now out of print. The cartoon's caption similarly parodied the "...for Dummies" titles to explain that people who consider themselves dummies don't usually try to learn programming. IDG's lawyers probably did not read that book since they sent no such baleful warnings my way or to MIS:Press.
Too late. Last year IDG Books purchased MIS:Press. They'd have to enjoin themselves.
Now they are thinking about reprinting the book. I wonder what they'll do about the cartoon. If you want to see it and read its caption, you'll have to find a copy of the book's first edition somewhere and turn to page xxiv in the front matter. (That's page 24 for those of you who haven't read "Roman Numerals for..." oh, never mind.) Or you could get a copy of The Al Stevens Cram Course on C/C++ CD-ROM, published by DDJ, which includes the text and illustrations of that and two of my other MIS:Press books. DDJ has snarling lawyers, too, and they won't let me reproduce the cartoon and caption here because of the precedent that Michael's earlier indiscretion established.
Inasmuch as I am now one of IDG's favorite authors, a legacy of their acquisition and author of a best-seller or two, I wonder if they'll object to the title of this column, which includes a picture of my Jerry Mahoney collection, a vestige of former times. (The big guy in the middle is a home-built figure named Jack. That's right, Michael, I even build and dress them. So what's it to ya?)
By the way, I have to find a better place for my collection. If you think publishers' lawyers are grouchy, you should see the snarl on Judy's face as she tells me that she wants her living room sofa back. Right now.
Anyway, how can anyone complain if, on my 11th anniversary at DDJ, I dedicate this column to my collection of -- er -- ventriloquial figures? If anyone does object, I'll simply advise them that they have leaped to conclusions, rushed to judgment, and totally misinterpreted my meaning. The title of this column is obviously, "C++ for Publishers and Lawyers."
Speaking of C++ for lawyers, a debate is underway among the language lawyers and standardizers on the comp.std.c++ newsgroup about a C++ language feature that quietly slipped into the standard definition with no one noticing it. Well, perhaps the committee's deliberations weren't so quiet because they seem to be the ones who disagree the loudest, but the change came as a surprise to some of us on the outside. What surprised me wasn't the new behavior but what it replaced. Consider the program in Example 1(a). Does the code look correct? Well, it does to me on the surface, and it is indeed correct, according to the new rules of the standard. The program in the example instantiates a pointer to const Foo and initializes the pointer with the address of a Foo object built on the heap by the new operator. Because of the const type qualifier, the program may not modify the object through the pointer without casting away the constness of the pointer or by changing only mutable members. The program may, however, delete the object. This program compiles with the newer, nearly compliant compilers, such as egcs 1.1 and is well-formed according to the C++ standard definition.
What surprised me was that this code used to be incorrect. The program fails to compile under older compilers such as VC++ 5.0, which reports:
error C2664: 'delete' : cannot convert parameter 1 from 'const int *' to 'void *'
The ARM says, "A destructor can be invoked for a const or volatile object," which at first reading seems to support the correctness of Example 1(a) since the delete operator calls the class destructor. But then it says, "A pointer to constant cannot be deleted. The reason is that the deletion in principle modifies the object pointed to...The purpose of prohibiting the deletion of pointers to constants is to allow programmers to rely on const objects being immutable...Without the restriction against deleting pointers to const objects the delete operator would have the effect of implicitly removing the const attribute from a pointer. No operator should have that property."
I not only didn't know that, I never gave it the first thought. On the rare occasion when a compiler complained about my violation of that rule, I shrugged it off as yet another anomaly in the compiler or the language and removed whatever const qualifiers were in the way so I could get on with the task at hand -- const can do that to you.
Somewhere during the standardization process, they changed the rule. The change had its advocates and its opposition, none of whom seem to have changed their opinions, and both sides make compelling arguments that the other side deems spurious. I'll paraphrase some of the arguments that appeared on the newsgroup.
Those who support the current and new behavior argue that constness begins when the constructor completes processing and ends just before the destructor begins processing. They also argue that inasmuch as Example 1(b) has always been correct code and invokes the destructor for the const object when the function exits, Example 1(a) should be correct also. It is not easy to see the difference between the effects of the two examples. Supporters of the new behavior go on to say that changing an object's value is unrelated to causing the object to cease to exist and that if delete were not permitted, memory leaks would be the result.
Those who support the earlier behavior and oppose the change argue that a destructor, which by definition cannot be const, changes the state of the object and, consequently, should not be permitted to execute for a const object when invoked by the delete operator. One debater said, "The result [of the new rule] is that a whole class of preventable errors are not caught by a standard-conforming compiler..." Example 1(c) illustrates this concern. The caller of bar fully expects that bar cannot change the caller's Foo object through the pointer. Yet bar now has the freedom to delete the object. This is not good; the caller's guarantee of constness has been seriously compromised by the new rule. If the caller tries to make further use of the object, the program probably crashes. Someone pointed out that the guarantee really never was there because the older compilers permit explicit calls to the destructor through pointers to const as in Example 1(d).
They address the question of the difference between Examples 1(a) and 1(b) by observing that the compiler destroys the object in Example 1(b), whereas the programmer destroys the object in Example 1(a). "There are...many things that are allowed to the compiler and not allowed to the user," said one participant. Another participant pointed out that, "A local const variable is being destroyed with full knowledge of its creator, in the same context where it was created, and where it is lexically impossible to try to continue using it." This argument, compelling at first, weakens when you realize that the language has always permitted a function that instantiates a local const variable to pass the variable's address to another function that may then call the object's destructor as Example 1(d) shows.
Examples 1(c) and 1(d) are, in my opinion, really bad code. Example 1(c) assumes that the object was instantiated with the new operator, which might be wrong. The explicit call to the destructor in Example 1(d) does not do what delete does, because it does not return the memory to the heap. If you write code like that, you deserve what you get. If you are the unlucky caller of those functions as written by someone else, however, you do not deserve what they might do to you. Sometimes I write code wherein a function (I'm not discussing destructors now) deletes a pointer that the function did not initialize with new. Rarely if ever is that pointer passed to the deleting function. It is usually a data member.
Someone in the group asked if there was any empirical evidence that the old rule caused problems that mandated the change. This question reflects once again a larger concern that I, in my role as founder, president, sole member, and chief spokesperson of the C++ Dead Horse Molester's Society, continue to express. That concern has to do with the wisdom of using the standardization process to invent language features with which no one has gained any experience. In this particular case, I have no complaints. The new rule is what I always thought the rule was without any thought as to what it should be. I used to be wrong. Now I am right. How nice. Whether the language is wrong or right with respect to deleting pointers to const is likely to be argued for years to come.
Java programmers must be laughing themselves to sleep at night.
Speaking of being wrong, in a recent column, I spoke to the issue of wild card expansion on the MS-DOS command line, observing that the MS-DOS command.com shell provides none and that programs written for UNIX, which enjoy wild card expansion through the shell, do not work properly when ported to MS-DOS.
No column in all these years of my tenure has generated as much interest. You dear readers often enjoy disagreeing with my opinions or offering up a few of your own, but nothing stirs you like me being dead bang wrong. To date, I have over half a hundred e-mail messages that all say essentially the same thing. Most MS-DOS compilers include a relocatable object file named setargv.obj, wildargs.obj, or some such that, when linked with a program, provide wild card expansion through the compiler's startup code. All you have to do is link with the object file, and the argc and argv parameters automatically reflect properly expanded wild card arguments rather than the raw wild card specifications that the user types on the command line.
Some of your messages to me were quite understanding, realizing that no one can know everything and that mistakes do happen. But others were less forgiving, insisting that people like me who set themselves up as experts at least get their facts right before going public with them.
Fortunately, it happened in a magazine, a mostly disposable medium once its value has been extracted, and not in a more enduring vehicle, such as a book that might languish on shelves for years waiting for future readers to discover my lapse in journalistic judgment. But wait. There's that Dr. Dobb's Journal CD-ROM with all the editorial content. They never wear out. I'm doomed to leave my legacy of ignorance and stupidity for generations to follow. Oh, the shame.
Besides the correction to my bad information, not everyone agrees that the UNIX behavior is all that desirable, and several readers offered additional insight into the issue.
From Paul Sobolik,
I found out about UNIX's alternate behavior when I was working with Perl, the Win32 implementation of which emulates UNIX's expansion of command line file specs. The trouble was, I wanted my little program to recurse subdirectories and process the files found there as well as those in the current directory, and automatically expanded wildcards were unwanted. I eventually discovered that expansion could be suppressed by putting the argument in quotes, but that made the program a little bit harder to remember how to use. (Another common feature of UNIX, as I understand it.)
From Greg Miller,
...you seem to have missed one of the biggest reasons why...only UNIX shells do automatic expansion: It's simply a less powerful approach. Consider a common archiver syntax:
<archiver> d *.ext *.txt
On DOS archivers, this normally deletes all files with the extension ".txt" from all archives with the extension ".ext" in the current directory. Doing the same in UNIX requires that the user manually disable globbing using quotation marks. As another example, with 4DOS and 4NT, I can copy *.c to *.bak to make quick-and-dirty backups. Under UNIX, the "cp" command simply can't do this.
From Andy Glew,
UNIX shell's argv wildcard expansion is a half-hearted feature -- it works maybe 80% of the time, but not all of the time. Certainly, I have learned the hard way that you should never use wildcard expansion in shell scripts, Perl scripts that invoke the shell, and frequently you should not use it in Makefiles. It can be really annoying when such tools break just when you really need them to be able to work, to clean up a large mess of temporary files.
From Robert Dibley,
I have written many utilities which need to allow a very flexible approach to my development directory structures, and one of the best ways to handle this seems to be a recursive directory tree scanning approach combined with full wildcard specification. This has allowed me to do such things as process all the town models in all the different data-sets I'm using, from one single command.
...the real culprits are the ones who thought they should modify what you typed without you asking.
From Vadim Slesarev,
I was shocked by this new knowledge... I...never expected [the] UNIX shell [to] behave so UGLY.
From Tarjei Tjxstheim Jensen,
UNIX does not cope with unlimited expansion of file names. The limit may vary from version to version, but there is a finite limit. If my memory serves me right the limit is around 10 KB. Once the shell command line expansion reaches that limit it will barf (print an error message and refuse to run the program you specified). So if there are a lot of files involved there is a good chance that nonUNIX programs are more able to cope with that situation.
From Julius Caesar,
You make me think about a time when the world was young and I was a little bit younger (as you and K&R) and 48K was really a lot (my first computer had 4K) and data got saved on tape in Kansas City Standard.
Julius, was it really that long ago?
From Chris Jennings,
...MS-DOS offers very little flexibility with its wildcards. For example, there is no standard way that I know of to deactivate one with an escape character. Presumably this is a side effect of the fact that in the original design escaping a wildcard wasn't required since the programmer got to pick which arguments he or she expanded (if any). Chicken and egg.
From John Wendel, who quoted a common tagline found in usenet news groups,
Those who don't understand UNIX are doomed to recreate it, often poorly.
Finally, you'll remember that I began that particular column with a story about my grandson Landon and his friends yelling "Doorknob" whenever someone did "a bad thing." I didn't quite know what it meant until Landon explained it to me.
This from Scott Seely,
When I was growing up, my friends and I had a similar game. Whenever someone "left one," the first one to smell it and figure out who let it go shouted "Doorknob." If the offender didn't touch a doorknob before he was caught, he got noogies from the first one to catch him.
Now if someone can just tell me what a "noogie" is...
Copyright © 1999, Dr. Dobb's Journal