Dr. Dobb's is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Channels ▼
RSS

Programmer's Bookshelf Newsletter - April 2004


DDJ Programmer's Bookshelf

Book Reviews


Perl Cookbook

review by Russell J.T. Dyer

As a Perl programmer, you probably own or have access to O'Reilly's Perl Cookbook. It's an essential book for the advanced development of Perl skills. Thanks to the stability of Perl, this is one computer book that doesn't become obsolete very quickly. It was first published in the Summer of 1998 and is still useful today. If your copy of the cookbook, however, is anything like some that I've seen belonging to senior Perlists—filled with post-it notes, notes written in the margins, highlighting and dog-eared pages throughout, and looking more than a little worn—you're probably due for a replacement. Fortunately, you now have an additional excuse to buy a fresh copy of the Perl Cookbook, since the second edition is out.

The Perl Cookbook is filled with dozens of common and somewhat uncommon dilemmas that one might encounter with Perl in daily life. Scenarios are laid out in clear language, then resolved with excellent explanations. Sometimes the solutions (or recipes) are straightforward and limited. Often times, though, the authors give more than one solution depending on what they imagine the reader may be seeking or may need. The result is a deeper understanding for the reader by way of more examples, and a greater likelihood that the nuances of your particular problem are addressed.

The first edition of the Perl Cookbook is based on Perl 5.004.04. Because of the stability and the reverse compatibility of Perl, just about all of the first edition still applies. The authors have updated the text in the second edition for Perl 5.8.1. Many of the changes to old recipes are based on the newer version of Perl. But many of the changes were made to give greater clarity through expanded discussions, and to give the reader more examples since there's always more than one way to solve a problem in Perl. The 200 additional pages in the new edition are composed of changes to more than 100 recipes, as well as the inclusion of 80 new recipes. With that many changes, I can't list them all here. However, I will highlight several of them briefly.

Math fans will be pleased to find that a new recipe on named Unicode characters (1.5) has been added, as well as a recipe on normalizing similar Unicode characters (1.9) and treating them as octets (1.10). There's a new recipe to format text as title case (1.14). The recipes for trimming blank spaces from the end of strings (1.19), as well as the one for parsing comma-separated data (1.20) were expanded. Chapter 2 on numbers has some recipes that have been reworded and reworked, as well'including 2.2 on rounding floating-point numbers and 2.3 on comparing them. The recipes on converting binary, octal, and hexadecimal numbers have been rewritten and combined into one lengthy recipe (2.15). There are new recipes in Chapter 8 on dealing with a file's contents: treating a file as an array (8.18); setting the default I/O layers (8.19); converting Microsoft text files into Unicode (8.21); comparing the contents of files (8.22); treating strings as files (8.23); and dealing with flat file indexes (8.27). In Chapter 9 on directories, a recipe has been included on how to handle symbolic file permissions instead of their octal values (9.11). In Chapter 10 on subroutines, a recipe has been introduced for creating a switch statement using the Switch module with the case command--a very handy way to consolidate multiple if and elsif statements into a clean format.

Chapter 11 on references has a new recipe for dealing with memory problems common in self-referential data structures (11.15). There's also a new recipe on using program outlines (11.16). Chapter 12 on packages, libraries, and modules has a new recipe that provides a solution for making a function private (12.5). There's another one on customizing warnings in your own Perl module (12.15). And Chapter 13 on objects has a new recipe using the dclone() function to give the user a copy method for a class.

In Chapter 14 on database accessing, there are several new recipes: escaping embedded quotes (14.10); handling database errors (14.11); setting up database queries within a loop statement (14.12); determining the number of rows returned by a database query (14.14); and displaying data retrieved one page at a time (14.16). Chapter 15 is on user interfaces, or rather, interactivity. There are several new recipes on this topic as well: graphing data (15.18); creating thumbnails of images (15.19); and adding text to an image (15.20).

Chapter 17 on sockets has an additional recipe on handling multiple clients from within a process using an operating system's threads (17.14). There's another on managing multiple inputs from unpredictable sources (17.19). Chapter 20 on web automation has many new recipes'one using cookies (20.14), another two on retrieving password-protected pages using LWP (20.15) and https pages (20.16), and two particularly good ones on parsing HTML (20.18) and on extracting data from an HTML table (20.19).

Finally, two new chapters have been added: Chapter 21 on mod-perl contains 17 recipes from authenticating to dealing with cookies to redirection. It has recipes on Apache logs, migrating from CGI to mod-perl, and working with the HTML::Mason perl module. Chapter 22 on XML, another new chapter, includes a quick introduction to XML, as well as a few lengthy recipes on parsing XML and validating XML gracefully. It also provides advice on searching an XML tree.

In summary, the authors and editors of the Perl Cookbook have managed to retain what is good and of value in the first edition. They've managed to fine tune the existing recipes for the latest version of Perl and have added many more recipes to keep up with the developing needs of Perl programmers. This was quite an undertaking on their part, and they've succeeded nicely.

Russell is a Perl programmer, MySQL developer, and web designer living and working on a consulting basis in New Orleans. He is also an adjunct instructor at a local college where he teaches Linux and other open-source software. He can be reached at doug [email protected].

Perl Cookbook, Second Edition
Tom Christiansen and Nathan Torkington
O'Reilly & Associates, 2003
1000 pp., $49.95
ISBN 0596003137

SQL Server Performance and Tuning

review by Douglas J. Reilly

One of the more interesting things about Microsoft's SQL Server is the volume of information available to help developers and administrators make it work better. That there continue to be new books focusing on SQL Server 2000—a three-year old version whose successor, known as "Yukon," has already been introduced at Microsoft's Professional Developers Conference—is a testament to the community that exists around it.

The first book I examine is actually a somewhat older title, copyrighted 2001, but relatively new to me. Microsoft SQL Server 2000 Performance Optimization and Tuning Handbook, by Ken England, is a replacement for the author's previous The SQL Server 6.5 Performance Optimization and Tuning Handbook--one of the first books I read when moving to SQL Server 6.5 years ago.

The book begins by covering the basics of how SQL Server organizes databases. England covers both the underlying details and the tools that let you explore those details. While it is possible to do almost everything you need to do with Query Analyzer and some SQL Script, the reality is that most users use Enterprise Manager to do much of their maintenance.

Next come three chapters on indexing. These chapters go beyond the basics of indexing and go into what makes a good index, and when to use clustered and nonclustered indexes. Just as important (and something missing from some similar books) is the examination of the impact of indexing on performance of inserts and updates.

The meat of the book is the chapter on query optimization. This single chapter makes up more than 100 pages of the 370-page book. Once the basics of query optimization are covered, the chapter covers use of the client tools, especially the graphical Showplan offered in Query Analyzer.

The balance of the chapters cover SQL Server's use of memory, disk, and locks, followed by a chapter devoted to the Profiler--the most underused but useful of SQL Server tools. This chapter alone is worth the cost of the book if you work with code that you do not fully understand. Seeing what the program is actually doing can help tremendously when trying to uncover a difficult performance problem. For instance, I have used the Profiler to see why Crystal Reports was performing terribly when using OleDb, yet performing wonderfully using the same SQL code hitting an ODBC data source. Interesting indeed.

The Guru's Guide to SQL Server Architecture and Internals by Ken Henderson is a new book, copyright 2004. You might wonder if a new book on SQL Server 2000 makes sense, given the new version of SQL Server waiting in the wings. In the case of this book, yes, it certainly does.

This is a huge book (about 1000 pages) and the level of detail and the breadth of Henderson's knowledge of SQL Server and the Win32 world in general is breathtaking. The first half of the book covers things that no other SQL Server book I have seen covers. In fact, while the book focuses on SQL Server and how it interacts with the Win32 environment, I would argue that the first 400 pages or so would serve as a complete introduction to server-side development.

About 300 pages into the book, I was questioning how important all of this stuff was. Processes and threads, memory handling, I/O (including a good discussion of I/O completion ports and their impact on server scalability), networking, and COM are all covered in some detail. It only took me about 30 pages into Part II to realize that the initial portion of the book was required for the level of detail offered in the balance of the book. Henderson was indeed smart to include what he did in the first section. If you have ever had any question about what SQL Server is doing, you will be able to figure it out after reading this book.

For instance, if you have ever used SQL Profiler on a program and didn't see a line of SQL that you're certain the program emits, it could be because the profiler specifically has code that filters out any call to sp_password, presumably to ensure that the new password won't be sniffable by unscrupulous administrators. As luck would have it, the code that does this is primitive, so even adding --sp_password (that is, the literal sp_password in a comment) causes the line to be missing from the SQL Profiler's output. This is certainly not the most useful detail in the book, but it does point up the level of detail.

If you can buy only one of these books, which is it? If you are just interested in SQL Server from a somewhat naive programmer's standpoint, you might be happier with Microsoft SQL Server 2000 Performance Optimization and Tuning Handbook. This is a more traditional book covering SQL Server performance issues, and it does a good job at its intended task.

On the other hand, The Guru's Guide to SQL Server Architecture and Internals gives you a more detailed view of what SQL Server does and how it does it. There is also lots of code in the book, and a great deal (especially Part I) is in C++. Henderson's response to possible complaints concerning his use of C++ is, essentially, "get over it." That's reasonable. You don't need to understand every line of code, and most programmers familiar with any language can follow along, given his excellent explanations. And best of all, the code is included on a CD, making it easy to follow along in your favorite editor.

Douglas is president of Access Microsystems and can be contacted at [email protected].

Microsoft SQL Server 2000 Performance Optimization and Tuning Handbook
Ken England
Digital Press, 2001
370 pp., $47.95
ISBN 1555582419

The Guru's Guide to SQL Server Architecture and Internals
Ken Henderson
Addison-Wesley, 2004
1040 pp., $54.99
ISBN 0201700476

Book Excerpt: How to Fail at Software Development

This article is an excerpt from the book How to Fail at Software Development by Arthur Griffith. The book is available at your local bookstore and online through Amazon and Barnes & Noble. The web site for the book is http://www.anchorpointbooks.com/ and you can reach the author through [email protected]. Arthur Griffith worked as a programmer and a project manager for twenty five years before becoming a writer in 1997. His varied software experience includes embedded systems, telephony, insurance company data processing, and nuclear power plant construction. As a compiler writer, he has implemented four languages. Among his books are GCC: The Complete Reference and COBOL for Dummies.

How to Fail at Software Development
Chapter 12

It is possible to spend so much time planning and working on preliminaries and prerequisites that the actual work never gets done. This sort of activity can be justified because it masquerades as actual work.

For example, everybody knows that there must be documentation written before you can begin writing software, so you first need to decide exactly what documents are to be written and what they are to contain. Before you can do that, decisions need to be made about who is going to write them, so first it is necessary to staff the committee to review word processors. But first...and so on. All of this effort can go on for days and days and you haven't even started designing yet. Handled in a properly bureaucratic way, this sort of thing can delay the actual start indefinitely.

For those who don't understand the difference between productive work and bottle washing, let me provide some examples.

* Discussing which is the better of two text editors is not developing software.
* Exploring possible options for archiving source files is not developing software.
* Producing metrics measurements of programmer productivity is not developing software.
* Making changes to the requirements is not developing software.
* Discussing who is responsible for what is not developing software.
* Installing a new compiler is not developing software.
* Holding a meeting is not developing software.
* Changing the names of files and directories is not developing software.
* Forgetting your password again is not developing software.

Levels of Levels

Some people have a bureaucratic mentality. This is apparently a genetic mental condition, much like being a sports fan or believing what a politician says. It manifests itself in the belief that once some sort of multi-faceted organization is in place, something good will come out of. If this kind of mentality takes hold in your project, it is quite likely you will create an organization that does nothing other than fulfill its own internal requirements. And almost none of the internal requirements have to do with software development.

Studying this kind of organization is difficult because watching it do what it does is similar to watching a recently sliced apple turn brown. You know something is happening, and you can see the long term effect it's having, but you can't actually see it move. But some facts about it can be discerned if you can stay awake long enough. For example, there seems to be a critical minimum number of full-time people who require the presence of one other full-time person just for support. The number seems to be between five and seven. But the number can be smaller in a group with a more inherently bureaucratic mind set. In a highly refined and tuned bureaucratic setting the ratio can be pulled all the way down to one to one. This is the ultimate level of bureaucracy at which point the entire team does nothing but support itself.

But to completely lock up a project so it cannot move forward, it isn't necessary to bring it all the way down to a one-to-one ratio. Even with people who would otherwise be productive, it is possible to build a support organization sufficiently complex to prevent anything from getting done.

The following list is made up of job titles for things that must be done. By staffing up to the point that you have a separate individual responsible for each of these tasks you will almost certainly impose the disabling weight of crushing gridlock.

Systems Analyst. This person is a data scout. You can constantly have your systems analyst going outside to try to find out how the customer does their job, and why they are so successful not using your software that they can now afford to pay you for software development. The information retrieved by a systems analyst provides no end of options for changes to the specifications and writing more documents.
Programming Technician. This person acts as an assistant programmer. The work involves copying files, keeping track of the various versions of things, entering source code into the machine, writing small programs, compiling software, running quick software tests, and so on. Basically, this is a programmer that costs less.
Office Administrator. This is someone who specifies where the desks go and who sits in what chair. Among the duties of the administrator are making certain that everyone has a stapler, handing out paychecks, memos, and those strange little messages that come from human resources.
Secretary. Besides being the only person that actually knows what's going on, both in this office and in the rest of the company, the secretary does all the stuff the office administrator was supposed to do.
Technical Writer. This person has a powerful command of the English language and has the job of documenting the system. The technical writer takes the undecipherable jargon and grammatical disasters scribbled by those who develop software and converts it all into grammatically correct and easy to read undecipherable jargon.
Clerk. This person keeps track of all the paper work and information produced as the project proceeds. Along with filing every document and program, the clerk also maintains a running history of the project. This history is often verbal. You will find the clerk going into someone's office or cubicle and starting a conversation with something like, "Do you know what that idiot down the hall did now?"
Network Administrator. The apparent purpose of this person is to wonder about things. You will hear the network administrator say things like, "I wonder why that same server goes down every morning at ten o'clock" and "I wonder why those three users can never access the printer."
System Administrator. Fundamentally, the system administrator's job is to listen to people. He listens to vendors explaining the wondrous new stuff they've got that can be used to monitor the network. He listens to the project manager talk about security. He listens to complaints from members of the software development team and says things like, "I'll get that fixed." and "Are you sure?" and "Really? Did that happen again?"
Language Expert. This person knows more about the project's programming language than the guys who the wrote the compiler. He can write one-line programs that can do amazing things—like a program that can reproduce its own source code while simultaneously producing large prime numbers. It's always been a shame that these things serve no actual purpose.
Interface Designer. This individual has responsibility not for laying out the screens, but for specifying the fonts, size, color, and artwork that determine the user's experience when interfacing with the software. His expertise in this area comes from the fact that he has spent eighteen hours a day playing computer games since he was twelve years old. The quality of a human interface is measured on the cool scale. The levels, from best to worst, are awesome, real cool, cool, bogus, and broken.
Database Administrator. During the first half of the schedule for the project the database administrator is installing the database software on a computer, or a group of computers, that are dedicated for that purpose. During the second half of the schedule, the database administrator is making changes to the database installation to keep up with the constantly changing specification. During the third half of the schedule the database administrator is trying to make the thing run fast enough to be usable.
Software Tester. This individual has been hired to repeatedly test the software being produced by comparing it to the software described in the specification. Day after day he runs the same programs over and over and writes down the discrepancies he finds and reports them to the programmers. This person is usually a down-on-his-luck programmer whose area of expertise is no longer in demand, and he grabbed this job so he could eat while he is taking courses in some of the new stuff. Software testers are usually very easily irritated.
Quality Assurance. This individual (or this team, if it's a large project) spends almost all of its time doing research with the idea in mind of framing lots of procedures to be followed to certify that a quality product has been produced. Usually, however, by the time the software is complete enough to warrant the use of some of these procedures the entire project is so far behind schedule that there is no time for anything beyond slapping an approval sticker on the box, releasing the software, and hoping for the best.
Programmer. Actually, if you have adequately staffed the other positions, staffing one or more of these positions is not entirely necessary. But you may need to have some around to justify the existence of the project. Or you could refer to the overall task as a system integration project and just play around with software written by other people.

Tool Polishing

A programmer can be banging away at writing and debugging code all day every day and get nothing done. Programmers like tools. And most programmers like to make their own tools. A good tool can be a terrific time saver and an accurate short cut for getting a job done, but a poorly crafted tool can be the long way around. Some tools are just too cute or too baroque for their own good.

It may be very handy to have a script that will, say, copy all the files of certain types from one place to another while changing all the names to create a new version of each one. However, it's not worth spending a day to write such a script if it's only going to be used once or twice.

Even a very useful tool can be costly. What often happens is that the project's established procedures for developing software causes half, or more, of the programmers on the team to want the same tool. So each one develops his own version of the tool. If writing the tool takes half a day, and half the team develops his own version of the tool, the entire project has just slipped by a quarter of a day. By the time four tools have been written this way, the project has lost an entire day. And only half the team has access to these new tools. And software being what software is, only half the tools will work right. And one will be so buggy it will reduce some files to being fond memories--the result is a situation from which it may take days to recover.

The truth is that there is no way out of this situation. It's going to happen. Trying to prevent it only makes things worse.

If you set up a standard policy that tools be published for everyone to use, three things will happen. First, enough time will be taken that every tool will be polished to a high shine because every programmer wants his stuff to be well thought of by other programmers. Second, some programmers won't like the way a published tool works and will write and publish their own versions anyway. This means that not only are several tools written to do each job, but now every programmer has access to the one that damage the files. Third, programmers will discover a published tool that they like, and wouldn't have thought of on their own, so they think about it a while and then write their own versions of it anyway. The result of all this is that more time, not less, is spent on tools.

Another thing you could try is to appoint one person as the tool writer. The tools he writes are made available for use by the entire team. But the other programmers won't like the way he does things and they will write their own private versions anyway. The same amount of time will be consumed and the project will be plagued with just as many rogue tools as before.

The Great Space Saver

It's difficult to make general statements about doing the wrong job because each situation is unique. There are an infinite number of wrong jobs available, so you have to be able to recognize one when you see it. Of course, if your intentions are to overburden a project with useless work, then all you need to do is accept everything at face value as a job worthy of being done.

An example of this occurred during a software development project that had been turning out small programs for about two years. There were about ten programmers involved and the whole thing was happening on a collection of minicomputer systems in the 1980s. Disk space was very expensive in those days, and this particular operating system had a peculiarity that would eat up disk space. Whenever a program created a new file, the file would be allocated a large amount of space. If the program failed to close the file using the correct sequence of system calls, the large size would remain allocated to the file even though it wasn't all being used. And it seemed to be standard operating procedure to not use the correct sequence of system calls.

A junior member of the team did a eureka one day when he discovered a command that would reduce the allocated size of a file to the amount actually being used. He presented his find to the project manager and received approval for the time necessary to go through the system and shrink all the files. After spending about two weeks doing nothing but going from one directory to another condensing files, his activity came to the attention of a senior member of the team. The senior member kindly demonstrated another command that compressed all the files on the computer at once. The job was completed in less than a minute.

The senior member then explained that the files were not really that big anyway. The size reported was the original potential extent to which a file could be expanded, not the actual size.

It turned out that not only was the junior member spending a lot of time on this task, nothing was being accomplished. This example should serve as another reminder to RTFM.

RTFM (are-tee-eff-imm) a. (Read The Manual) It seems to be a very well kept secret that all computer gurus have become computer gurus by reading computer manuals. It's easy to do. The manuals are sitting right there. You just pick one up, start at page one, and keep reading. If everyone were to read the manuals then there would be no need for computer gurus. Computer gurus are not worried.

guru (goo-roo) n. 1. A person who knows more about the computer system than anyone else in the room at the moment. 2. Someone who once answered a question asked by an individual who had not read the manual. 3. A computer expert.

Rearranging

A large and very high profile retailer had, over the years, developed a sophisticated cash register system. It consisted of a personal computer with software that was close to fifteen years old, and over the years it had been modified to within an inch of its life. Another feature was to be added to it and a contract programmer was hired to do the job. Because of the complexity of the system and the strangeness of the requirements, the development project from design through final testing was estimated at six months.

This was a classic example of a bureaucracy-driven dead end task. It would have been a multi-year task to fix the whole system to handle anything other than two-digit years. And this was being done in 1999. If the project were to be finished on schedule, it would start working and be deployed in stores across the country just in time for the Y2K event.

Adding this new feature was rearranging deck chairs on the Titanic.

But all was not lost. It turned out that a shiny new point of sale system was being purchased by upper management, and it was to be installed for the new year. The new system was flexible enough that the new feature could be added by making some entries in configuration files.

But nothing stops a good software development project. The new feature to the old system was completed, tested, accepted, paid for, and thrown away.

Switch From Scratch

Technological advances can cause entire industries to shift. A prime example of this has been the telephone industry. As things moved from mechanical relays to digital logic for making telephone connections, one equipment manufacturer after another developed its own version of a computerized system for making connections.

A modern telephone switching system is fundamentally a collection of computers that all share one large block of memory. Communications are achieved by digitized audio being stored in memory by one computer and retrieved by another. Each computer is attached to an outside phone line. The result of all this is that a telephone system today is mostly composed of software.

An existing telephone equipment manufacturer started a project to develop its own switching system. The company had no prior experience in software development, and hired a team of highly educated, and therefore highly qualified, individuals and set them to work on the problem.

They began by selecting the fundamental hardware architectural elements, such as the type of memory and the computer processor. Of course, a computer has to have an operating system, and there were several available, but they couldn't find one that suited them exactly, so they decided to write their own.

The next objective was to select a programming language and a compiler for it. It made sense to use the same language throughout for all the software, including the operating system. But they couldn't find a compiler that did everything they wanted it to, so they decided to write their own. With this decision they realized that they were no longer limited by the language choice. They could make the language capable of doing everything they wanted it to do. So they made the decision to design their own language.

It was almost a year later, as the programming language was reaching its final stages of design, that the whole project was cancelled.

An Ideal Replacement

A small software development firm was formed and won a contract to develop a system to monitor a large and widely scattered collection of equipment. The purpose of the software was to detect faults and report conditions exceeding preset tolerances. The system was complex enough that, although the task was fundamentally simple, software development took the better part of a year. It was installed and, after a bit of tweaking, the customer was satisfied.

That left the software development company with nothing to do. But they had the leftovers of the successful development project--pages and pages of notes about what could have been included as part of the system. This was information that had come to light during development. None of these nifty new ideas had been included in the original software because they had been closely focused on finishing the job. But they decided that the time had come to cash in on this extensive list of insights and brilliant ideas.

A proposal was written to produce new software that would include everything. It would be capable of accepting complex rules and equipment relationship definitions from the user--rules that would examine the data from several related pieces of equipment and reach conclusions about the current status of the network. It would also have a window and mouse interface instead of simply scrolling simple text up a screen. The proposal bounced around the offices of the client company until, one day, it was approved.

Work began. Everything was brand new. The software was being developed from scratch. It was whistle and bell time--this new software contained everything that could be imagined to be a part of the system. This alone was a formula for disaster, but we will never know how it would have turned out.

The folks at the large company who were actually using the original system came up with some features they wanted to have added to the existing system. None of these features from the actual users were in the plans for the new stuff being worked on because software developers designed the new system, and they never imagined these particular features. It's always surprising what pesky problems can be generated from the minds of actual users.

The result of all this was that development on the new system was halted while everyone turned back to working on the additional features for the existing system. The developers wanted to get back to work on the new system, so they buckled down and updated the existing system quickly. They wrote some, shall we say, less than elegant software to get the job done. The result was software that was far from elegant, but that was okay, see, because they knew the software was only a temporary solution and would be replaced by the new stuff real soon now. So the modified software was installed, put through its testing, and, after a few tweaks, was accepted and went into production in place of the first system.

The little company then turned its attention back to the new system, which was now obsolete because it didn't contain any of the new stuff that had just been added to the old system. This required a new round of specifications and approval. After that was all done, they pitched in again on developing the new wonder system with the intelligence and the windowing interface.

But history repeated itself.

Once again, the real users came up with some stuff nobody had thought of before. And once again the new project was put on hold. The customer insisted that the existing system was to be upgraded again. And this was going to make the design of the new obsolete again.

There is a moral to this story. There must be. I just don't know what it is.

How to Fail at Software Development
Arthur Griffith
Anchor Point Books, 2004
264pp., $24.95
ISBN 0974550302

Jerry Pournelle Recommends...

The book of the month is by Thomas Powers, Intelligence Wars: American Secret History from Hitler to Al-Queda. Powers is not nonpartisan, but his is probably the most objective account in the unclassified press. His biography of Richard Helms was based on interviews with Helms, and Helms himself vetted at least two of Powers's books on intelligence. The intelligence game is shadowy, and no history of that period can be complete without some reference to my old mentor, the late Stefan T. Possony (who doesn't appear in this book at all); but this book is well worth reading for its broad scope and sometimes it gets into depths that surprised me. It's certainly a better book than most of the conspiratorial volumes. The computer book of the month is Mitch Tulloch's Windows Server 2003 In A Nutshell. For Windows 2000 Server you will want Roger Jennings's Using Windows 2000 Server (Que, 2000; ISBN 0789721228), which is complete and invaluable in my efforts to get my Macintosh and Windows Active Directory Network in communication. I don't believe there is anything similar for Windows 2003 Server, and the O'Reilly book, while more reference than introductory, is the best I've seen on 2003 Server so far.

Intelligence Wars: American Secret History from Hitler to Al-Queda
Thomas Powers
New York Review of Books, 2004
500 pp., $16.95
ISBN 1590170989

Windows Server 2003 In A Nutshell
Mitch Tulloch
O'Reilly, 2003
650 pp., $39.39
ISBN 0596004044

Using Windows 2000 Server
Roger Jennings
Que, 2000
1392 pp., $49.99
ISBN 0789721228

Recent Releases

Game Programming Gems 4
edited by Andrew Kirmse
Charles River Media, 2004
600 pp., $69.95
ISBN 1584502959

Charles River Media has released Game Programming Gems 4. Featuring over 60 new techniques, the book is a definitive resource for developers. Written by expert game developers who make today's amazing games, these articles not only provide quick solutions to cutting-edge problems, but they provide insights that readers will return to again and again. Most code is written in C++, but some interpreted languages (Java and Python) are also represented. The graphics articles make use of OpenGL, DirectX, and the various available shader languages. Volume 4 also includes an all new physics section that teaches innovative techniques for implementing real-time physics.

Linux Pocket Guide
Daniel Barrett
O'Reilly, 2004
191 pp., $9.95
ISBN 0596006284

The new Linux Pocket Guide by Daniel Barrett provides a useful alternative for new and experienced Linux users who need a quick and handy means to look up Linux commands.

Mac OS X 10.3 Panther Little Black Book
Gene Steinberg
Paraglyph Press, 2004
576 pp., $29.99
ISBN 1932111867

Paraglyph Press is pleased to announce the publication of the Mac OS X 10.3 Panther Little Black Book. Written by best-selling Mac author and expert Gene Steinberg, this latest version of the Little Black Book on the Mac operating system is designed to be a problem-solving guide and complete reference to help users get the most out of the Mac OS X 10.3 (Panther) operating system. The book provides valuable upgrading and troubleshooting tips in a concise format.

Software Forensics: Collecting Evidence from the Scene of a Digital Crime
Robert M. Slade
McGraw-Hill Professional
215 pp., $39.95
ISBN 0071428046

The first book on software forensics—analyzing program code to track, identify, and prosecute computer virus perpetrators. Written by a certified CISSP trainer and software forensics specialist, it addresses a new and rapidly growing field, crucial to both corporate and national security.

Check Point Next Generation with Application Intelligence Security Administration
Chris Tobkin and Daniel Kligerman
Syngress, 2004
608 pp., $59.95

ISBN 1932266895

This book covers the basic concepts of security and how to configure a simple firewall—all the way to advanced VPN and firewall scenarios. Written by experts in the field as well as certified instructors to give the depth desired by the most advanced users, the book also serves as a study tool for Check Point's Exam 156-210.4 and covers all the new features such as the SmartDefense options now available.

Contact Us

To contact Dr. Dobb's Programmer's Bookshelf Newsletter, send e-mail to Deirdre Blake, DDJ Managing Editor, at [email protected].


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.