Overlays, EMS, Cooperative Multitasking, and Invincibility
If you've ever played Metal Gear Online or Modern Warfare 2 Online, then hopefully you've had the good fortune of being on a roll once you've learned a map or stage.
You know, when you see them coming long before they see you and it seems like you just can't miss. You've been playing for months now and you've got a ratio of 4, 5, 6 to 1 and everyday, every shot just seems to get easier and easier. Everything is going your way, and for a moment its like you're invincible.
That is until the game vendor decides to put out an innovative upgrade that you didn't exactly ask for. Sure you could hold out and play the old version but all of your mates will have the new version, so you upgrade too. Now for some reason your productivity goes down. Your ratio goes negative and it seems like you can't win for loosing. You realize that you're stuck with the new innovation and the only way to get back on top of your game is to go back and master the fundamentals and maps all over again. It's a very familiar refrain. I seem to run into it with suspiciously similar intervals both in gaming and software development, perhaps for obvious reasons.
The good old days! 8 MHz, 16 MHz, 33 MHz, 50 MHz, 90 MHz, 100 MHz, 500 MHz, 1 GHz, the processors were just getting faster and faster. Our programs would just take advantage of speedup without any effort on the programmer's part. And the RAM, oh my! the RAM: 16 MB, 64 MB, 256 MG, 1, 2, 3, 4 GB! Wow, the programs just got fatter and fatter. We could just add more and more features with no thought of efficiency. Really who cared about optimal algorithms and data structures? The users were happy and the money was good. Ah ... Shangrila!
For a while there we were on a roll. But for some reason it's always a temporary roll. It's like the easy roll your on when your going down a steep hill. It's fun while it lasts but it doesn't last forever and we seem to forget the initial climb.
At one point I remember having to break my programs into pieces called "overlays". Overlays were necessary if the program was to big to fit into main memory (RAM). You know, back in the days when 16k and 64k of RAM was considered respectable (I can still hear the Turbo Pascal 3 disk spinning in the drive). If the computer had 16k of RAM and your program was 128k in size then your program was a candidate for overlays. Tedious programming, having to measure the size of sub-modules, worrying about performance during overlay swapping, and memory management between overlays are all artifacts of overlays. Argh!
This overlay programming was a kind of do-it-yourself virtual memory management. Because RAM was so small, and hard-drives were not yet widely available, a programmer had to really know and control some fairly low-level details of program loading, swapping and memory management. Not for the weak of heart. Oh, but once 640k and the hard drive was standard. It was like being set free. No worries at all. Gone were the days of "segment not found" and overlay file read errors. We could just once again lazily write our programs, the operating system, and some kind of virtual memory manager would take it from there. We were on a roll, until somebody decided to add expanded and then extended memory to the mix. Although the application programmer escaped the details and drudgery of overlay programming, we got tossed right back into the soup with the tedium and error prone activity of memory management programming. We had to jump over special hurdles with special instructions and memory management libraries in order to access the 384k of memory on top of the 640k memory, or to access the 16 MG of memory once we jumped into 16-bit programming (in the P.C. world). But we did it.
After our joust with overlays, we had the tenacity to deal with memory management programming and all the add on libraries required (Pharlap anyone?)
We climbed the steep hill of overlays and the ride down the other side was smooth and fun. Programs able to breathe the fresh air of 640k. We climbed the steep hill of expanded memory and the EMS specification as well as extended memory and the XMS specification. And the ride down the hill was smooth and fun, allowing us to write fatter and more feature rich programs. It seems like innovation always provides us with a nice, fun, and productive ride down the steep hill. But as soon as the joy ride seems to be the most fun, innovation introduces us to another hill. Usually a hill that we weren't exactly asking to climb. My nice monolithic 16 MB programs were running just fine, the user's were happy, the money was good. Ah... Shangrila! Then all of a sudden we were told multitasking is good for you and is good for your users. Why just run one of your programs at a time when the user can run two or three of your programs at a time. So this time instead of only hardware innovations, operating system innovations provided us with another steep hill to rain on our parade. So we had to learn to conquer the ways of cooperative multitasking. Our programs had to be constructed and instructed to yield the processor and resources at the right moment and in regular fashion, or else you could run the risk of locking up all other running applications with inevitable result of the entire computer coming to an halt. If you yielded too soon then you would experience poor performance of your application. If you yielded too late, then the user would experience poor performance of other applications they were running. If you didn't yield, lock turns to crash. Of course, I see the point of it all now.
I recall being totally happy programming in a single process world. With 16 MB of RAM and processors getting faster and faster and no overlays to worry about, for a moment I felt invincible. But every time I start to feel like I'm on a roll and cruising down the hill, up pops some new hardware or software innovation that I wasn't exactly looking for. It wasn't enough that I had to add cooperative multitasking to my programs, then I had to add network capability. It wasn't enough that I added network capability, then I had to add Web capability, then mobile capability, then cloud capability. Each steep hill was followed by a nice worry free, fun ride down the other side of the hill. After the first couple of new hills, I had to go back to the fundamentals because the fun ride down the previous hill always relieved me from worrying about some aspect of programming, a tendency towards technical laziness. So that aspect that I was relieved from, I kind-a-forgot-it. But the hills kept coming and I kept having to go back to the fundamentals in order to get my bearing. So eventually I realized that the only thing that remains pretty constant are the fundamentals. So these days I tend to keep my fundamentals honed and undergo regular baptisms in software engineering.
Overlays, expanded memory with the EMS specification, extended memory with the XMS specification, and cooperative multitasking are all artifacts. All these innovations in the short-run made the programmer's (developer's if your wish) job more tedious, error prone, and challenging. All these innovations assumed a programmer's grasp of the fundamentals prior to implementation; innovations that in an odd way had the potential to lead to technical laziness. Although each of these artifacts are different, they all sung a familiar refrain when they were introduced. Atleast part of that refrain was 'here-we-go-again'. 'Here-we-go-again' the bridge that ties artifacts of the past to the future. Now here we are at the bottom of the steep hill of application-level, massive parallel programming. The hardware vendors are adding multi-threaded multiprocessor cores to the target application environment whether the application programmer (developer if your wish) wants it or not. The new bag requires multi-core and parallel programming skills.
Just like the shift from Call of Duty to Modern Warfare 2, or Metal Gear 3 to Metal Gear Solid 4, if you want acceptable positive productivity and your ratio to be 4, 5, 6 to 1, you're going to have to rely on the fundamentals. There is no replacement for a solid software design and development life cycle. Software modeling is your friend. A good knowledge of algorithms and data structures will never let you down. Once you get the hang of it, there will be moments when you'll feel invincible!