Channels ▼

Cameron and Tracey Hughes

Dr. Dobb's Bloggers

Altair 8800 vs. Watson in the Cloud

April 07, 2011


We're always paranoid and suspicious on April 1st. Maybe it's because we've pulled so many pranks on so many friends and foes that we realize we're bound to get nailed sooner or later. Or maybe it's because we laugh so hard when others get caught. Whichever case, we proceed very cautiously with everything on April 1.

The day was almost over and we had spotted and subverted all attempts at turning us into April fools. As the day was drawing to a close, we were hanging out in one of the weird circles that we hang out in arguing over the reason why the Dragon of Dojima has been wearing the same clothes for Yakuza 2, Yakuza 3, and Yakuza 4, when one of our associates asked me, "What are the processor instructions that support parallelism at the hardware level?" First of all, it was a suspicious question coming from the one who asked it. Secondly, what an odd question given the current circumstances (we had just found out that it was the CIA, and not the Chinese Triads, that were behind the recent hits) ... I did say weird circles. So I immediately tried to see the April fool's joke angle to the question. I looked at my mate's face and he didn't break a smile. He was quite serious. As a matter of fact, everyone kinda had a serious face, looking at me, waiting for the answer. I'm thinking to myself , there's a punchline in here somewhere, I just don't see it. But to my pleasant chagrin, it was a serious question. I guess the sobering thing here is this question seemed so unlikely at the time that I thought it was some part of an April fool's day joke.

Look how far we've come (I think?). I don't know whether to say how much we've grown, or how far we've fallen. Is it for the better or is for the worse? In the early days of the personal computing revolution, there was something called an Altair 8800. It was one of the earliest popular personal computers. I won't digress with a lot of hand waving trying to demarcate the differences between a mainframe computer, a minicomputer, and a microcomputer, which almost has a "who cares" designation these days; but early on, the idea of having your computer was a really big deal. The Altair 8800 could be ordered from hobbyist-type magazines like the Popular Electronics. Some of the more sought-after versions of the 8800 came in kit form. Yep, it had to be put together and I don't mean just plugging in the monitor, mouse, and keyboard. Most proud owners of the Altair were excited about every little detail, from the S-100 bus to the 8080 processor. Upon querying a proud owner, you could get any level of detail that you might imagine, from the teletype specification to the RS-232 specification compliance of their Altair. Lots of talk about the 100-pin edge connector, data buses, voltages, the whole kit and kaboodle ... I mean scary-level detail about the computer. There was a time when the processor capabilities, architectures, and instruction sets were the subject of bragging rights: You know the "my-processor-has-bigger-and-faster-registers-than-yours" variety. Here we are now in one of the most exciting times for processor development and evolution, where not only can an individual have their own microcomputer but individuals may own super computers because of the low cost and advancements in chip multithreading. Super computers used to be the prerogative of universities, government agencies, and research facilities; but now with a modest investment, anyone can own super computing or cluster computing capability. But ironically, in many circles, the conversation and focus has shifted from knowing "what's under the hood?" to "how many places have you traveled in the car?"

More and more I hear, it's all about SaaS (Software as a service), PaaS (Platform as a service), HaaS (Hardware as a Service), aka the Cloud. It's all in the Cloud. I stroke my Android 4G and there it is, the app that solves whatever problem that I don't know I have yet. And not only that, I don't have to worry about hardware configurations, software compatibility, load scalability, because the cloud will handle it. Just touch, slide, and tap. My droid is wireless, I have no idea where the computers are that are running my miracle app, or what their capabilities are, or what platform they're running. Maybe they're not computers at all. For all I know, it could be thousands of well trained gerbils taught to retrieve information and process information reliably. It's in the Cloud! And one of the distinguishing features of the Cloud is it protects you from the details of how things are done, who's doing them, and what they're doing them on.

I know we're going to get e-mails accusing us of being too cynical. But here's the deal, we've blogged about the turning point in history that we've just witnessed as we watched the Watson computer defeat a couple of human wizards at the game of Jeopardy. We blogged about the complexity of the parallelism, natural language processing, data mining, information retrieval, text mining, and other artificial intelligence techniques that are at work when Watson is doing its thing. And we noted how removed so many of our technical friends are from this type of technology. Everyone watched Watson on TV beat the Jeopardy wizards, but relatively few on the planet truly understand how Watson works. Within days after Watson's performance, Tracey and I were in conversations that were placing Watson in the Cloud for the use of medicine, law, government, etc. Watson for most is spooky enough, now we're gonna put Watson in the Cloud? We're not cynical, really we're not. Actually, we're trying to keep our sense of humor about the whole thing. But we can see some corollaries emerging here.

The easier the technology is to use for the masses, the fewer people understand how the technology works. The fewer people understand how the technology works, the more they have to pay for it in the long run. Well maybe those aren't corollaries, maybe they're just fears: (Cloud == Magic, or Worse!), but in the days of the Altair 8800 and the other early personal microcomputers, computer geeks prided themselves in how close they were to their computers. Now, at least one class of computer geek prides themselves in how far they are away from the details of their computers.

I've been hearing so much Cloud talk lately, until a valid question about processor instruction support for multicore architectures on first hearing appeared to me to be an April fool's day joke. But the fact of the matter was, it was a very interesting and exciting question. First, there is the recognition that a processor has an instruction set (long lost fact for many). Second, is the question of how or if that instruction set has been impacted by inclusion of multiple processors on a chip,or multiple threads within a processor. Third, there is his discussion that demarcates the micro architecture instructions from the macro architecture instructions. So the question is really worth pursuit for those that are interested. The answer would be different, of course, depending on which processor architecture we were talking about, whether it's Intel's IA 64 architecture, with its support for code bundles at the assembly language level, or if we were talking about Sun's Ultra Sparc T1 processor with its 8 core, 4 hardware thread architectures, pipelining and memory cascading schemes, or what processor instructions (if any) support the AMD Opteron's hyper-transport architecture. I'm giddy just thinking about all of the elegant architectures out there. This is the kind of question that a proud owner of an Altair 8800 would have made it their business to answer if computers had multiple processors in those days. First, they would have come across the notion of instruction-level parallelism versus task-level parallelism.

The idea of parallel programming at the assembly level is definitely not for the weak of heart, and is exactly what my buddy was asking about on April 1st. Depending on the architecture under discussion, there is more or less support for parallelism in the instruction set. But a discussion about the capabilities of the instruction set, at that level, is what engendered pride in one's microcomputer.

Aha! My processor has instructions optimized for multi-threading and your's does not. My assembler allows the programmer to bundle groups of instructions to be executed in parallel and yours does not!

But the Cloud geeks have no idea whether Watson even has processors, let alone whether Watson's processors have support for parallelism at the instruction set level. I guess that means Tracey and I haven't quite made it to the magic and wonder of the Cloud yet.

Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.
 


Video