Building Findable Websites

Putting your link in front of the person who wants to see it


July 29, 2008
URL:http://www.drdobbs.com/open-source/building-findable-websites/209800309


Building Findable Websites: Web Standards SEO and Beyond
Aarron Walter
Peachpit, 2008
240 pp., $39.99
ISBN 0321526287

Whether a website's mission is to attract advertising dollars or simply to share information on something the owner is passionate about, a lot of time and effort goes into developing online marketing strategies and devising ways to make pages "sticky." This usually involves the "look" and "feel" of the site more than what's under the hood. Aarron Walter's Building Findable Websites: Web Standards SEO and Beyond branches off in a new direction. Its central theme is the importance of organizing and labeling information so that searchers and spiders can find it and put your link in front of the person who wants to see it. This isn't done by making the colors more eye-catching, but by optimizing the website under the skin...where developers live and spiders crawl.

Aaron Walter's goal is to get developers to take a hard look at how their source code and site layout affects the way search engines see their sites. In a time when more and more sites are putting Flash graphics up front and burying the words, this is an important message for anyone involved in web development. It also turns out to be a boon in terms of good e-citizenship, because the way search engines see the world--by reading text instead of seeing pictures -- also happens to be the way visually impaired people access the Internet.

Like any good author, Walter starts out by defining his terms. "Findablility," the author explains, "is the broader (than Search Engine Optimization) discipline that unites all strategies to help your audience find what they seek." This is an interdisciplinary skill that has implications for project managers, who needs to explain findablility to the client, and information architects, who need to develop keyword strategies and use them to label site sections to drive traffic internally. It also impacts the designer, who needs to highlight opportunities for user interaction such as signups and mailing lists, and the copywriter, who needs to create content that contains the desired keywords.

At the end of the day, however, it's the developer who has to draw all these aspects together to create a structure on the web that presents all these elements to search engines and users alike. This is often tedious, exacting work that goes unseen by the client and project manager, and hence falls out of the spotlight. The developers, therefore, need to be proficient not only in web standards and best practices, such as using alt text for images and separating content from style elements, but also in articulating the value of these techniques to the success of their website.

Though I've been a web developer for some time, the book framed a number of important issues that I either hadn't thought about, or had failed to properly appreciate. For instance, I'd never given much thought to the role of 404 pages beyond the obvious -- alerting a visitor that the page they wanted wasn't there. Accurate as that may be, few things are as effective in driving traffic away from your site. Well-designed 404 pages, on the other hand, use the opportunity to lead users to the correct or alternate content and help them find what they need quickly and efficiently. Poorly designed pages frustrate users, and unless they are exceptionally tenacious, they will click away to another site in hope of finding what they need more easily. Another frustration, of course, is slow loading pages, which also shunt traffic away. If you notice your pages are loading slowly, Building Findable Websites recommends yslow, a Foxfire add-on diagnostic tool that extends Firebug to get information on what's holding up pages from loading.

An interesting bit of wisdom is to use the "If-Last-Modified" request header to let search spiders bypass pages they have already seen, but which haven't changed. Nice for the bots, but I'm not sure that they care enough to up the sites SEO score. Not a bad practice, but more because it shows you to be a good web citizen than for any measurable improvement in site findability. More good citizenship comes from space management -- keeping files and names small so bots can zip through them. Walter recommends a compression tool, gzip, to shrink them. Fortunately Walter also outlines a strategy for doing this automatically, or the manual effort would get old fast.

I did not agree with all of his rules, such as using .htaccess customization to keep from using dynamic parameters. I don't doubt that it has some merits, but I'm not big on customization. It tends to require upkeep when anything else changes, and unless you are keen on paying a lot of time or money on website maintenance, this seems like more effort than its worth. It's also predicated on the belief that search engines can't parse php, which boogles the mind.

It turns out that there are tweaks to be made on the server side as well, including using descriptively obvious names for folders, files, and domains, as search engines pay attention here as well. An interesting side note about domains is that they age well, but travel poorly, because search engines pay attention to stable domains but restart their clocks when the owner changes. So, you can buy a name that has relevance, but you can't buy the goodwill from its former use.

After dealing with sever issues, Walter gets to "Creating Content That Drives Traffic" bringing us back to the reality that viewers may come for the bling, but content is king. No one can argue with admonitions to create useful and unique content, but it seems more the domain of copywriters than developers. This is one of the few times the author strays from his "developer as driver" theme. Subsequent discussions of RSS (Really Simple Syndication) and UGC (user generated content) are more in line with the core theme of the book.

Keyword tracking with Wordtracker, evaluating keyword density, and other tips about this critical piece of the puzzle are included as well as a detailed overview of the value and tools for adding RSS content to your site. While I don't disagree that RSS-fed content is an easy way to populate your site, aggregating content seems at odds with the "useful and unique" content approach. One variation might be to pull content from RSS feeds, then comment on it to provide unique value for the reader, while also offering links to the original content.

Many fledgling web developers and bloggers will find his section on creating "The Findable Blog" to be particularly useful. It is full of good code bits and information on WordPress plug-ins and organizational tips. Focused content, linking to be linked, making your own template, putting keywords in titles, organizing archives, creating abstracts to drive traffic, and hints on working with WordPress are all covered in detail. More might be said about using social networking sites to bring traffic to your site, but that would stray from the central theme of the book.

While this book reads perfectly well from a marketing theory standpoint, it's not shy to offer code to put money where its mouth is. The supplementary website for the book includes the code in the book, and considerably more, as well as five bonus chapters not in the book at all -- Free Search Engine Tools and Services, Building Viral Marketing Tools, Places to Promote Your Site, Analyzing Your Traffic, and Black Hat SEO Techniques to Avoid.

Overall, Building Findable Websites lays out an excellent collection of tools, tips, and strategies for making websites findable, and the author follows up by providing the code and other tools you'll need to follow his guidance. Anyone serious about getting their website found by the bots should give these approaches some thought, because the battle for eyeballs on the web, as any arms race, requires constant innovation.

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.