Building Websites with Findability in Mind – Stuart Colville [WSG notes]

Le MiIle Bolle Blu

Image by Ramón★Gris via Flickr

Who needs to find your content? Examples are readers, potential users, customers etc.

Basic requirements:

  • You need to have an understanding of your potential audience. If you look at your site with wish to make it findable you have to understand your audience.
  • Semantic markup is good for marking up.
  • Good context
  • Accessibility
  • Search engine friendliness
  • No barriers to indexation

Making content more appealing for your audience. Niche subjects have lot less competition from other sites. If you find your niche this is good. Originality is also good, re-mixing other peoples content is also really good. It’s also important to stay on topic for your subject. Also adding comments and discussions create lots of new content for your site and it also gives a new perspective to your content.
Powerful audience can create great stuff: user reviews, tagging etc.

Folksonomies (tags) are a great thing for your site. A basis for your search, and it can replace fixed categorization in your blogs. Tagging is a way for enabling association in a different way. Tags are keyword rich, that’s why search engines love them.

Biggest search engine around is still Google.

What can we do to make our content more findable?

  • Web standards
  • Semantic markup
  • Recommendation SEO test poster
  • Accessible content is indexable content

Markup: meta - it’s not such a bit deal anymore. Meta descriptions are still important as they are still used to describe your web page.
One good plugin for WordPress is “Meta-description plugin”.

Markup: Titles and Headings. You want to make title very relevant as this is going to be the title it’s going to show up in del.icio.us and people’s bookmarks. Headings, especially h1 element, with one h1 per page as it tells the search engine the most relevant piece of information.

Markup: Text. Strong emphasis elements do have higher weight, but it’s usually better to err on the side of something that is semantic, then to optimize for search engines.
Duplicate content, is not something that should be too worried about. Especially if it’s natural duplication.

Markup: Imagery. If you are doing something that’s a pure design element, background images are good for that. If you want to use an image, but has some data in it, you want to use image placement. If you are thinking about performance, spriting images is really great. Use all attributes correctly and make sure the alt text is set correctly.

Markup: Microformats. They provide a simple layer to your structure around common use structure to provide an easy way to read your content for your machine. There are some powerful tools around this: X2V – taking a site and extracting calendar and hCard information; operator plugin for Firefox. The real power comes from the sites that can then reuse this information to create new powerful products.

Markup: JavaScript. Unobtrusive javascript is the way to go. It’s important to not put content on your site that it’s not useful without javascript. The important thing is that if you need the content to be there just for javascript, you should generate/inejct the elements into the page with Javascript.

Too Stack on not to Stack? Different approaches to the problem. Single page with div switching, or emulating the whole experience with server-generated pages.

Performance and indexation. The more performance it’s good for you users and search engines. Last-modified headers are a good thing that will allow search engines to focus on your newest and greatest search content. One helpful tip is that if you have Google ads on the page, Google will visit the site more often as it reindexes page to make sure they’re still relevant.

URLs are important because it makes your page better for both users and search engines.

“A cool URI is one which does not change” – Tim Berners Lee

How to move your content around

De-indexing, when you want to remove some content from search engine. By providing a proper 404 you are telling search engine that it went away. Alternatively you can show 401, to tell that the page purposefully went away.

Robots.txt is something that allows you to setup rules and definitions what you want and don’t want the search engines to index.

Redirect 301 is the best way to tell the search engines that the page is permanently moved.

Putting it all together

Google Webmaster tools is an useful resource that will tell you how are you doing as regards the standards and how they see your page.
Google Analytics or log files are ways to learn about your page.

Summary

It’s important to think about findability before you write your code. Having a good content, together with semantic markup is something that you should always do. The content should be accessible with or without the javascript. You can control indexation with robots.txt and you can cater moving with 301 moved permanently header.

One thought on “Building Websites with Findability in Mind – Stuart Colville [WSG notes]

Comments are closed.