What’s the secret to SEO? I’ve seen this question pop up a lot recently, so I figured I’d reiterate my thoughts here.
First off, sure, let’s mention the little things. Always put alt text on your images, use clean and concise URLs, put header tags around important text in a sensible manner, avoid using JavaScript and form posts for navigation, etc. These are all good things to do, but there’s a bigger, more important rule: never forget that you’re building a website for humans.
Years ago, the SEO landscape was very different. It was all a game to see who could exploit search engine bots the most. The web was littered with mass gateway pages, excessive meta keywords, irrelevant search terms, invisible text, and all sorts of other annoying things that appealed to bots, but were useless to humans.
This was a problem for search engines. Why would a human use a search engine that only found bot bait? To be successful, search engines needed to close the gap between what a human likes and what a bot likes. The industry has made huge leaps in this regard (Google, particularly) and is continuing to evolve to this end. Because of this, if you build a good site for humans, then the SEO will follow.
All those bot exploitation tricks I mentioned? None of them work anymore. Search engines have made these short-sighted tactics obsolete since they didn’t provide any value to humans. For good, dependable search engine rankings, a site needs to provide good content, have natural popularity, and employ sensible SEO practices that don’t resort to excess spam and brittle tricks.
The New York Times recently ran a great article about the dirty little secrets of search. It discusses both sides of the coin, from people that consider black-hat SEO necessary, to people that have been bitten for gaming the system. Of particular interest is JC Penney’s story. They held the top spot for a number of search terms, but had their rankings plummet dramatically after “manual action” from Google.
Oh, and by the way, meta keywords are a placebo.