Content, Popularity, and Architecture: Things All Search Engines Look For - dummies

Content, Popularity, and Architecture: Things All Search Engines Look For

To keep their results relevant, all search engines need to understand the main subject of a Web site. You can help the search engines find your Web site by keeping in mind the three major factors they’re looking for — content, popularity, and architecture:

  • Content: Content is the meat and bones of your Web site. It’s all the information your Web site contains, not just the words but also the engagement objects. Your page’s relevancy increases based upon your perceived expertise. And expertise is based on useful, keyword-containing content. Search engine spiders also measure whether you have enough content that suggests you know what it is you’re talking about. A Web site with ten pages of content is going to rank worse than a Web site with ten thousand pages of content.

  • Popularity: The Internet is a little like high school in that you are popular as long as a lot of people know you exist and are talking about you. Search engine spiders are looking for how many people are linking to your Web site, along with the number of outgoing links you have on your own site. Google, especially, seems to love this factor.

  • Architecture: If you walk into a grocery store and find everything stacked haphazardly on the shelves, it’s going to be harder to find things, and you might just give up and go to another store that’s better organized. Spiders do the same thing. Search engines love Wikipedia because of how it’s built. It’s full of searchable text, Alt attribute text, and keyword-containing hyperlinks that support terms used on the page.

You also have some control over two variables that search engines are looking at when they set the spiders on you. One is your site’s response time. If you’re on a server that loads one page per second, the bots request pages at a very slow rate. A second may seem fast to you, but it’s an eternity for a bot that wants five to seven pages per second. If the server can’t handle one page per second, imagine how long it would take the bots to go through 10,000 pages. In order not to crash the server, spiders request fewer pages; this puts a slow site at a disadvantage to sites with faster load times. Chances are, bots will index sites on a fast server more frequently and thoroughly.

The second variable is somewhat contested. Some search engine optimization specialists believe that your rank could be affected by your site’s bounce rate. The search engines can detect it when a user clicks on a result and then clicks on another result in a short time. If a Web site constantly has people loading the first page for only a few seconds before hitting the back button to return to the search results, it’s a good bet that the Web site is probably not very relevant. Remember, engines strive for relevancy in their results so this could very likely be a factor in how they’re determining rankings.

So if all search engines are looking at these things, does it matter if you’re looking at Yahoo! versus Google? Yes, it does, because all search engines evaluate subject relevance differently. All of the Big Players have their own algorithms that measure things in a different way than their competition. So something that Google thinks belongs on page 1 of listings might not pop up in the top ten over on Yahoo!