Invite Search Engine Spiders to Index Your Web Site - dummies

Invite Search Engine Spiders to Index Your Web Site

You may discover that important pages on your Web site haven’t been indexed on a search engine. In this case, you can invite the search engine spiders to your site, to travel all of your internal links and index your site contents. What follows are several effective ways you can deliver an invitation to the search engine spiders:

  • External links. Have a link to your missing page added to a Web page that gets crawled regularly. Make sure that the link’s anchor text relates to your page’s subject matter. Ideally, the anchor text should contain your page’s keywords. Also, the linking page should relate to your page’s topic in some way so the search engines see it as a relevant site. After the link is in place, the next time the spiders come crawling, they follow that link right to your page. This sort of “natural discovery” process can be the quickest, most effective way to get a page noticed by the search engines.

  • Direct submission. Each search engine provides a way for you to submit a URL, which then goes into a queue waiting for a spider to go check it out. It’s not a fast or even reliable method to get your page noticed, but it won’t hurt you to do it.

  • Internal links. You should have at least two links pointing to every page in your Web site. This helps ensure that search engine spiders can find every page.

  • Site map. You should have a site map for your users, but for the search engines, you want to create another site map in XML (eXtensible Markup Language) format. Make sure that your XML site map contains the URL links to the missing pages, as well as every other page you want indexed. When a search engine spider crawls your XML site map, it follows the links and is more likely to thoroughly index your site.

The two versions of your site map provide direct links to your pages, which is helpful for users and important for spiders. Search engines use the XML site map file as a central hub for finding all of your pages. But the user’s site map is also crawled by the search engines. If the site map provides valuable anchor text for each link (for example, “Frequently Asked Classic Car Questions” instead of “FAQs”), it gives search engines a better idea of what your pages are about. Google specifically states in their guidelines that every site should have a site map.

There is a limit to the number of links you should have on the user-viewable site map. Small sites can place every page on their site map, but larger sites should not. Having more than 99 links on a page looks suspicious to a search engine because spammers have tried to deceive the search engines by setting up link farms for profit, which are just long lists of unrelated hyperlinks on a page. So just include the important pages, or split it into several site maps, one for each main subject category.

Unlike a traditional site map, an XML site map doesn’t have a 99 link limit. There are still some limitations, but the file is meant to act as a “feed” directly to the search engines. For full details on how to create an XML site map, visit, the official site map guideline site run by the search engines.