Make Your Web Site W3C–Compliant for Better Search Engine Results
One way to improve your Web site’s search engine results is to validate your code. Validating code means making your Web site W3C compliant. The World Wide Web Consortium (W3C) is an international consortium where member organizations, a full-time staff, and the public work together to develop Web standards. Their mission statement is “to lead the World Wide Web to its full potential by developing protocols and guidelines that ensure long-term growth for the Web.”
W3C goes about achieving this by creating Web standards and guidelines. It’s basically like health code guidelines for a restaurant. A Web site needs to meet certain standards in order to be as least imperfect as possible. Since 1994, W3C has published more than 110 such standards, called W3C Recommendations.
A compliant page is known to be spiderable and the links crawlable, so although the search engines do not require W3C compliance, it’s not a bad idea. If you have complex or just plain ugly Web page code, or you’re having issues getting your pages crawled and indexed, validating your code to W3C standards might help.
On the front page of the W3C’s Web site is a sidebar called W3 A to Z, which contains all sorts of links. Bookmark this page: these links are a great reference to help you understand the standards that the Web is built on.
Here’s something about search engines: The harder they have to work to read your site, the less often and less thoroughly the search engines index your site. Because more content tends to mean more authority, you are less likely to receive top ranking. In fact, if a search engine has to work too long at reading your page, it might just abandon it altogether.
So it’s a good idea to follow the W3C standards, simply because they make for a faster, more efficient page that is set up the way a search engine expects to find things. It’s like having your house swept clean and in order when the spiders come to visit: It makes them like what they see. (Internet spiders, that is. It doesn’t work that way for the arachnid variety.) If your site doesn’t comply with the W3C standards, the search engines might not crawl all of the pages on your site. Because you can’t rank pages the search engines don’t know about, that’s a big problem.
To comply with W3C, every page should declare a doc type (document type) and validate itself. To declare your doc type, include a line at the very top of your of HTML code, which declares the document as an HTML document and identifies the type of HTML you are following. Because HTML has changed since the early days, some versions are different than others. Declaring a doc type is telling the search engine what it’s going to be reading. It’s important to comply with your declared doc type. If you don’t, you confuse the search engine spider, and it takes longer to crawl your pages.
To validate your page, go to the W3C Web site and use the free tools on that page, as shown in the above figure.
These are tools that you can use to basically “proofread” your site in order to make sure they comply with the W3C standards.
The first tool on the site is the MarkUp Validation Service, shown in the above figure. Also known as the HTML validator, it helps check Web documents in formats like HTML, XHTML, SVG, or MathML. Plugging your site’s URL into the box allows the tool to check your Web site to see if the code matches the declared doc type.
The second tool is the Link Checker (shown in the above figure). It checks anchors (hyperlinks) in a HTML/XHTML document. It’s useful for finding broken links, redirected pages, server errors, and so on. There are some options for your search, like ignoring redirects and the ability to check the links on the pages linked to from the original page, as you can see in the figure above, and you can also save the options you set in a cookie, to make it quick to run it again in the future. If you don’t select summary only, you can watch it go through each link on the page. Most of the time, you just need to run the tools without making any adjustments so don’t stress about the options.
Many other link checkers are also available out there. The Link Checker tool is great for checking one page (if you were putting up a single new page with a lot of links, for example), but for spidering a whole site, you may prefer Xenu’s Link Sleuth, which is a great (and free!) link-checking tool that makes sure all of your links work.
The third tool on W3C’s site is the CSS Validation Service, which validates CSS style sheets or documents using CSS style sheets. As shown in the above figure, it works a lot like the Markup Validation Service. Just put in the URL of the site you want to have validated.
Validating your CSS ensures that your site looks picture perfect whenever a standards-compliant browser (like Firefox) or spider (like Google) comes by and checks it out.