Server Side Files Important for SEO

canonicalization

Focus Google Crawl

We use an .htaccess file to set canonicalization to a single URL. We create a robots file and an xml sitemap to ease and cleanse the crawl of search engine bots.


These tasks are so important that Google Webmaster Tools (WMT) attempts to verify these tasks were accomplished.


Canonicalization

Canonicalization is the process of creating a single URL for Google to index.

A website without canonicalization will show up with a duplicate site that has a separate URL. That is both with and without the WWW. prefix

We create an .htaccess redirect file and upload to server forcing the occurrence of a single URL. Google recommends that this procedure is to be done and verifies that it has been accomplished in Webmaster Tools.


XML Sitemap File

An XML Sitemap enables Google to better crawl your website. We create a custom XML sitemap that lists each indexable page in website along with a timestamp and a page importance variable.

An account is created in Google Webmaster Tools, sitemap is uploaded via FTP, and we force Google to index site with account tools.


Robots File

Often times so much clutter and unused file structures sit on a web server alongside the actual website content which is difficult for Google crawlers to determine. A robots file makes the crawl process smooth and relevant as it disallows irrelevant content from indexing.