Server SEO

canonicalization

Simplify Google Crawl

Server files can perform many SEO functions such as page redirects, setting a single URL, disallowing select pages from indexing and setting crawl rate and importance. Previously accomplishing this meant uploading 3 separate files. This was easy in the html based websites of the past, but today many websites sit on the wordpress platform. As a result we use a different method involving a wordpress plugin.


canonicalization

Canonicalization is the process of creating a single URL for Google to index. A website without canonicalization will show up with a duplicate site that has a separate URL. That is both with and without the WWW. prefix. By creating an .htaccess redirect file and uploading to server we force the occurrence of a single URL. Google recommends the occurrence of a single URL and verifies it as a 'Preferred URL'.


xml sitemap

An XML Sitemap sets crawl rate and page importance of the website. We create a custom XML sitemap that lists each indexable page in website along with a timestamp and a page importance variable. The sitemap is uploaded via FTP and verify it's presence in Google tools.


robots.txt file

So often unused file structures and other clutter sit on a web server alongside the intended website content, making it difficult for Google crawlers to determine the actual content. A robots file makes the crawling process smoother by disallowing irrelevant content from being indexed.