Review Sitemaps, Robots.txt, QA Google Search Console

Category: SEO

The website's sitemap.xml and robots.txt communicate directly with the search engines and inform the search engines how the website should be indexed.

Sitemap.xml

The sitemap.xml is a file that lists every URL on a website. This file is submitted to the search engines to make it easier to index the website's URLs. It is important for the URLs we want indexed in the search engines to be listed accurately.

What does Tractorbeam do?

We run a crawl of the website to get every URL. We compare our crawl to the sitemap.xml file to ensure accuracy.

Robots.txt

The robots.txt file lists the URLs that should NOT be indexed in the search results, as well as pointing to the sitemap.xml file. Search engine crawlers hit the robots.txt file to see what URLs they should avoid and to grab the sitemap.xml file.

What does Tractorbeam do?

When reviewing the robots.txt file, we ensure the right URLs are disallowed and that the sitemap is present.

Technical Audit

Having a technically sound website is vital to achieving organic rankings. Technical audits refers to optimizing a website for crawling and indexing. While technical audit is a broad catch all term, in this section we are specifically crawling the website to find 404, 302, and 301 errors.

When a crawler hits a page, it should register as a 200 page. This means that the page successfully loaded.

A 404 page means the page is broken.

A 302 redirect is temporary redirect signaling to the search engines that the page will come back in the future.

A 301 redirect is a permanent redirect signaling to the search engines that the is gone for good.

What does Tractorbeam Do?

We use redirects on pages we want to delete so that people end up in the right location, instead of an error page. For example, if a floor plan is renamed, we want to add a 301 redirect to the new floor plan page. We are signaling to the search engines that a page that used to be indexed is no longer around and to now index this new page.

Our goal in the technical audit is to identify and clean up any error pages to ensure the site is crawling with 100% accuracy.