Embed this checklist
Want to use this checklist on your blog or magazine? Use the following code to embed it:

Template: SEO Checklist

With this checklist you can validate that your site is still search engine friendly. It only contains checks that are relevant to the whole site. Checks for optimizing single pages (in regard to keyword density, markup, etc) are not included.

As always: We're happy about suggestions. Contact us!

Notice: Your checks will not be saved. Please log in or sign up to permanently manage this checklist.
Are pages that should be indexed accessible by robots?

Make sure that all pages that should be indexed are accessible by robots.

  1. Check your robots.txt
    • If it exists, make sure it only blocks pages which really should be blocked. You can use Google's Webmaster Tools for this.
    • If it does not exist, make sure that the server returns a code 404. Otherwise search engines might assume a temporary error and halt crawling until either a robots.txt is successfully retrieved, or a code 404 is returned.
  2. Make sure that the pages are not blocked by a robots meta tag.
  3. Make sure that your server does not send a x-robots header (highly unusual).

Are pages that should NOT be indexed really blocked for robots?
On most web projects, there are a few pages that should not be indexed. Make sure that these are blocked for robots through robots.txt or meta tags.

Did you use the right redirects?

If you are redirecting some pages to other pages, did you use a 301-redirect?

There are two kinds of redirects: One with the http status code 301 and one with the status code 302. Code 301 signals that the page has been permanently moved, while 302 means only a temporary redirect. If you use a permanent (301) redirect, search engines will apply all "link power" (page rank and other factors) to the new page.

Is the site still fast enough?
Site speed has become a ranking factor. Check that your site has not become too slow. You can use Google PageSpeed Insights for this.
Is there no duplicate content?

If large blocks of content are basically the same on multiple pages, this might lead to a ranking problem. See Google's Webmaster Tools Help for more information on how to avoid duplicate content.

Careful: It is easy to accidentally create duplicate content if you don't handle domain names carefully (www.domain.com vs. domain.com) or when dynamic pages use URL parameters in loose order.

Is there a XML sitemap and is it up to date?

If you page is difficult to crawl (complicated URL structure, fast changing content, etc) consider creating a XML sitemap to help search engines crawl your site. See Google Webmaster Tools Help for more information.

If you added new features, make sure they are reflected in the sitemap.

Are there no broken links?

Make sure that there are no broken links on your site.

You can check the reports in Google Webmaster Tools or use software like Xenu's Link Sleuth to automatically find broken links on your site.