Site owners should periodically verify that the site is completely accessible for both search engine spiders as well as users.

Robots.txt, for example, can be useful at times when you do not want a page to be indexed, but accidentally marking pages to block the spider will damage rankings and traffic.

Brands should also look closely at their Javascript coding to ensure that the vital information for the website is easily discoverable. Since customers also regularly complain about error messages and sites failing to load, brands should be checking for 404 pages and related errors.

Given that more searches now occur on mobile than desktop, and the impending switch to a mobile-first index on Google, brands should also ensure that any content published is constructed for mobile usage.

When speaking about the user experience, visitors themselves also pay a considerable amount of attention to load speeds. Brands should optimize for load speeds, watching site features such as cookies and images, that can slow down pages when not used correctly.

Things to do to improve your site’s accessibility:

  • Check that robots.txt is not blocking important pages from ranking
  • Make sure the robots.txt contains the sitemap URL
  • Verify that all important resources, including JS and CSS are crawlable
  • Find and fix any 404 errors
  • Check that all content, including videos, plays easily on mobile
  • Optimize for load speed