There are so many things that website developers need to changed, update and check before a new website gets deployed to a live environment on the internet. More often than not, I review a website that has just been deployed, and find little things that annoy me. I feel that these things should have been checked, changed and updated before going live, however, that is not always the case and that is why I am creating this checklist to help me and my team deliver better websites for our customers.
I find that so many web developers are not concerned or just have no idea about search engine optimisation (SEO), the size of the .js and .css files they deploy and the little things that may make your customer’s website rank slightly better with search engines than others.
I would like to use this post to create my own check list of things that I think should be checked before a new or updated website goes live. I am going to update this post from time to time with additional checklist items.
Website go-live checklist:
- Have you used a ‘Release’ build of the code base (when using compiled code such as .NET)
- Have you included a robots.txt file?
- Have you validated your css files?
- Have you validated your html?
- Have you removed unused .js and .css file references?
- Have you disabled any unnecessary logging in your code base?
- Have you enabled exception logging and reporting?
- Have you removed any testing data and made sure a set of clean reference data is available?
- Have you removed any test user accounts that you may have created?
- Do all your pages have unique page titles? (Meta title)
- Have you selected your canonical domain name and do you issue 301 redirects for other alias you may have? (It is nice to try and keep the path when redirecting a visitor)
- Have you enabled caching on the website? (There are many caching options to chose from such as query caching, output caching, reverse proxies)
- Nice to have:
- Do all your pages have Meta Descriptions?
- Have you published a sitemap file and linked to it in your robots.txt file?
- Have you submitted sitemaps to Google, Bing and Yahoo webmaster sites?
- Have you added Google Analytics (or equivalent) to the site? (It only takes 5 minutes and it is free)
- Have you combined your .css files?
- Have you combined your .js files?
- Have you minified your .css and .js files?
- Have you got 404 handlers in place to handle links from the old website?
- Try and use a Content Delivery Network for your script files such as jQuery.
- Use spriting for images where possible
Firefox tools that enable early notification of issues:
If you have some good ideas that I can add to this list, please add a comment and I will add them.