Google’s Search Console has been the best asset to webmasters for years and the search engine giants have recently released new guidelines – here’s our guide to what’s changed.

You've launched your website. So what next?


Too many pages, or pages with little relevance clutter your site – it is crucial that you direct the user to your most important pages whilst maintaining a clear theme and message about your product/service.


Previously, the more links you had; the better your site would perform on Google. Now, Google are suggesting a narrowing of your links to around a few thousand at most – shed the unnecessary links to improve your site’s performance.


It is more important than ever to keep these files up to date. Robot.txt files can clutter your site as much as unnecessary pages and links.

Alt tags

Basically, make sure you use them. Google’s analysis programs need to read them to know the content of an image to categorise it correctly.

Obvious Hierarchy

If your site looks messy, it will perform poorly. Google’s programs act like users; if they can’t use your site then what chance does a human operator have?

Broken Links

Any links that aren’t connected to a live page with a valid HTML could cause your entire site to be disregarded by Google – run thorough tests often.

Dynamic Pages

Google now frowns on users including special characters in their URLS (! Or ? for example). Before long these characters could become unrecognised.

Session IDs and Parameters


When bots crawl your site, you would usually have a session ID or URL parameter in place to track its movement – this is no longer needed as Google will have bots crawling frequently to assess for broken links etc for their own research.

Blocked Resources

Bots are still fairly simple programs so it would be wise to allow all assets such as JavaScript, CSS and HTML to be active during crawling – bots view sites as the user does. Blocking these resources will give a distorted view and the bots won’t get a correct image.

NoFollow Ads

Google’s crawling bots will access everything on your site including advertisements – if these are not set up as NoFollow; the bots will crawl the corresponding site the advert leads them to instead of fully covering your own.

Multiple Devices & Browsers

Google now suggest testing on more than one medium and on several browsers. With so many options available to view your site, you need to know it functions correctly on all platforms.


At last, it has been declared HTTPS is the preferred option over HTTP.

Visual Impairment

Google has also suggested designers and webmasters begin testing their sites using visual readers – some users may have visual impairment and will use similar software to access content. Make sure you are prepared.


Your website’s loading speed plays a crucial role in the user’s experience and ultimately, the sites growth. Page speed is a critical yet often overlooked aspect of SEO. Most websites have an expected average loading time of 7.72 seconds  and often, online businesses often lose out on the opportunities to convert visitors to actual paying customers simply because of the slow loading times.

In Brief

Google haven’t changed all that much in their new guidelines but that isn’t to say there is nothing you should change. By following this brief guide, you can develop your site appropriately and reach the first page of the SERPs.