If your website has been live for any time between six months and two years, then you may well have started to think about small ‘tweaks’ to improve looks, navigation or user experience.
Small changes in that sort of timeframe - too soon for a new site, but far enough away that the current design has become familiar - are common, but less common amongst firms is any formal assessment on the effect of the change.
Even minor tweaks can have an impact on how your site performs in a variety of important metrics. A change to the most prominent homepage image, for example, could increase your bounce rate (the percentage of visitors who make it no further than the homepage), as familiar clients assume they’ve entered the wrong site. The addition of a small call to action box in your header asking clients to ‘sign up to your newsletter’ could increase subscriber volumes, but without testing, how can you be sure that this is the case and will it discourage users from going further into the site?
The ‘Experiments’ facility within Google Analytics allows you to carry out A/B split testing, assesing the case for a change, or for the current design. There is some technical expertise involved in setting it up, and options to configure, but once it is running the data can be invaluable. Take the below example.
Case Study: Using Google Analytics Experiments to Test a Homepage Change
In this scenario, there were two equally valid arguments for what to do with a client’s homepage. One argument ran that, aesthetically, the page worked better without links to the client’s most recent blog posts, especially considering the page had recently been streamlined to emphasise the services they offered.
The other argument was that blog posts have lots of positive effects; showing off expertise, engaging clients in communication, showing the site is regularly updated and encouraging the user to explore.
In order to gauge the effect of both ideas, two home pages were established. Google Experiments automatically forwarded a percentage of visitors to one page (without the links to recent blog posts), whilst the remainder went to the original page (with recent blog posts).
The outcome of the A/B split test
The page with the blog posts (Subject A) gave the site an incredibly low bounce rate of just 39.63%. The alternative page without the blog posts (Subject B) saw an increased bounce rate of 48.84%, which though still good, is obviously a tangible increase.
Average session duration was markedly improved in Subject A, which saw visitors remaining on the site for 6.03 minutes, visiting 3.02 pages. Subject B saw visitors stay on the site 3.34 minutes, visiting 2.80 pages.
Google Experiments rated Subject B as having just a 3.5% probability of outranking Subject A in the long term.
Conclusions and next steps
Clearly, Subject A was the better homepage, with the argument that blog posts encourage interaction winning out over pure aesthetics. The metrics on both subjects were actually very good, reflecting that this is a well-built and well-used website that clients like.
Having proved out Subject A as a forward focus for the website, there’s no reason to stop the experiments there. A new version of the website, with blog posts increased in visibility on the homepage will now be trialled against Subject A, to see if a greater prominence results in even better metrics.
Thinking about a change to your website? If you would like assistance collecting data on the likely effects of any change to your visitor levels then why not get in touch here and we’ll be happy to discuss your requirements.