Am I normal?

I should probably have written this a few days ago.

 

Are we normal?

It all started when I was training some local government web folk on the dark arts of Google Analytics.

One of the pieces of data that Google Analytics (or indeed many other analytics tools) will give you about visits to your website (which Google Analytics calls sessions) is what proportion of them are “bounces”.

Someone bounces on your site if they only view one page in their visit*.

So if your page is there to provide helpful information, you’d probably be quite pleased with a high bounce rate because it suggests people are searching for information, finding it and leaving again.

On the other hand you might have designed a site which encourages people to browse around, stumbling across new things. In which case bouncing would be bad.

And, of course, local authorities are service providers. If people come to your website to pay for things, request things, or book things then they (hopefully) won’t bounce either.

So the question these folk asked was “Is our bounce rate typical for local government?”. Which is an interesting question and one I couldn’t directly answer.

Is that the right question?

Now obviously the key question should be “Is it the bounce rate you expected when you designed the site?” In fact I argue you should always know what good looks like before delving in to analytics tools.

But it’s always nice to compare to others. It’s always useful to be able to think “I would expect my site to behaving very much like X council” and then find out if it is or isn’t.

And a question that always bothered me when I was responsible for local government digital services was what level of mobile traffic I should expect. Other councils had higher proportions of mobile traffic than my council. But was this a factor of serving a rural population? Was it a factor of the sort of services and content we provided? Was it an aspect of the way we marketed our services.

The more I thought about it, the more interesting (and useful) I thought it would be if we could see this data for each local authority.

So I decided to ask them.

It’s my fault

So, I’m afraid I’m responsible for a flurry of FOIA requests flying around the country. I’ve tried to keep the request specific and simple to answer.

I hadn’t quite appreciated how much email traffic it would generate or how much admin it would involve me in. But hopefully in a few weeks I should have a large dataset of some key metrics.

This is not about league tables or declaring winners: the only correct bounce rate is the one you intended for your site. This is about trying to help describe the ranges and differences between types of authorities and different parts of the UK.

I’ll be sharing it all back and I hope that it will be a useful contribution to the sector.

That said, I may think twice before embarking on a mass FOIA again…

*Inevitably it’s actually more complex than that but that’ll do for the purposes of this blog post.

7 thoughts on “Am I normal?”

  1. First of all it’s worth saying that this is worthy work, if only to draw things like the existence bounce rates and analysis of mobile usage to people’s attention. I bet a lot of people never look at this. I’m a stats nerd too so will be eagerly awaiting the findings of your survey.

    A word of caution though. Whilst a measure across all council services and information will be of interest to many including myself, it won’t give you what good looks like for specific content types, of which the LocalGov Digital Content Standards have defined 11. It won’t measure individual tasks either.

    To use a real world example, it’s a bit like measuring how quickly a council met a service request across all services provided. So one figure that included missed bin collections which (one would hope) would be pretty speedy to resolve and re-housing an elderly relative which I would guess might take a bit of time, wouldn’t be that useful. The former would almost always be under the measure whilst the latter over.

    Looking in the digital world, where I work we have two main sites, in line with best practice in the retail sector. One’s for reading about information, the other for doing stuff. Desktop/Tablet & Mobile use for the former is around 50/50%, for the latter it’s 65/35%. I could (and should) go into the reasons for this elsewhere, but you can see that a common figure for the two isn’t very useful for us because they meet different user needs, so user behaviour is different.

    We’re creating a content assessment, including user testing and a peer review and it’d be great to include you in this, as I suspect you’d have plenty of ideas to contribute, Do let me know if you’re interested.

    1. I think we violently agree.

      Bounce rate for a whole site (or several sites depending on how you’ve got your analytics tool set up) is several layers of abstraction from the user experience. And the only good bounce rate is the one you designed the site for.

      Which is not to say it is without value. To take your service speed example. Even though different services have different target completion times you could still create an overall metric to represent this. And if one service starts to struggle you’ll see that reflected in the metric. You’d have to drill into the data to find out what was going wrong, but one number can tell you to start looking…

      But if I manage to raise the general profile of bounce rates across the sector I’ll call that job done.

      I’ll be in touch about the peer review.

  2. This is good work Ben and i’ve submitted my response and suspected you had a plan behind it 🙂

    Moving forward it might be useful to think about particular pages where most people would expect a high bounce rate for example on county council websites we have designed our school term dates page to hopefully achieve a high bounce rate and as we know that majority (around 80 %) of people want to simply check when the next one is and our bounce rate is almost exactly that current showing as 80.92% so we are very pleased with the success of that page.
    In addition we have designed “finding as school place” to be one which we hope to help people through a process and our bounce rate is currently 32% so maybe some work to do on unpicking that we given the huge improvement from the previous version we are happy with the impact of the work we have done.
    We tend to focus our bounce rate stats on specific tasks/pages where we expect to see a particular type of behaviour.
    I look forward to seeing the overall results/data too

    Carl

  3. Hi Ben
    Ok. So your FoI has landed on my desk – sort of. As an FoI, I’m not sure you’ll get quite the answer(s) you want. But, I guess since you don’t have 400+ helpful people like me (!) that’s the best way to get the data.
    Having said that, I’m not sure you will get comparable data.
    We use SiteImprove for our web stats which will give you very different numbers to GA. For example, SiteImprove doesn’t count robot visits. Also, I have a sneaking suspicion that SiteImprove counts too many bounces.
    You probably know that there’s a more generic issue that you might encounter: the dreaded third party sites. As set up, a visitor landing on the www subdomain who then clicks to the planning subdomain is likely to count as a bounce. Added to that, I don’t have control of the planning subdomain and certainly don’t have the stats (yes, I’ve asked). So, your www bounce may then spend 15 minutes clicking all over the planning subdomain, not finding what they’re looking for.
    Anyway, you’ll get an answer from us, but if you’d like to discuss the detail I’m more than happy to find time to do so – increasingly less of that as we’re at the brink of a CMS/re-design project.
    In the meantime, I’ve started to load up stats on DataShare, based on your question. More to come.
    Pete

Comments are closed.