Does bounce rate matter?

Invasion of the SpaceHoppers

space hopper invasion by Paul Stevenson used under CC BY 2.0

An interest in bounce rate can severely damage your health.

It’s true.

I recently started to muse about what a “normal” bounce rate might be for local government websites and this led me to spend a surprising amount of time gathering data from (almost) every council in the country.

Now I’m safely out of the other side of that I can muse a bit more about what it all tells us.

Nerdy explanation of bounce rate

In principle this is very straightforward. If I visit a page on your website and then don’t visit any other pages for a while then I “bounced” on that page. If half of the visits to your website were “bounces” then the bounce rate is 50%.

In reality it is slightly more complex than that. If you use Google Analytics to record this data (and 86% of local authorities do) then a bounce occurs if a user triggers only one event. A pageview is an event but Google Analytics can be configured to measure other events: such as playing a video. So if you set up Analytics to measure plays of videos, your bounce rate will go down even if people don’t visit any other pages.

And if people visit a second page but Google Analytics is not set up on the second page (which happens reasonably often in local government when people are passed to back office services (though only if the back office services are poorly configured or a bit pants (which is not-uncommon))) then Google Analytics will record a bounce even though the user is definitely not bouncing.

And finally there is the cross-domain issue. As people move between domains on your web estate (strictly speaking GA properties but it’s easier to think about it as domains). Imagine you have a main site www.marchford.gov.uk and then your comms team think it’s a jolly wheeze to set up a microsite to promote the council’s vision for the area: visionforthearea.marchford.gov.uk. When someone visits a news page on the main site and follows a link to the second site Google Analytics will record the first visit as a bounce. This can be fixed using cross-domain tracking though there are some situations when you might not find that helpful.

Time is also a factor in bounce rate. If I visit a page now and then come back in 15 minutes and visit a second page should we record that as one visit involving two pages (and so no bounces) or two visits each involving one page (two bounces)? Google Analytics by default assumes that there must be a gap of at least 30 minutes and so would plump for the one visit option. If the second page is visited 31 minutes after the first one that would make it two visits.  

What bounce rate is best?

It’s definitely not possible to say “local government websites should have a bounce rate of X%”. The correct bounce rate, ultimately, is the one that shows the website is being used as it was designed.

Many public sector sites are designed to work well with search engines (in the UK, let’s face it that’s basically Google) and to resolve the user’s question immediately. So if I type “Christmas holidays Barnet” into my favourite search bar I get a link to the Barnet School Term and holiday dates page. This gives me the answer I was looking for and so I go about my business. I just bounced on Barnet’s site (they can thank me later).

So if we’ve designed the site to work this way and it is working as designed, we would expect high bounce rates. If we believe that people are more likely to visit the front page of the website and then navigate to the answer and the site is working as designed then we would expect lower bounce rates.

Of course the local government digital estate is about more than delivering small packets of information. It’s about (or should be about) much more complex interactions, looking for support with housing, working out what support at home would be best for mum, understanding what options my visually impaired daughter has to get the best out of her further education.

These interactions are likely to be very different, longer, more interactive, perhaps paused halfway through for a bit of consideration and discussion. So if you’ve designed the site for these sorts of interactions and the site is working as designed bounce rate would be lower. The degree to which this affects the overall bounce rate will depend on the mix of the straightforward and the complex interactions. And the way you expect users to interact with your site.

So is it useful to monitor bounce rate for your site?

Yes and no.

Some eloquent explanations of why it is not useful to look at bounce rate were posted as comments on my blog a couple of months ago.

While I don’t disagree with the points made there but I do have a different take on the situation.

Consider the rev-counter on your car. It displays a single figure indicating the number of revolutions per minute your engine is making). When you depress the throttle the rate goes up and the indicator shows this change.

But the situation under the hood is much more complex: valves are opening and closing, fuel is being sprayed, exhaust gasses are leaving the cylinders and being exhausted.

RPM is a result of all of this activity, it’s not a measure of all of it but it is affected by changes in the whole system.

I think bounce rate is a bit like this. Overall bounce rate is a result of the design decisions you make for your site and how users actually use it. A web manager should have a good idea of what bounce rate they expect across their site(s). So if the rate they actually see is radically different that means something odd is going on. And if the rate changes suddenly that’s another indicator something needs investigation (or conversely if the rate doesn’t change when they expected it to).

What bounce rate should I expect?

The simple answer, without knowing anything much about your site, is between 40% and 51% because 50% of all local government sites are within that range.

Which doesn’t mean that if you have a bounce rate of 60% you are doing something wrong. It means that your site has an unusual bounce rate. If that’s what you expect then that’s great news (isn’t it?).

On the whole English district councils and London Boroughs tend to be a bit less bouncy than other councils. The explanation for this could be as simple as not having public transport responsibilities which we might expect to drive lots of “when’s the next bus” type traffic.

People visiting council sites from internal addresses tend to be much less bouncy than visitors overall. So if you have a site targeted at school professionals, for example, you would expect much lower bounces.

And the design decisions you take will also play a key role. Are you handing people to third-party sites for example?

Cross domain tracking rocks

One thing that struck me when I was asking councils for data on their website visits was the number of councils tracking each domain (or microsite) separately. If you treat your digital newsroom as a distinct analytics property to your main site, both will look artificially bouncy. If you use cross-domain tracking you’ll get a much more realistic sense of what is happening to traffic between those sites.

A tool, not a grade

To me using website analytics data is all about understanding what your site is designed for, how it is being used and then tweaking and improving the design to meet the needs of your users. A single figure (like bounce rate) can never give you the full detail of what’s going on under the hood but it can tell you whether things seem to be working as they should or need closer investigation.

Investigating crime data at small geographies

A bit of background (skip if you’re familiar with h-map)

I’ve been doing some work with Herefordshire charity the Bulmer Foundation (and a lot of other organisations) to create a central resource of data about how sustainable Herefordshire is becoming. The Foundation has been working on this for some time and, following a lot of consultation, they have identified the key indicators of sustainability for the county.

If these things are getting better, Herefordshire is developing more sustainably. If these things are getting worse Herefordshire is developing less sustainably.

Identifying the indicators is important but it’s not the whole task. Next we have identify the data that tells us if things are improving or not for each indicator. Then we have to display it in a useful way.

The Foundation has commissioned some development work which is still in progress but play-about-with-able at h-map.org.uk.

Understanding how things are changing at the whole county level might be interesting but for most practical purposes it may not be useful. We need to get into the data at much smaller geographic levels. As a way of investigating how this might work I decided to investigate crime data which is a nice, rich dataset.

Processing

This post is really a set of notes about how I processed the data. Partly for my records but also to encourage les autres and to enable people to suggest ways this could be done better.

I’ll write about the implications for the h-map project on the project blog in due course.

The police.uk website publishes crime data in a number of ways. There is an API but I haven’t explored that. Humans can download see some data on maps and they can elect to download some CSVs

I asked for everything they had on West Mercia ( the police force serving Herefordshire, Worcestershire, Shropshire and Telford). I received a zip file containing a bunch of csvs. One csv per month for all of the months starting Jan 2011 and ending Decemb 2014. Each csv contains a row for each crime recorded across the force area. Against each row is a type of crime, lat, long, outcome and sundry other data including the LSOA.

That’s great but much more than I need. All I ultimately want to know is the number of crimes recorded per year in each LSOA. If I had better coding/scripting skills I could probably have got a few lines of Python working on the task for me. But I don’t.

Also it was an excuse to play with Datagraft.net This new cloud thingy has sprung from the Dapaas project involving the ODI and other cool linked/open data folk. It allows you to create transformations in a reasonably point and click manner. There is a pipeline metaphor so transformations proceed sequentially. There is a bit of a learning curve and the documentation is early stage but I got the hang of it. It allows you to preview the effect of your transformation on real data which enables effective pressing of buttons until the right thing happens.

So, after a bit, I managed to build a little transformation that pulls out rows here the name of the LSOA contains “Herefordshire” and then creates a 2 column table from this with LSOA and month the respective columns.

I still have 1 csv for each month. It might be that datagraft will append rows onto an existing column but I couldn’t work out how to do this.

So I had a happy couple of hours uploading each csv and downloading the processed file.

What I was aiming for was number of crimes per year per LSOA. So I had to get my monthly files into 1 big yearly file. Which I did manually with the assistance of the excellent TextWrangler application. It really made this tedious manual task a breeze.

Then a simple pivot table for each year gives me the totals I was looking for.

There was a little bit of spreadsheeting to decide if crime was improving, worsening or not changing.

And finally the application of Google Fusion Tables to link the LSOA codes in my dataset to polygons (described as KML) and I have a lovely map painted red, amber and green.

Datagraft enables me to save the transformation so when all of 2015’s data becomes available I’ll be able to repeat the process. It also enables me to to publish my dataset and to RDF it.

Maybe next week for that.

If you have any suggestions for ways I could cut out steps, or improve my data wrangling I would love to hear them.

big numbers on council websites: a guide for comms folk

(I forgot to add this at the time, I’ve put it here for completeness, apologies if it is clogging up your RSS feed)

vintage-649760_1920

I wrote a piece on Comms2Point0 about why some of my findings about local government websites asking whether 10 visits per person per year is, in fact, a lot.

Read big numbers on council websites: a guide for comms folk now.