How to reduce the FOI burden on local government

Secret

(photo is: Secret Comedy Podcast 06 – 2 August 2013 by Amnesty International UK used under CC BY 2.0)

 

As reported by the Press Gazette, the LGA has provided evidence to the Freedom of Information Commission [PDF].

The LGA is disappointingly negative about FOIA, seeing it as a a cost to authorities rather than a boon to their communities.

It prompted me to get round to a blog post I’ve been meaning to write for a while, reflecting on my experience of making an FOIA request to every council in the country.

Asking for data

I wanted to know some things about the usage of council websites. I wrote a report about websites based on this data.

It is reasonably trivial to run a mailmerge and issue an email to every council and I was conscious that this would generate work in each authority so I tried to pick a small number of datapoints that it would be easy to obtain.

I’ve never issued an FOIA request before (though I’ve answered plenty) and I did feel a bit of a sense of guilt/fear when I pressed go.

This was assuaged somewhat by receiving a response within an hour (well done Cardiff) and exacerbated by receiving a call from a web manager apparently wanting to know what lay behind my request. I was a bit taken aback but when other people contacted me (in a less defensive way) I wrote a blog post explaining what I was up to.

Not that bad in the end

And I have to say the vast majority of councils responded promptly and in an extremely helpful manner. A small minority had a much more defensive attitude and some councils attached very restrictive licences to the data (despite the obligation, in England and Wales, to provide datasets under open licences).

A very small number (sadly I didn’t keep an accurate count of this) of councils responded to say “We already publish this”. Now to be honest, it was less convenient to me when they did this because it usually involved me in more work. But I was still delighted because it’s clearly such a sensible thing to to.

Gold star goes to East Sussex who just give you the login details of read-only account for their Google Analytics account.

It would be easy to publish this

It is technically trivial to publish data from Google Analytics (the tool used by the majority of councils). Website data is not secret, not personal and its publication is of benefit to the sector and potentially to the wider community.

And if it had been published the cost to the public sector of my report would have been marginal to nothing.

In fact the only reason not to publish this data is a cultural inclination not to tell people stuff.

The way to reduce the FOIA “burden” on local government is to answer people’s questions before they ask them.

And if local government routinely published its non-personal data then it would have a stronger argument when raising concerns about the cost of FOIA compliance.

What the evidence tells us

In fact the LGA evidence to the FOIA Commission reveals a sector stuck in a suspicious, closed, and secretive culture.

 

Which suggests, of course, we need the FOIA even more than before.

Oh. And and what chance do closed, suspicious, secretive organisations have of being effective in the digital age?

(Don’t answer that)

Does bounce rate matter?

Invasion of the SpaceHoppers

space hopper invasion by Paul Stevenson used under CC BY 2.0

An interest in bounce rate can severely damage your health.

It’s true.

I recently started to muse about what a “normal” bounce rate might be for local government websites and this led me to spend a surprising amount of time gathering data from (almost) every council in the country.

Now I’m safely out of the other side of that I can muse a bit more about what it all tells us.

Nerdy explanation of bounce rate

In principle this is very straightforward. If I visit a page on your website and then don’t visit any other pages for a while then I “bounced” on that page. If half of the visits to your website were “bounces” then the bounce rate is 50%.

In reality it is slightly more complex than that. If you use Google Analytics to record this data (and 86% of local authorities do) then a bounce occurs if a user triggers only one event. A pageview is an event but Google Analytics can be configured to measure other events: such as playing a video. So if you set up Analytics to measure plays of videos, your bounce rate will go down even if people don’t visit any other pages.

And if people visit a second page but Google Analytics is not set up on the second page (which happens reasonably often in local government when people are passed to back office services (though only if the back office services are poorly configured or a bit pants (which is not-uncommon))) then Google Analytics will record a bounce even though the user is definitely not bouncing.

And finally there is the cross-domain issue. As people move between domains on your web estate (strictly speaking GA properties but it’s easier to think about it as domains). Imagine you have a main site www.marchford.gov.uk and then your comms team think it’s a jolly wheeze to set up a microsite to promote the council’s vision for the area: visionforthearea.marchford.gov.uk. When someone visits a news page on the main site and follows a link to the second site Google Analytics will record the first visit as a bounce. This can be fixed using cross-domain tracking though there are some situations when you might not find that helpful.

Time is also a factor in bounce rate. If I visit a page now and then come back in 15 minutes and visit a second page should we record that as one visit involving two pages (and so no bounces) or two visits each involving one page (two bounces)? Google Analytics by default assumes that there must be a gap of at least 30 minutes and so would plump for the one visit option. If the second page is visited 31 minutes after the first one that would make it two visits.  

What bounce rate is best?

It’s definitely not possible to say “local government websites should have a bounce rate of X%”. The correct bounce rate, ultimately, is the one that shows the website is being used as it was designed.

Many public sector sites are designed to work well with search engines (in the UK, let’s face it that’s basically Google) and to resolve the user’s question immediately. So if I type “Christmas holidays Barnet” into my favourite search bar I get a link to the Barnet School Term and holiday dates page. This gives me the answer I was looking for and so I go about my business. I just bounced on Barnet’s site (they can thank me later).

So if we’ve designed the site to work this way and it is working as designed, we would expect high bounce rates. If we believe that people are more likely to visit the front page of the website and then navigate to the answer and the site is working as designed then we would expect lower bounce rates.

Of course the local government digital estate is about more than delivering small packets of information. It’s about (or should be about) much more complex interactions, looking for support with housing, working out what support at home would be best for mum, understanding what options my visually impaired daughter has to get the best out of her further education.

These interactions are likely to be very different, longer, more interactive, perhaps paused halfway through for a bit of consideration and discussion. So if you’ve designed the site for these sorts of interactions and the site is working as designed bounce rate would be lower. The degree to which this affects the overall bounce rate will depend on the mix of the straightforward and the complex interactions. And the way you expect users to interact with your site.

So is it useful to monitor bounce rate for your site?

Yes and no.

Some eloquent explanations of why it is not useful to look at bounce rate were posted as comments on my blog a couple of months ago.

While I don’t disagree with the points made there but I do have a different take on the situation.

Consider the rev-counter on your car. It displays a single figure indicating the number of revolutions per minute your engine is making). When you depress the throttle the rate goes up and the indicator shows this change.

But the situation under the hood is much more complex: valves are opening and closing, fuel is being sprayed, exhaust gasses are leaving the cylinders and being exhausted.

RPM is a result of all of this activity, it’s not a measure of all of it but it is affected by changes in the whole system.

I think bounce rate is a bit like this. Overall bounce rate is a result of the design decisions you make for your site and how users actually use it. A web manager should have a good idea of what bounce rate they expect across their site(s). So if the rate they actually see is radically different that means something odd is going on. And if the rate changes suddenly that’s another indicator something needs investigation (or conversely if the rate doesn’t change when they expected it to).

What bounce rate should I expect?

The simple answer, without knowing anything much about your site, is between 40% and 51% because 50% of all local government sites are within that range.

Which doesn’t mean that if you have a bounce rate of 60% you are doing something wrong. It means that your site has an unusual bounce rate. If that’s what you expect then that’s great news (isn’t it?).

On the whole English district councils and London Boroughs tend to be a bit less bouncy than other councils. The explanation for this could be as simple as not having public transport responsibilities which we might expect to drive lots of “when’s the next bus” type traffic.

People visiting council sites from internal addresses tend to be much less bouncy than visitors overall. So if you have a site targeted at school professionals, for example, you would expect much lower bounces.

And the design decisions you take will also play a key role. Are you handing people to third-party sites for example?

Cross domain tracking rocks

One thing that struck me when I was asking councils for data on their website visits was the number of councils tracking each domain (or microsite) separately. If you treat your digital newsroom as a distinct analytics property to your main site, both will look artificially bouncy. If you use cross-domain tracking you’ll get a much more realistic sense of what is happening to traffic between those sites.

A tool, not a grade

To me using website analytics data is all about understanding what your site is designed for, how it is being used and then tweaking and improving the design to meet the needs of your users. A single figure (like bounce rate) can never give you the full detail of what’s going on under the hood but it can tell you whether things seem to be working as they should or need closer investigation.

Investigating crime data at small geographies

A bit of background (skip if you’re familiar with h-map)

I’ve been doing some work with Herefordshire charity the Bulmer Foundation (and a lot of other organisations) to create a central resource of data about how sustainable Herefordshire is becoming. The Foundation has been working on this for some time and, following a lot of consultation, they have identified the key indicators of sustainability for the county.

If these things are getting better, Herefordshire is developing more sustainably. If these things are getting worse Herefordshire is developing less sustainably.

Identifying the indicators is important but it’s not the whole task. Next we have identify the data that tells us if things are improving or not for each indicator. Then we have to display it in a useful way.

The Foundation has commissioned some development work which is still in progress but play-about-with-able at h-map.org.uk.

Understanding how things are changing at the whole county level might be interesting but for most practical purposes it may not be useful. We need to get into the data at much smaller geographic levels. As a way of investigating how this might work I decided to investigate crime data which is a nice, rich dataset.

Processing

This post is really a set of notes about how I processed the data. Partly for my records but also to encourage les autres and to enable people to suggest ways this could be done better.

I’ll write about the implications for the h-map project on the project blog in due course.

The police.uk website publishes crime data in a number of ways. There is an API but I haven’t explored that. Humans can download see some data on maps and they can elect to download some CSVs

I asked for everything they had on West Mercia ( the police force serving Herefordshire, Worcestershire, Shropshire and Telford). I received a zip file containing a bunch of csvs. One csv per month for all of the months starting Jan 2011 and ending Decemb 2014. Each csv contains a row for each crime recorded across the force area. Against each row is a type of crime, lat, long, outcome and sundry other data including the LSOA.

That’s great but much more than I need. All I ultimately want to know is the number of crimes recorded per year in each LSOA. If I had better coding/scripting skills I could probably have got a few lines of Python working on the task for me. But I don’t.

Also it was an excuse to play with Datagraft.net This new cloud thingy has sprung from the Dapaas project involving the ODI and other cool linked/open data folk. It allows you to create transformations in a reasonably point and click manner. There is a pipeline metaphor so transformations proceed sequentially. There is a bit of a learning curve and the documentation is early stage but I got the hang of it. It allows you to preview the effect of your transformation on real data which enables effective pressing of buttons until the right thing happens.

So, after a bit, I managed to build a little transformation that pulls out rows here the name of the LSOA contains “Herefordshire” and then creates a 2 column table from this with LSOA and month the respective columns.

I still have 1 csv for each month. It might be that datagraft will append rows onto an existing column but I couldn’t work out how to do this.

So I had a happy couple of hours uploading each csv and downloading the processed file.

What I was aiming for was number of crimes per year per LSOA. So I had to get my monthly files into 1 big yearly file. Which I did manually with the assistance of the excellent TextWrangler application. It really made this tedious manual task a breeze.

Then a simple pivot table for each year gives me the totals I was looking for.

There was a little bit of spreadsheeting to decide if crime was improving, worsening or not changing.

And finally the application of Google Fusion Tables to link the LSOA codes in my dataset to polygons (described as KML) and I have a lovely map painted red, amber and green.

Datagraft enables me to save the transformation so when all of 2015’s data becomes available I’ll be able to repeat the process. It also enables me to to publish my dataset and to RDF it.

Maybe next week for that.

If you have any suggestions for ways I could cut out steps, or improve my data wrangling I would love to hear them.

big numbers on council websites: a guide for comms folk

(I forgot to add this at the time, I’ve put it here for completeness, apologies if it is clogging up your RSS feed)

vintage-649760_1920

I wrote a piece on Comms2Point0 about why some of my findings about local government websites asking whether 10 visits per person per year is, in fact, a lot.

Read big numbers on council websites: a guide for comms folk now.

Things your web team didn’t tell you about Google Analytics.

Photograph: Monkey seems to whisper into the ear of another monkey.You have a lovely shiny website. Your in-house team or your external contractors wired it up to Google Analytics account and handed over to you (in marketing, digital or comms). They may even have muttered something mysterious as they left.

But there are some things they didn’t tell you.

They (probably) don’t actually know much about Google Analytics.

Now some of my best friends build websites and I promise you that they know a lot of clever things. (I am already wincing in anticipation of their outrage as they read this, but let me explain).

Your web team definitely know some things about Google Analytics. They installed the tracking script. They made sure it worked and, I imagine, they can talk with great confidence about bounce rates, pageviews and unique users.

Which is the sort of thing webby people care about.

But that’s not even half of what you can find out about your users. Analytics tells you who is visiting, how they got there, and what they do on the site. And it blends those data so if you want to know “these people that signed up for our newsletter, where did they come from?” it will tell you that.

Google Analytics is an immensely powerful tool and your web team probably knows how to get the most out of it for web things. But there is so much more.

They are oblivious to the joys of Campaign Tagging

One of the many fascinating things about Google Analytics is that the useful stuff doesn’t happen on your website. It happens in Google’s data centres.

The tracking script your team installed collects a bunch of stuff each time your users visit a page and sends it off to Google who turn it into useful reports for you.

But the tracking script is limited by what the user’s web browser will tell it. And that is limited by what it knows.

One of the areas that is a real problem is “referers”. Essentially the Google Analytics script asks the browser, what page brought you to this site ? But the browser often doesn’t know, especially if the user clicked on a link in the Facebook app or in an email (like the ones you send out to get people to your site).

So you are missing some crucial data.

But don’t call your web team, they can’t help you.

The solution is in your own hands. You need Campaign Tagging. Essentially you add some codes to the end of the links you share. The tracking script sees these and uses them to understand where the user found the link.

It’s all handled away from your website so your web team is quietly oblivious.

They are unmoved by the awesomeness of attribution modelling

Create some goals (you know, the things you want people to actually be able to do on the website) and Google Analytics will tell you amazing things about your users.

It will tell you which pages are really helping get people to the goal and which pages are really causing a problem.

It will tell you which marketing channels are really working for you and which are a waste of time (and money). And it’s pretty sophisticated. Let’s say you run an advert on facebook which brings people to your website but they don’t go on to buy your thing. Should you give up on that advertising channel? What if those people come back and are more likely to buy next time? You could be cutting off your nose to spite your face.

Google Analytics will give you the answer to this.

But, I’m afraid, your web team probably won’t.

And if you want a little help, give me a call.

Image credit: Can you keep a secret? by jinterwas used under a Creative Commons licence

Am I normal?

I should probably have written this a few days ago.

 

Are we normal?

It all started when I was training some local government web folk on the dark arts of Google Analytics.

One of the pieces of data that Google Analytics (or indeed many other analytics tools) will give you about visits to your website (which Google Analytics calls sessions) is what proportion of them are “bounces”.

Someone bounces on your site if they only view one page in their visit*.

So if your page is there to provide helpful information, you’d probably be quite pleased with a high bounce rate because it suggests people are searching for information, finding it and leaving again.

On the other hand you might have designed a site which encourages people to browse around, stumbling across new things. In which case bouncing would be bad.

And, of course, local authorities are service providers. If people come to your website to pay for things, request things, or book things then they (hopefully) won’t bounce either.

So the question these folk asked was “Is our bounce rate typical for local government?”. Which is an interesting question and one I couldn’t directly answer.

Is that the right question?

Now obviously the key question should be “Is it the bounce rate you expected when you designed the site?” In fact I argue you should always know what good looks like before delving in to analytics tools.

But it’s always nice to compare to others. It’s always useful to be able to think “I would expect my site to behaving very much like X council” and then find out if it is or isn’t.

And a question that always bothered me when I was responsible for local government digital services was what level of mobile traffic I should expect. Other councils had higher proportions of mobile traffic than my council. But was this a factor of serving a rural population? Was it a factor of the sort of services and content we provided? Was it an aspect of the way we marketed our services.

The more I thought about it, the more interesting (and useful) I thought it would be if we could see this data for each local authority.

So I decided to ask them.

It’s my fault

So, I’m afraid I’m responsible for a flurry of FOIA requests flying around the country. I’ve tried to keep the request specific and simple to answer.

I hadn’t quite appreciated how much email traffic it would generate or how much admin it would involve me in. But hopefully in a few weeks I should have a large dataset of some key metrics.

This is not about league tables or declaring winners: the only correct bounce rate is the one you intended for your site. This is about trying to help describe the ranges and differences between types of authorities and different parts of the UK.

I’ll be sharing it all back and I hope that it will be a useful contribution to the sector.

That said, I may think twice before embarking on a mass FOIA again…

*Inevitably it’s actually more complex than that but that’ll do for the purposes of this blog post.

Four steps that will transform your relationship with Google Analytics

Steps

The world is divided into two groups of people

The first group has Google Analytics wired into their website. They have dashboards set up that they check daily. They run weekly and monthly reports and pour over the data with confidence and enthusiasm. They advise managers in their organisation of changes that should be implemented in the website and in services in general based on the insights from this data. And then they see the fruits of their labour reflected in the reports they run next month.

The second group also has Google Analytics wired into their website. This group logs in every so often and pokes about in the back-end in a desultory fashion. They look at some graphs which seem to be going in the right direction and print them out and send them to some people who examine them in a somewhat confused manner. Nobody changes anything as a result of insights gained from the data.

I’m going to guess that you are in the second group. Most people are. Google Analytics is extremely widely used and very powerful but it’s very power can make it seem confusing. Even if you are confident interpreting what it is telling you affecting change in the rest of the organisation can be tricky.

Here are four simple things you could do differently that should help

1. Never look at Google Analytics unless you have a question that needs answering.

 

Google Analytics is a reporting tool. It tells you what happened and when. It reports an awful lot of things. Most of these things aren’t relevant to you right now (possibly ever). Decide what it is that you want to know and your journey into the data will be smooth and pain-free.

2. Decide what you think the answer should be before you look

Have you just run a marketing campaign? Did it go to plan. Logically then lots more people should have visited the service you were promoting right? You would expect to see sessions and probably pageviews increase. You would expect to see lots of people visiting your site via Facebook (if that’s where you ran the Campaign) and so on.

Decide what you think success would look like and then you are looking for a yes/no answer: did it happen as I was expecting?

3. Be clear about what the link is to the organisation / service objective

It’s nice when people visit your website but there has to be more to it than that. If you sell things then, presumably, the service objective would be to… er… sell more things.

In other walks of life (say local government), perhaps you hope that people will be less likely to phone you if they’ve gone online (and so will save you money) or will be able to do something new that will help them in their lives.

If you can’t see the link to the organisation / service objective then you’re going to find it hard to do anything useful with the Google Analytics insights. If that is where you are, put the analytics book down and go and talk to the service.

4. Understand what you will do with the answer

Essentially you can recommend three broad things as a result of what the data show:

  • carry on doing the same thing (because what you are doing is working)
  • do something different (because what you are doing is not working)
  • stop doing anything (because what you did worked so well the job is done)

Make sure you know will actually make that decision in your organisation. If you know what the link to the service or organisation objective is And then understand what information they are going to want from you.

Some people might simply want an email saying

“I’ve had a look at the website stats, things aren’t going to plan. Unless we do something different you’re not going to meet your service target. I suggest we switch our focus to Twitter for the next few weeks”.

Some people (like me) will want a graph for every step of the way

“ Before we started the campaign visits to the page looked like this: [GRAPH]. We have been running the Campaign for four weeks and visits have changed like this [GRAPH]. As you can see the few people that have used the service came via Twitter [GRAPH] so I recommend we switch our emphasis to Twitter for the next few weeks. I’m sure I don’t need to remind you that your service target is for lots more people to use this service [GRAPH]”

And some people just love the numbers instead (accountants especially).

Every journey starts with a first step

These steps will not turn you into a Google Analytics ninja overnight. But they will mean that your relationship with the platform starts to become more productive. Gradually, step by step, it will become less confusing and more useful. And before you know it you will be well on the way to joining that hallowed group of people making analytics work for them and their organisation.

My company The Likeaword Consultancy can help you get the most out of Google Analytics.

It’s all about the podcast

For several months now I’ve been recording a podcast with Helen Reynolds from Social for the People. We talk about things that have been in the news that interest us: a bit of social media, government, housing, data, emergencies and PR generally. Sometimes we have a guest.

It has been and continues to be an interesting learning curve. In fact Helen has just posted some tips for new podcasters.

The feedback so far has been pretty positive and so we’ve gone so far to set up a proper home for the podcast online: The Natteron Podcast. We really appreciate the fact that people listen and that they go to the trouble to get in touch about the show.

If you’ve got a suggestion, comment or question please do drop me a line or comment below.

Oh, and the latest podcast, Sixth (non)sense, is up. Have a listen, it includes:

  • Mike Bracken’s dramatic departure from Government Digital Service
  • The company that banned emails and managers
  • Hacking cars
  • Festival of Code winners
  • The value of the Chartered Institute of Public Relations
  • some stuff about analytics