Data maturity in local government

Black and white photo of someone running up steep stairs

I’d like to talk about Data Maturity

Data maturity has recently become a big thing in my life, not least because of my involvement in Data Evolution: a project looking into data maturity in charities and social enterprises. As part of that project some of my colleagues have undertaken a desk review of data maturity models.

I’ve found these models helpful in thinking how I work with specific organisations in a couple of ways:

  • I turn up and start talking about Google Analytics it is really helpful to get a sense of what the culture of the organisation is around the use of data to inform decisions first
  • Some of the tacit assumptions I have about the use of data and targets to manage organisations come from my experience in a reasonably data-mature organisation. In less mature organisations I find that we have little shared frame of reference.

Data vs open data

I’m also interested in the relationship (if there is one) between the data maturity of an organisation (the culture in the organisation around the use of data to inform and improve decision making) and the open data maturity (the publication and use of open data to support and enable a wider ecosystem).

So obviously I pitched a session on this at LocalGovCamp. We were a select band but I was delighted that anyone else at all wanted to talk about this topic.

What, actually, is data maturity?

We kicked around the idea that organisations can be at different stages on a data maturity gradient. Within Data Evolution my colleague Sian Basker often describes a broad sweep:

  1. ad-hoc gathering of data in some areas
  2. pulling data together centrally
  3. starting to use data looking backwards (how did we do last year? what should we do differently?)
  4. using data in real time to manage the organisation and move resources rapidly
  5. modelling the future before making decisions to enable better decisions to be taken
  6. modelling the future the organisation wants and working backwards to understand what needs to happen now to deliver that future

And all the evidence is that it’s hard work (and takes a long time) to progress along this gradient.

This seemed to resonate with our experience of local government.

Would a model help local government?

We concluded that a local government data maturity model might be really helpful:

  • to begin to structure conversations in organisations
  • to help people understand where their organisation is
  • to help people understand where their organisation might get to
  • to help inform decision making, investment and planning

Lucy Knight shared her experience of using the ODI Open Data Maturity model in just this way (to have useful conversations around open data).

There are some specific things to consider around local government. In England certainly, parliament has made a set of assumptions around the data maturity not just of councils but also of the local public service system. The requirement to undertake Joint Strategic Needs Assessments for example, assumes a minimum level of maturity. It will be interesting to reflect on how realistic these assumptions are.

Minimum viable models

Then we did what any self-respecting group of govcampers would do in such a situation: we got out a flipchart and started thinking about the sort of things that would be helpful to include in a local government data maturity model. Lucy Knight has already spun up a Google Doc with our first thoughts on a data maturity model for local government.

More, as they say, as we get it.

PS. Though no-one from NESTA was there it’s worth noting the Local DataVores research project which is investigating these sorts of areas (NESTA is also part funding Data Evolution: thanks NESTA!)


Using web stats to engage colleagues and improve performance


(Not my choice of title)

I gave a presentation at Better Connected Live yesterday (25 May).The slidedeck is available online including a cheesy stock image that amused me (but, it would seem) no-one else. And a slide only included so I could make a “Why did the chicken cross the road” gag. Which I totally failed to do.

A talk of two halves

The first half of the talk was a rambling discourse of my, possibly ill-advised, research into the use of local government websites. I have written extensively around this research so I will spare the reader what I failed to spare the audience.

Engaging colleagues

First of all let me confess that I was never brilliant at engaging colleagues. My tactic of repeating “I don’t care what you think” was often not seen as an offer to collaborate.

However it did happen occasionally and since I’ve fled the shores of local government I have seen good examples of other people working closely with their colleagues.

Let’s not talk about data. Let’s talk about services.

The more I work with data the more I think it’s the wrong thing to talk about. No-one (apart from data-geeks) really cares about the data. They care about what the data tells them about their life or work. So let’s talk about that.

This also shifts the power dynamic. The web team “owns” the webstats. The service manager “owns” the service. So let’s talk about the service.

Who used your service? How did they get there?

My old friend Google Analytics (don’t forget Google Analytics is used by 89% of local authorities) is good at capturing and reporting referral headers. Referral headers, broadly, tell the stats engine which website the user was on before they arrived here.

Except referral headers are not passed by many email clients or by social media apps (though typically they are passed by social media platforms opened in web browsers). Which has the effect, for many organisations, of under-reporting referrals from social networks and emails.

There is a solution but I don’t see it widely implemented across local government: campaign tagging. Essentially manually appending extra information to the URL when you share it.

The Google URL Builder tool makes this easy and the process can be automated or semi-automated for enterprise use.

Referrals tell you not just what channel they used, but potentially infer some information about the user (if they clicked a link on the St Mary’s School website maybe they are a parent or pupil there). Device use, browser choice all help build a profile of who is visiting your content or accessing your service.

What did they do next?

One of the most powerful signals that your content or service is working well (or not working).

If people visit the missed bin page and then vanish from your site it suggests that they got what they were looking for. If people visit the missed bin page and then visit other pages in the waste area it suggests that page (or potentially the navigation leading to it) is failing the user.

What did you expect?

The killer question.

For the service manager .

This is your service. Who are you expecting to use it? Where are you expecting them to come from. What are you expecting them to do next?

It’s OK if this is a back of an envelope calculation but my golden rule of not getting hopelessly lost in analytics data is never to look at it without a question. The best question (at least to begin with) is “did this work the way we expected”.

The answer is almost certainly “No it did not work the way we expected”.

Why did it happen this way?

Your chosen webstats package can tell you what happened and when but it cannot tell you why.

The why is the interesting question of course. It’s probably because the service isn’t working for the user. The best way to fix it, of course, is to go and talk to some users.

But now you know what to talk to them about.

Use simple infographics

In the same way as the longer I spend working with data the less I talk about data the longer I spend with graphs (or infographics) the less I want them to do.

My favourite infographic is a single word.


(this worked as expected) or


(this did not work as expected)

Time series bar charts and scatter plots are terrifically useful for investigating “Why did it happen this way” but they are, in my humble opinion, largely rubbish for engaging colleagues.

Keep it nice and simple. Add complexity only when the user needs it.

The goal is your friend

Goal tracking is a very powerful tool in Google Analytics. It’s not expressed in language that resonates with local government (lot’s of stuff about ecommerce). But goals can be expressed flexibly and give you really powerful insights into how people are interacting with your site over multiple visits.

It can be a challenge to define goals for your website. But if you don’t know what the most important tasks are right now then what do you know?

The unit of delivery is the team

(As someone once said)

This stuff works well when everyone gets focused on the same task. I achieved most as a web manager when I worked alongside service managers looking at all of our data: web, calls, service levels together. I achieved least when I used data to try to win arguments (or service managers did the same with me).

In conclusion

There is mixed practice in local government around the use of webstats but I don’t think that can be broken out of the organisation’s practice around the use of data generally.

In fact data was a recurring theme at Better Connected Live. Which is good.

I find it helpful to remember that organisations don’t switch between binary states of “using data well” and “not using data well”. Instead data-sophistication is a journey.

In fact I’m involved in a data sophistication project in the voluntary sector called Data Evolution for just that reason.

(Photo credit: why by Art Siegel used under CC BY-NC 2.0)

Telling stories with data.

This is a place holder.

More will be along soon

At OD Camp I (Ben) suggested a little project exploring creative writing with data.

A small but perfectly formed group kicked aroun a few ideas.

We looked at a range of datasets but they didn’t seem to inspire.

I had this idea of the story of an osprey.

I used Twine to give the reader some choices. The consequences of the choices are informed by some (public and/or open) datasets.

It’s quick and dirty.

I’ll add new versions here:

Version 0. Very rough and ready.

Public bodies that use Google Analytics do hold the data collected

The dangers of email mail-merge

A long time ago I rather rashly made an FOIA request for website usage data from every principal local authority in the UK. Things got a bit fuzzy in Northern Ireland owing to their local government reorganisation but in general most people handed over the information. Sometimes incredibly swiftly (take a bow Cardiff), sometimes with a bit of nagging (I’ll spare your blushes).

Two councils refused my request on the grounds that they don’t hold the data. Being the suspicious type I investigated their home pages. One of them did not appear to be running any tracking script which (though eccentric) seemed to be in line with their response. The other was running the Google Analytics tracking script. I pointed this out and following a brief email exchange my request was rejected[changed from reviewed which was a typo] and this had been upheld by an internal review.

So I referred the matter to the ico.

The ico investigates

I have to say the investigating officer was a remarkably nice and helpful man, despite my erratic phone answering and the comparative nerdiness of my request. After some discussions and contemplation the ICO issued a decision notice in which, you will not be surprised to learn, my appeal was upheld.

The decision notice has been published. They’ve taken my name out, which is nice, and you can read the decision notice in a PDF.

So apart from crowing at my (let’s face it: pretty minor) victory why else am I here?

Well though I always thought the council was wrong I could kind of see where they were coming from and so I think aspects of the ICO decision notice are helpful to note.

Things to note

The ico decided:

17. Having considered the above, it is evident that Google Analytics holds the usage data because the council has previously instructed it to do so (i.e. by actively placing a tracking script within the code of its webpages). Whilst the council has explained that it no longer needs this usage data for any business reason, it is clear that Google Analytics continues to collate and store the usage data because it has not received instruction from the council not to (i.e. through the removal of the tracking script). On this basis, the Commissioner has concluded that the raw usage data is held on behalf of the council by Google Analytics.

The Council also pointed out that they would need to run a report to answer the question which would be the creation of new data: something they are not obliged to do. The ico has given them quite a lot to consider on this point but concludes:

22. Having considered the above, it would appear to the Commissioner that running a report on the electronically held raw usage data would result in a statistical summary. It would also appear that it may be reasonably practicable for the council to provide such a summary, due to it having both the Google Analytics tool and council officers with the necessary skill to use it. On this basis the Commissioner would be likely to conclude that the provision of a summary based on the raw usage data would not represent the creation of new information.

So. If you collect the data, you hold the data. If someone asks you for a statistical summary of the data you hold that is (within limits) covered.

(I haven’t actually received the data yet mind).

What should define a Multi Agency Information Cell?

Why the gripping headline?

I’m not sure I’m getting the hang of writing click-bait headlines. But this is a significant question for some people. And some of those people read this blog.

What’s it all about?

Version 2 of the JESIP doctrine has been published for consultation. JESIP is the Joint Emergency Services Interoperability Programme and the JESIP Doctrine lays out how the emergency services should work together around major incidents.

Though JESIP is about the emergency services the doctrine actually affects hundreds more organisations because they (local authorities, heath bodies, utility companies and so on) have a duty to work with the emergency services (and each other) to sort out emergencies.

The original JESIP doctrine was pretty clear and sensible. Version 2 builds on these pragmatic and sensible foundations but adds in a couple of years of learning since the original. You can see the draft JESIP Doctrine here.

Get to the point Ben

Section 5 of the draft doctrine covers Information Assessment and Management. It touches on a range of things that will be of interest to people in my network (like essentially recommending ResilienceDirect as the way you should exchange data).

Section 5.4 issues a “Framework for Information Assessment” which is really saying “let’s be consistent in when talking about how reliable information is”. The question of how you assess the reliability of publicly available information (like reports on social media) is something VOST and Digital Humanitarian groups have some considerable expertise in.

Most exciting is section 5.5 which mandates a Multi-Agency Information Cell. This is a dashed good idea. In fact many people might think it sounds rather like a Virtual Operational Support Team (or VOST). In the current draft though the MAIC does seem a bit inward looking, pooling the geographic data that agencies have.

This sparked a bit of a discussion on Twitter and I said I would fire something up to see if we can get some sort of consensus from the VOST/BlueLight and possible CrisisMapping community.


The consultation is open to anyone to respond. responses have to be sent in a fairly structured way (using an Excel spreadsheet – I’ll park the discussion on the use of open formats for a more appropriate time). So anyone can (and probably should) respond in their own right.


I’d really appreciate the insight of the wider digital and emergencies community specifically on the sections about the Framework for Information Assessment and the Multi-Agency Information Cell. I’ve pulled those sections (and only those sections) into a Google Doc.


Living on the edge of devolution

Picture of a walker on a hill side with a farm in a valley far below and behind

Trains are devolving

Responsibility for the West Midlands rail franchise could be handed to the putative mayor of the West Midlands it has been reported. 

That’s probably a good thing. This devolution idea seems to be likely to hang around for a bit. Politicians based in Birmingham are likely to be more interested in rail services based in Birmingham than the bunch based in London. They may not run it any better but it’ll be a shorter drive to go and shout at them.

I’m not based in London or Birmingham so the whole thing is of largely academic interest to me.

Except actually I use that franchise quite a lot. The West Midlands franchise (currently badged London Midland) provides the (rather slow) rail link between Hereford and Birmingham (and Malvern, Worcester etc). We are, quite literally, the end of the line (or I suppose, the start).

London Midland isn’t the only operator serving Hereford. We travel north to Manchester and south to Cardiff and beyond on Arriva Trains Wales. Responsibility for this Wales and Borders Franchise is being handed over to the Welsh Assembly.

We’re also the end of the line for the occasional Great Western service (not complaining I love my early morning chug through the Cotswolds).

From next year the vast majority of our rail services will be under the control of devolved institutions. That’s exciting. Closer to the people, more accountable, more joined up.

Except, here in Herefordshire we don’t get a vote in either of those institutions. We’re not in Wales nor are we part of the West Midlands Combined Authority.

This might not matter. It’s not like transport planners have been focused on making an effective and efficient service at Hereford station up until now.

But it probably will matter.

The whole point of devolving decision making is that it will make it more responsive to local people. So the West Midlands franchise should be, under devolution, run to be in the best interests of Birmingham and the Black Country. The Wales and Borders franchise should be (despite the name) run to be in the best interests of the Welsh.

If it serves Herefordshire well it will be by accident or because our interests coincide with the interests of the people of Wales or the West Midlands.

I suspect you probably still don’t care. We’re only talking about trains after all.

But let’s think about some of the other issues that are going to be devolved to the West Midlands and are already devolved to Wales: health, social care, road transport, economic infrastructure investment.

In all these cases we’re on the edge. Potentially squeezed between two institutions created with the explicit purpose of not having to worry about the impact of their decisions on us.

What can we do?

As I see it we have three broad options

  1. We can ask to join one of these devolved bodies.

    Joining Wales might be a hard sell in the county (though maybe not, they are our neighbours after all (yes, OK, it would be a hard sell)). It would probably be an even harder sell to the people of Wales. And even then I can see negotiations over the use of the language and the location of “Welcome to England” signs spiralling out of control.

    The West Midlands might be even tougher. We don’t share a land border with the West Midlands county and, let’s face it, no more than 1 in 100 of their citizens could point to Hereford on a map, let alone Leominster.
  2. We can get a devolution settlement of our own.

    We’ve already missed out on some of the juicy deals but I think most Herefordians would have some strong opinions about local priorities for road spending or health. But the independent state of Herefordshire? It’s difficult to see it. Cornwall managed to get an early devolution deal so it’s not unheard of for a single county. But Cornwall has over half a million inhabitants (that’s not far off 3 times our population).

    So a combined authority then? Joining up with Shropshire and Telford? Or maybe with Worcestershire? That worked really well before didn’t it?

    No. No it didn’t.

    Personally I’m quite attracted to the idea of devolution to the Marches (currently configured as Shropshire, Telford and Herefordshire). If Worcestershire joined we’d match the police force boundary. If they can do it maybe politicians can to.But then I grew up in Hereford, lived in Shrewsbury and worked in Ironbridge. My perspective may not be typical.

    At least we could expect more of a focus on improving the A49.

  3. We can say a plague on all your houses and just ignore it.
    This also has its attractions. English devolution may not take. And compared to what we see in Wales it’s really quite limited.

    There are real reasons to be cautious. In 2008 Herefordshire Council merged with the Primary Care Trust (then the thing that made most of the decisions for the NHS in Herefordshire). This was an uncharacteristically innovative thing to happen in what is , to say the least, a small-c conservative county.

    That was unwound in 2013 not because the people of Herefordshire thought it was a bad idea but because the government struck Primary Care Trusts out of existence (as part of fulfilling their election pledge that there would be no top-down reorganisation of the NHS).

    That should create within our leaders locally a healthy scepticism of how much power the folks in London are really going to hand over.

To devolve or not to devolve

I’m not trying to sell one option (except I think joining Wales is probably out) but I do think it’s something we should probably talk about.

Wouldn’t it be great if the people of the county had a say in where the decisions that affect the county were taken.

Is that a foolish hope?

If you want to know more about English devolution this blog post from the LGIU is a good place to start.

Photo credit: Fran heads for the edge by Alastair Campbell used under CC BY-SA 2.0

A Minimum Viable LocalGov Web Analytics Platform

A couple of weeks ago I wrote about what a jolly good idea it would be if we could pull all Council website stats into one place.

As a next step in this journey I’ve set up a minimum viable analytics platform.

It’s very minimum. Out of the 400+local authorities, it features data from 2.

They are

East Sussex, my favourite council in this space, which publishes on a Google account that will give anyone read access to their Google Analytics property

North Yorkshire, which doesn’t publish an account like East Sussex but kindly set one up to allow me to experiment.

So. The first order of business is to pull some data out of their Google Analytics account. I wanted to use the Google Analytics add-on for Google Sheets because it makes life very easy.

Each spreadsheet has to be tied to one Google Account. In this that meant having to create 2 spreadsheets. Luckily I was able create each spreadsheet on Google Drive using the account details I had been given (yes I can see a problem with this too: bear with).

So I created a spreadsheet for each account and used the Add On to run some simple reports. Then I gave myself access to those spreadsheets and copied their contents into a Spreadsheet of my own devising.

And was able to draw a nice graph of sessions per month for each council.

It would be straightforward to add more councils in the same way if they published an account as these two have. I don’t think that’s the solution though.

Creating a Google Account and publishing the password is a really attractive idea but it does enable people to use Google services completely anonymously. They might well want to do that for nefarious reasons. The risk is pretty low whilst only one or two councils does it but the risk would grow significantly if it became standard practice.

This is annoying of course but there we are. What could we do instead?

  • We could encourage councils to run a standard set of reports and publish these. Google Sheets would be a neat way of doing this because the integration is simple and the data can be pulled out in many useful formats.
  • We could set up a series of (that’s not me if it exists) accounts and ask councils to grant us access to their properties. That would save them the trouble of having to schedule reports and we could hand their data back to them nicely
  • We could ask councils to submit annual returns from their websites in a nice simple form
  • We could do all of those and give councils the choice

Going through this process makes me see why the US Government uses its own tracking script for Federal Departments and Websites. From my point of view that would be the most satisfactory approach but 650,000,000 visits a year probably equates to 1.5bn records a year so that’s not a project we’re going to manage successfully as a spare time endeavour.

But I’d really like to hear from others on this.

Five things corporate communications teams should know about their website.

Cloudy sky seen through the red net of a football goal
Soccer goal by ewiemann used under CC BY 2.0

It’s funny how often I find corporate comms people who are divorced from their organisation’s websites.

Sometimes it is clear that internal turf wars ensure that they are kept at arms length. Even when corporate comms teams are welcomed into the digital/webby fold or placed in charge of websites they still seem to miss out some of the fundamentals.

So here are my top five things they should know. As always I refer to Google Analytics because it is incredibly widespread and powerful but other website analytics tools exist..

1. How do people get to your website?

It’s a simple enough question and one that Google Analytics tries hard to answer. Left to its own devices Google Analytics will under estimate the number of people that come following links from social media and email (for more on this try this interactive story).

Crucially, of course, ask not just “how do people get to us?” But also “is that what we expected?”. If you are running a campaign focused on email marketing you would expect more people to visit you via email.

2. Which campaigns work best?

You are planning and delivering some of the finest, cutting edge multi-channel campaigns ever devised (possibly). But really: are they any good?

Some aspects of your campaigns are easy to capture in Analytics (there’s a nice integration with AdWords for example) but it’s a bit harder to align that radio ad alongside. But think a bit laterally: maybe give people a voucher code in this campaign, make the call to action a specific search on your site (I ran a radio ad like that once, I was able to establish that precisely one person searched as a result, we didn’t do that again).

Analytics has powerful reporting tools that allow you to draw a lot of this data together and, if that’s not enough, a beautiful API and more customisation than you can shake a stick at.

3. Are your goals being met?

I’m always surprised at how poorly understood the goals feature in Google Analytics is. It makes sense for organisations with a small number of clear goals (buy our stuff, buy as much as possible). But if the main feature of the site is to provide content (say) organisations can struggle to see how to describe goals usefully.

This has to be a challenge back to the organisation. If you can’t define the goals of your website that’s a problem. If your website has 300 goals all equally important that’s almost as bad.

Right now some things have to be more important to the organisation than others. That’s prioritisation. It’s something corporate comms teams should be on top of anyway.

So now you have some priority goals you can test

4. Which channels are working best for our goals?

So we really want people to sign up for this new service. We’ve done a groundbreaking and massively successful multi-channel campaign. Loads of people signed up. Job done. Have a cigar (not really, smoking’s really bad for you).

Hang on a minute though. Was the Facebook Campaign worth the effort? That blogger outreach campaign got us loads of mentions but did it actually get people to our service?

Luckily Google Analytics has the answer to these questions.

5. What does an effective multi-channel campaign really look like?

Maybe the blogger outreach got people to have a peek at your site but they only signed up when they saw a Facebook ad? That’s important data. If you ditched the blog work for your next campaign you might see a drop in success even if you spend more on Facebook.

If only there were some way of working out the contribution that each channel made to the final goal.

Wait, there is? Wow. That’s great.

As so often when dealing with metrics and analytics the challenge is not with the technology and the tools.

The challenge is to ask the right questions.


One analytics site to rule them all

All tomorrow’s data

I love website usage data. Can’t get enough of it. I love it so much that last August I asked every council in the country to send me some.

And they did (well nearly all of them did). And I poured it all into a big beautiful spreadsheet and put it on the web. The usage of local government websites in Great Britain.

Which was nice.

Unfortunately my love for website usage data is such that it was not enough. I want to know more. What are the annual trends and the seasonal trends. Do areas with significant tourism industries get more interest in the summer (or the winter)? What areas of websites are getting the most traffic?

More FOIA, more hassle

Now I could just ask for this data. That worked tolerably well last time but it’s a pretty unsatisfactory idea. Each FOIA request generates work in each council and when the data comes in, it creates work for me. And, though I love website usage data, that is work time that might be better spent doing things for paying clients so that I can afford to feed my dog.

And also, you know, it’s the 21st century: machines are supposed to do this sort of thing. Some (surprisingly few) councils already publish their website usage data. Getting more of them to do so would be a start but unless we can get the data marked-up against an agreed standard is still going to take a human being a distressingly long time to collate.

The Americans will solve it

In the USA there is a project that could provide a nice model. Its called the Data Analytics Programme.

Participating departments and agencies insert an additional tracking script in their webpages. This sends packets of data back to the project server, and this is then available to anyone who shares my website stats interests.

We could do that here couldn’t we? It would be easy for councils to implement and should ensure that people like me cease to trouble them with FOIA requests in this area. And it will provide really rich benchmarking and research data. If we included mobile app use tracking that would provide really useful evidence in the “I want an app” “No-one will use your app” arguments.

It wouldn’t be entirely free. We’d need some server capacity and some support to maintain the analytics tool. But it would be very low cost.

What’s not to love?

This is not the only way

I know what you’re thinking (no, not that). You’re thinking: couldn’t we just use Google Analytics for this? And the answer is yes, partially.

In principle we could set up a new Google Analytics property for “All council websites” and harvest that data but the combined traffic would significantly exceed the maximum allowed in the Google Analytics free tool.

All but one council already uses an analytics tool so, as an alternative, we could automate the collection of data from their existing tools. Overwhelmingly they use Google Analytics which has a beautiful API so that is certainly feasible. Feasible but practically complex. That means each council will have to manage user credentials and those will also have to be managed by the central datastore. If the council switches analytics tool that will create an additional (and little used so easily forgotten) admin load.

Good idea / bad idea

Who’s with me?

What would be the best tool?

Why is this not the best idea you’ll hear about all day?


The wind in the website

Photo of toy rat and mole in winter woodland

I’ve been thinking about using different ways to communicate ideas lately.

I wanted to write something simple about Campaign Tagging and why it might be a good idea to try it.

I could just knock out a simple blog post. That would actually probably be the best idea in terms of SEO. But it’s dull.

I could create an exciting video. Maybe a 2 minute YouTube or a quirky and imaginative Vine.

But I wanted to play about with the idea of story and to try the excellent Twine.

So here is my first effort “The Wind in the Website (with apologies to Kenneth Grahame)“.

I’d really like to know what you think.
It’s also an opportunity to have a play about with GitHub HTML hosting.

(Photo credit: In the Woods by John Nolan used under CC BY-SA 2.0)