All tomorrow’s data
I love website usage data. Can’t get enough of it. I love it so much that last August I asked every council in the country to send me some.
And they did (well nearly all of them did). And I poured it all into a big beautiful spreadsheet and put it on the web. The usage of local government websites in Great Britain.
Which was nice.
Unfortunately my love for website usage data is such that it was not enough. I want to know more. What are the annual trends and the seasonal trends. Do areas with significant tourism industries get more interest in the summer (or the winter)? What areas of websites are getting the most traffic?
More FOIA, more hassle
Now I could just ask for this data. That worked tolerably well last time but it’s a pretty unsatisfactory idea. Each FOIA request generates work in each council and when the data comes in, it creates work for me. And, though I love website usage data, that is work time that might be better spent doing things for paying clients so that I can afford to feed my dog.
And also, you know, it’s the 21st century: machines are supposed to do this sort of thing. Some (surprisingly few) councils already publish their website usage data. Getting more of them to do so would be a start but unless we can get the data marked-up against an agreed standard is still going to take a human being a distressingly long time to collate.
The Americans will solve it
In the USA there is a project that could provide a nice model. Its called the Data Analytics Programme.
Participating departments and agencies insert an additional tracking script in their webpages. This sends packets of data back to the project server, and this is then available to anyone who shares my website stats interests.
We could do that here couldn’t we? It would be easy for councils to implement and should ensure that people like me cease to trouble them with FOIA requests in this area. And it will provide really rich benchmarking and research data. If we included mobile app use tracking that would provide really useful evidence in the “I want an app” “No-one will use your app” arguments.
It wouldn’t be entirely free. We’d need some server capacity and some support to maintain the analytics tool. But it would be very low cost.
What’s not to love?
This is not the only way
I know what you’re thinking (no, not that). You’re thinking: couldn’t we just use Google Analytics for this? And the answer is yes, partially.
In principle we could set up a new Google Analytics property for “All council websites” and harvest that data but the combined traffic would significantly exceed the maximum allowed in the Google Analytics free tool.
All but one council already uses an analytics tool so, as an alternative, we could automate the collection of data from their existing tools. Overwhelmingly they use Google Analytics which has a beautiful API so that is certainly feasible. Feasible but practically complex. That means each council will have to manage user credentials and those will also have to be managed by the central datastore. If the council switches analytics tool that will create an additional (and little used so easily forgotten) admin load.
Good idea / bad idea
Who’s with me?
What would be the best tool?
Why is this not the best idea you’ll hear about all day?