“Has the milk tanker been yet..? I’m waiting for the Internet”

Photo of the tank of a Milk Tanker which prominently shows United Dairies
United Dairies glass-lined milk tank – freight train tanker carriage by David Precious. https://flic.kr/p/f1MQTB used under CC-BY-2.0

 

So this afternoon I went to a conference about highways.

Despite what you might imagine, this was geeky even for me. But I had been persuaded to run a workshop on “Smart Rural” a half-formed idea I (and other rural types) have that smart city initiatives may not have that much to offer the countryside.

I thought we would be talking about autonomous vehicles and intelligent tractors. But in fact we ended up talking about internet connectivity.

This was slightly galling because I try not to talk about internet connectivity. It’s a big problem in rural areas but it’s not going to be resolved at the sort of scale and speeds that would make a lot of smart city type projects viable.

But it does seem to sit at the heart of many issues in this space.

So what, I asked the group, are solutions that don’t involve the answer “Gigabit fibre”.

And one of our participants told us the story of a hack used in Cuba to get round the fact that Internet access is not available. People move files (video, magazines, books) physically. By regular courier or truck. It’s an obvious solution. And actually in the west we move very large files (or collections of files) physically because of the time taken to stream across the Internet.

So, this got me thinking, could we do something similar in rural areas? Could we arrange local (in village) caching of, for example, the BBC iPlayer. The BBC already uses Content Delivery Networks to cache files locally to your ISP.  This would be an iteration of that approach. The data could be distributed across a local network: say a WAN or a mesh. The files could be updated over the internet pipe into the village or physically brought to the location, or a combination of the two.

And maybe the same system could serve other content. Unlike the Cuba model there is likely to be a connection to the network, just one of limited bandwidth. So the server could be intelligent about what data it pulled (and sent) down the pipe and what data stored for physical transport.

There are a range of vehicles that visit rural communities on a regular basis: most obviously (and, in this context, pleasingly) the Royal Mail, but milk tankers, feed transport, the cars of commuters, buses, refuse lorries and so on.

Maybe as the connection enabled Royal Mail van enters the village it connects to the WAN, handshakes and starts pulling the data off the network as it travels around. Then it stores it on-board and handshakes with a server connected to a (bigger, faster) pipe back at the depot. What the Royal Mail didn’t have time to capture can be loaded to the Milk Tanker a bit later.

I can’t decide if this is a good idea (in which case it’s probably already being used somewhere) or over-engineered silliness (in which case someone in my network will probably met me know.

For completeness here are photos of the flipcharts that we created in our discussion.

 

 

Data maturity in local government

Black and white photo of someone running up steep stairs

I’d like to talk about Data Maturity

Data maturity has recently become a big thing in my life, not least because of my involvement in Data Evolution: a project looking into data maturity in charities and social enterprises. As part of that project some of my colleagues have undertaken a desk review of data maturity models.

I’ve found these models helpful in thinking how I work with specific organisations in a couple of ways:

  • I turn up and start talking about Google Analytics it is really helpful to get a sense of what the culture of the organisation is around the use of data to inform decisions first
  • Some of the tacit assumptions I have about the use of data and targets to manage organisations come from my experience in a reasonably data-mature organisation. In less mature organisations I find that we have little shared frame of reference.

Data vs open data

I’m also interested in the relationship (if there is one) between the data maturity of an organisation (the culture in the organisation around the use of data to inform and improve decision making) and the open data maturity (the publication and use of open data to support and enable a wider ecosystem).

So obviously I pitched a session on this at LocalGovCamp. We were a select band but I was delighted that anyone else at all wanted to talk about this topic.

What, actually, is data maturity?

We kicked around the idea that organisations can be at different stages on a data maturity gradient. Within Data Evolution my colleague Sian Basker often describes a broad sweep:

  1. ad-hoc gathering of data in some areas
  2. pulling data together centrally
  3. starting to use data looking backwards (how did we do last year? what should we do differently?)
  4. using data in real time to manage the organisation and move resources rapidly
  5. modelling the future before making decisions to enable better decisions to be taken
  6. modelling the future the organisation wants and working backwards to understand what needs to happen now to deliver that future

And all the evidence is that it’s hard work (and takes a long time) to progress along this gradient.

This seemed to resonate with our experience of local government.

Would a model help local government?

We concluded that a local government data maturity model might be really helpful:

  • to begin to structure conversations in organisations
  • to help people understand where their organisation is
  • to help people understand where their organisation might get to
  • to help inform decision making, investment and planning

Lucy Knight shared her experience of using the ODI Open Data Maturity model in just this way (to have useful conversations around open data).

There are some specific things to consider around local government. In England certainly, parliament has made a set of assumptions around the data maturity not just of councils but also of the local public service system. The requirement to undertake Joint Strategic Needs Assessments for example, assumes a minimum level of maturity. It will be interesting to reflect on how realistic these assumptions are.

Minimum viable models

Then we did what any self-respecting group of govcampers would do in such a situation: we got out a flipchart and started thinking about the sort of things that would be helpful to include in a local government data maturity model. Lucy Knight has already spun up a Google Doc with our first thoughts on a data maturity model for local government.

More, as they say, as we get it.

PS. Though no-one from NESTA was there it’s worth noting the Local DataVores research project which is investigating these sorts of areas (NESTA is also part funding Data Evolution: thanks NESTA!)

 

Telling stories with data.

This is a place holder.

More will be along soon

At OD Camp I (Ben) suggested a little project exploring creative writing with data.

A small but perfectly formed group kicked aroun a few ideas.

We looked at a range of datasets but they didn’t seem to inspire.

I had this idea of the story of an osprey.

I used Twine to give the reader some choices. The consequences of the choices are informed by some (public and/or open) datasets.

It’s quick and dirty.

I’ll add new versions here:

Version 0. Very rough and ready.

Public bodies that use Google Analytics do hold the data collected

The dangers of email mail-merge

A long time ago I rather rashly made an FOIA request for website usage data from every principal local authority in the UK. Things got a bit fuzzy in Northern Ireland owing to their local government reorganisation but in general most people handed over the information. Sometimes incredibly swiftly (take a bow Cardiff), sometimes with a bit of nagging (I’ll spare your blushes).

Two councils refused my request on the grounds that they don’t hold the data. Being the suspicious type I investigated their home pages. One of them did not appear to be running any tracking script which (though eccentric) seemed to be in line with their response. The other was running the Google Analytics tracking script. I pointed this out and following a brief email exchange my request was rejected[changed from reviewed which was a typo] and this had been upheld by an internal review.

So I referred the matter to the ico.

The ico investigates

I have to say the investigating officer was a remarkably nice and helpful man, despite my erratic phone answering and the comparative nerdiness of my request. After some discussions and contemplation the ICO issued a decision notice in which, you will not be surprised to learn, my appeal was upheld.

The decision notice has been published. They’ve taken my name out, which is nice, and you can read the decision notice in a PDF.

So apart from crowing at my (let’s face it: pretty minor) victory why else am I here?

Well though I always thought the council was wrong I could kind of see where they were coming from and so I think aspects of the ICO decision notice are helpful to note.

Things to note

The ico decided:

17. Having considered the above, it is evident that Google Analytics holds the usage data because the council has previously instructed it to do so (i.e. by actively placing a tracking script within the code of its webpages). Whilst the council has explained that it no longer needs this usage data for any business reason, it is clear that Google Analytics continues to collate and store the usage data because it has not received instruction from the council not to (i.e. through the removal of the tracking script). On this basis, the Commissioner has concluded that the raw usage data is held on behalf of the council by Google Analytics.

The Council also pointed out that they would need to run a report to answer the question which would be the creation of new data: something they are not obliged to do. The ico has given them quite a lot to consider on this point but concludes:

22. Having considered the above, it would appear to the Commissioner that running a report on the electronically held raw usage data would result in a statistical summary. It would also appear that it may be reasonably practicable for the council to provide such a summary, due to it having both the Google Analytics tool and council officers with the necessary skill to use it. On this basis the Commissioner would be likely to conclude that the provision of a summary based on the raw usage data would not represent the creation of new information.

So. If you collect the data, you hold the data. If someone asks you for a statistical summary of the data you hold that is (within limits) covered.

(I haven’t actually received the data yet mind).

A Minimum Viable LocalGov Web Analytics Platform

A couple of weeks ago I wrote about what a jolly good idea it would be if we could pull all Council website stats into one place.

As a next step in this journey I’ve set up a minimum viable analytics platform.

It’s very minimum. Out of the 400+local authorities, it features data from 2.

They are

East Sussex, my favourite council in this space, which publishes on data.gov.uk a Google account that will give anyone read access to their Google Analytics property

North Yorkshire, which doesn’t publish an account like East Sussex but kindly set one up to allow me to experiment.

So. The first order of business is to pull some data out of their Google Analytics account. I wanted to use the Google Analytics add-on for Google Sheets because it makes life very easy.

Each spreadsheet has to be tied to one Google Account. In this that meant having to create 2 spreadsheets. Luckily I was able create each spreadsheet on Google Drive using the account details I had been given (yes I can see a problem with this too: bear with).

So I created a spreadsheet for each account and used the Add On to run some simple reports. Then I gave myself access to those spreadsheets and copied their contents into a Spreadsheet of my own devising.

And was able to draw a nice graph of sessions per month for each council.

It would be straightforward to add more councils in the same way if they published an account as these two have. I don’t think that’s the solution though.

Creating a Google Account and publishing the password is a really attractive idea but it does enable people to use Google services completely anonymously. They might well want to do that for nefarious reasons. The risk is pretty low whilst only one or two councils does it but the risk would grow significantly if it became standard practice.

This is annoying of course but there we are. What could we do instead?

  • We could encourage councils to run a standard set of reports and publish these. Google Sheets would be a neat way of doing this because the integration is simple and the data can be pulled out in many useful formats.
  • We could set up a series of localgovernmentawesomewebstats@gmail.com (that’s not me if it exists) accounts and ask councils to grant us access to their properties. That would save them the trouble of having to schedule reports and we could hand their data back to them nicely
  • We could ask councils to submit annual returns from their websites in a nice simple form
  • We could do all of those and give councils the choice

Going through this process makes me see why the US Government uses its own tracking script for Federal Departments and Websites. From my point of view that would be the most satisfactory approach but 650,000,000 visits a year probably equates to 1.5bn records a year so that’s not a project we’re going to manage successfully as a spare time endeavour.

But I’d really like to hear from others on this.

Five things corporate communications teams should know about their website.

Cloudy sky seen through the red net of a football goal
Soccer goal by ewiemann used under CC BY 2.0

It’s funny how often I find corporate comms people who are divorced from their organisation’s websites.

Sometimes it is clear that internal turf wars ensure that they are kept at arms length. Even when corporate comms teams are welcomed into the digital/webby fold or placed in charge of websites they still seem to miss out some of the fundamentals.

So here are my top five things they should know. As always I refer to Google Analytics because it is incredibly widespread and powerful but other website analytics tools exist..

1. How do people get to your website?

It’s a simple enough question and one that Google Analytics tries hard to answer. Left to its own devices Google Analytics will under estimate the number of people that come following links from social media and email (for more on this try this interactive story).

Crucially, of course, ask not just “how do people get to us?” But also “is that what we expected?”. If you are running a campaign focused on email marketing you would expect more people to visit you via email.

2. Which campaigns work best?

You are planning and delivering some of the finest, cutting edge multi-channel campaigns ever devised (possibly). But really: are they any good?

Some aspects of your campaigns are easy to capture in Analytics (there’s a nice integration with AdWords for example) but it’s a bit harder to align that radio ad alongside. But think a bit laterally: maybe give people a voucher code in this campaign, make the call to action a specific search on your site (I ran a radio ad like that once, I was able to establish that precisely one person searched as a result, we didn’t do that again).

Analytics has powerful reporting tools that allow you to draw a lot of this data together and, if that’s not enough, a beautiful API and more customisation than you can shake a stick at.

3. Are your goals being met?

I’m always surprised at how poorly understood the goals feature in Google Analytics is. It makes sense for organisations with a small number of clear goals (buy our stuff, buy as much as possible). But if the main feature of the site is to provide content (say) organisations can struggle to see how to describe goals usefully.

This has to be a challenge back to the organisation. If you can’t define the goals of your website that’s a problem. If your website has 300 goals all equally important that’s almost as bad.

Right now some things have to be more important to the organisation than others. That’s prioritisation. It’s something corporate comms teams should be on top of anyway.

So now you have some priority goals you can test

4. Which channels are working best for our goals?

So we really want people to sign up for this new service. We’ve done a groundbreaking and massively successful multi-channel campaign. Loads of people signed up. Job done. Have a cigar (not really, smoking’s really bad for you).

Hang on a minute though. Was the Facebook Campaign worth the effort? That blogger outreach campaign got us loads of mentions but did it actually get people to our service?

Luckily Google Analytics has the answer to these questions.

5. What does an effective multi-channel campaign really look like?

Maybe the blogger outreach got people to have a peek at your site but they only signed up when they saw a Facebook ad? That’s important data. If you ditched the blog work for your next campaign you might see a drop in success even if you spend more on Facebook.

If only there were some way of working out the contribution that each channel made to the final goal.

Wait, there is? Wow. That’s great.

As so often when dealing with metrics and analytics the challenge is not with the technology and the tools.

The challenge is to ask the right questions.

 

One analytics site to rule them all

All tomorrow’s data

I love website usage data. Can’t get enough of it. I love it so much that last August I asked every council in the country to send me some.

And they did (well nearly all of them did). And I poured it all into a big beautiful spreadsheet and put it on the web. The usage of local government websites in Great Britain.

Which was nice.

Unfortunately my love for website usage data is such that it was not enough. I want to know more. What are the annual trends and the seasonal trends. Do areas with significant tourism industries get more interest in the summer (or the winter)? What areas of websites are getting the most traffic?

More FOIA, more hassle

Now I could just ask for this data. That worked tolerably well last time but it’s a pretty unsatisfactory idea. Each FOIA request generates work in each council and when the data comes in, it creates work for me. And, though I love website usage data, that is work time that might be better spent doing things for paying clients so that I can afford to feed my dog.

And also, you know, it’s the 21st century: machines are supposed to do this sort of thing. Some (surprisingly few) councils already publish their website usage data. Getting more of them to do so would be a start but unless we can get the data marked-up against an agreed standard is still going to take a human being a distressingly long time to collate.

The Americans will solve it

In the USA there is a project that could provide a nice model. Its called the Data Analytics Programme.

Participating departments and agencies insert an additional tracking script in their webpages. This sends packets of data back to the project server, and this is then available to anyone who shares my website stats interests.

We could do that here couldn’t we? It would be easy for councils to implement and should ensure that people like me cease to trouble them with FOIA requests in this area. And it will provide really rich benchmarking and research data. If we included mobile app use tracking that would provide really useful evidence in the “I want an app” “No-one will use your app” arguments.

It wouldn’t be entirely free. We’d need some server capacity and some support to maintain the analytics tool. But it would be very low cost.

What’s not to love?

This is not the only way

I know what you’re thinking (no, not that). You’re thinking: couldn’t we just use Google Analytics for this? And the answer is yes, partially.

In principle we could set up a new Google Analytics property for “All council websites” and harvest that data but the combined traffic would significantly exceed the maximum allowed in the Google Analytics free tool.

All but one council already uses an analytics tool so, as an alternative, we could automate the collection of data from their existing tools. Overwhelmingly they use Google Analytics which has a beautiful API so that is certainly feasible. Feasible but practically complex. That means each council will have to manage user credentials and those will also have to be managed by the central datastore. If the council switches analytics tool that will create an additional (and little used so easily forgotten) admin load.

Good idea / bad idea

Who’s with me?

What would be the best tool?

Why is this not the best idea you’ll hear about all day?

 

Things to worry about before you worry about SEO

Very close up of an apparently worried face in black and white

I was at a business networking meeting the other day (because that, ladies and gentlemen, is how I roll). When I explained what I do most people nodded sagely and said “SEO”. Some of them went so for as to say

“yes we do that, targeting keywords, making sure we come up top in search results”

And this made me worry for them. Because, though I’m sure it does them no harm to go chasing search terms on Google there are, almost certainly, they would find would deliver them a return faster.

Here are a few of those things:

  1. Does your website work?

By this I don’t mean, is it there (though that’s worth checking). I mean does it fulfill its purpose. For most businesses its purpose will be to sell things, or to generate leads.

Lots of websites, and I mean LOTS, are really bad at this.

It’s really easy to build websites that don’t work. Much harder to build websites that do.

Your website stats package should help you answer this question. My favourite is Google Analytics and it has powerful tools to help you understand what proportion of your visitors actually go on to buy or to contact you.

2) Does your website work on mobile?

When you visit your website you’re probably sitting at your desk at work. You probably showed it to the directors by projecting it onto a massive screen. It probably looks tremendous.

But your customers aren’t visiting it on a desk (they’re really not). They are visiting it on iPhones cheap and cheerful LG phones and Windows phones (well probably not the latter). They’re sitting in their cars waiting for the kids to finish cubs or in a pub pretending to look up the quiz answers.

It’s not as simple as saying “Oo that looks nice” on your iPad. It has to be incredibly clear and simple to use on mobile. Because if it isn’t, people will go elsewhere, it’s a big Internet.

3) Is it accessible?

Broadly, can people with disabilities use it?

Now I know that for many, many businesses the question of whether your potential blind customers will struggle with your interactions is of very little significance. That’s a moral (not to say legal) problem but that’s not, in fact, my argument.

There are a range of rather geeky things that can really help search engines to understand and classify your site. Proper mark-up (the correct use of HTML) is one of them. Not hiding key information in images and videos is another. All of these are covered by accessibility. So, in fact, by serving people with disabilities you make the site work better for everyone.

4) Is it fast?

How fast?

As fast as it can be.

Consider the sense of satisfaction one gets making a purchase with Amazon. If you have one-click ordering turned on you can go from wanting a thing to the thing being ordered in a very few seconds.

Some of this is down to the fact that Amazon have very big computers driving their site. But mostly it’s because they really focus on getting the job done. It’s in the design, in the way the pages load and in the decisions they make about how to offer you services.

Website speed is actually a fiendishly difficult thing to assess objectively but, again, Google Analytics will make a good stab at it. Under a second page load speed is what I’d be looking for. But even then don’t get complacent. That customer sitting in their car might have an incredibly flakey and slow connection. Your .75 second page might take 15 seconds for them. And the next one. And the next one.

5) Have a retention plan

Most people, even if they are delighted by the speed and simplicity of your site, are not going to buy straight away. They might want to compare prices on your competitor’s site. They might want to talk it over with their partner. They might get distracted by a phone call.

So you need to focus on reaching people who have grazed your site. Encourage them to leave their email for a newsletter (a newsletter that will actually benefit them). Ask them to follow you on Twitter. Run a remarketing campaign so you can target adverts to them.

And reward your existing customers. Give them nice things. Give them exclusive discounts. Make it really easy for them to recommend your products (and services) to their friends and family.

And, inevitably, your favourite web stats package (it’s Google Analytics isn’t it) can give you rich data about what happens on the several visits that people make.

Here comes the SEO

And when you’ve got all that in place. Then it’s time to think about targeting search terms in Google.

My company, Likeaword, can help you with all of this, including finding and targeting the right search terms.

(Photo credit: Worried life blues… by Joe Sampouw used under CC BY 2.0)

Navigating around maps on websites. A guide for local authorities (and others)

Map showing gritting routes

I was supposed to be sorting out the garage. In a desperate act of prevarication, I checked Twitter. Sure enough Dan Slee had posed a gnomic question to the world.

I’m not totally sure what the argument was or who was arguing. But there are some issues that are worth exploring. Though this post has local authorities in mind most of these issues actually apply to anyone using online maps.

  1. Why put maps on your website?

Maps can be useful for displaying (and allowing people to report) lots of information. If the information has a spatial component then maps can be a very helpful way of understanding that information. So if this is information about a specific location a map can help people understand where that location is relative to their location or a third location.

Here’s a map of the routes that are gritted in Herefordshire [disclosure I was responsible for creating this map in the first place though I no longer work for the council].

Maps aren’t suitable for all users or in all circumstances. People who can’t see the screen, who have difficulty processing this sort of abstract information or are unfamiliar with using maps need other ways of navigating this sort of data.

2) GIS and the printing of maps

Usually (though not always) when people sit down to plan gritting routes they draw them on an electronic map. Specialist Geographical Information Systems make this sort of thing easy and make it simple to change routes (when a new school is built for example). And, in principle, this information can be shared with other departments (the bin lorries like to know which routes are going to be gritted for example).

In a local authority, inevitably, these routes will be drawn on electronic versions of Ordnance Survey maps. It’s easy to forget that the Ordnance Survey dataset is amongst the best quality mapping that any country has. Under the Public Sector Mapping Agreement local authorities (and town and parish councils) get to use the OS data for their patch under a licence.

The simplest way to get the gritting routes from the GIS software onto the website is to output a screenshot as a PDF (or a JPEG). Easy but not very useful. PDFs are unusable for almost all cases (reading long reports offline on a mobile phone would be one exception). Either format can’t be zoomed and so, to cover a county like Herefordshire, a large number of PDFs or JPEGs would have to be shoved out.

3) Bring on the Slippy Map

A much more useful concept is to use the awesome power of the Internet to display the mapped data in an interactive way. You’re probably most familiar with this from Google Maps. If the bit you are interested in lies to the right of the map you’re viewing you just reach with your cursor (or finger these days) and drag the map to the right.

Already this is a much better way to display gritting routes. As technology should this, now familiar, approach actually relies on a series of clever and complex interactions.

This is a post aimed at a reasonably general audience so I’ll risk the ire of GIS geeks with a simple description.

In order to get the gritting routes on a slippy map on a council website, several things need to be delivered.

The background mapping has to be available in a dynamic format, so that as you drag the map to the right, a service sends the maps covering the new area. These are just images (called tiles) though they have to be delivered so they can be shown at the right scale and in the right place.

The lines for the gritting routes have to be delivered in a similar way. They form a separate layer and are drawn on top of the background mapping. They also have to be delivered in so they can be shown at the right scale and position.

Then you need a tonne of code in your webpage that will go and get the background maps and the routes and display them, and handle the interaction with the user.

4) Give it to Google

One of the very attractive things about the Google Maps service is that it makes it really easy to do all of the things described above. You can draw your gritting routes in its service, grab a simple embed code and bosh an interactive slippy map.

Here’s Herefordshire Council using Google Maps to display car park locations [disclosure I totally failed to stop using google for this service in my time at the council].

So that’s it then?

Well no. Back in step two we saw that the gritting routes were drawn on top of Ordnance Survey data. The licence the council uses the data under means they can’t give that information (which in this case is incredibly accurate information about where roads are) to Google.

5) So are we stuck?

We most certainly are not. There are a huge range of solutions open and closed source for delivering mapping on council sites. And under the PSMA the council is perfectly able to publish maps.

6) Maps are good. Data is better.

Just supposing you want to drive from Hereford to Worcester. You want to make sure you followed gritted routes. You need to visit two websites: Herefordshire Council for the Herefordshire part of your journey and Worcestershire County Council for the Worcestershire part.

It’s not a brilliant user experience. There is, of course, an alternative. Just supposing you want to drive from West Bromwich to Edgbaston. You could visit the relevant websites or you could visit this map from Mappa Mercia. Which displays all of the gritting routes across the West Midlands conurbation.

Brilliant. Why don’t they include gritting routes further afield? Well they would like to but they encounter the licensing problem. Mappa Mercia is an OpenStreetMap project and OpenStreetMap can’t use data with restrictive licenses.

There are many, many reasons why local authorities might want to support OpenStreetMap but they’ll have to wait for another post.

7) Data is rubbish. Open data is resource.

Imagine a world where the gritting routes the council used were derived in an open way. Perhaps by putting GPS loggers (or as I like to call them “phones”) in the cab of the gritters.

Those gritting routes wouldn’t be restricted by the Ordnance Survey license. They could be used to create Google Maps, OpenStreetMaps (or used in Bing or ESRI or a whole host of other services). WHo knows what use people might make of them.

The local authority would carry on using them against OS data in its back office.

The open and the un-open

This blog post is not complaining about OS licensing restrictions (not least because the OS is, in fact, opening ever more of its data. It’s an issue. It can be worked round. Like many situations where data can be open or non-open there is an imbalance. The way the local authority chooses to collect (and publish – I haven’t really gone into that) its data has real impacts on the use of the data by third parties.

This post will change nothing

To some of us, these downstream impacts are clear and urgent. But to most people it’s abstruse and abstract. We need to find ways to encourage people across public services (and other sectors too) to understand some of these issues. This blog post is probably not going to achieve that. But it has got me out of tidying up the garage.

(Image credits: Screengrab from Mappa Mercia site (c) OpenStreetMap contributors)

.

Should you be doing that? Social media, ickiness and privacy

 

Let me be absolutely clear. I am not a lawyer. I know a little about technology and I have been thinking about these issues for a while. But being an avid reader of Jack of Kent does not substitute for actual qualifications in legal practice.

Ickiness

A couple of years ago at BlueLIghtCamp Andrew Fielding pitched a session asking “when does social media use get ‘icky’”.

It’s a good question.

He was really thinking about a public sector comms team, say a police force. What are the limits of what they should be getting up to on Twitter? Is it OK to run searches for mentions of your local town? What about going back through the messages posted by someone tweeting about your local town? What about running a search on LinkedIn to find out more about them?  What about building a file on them?

(There is no suggestion that Andrew or indeed anyone in public service comms is doing these things, this is a thought experiment).

Clearly (at least hopefully, clearly) there is a point when the normal use of these technologies for engagement and customer service steps over a line.

My initial response was to suggest that organisations should publish a policy on what they will or won’t do on social media. I started something off on PenFlip

My thinking on how a policy should be framed has evolved a bit since then.

The view that it is necessary or desirable to have a policy covering these areas is not widely shared. I don’t know of any public body that has a policy covering these issues and when I talk to people working in digital comms they are surprised and sometimes angry at my position.

My position

My starting point is that citizens have a right to expect the state to respect their privacy within reasonable limits. Chapter 2 of David Anderson’s Report of the Investigatory Powers Review provides a nice primer on privacy generally (how often do you read that sentence?)

In fact the right to privacy is enshrined in the European Convention of Human Rights (Article 8). This Article does allow the state to infringe your right to privacy when it is legal, reasonable and proportionate to do so. This is one of the issues at the heart of the debate around investigatory powers. The debate (and rightly) is focused on the powers the state should have to have a look at things you have chosen to keep private.

What I’m concerned with is the limits the state should have to look at things you have placed into a public sphere. There is a perfectly coherent argument to the effect that if you have chosen to put information into a public forum, you should accept the consequences. That makes some assumptions about the nature of the public spaces online. Is your Facebook update public like graffiti or public like a chat down the pub. As a society we would be much more relaxed about the council monitoring messages sprayed on walls than we would about them hanging around in pubs on the off-chance that they will hear something interesting.

I think that, in reality, some online spaces are public like graffiti and others are public like the pub.

I would like to see public bodies thinking through these issues and helping their staff understand what is acceptable and what is ‘icky’.

The three areas of relative ickiness

Generally acceptable (not really icky)

There are a set of actions that should be uncontroversial. It is a good idea for organisations to use social media for customer relations, policy development and to be “networked”. They should respond to messages clearly sent to them (or written on their page). They should seek out statements that are clearly intended for a wide audience: blog posts, comments on the local paper website, Tweets using relevant hashtags. All this helps organisations to understand their online community and should be encouraged.

Need to be authorised and limited (icky)

There are a set of actions that are not part of a public body’s investigatory functions but should be thought through and only undertaken within limited circumstances. To me these become relevant when the organisation becomes more interested in individual people.

Here are a couple of examples (again thought experiments not real world):

The comms team is asked a couple of interesting questions on Twitter from a new account. They wonder if this is a new blogger. Keeping track of who is writing about matters relevant to the authority is part of the job of the comms team. So they visit their profile but the information there is opaque. They want to do some more investigating, reading back through the Twitter timeline, searching for the name / user name on other accounts.

Social workers are working with a family. Dad is not happy following a meeting with social workers. There is concern that he might encourage people to harass the social workers in question. In order to understand the potential threat to their staff a team leader wants to search Facebook and keep an eye on Dad’s profile and maybe the profiles of his friends.

To my mind neither of the proposed actions are things that public bodies should be doing routinely. Given the specific circumstances they seem to me to be potentially reasonable and proportionate.

So I would suggest that they should be authorised on a case by case basis by someone reasonably senior. We are not in an area where warrants are necessary but we are in an area where the potential infringement of people’s privacy has to be considered and balanced with the need to (in these cases) protect public employees or provide better public services.

Constitutes investigatory action (beyond icky)

Beyond these actions are a whole set of actions where public employees are undertaking formal investigations for the detection or investigation of crimes. The Chief Surveillance Commissioner thinks there should be a policy covering social media:

“I strongly advise all public authorities empowered to use RIPA to have in place a corporate policy on the use of social media in investigations.”  Annual Report of the Chief Surveillance Commissioner to the Prime Minister and to the Scottish Ministers for 2013-2014 para 5.33 

Personally I think a policy covering the use of social media overall would make the most sense: these things are generally permitted, those things must be authorised, these other things are dealt with under RIPA-like procedures.

Don’t we have better things to worry about?

While we attempt to dissuade the government from granting far-reaching powers to the police and security services to break into computers and messaging systems this may seem like a distraction.

One does not discount the other. We should strike a sensible balance between security, utility and privacy all the time, not just when people remember to whack up the privacy settings.

I am also aware that I could potentially unite the “Human Rights gone mad” brigade with the “JFDI” digital engagement gang.

I am also aware that I’ve been focusing on public bodies here. This is deliberate because, as I understand it, public bodies are directly bound by Article 8: it is a right that protects you from the state. All of the things I have described can be undertaken by anyone, in any country.

Should your district council be able to find out less about you than a Chinese company?

All I can say is. These seem like relevant issues. We have not sorted them as a society. Talking about them seems like as good a way of approaching them as any.