The London problem (a #ukgc16 post)






Photo of house by river with the house and the plants on the bank reflected in the water

Tech city?

London

London’s great right?

It is.

Obviously I wouldn’t want to live there but I love visiting.

Which is good because I have to visit it a lot.

Because London is where everything happens. Or, if not everything, a massive proportion of everything.

Take the explosion of interest in and recruitment of digital talent in government.

That’s great too. But it seems to be heavily focused on London. If you want to be part of making it all happen, you’ve got to be in the capital.

And this seems, to me, to be a problem.

A problem

It’s a problem in a couple of ways:

  • I think it will be hard for people based in London to build tech that meets the needs of people living in places very different to London*
  • I think it will suck talent out of the rest of the country

But what do I know?

Govcampers will know

So I pitched a session on this at UKCG16. It was merged with a session Jessica Figueras had pitched about creating sustainable and stable digital teams. This was a very good thing. Without it it would probably have been a bunch of us from the sticks talking about how crap London was.

There is agreement that there is a problem in this general though different people see different aspects of the problem.

The things people think are problems

A presenting problem at the moment is recruitment and retention of digital/tech staff. At least with some skills. At least in some areas.

The reasons for this are complex and intertwined but they seem to include:

  • shortage of some key skills
  • competition within the London market (fishing in the same pool as many private sector organisations and, increasingly, other government departments)
  • internal civil service recruitment and development processes may not be attractive to tech workers

There are also system inertia or resistance problems which include:

  • government as a whole is London-focused so it isn’t surprising that tech in government would have the same bias and, in some cases, it may be deliberate since tech is about transformation it must be visible to senior decision makers
  • tech in government is about delivering things now. So pragmatic decisions are taken to live with problems (like fishing in the same pool) because the sort of changes that would tackle them would delay delivery (or require several years to pay off)
  • since so many organisations are fishing in the London pool, London is an attractive place to base yourself as a potential employee or contractor so there are, in fact, many skilled staff available in London
  • working practices are (often) configured based on physical co-location. A tour round Aviation House reveals a cornucopia of post-it notes on walls. Attempting to bolt distributed working onto this culture will be hard. In contrast it is possible to deliver real projects via completely distributed teams, assuming that is the way the process is designed. Shifting from one to the other would be non-trivial.

It’s also worth highlighting (and it was highlighted) that some departments already have significant numbers of staff based outside London and DWP and HMRC (for example) are developing “digital hubs” in Leeds, Newcastle and other places.

London goggles

One other aspect I picked-up from the discussion was a certain sense of how people who live and work in London frame this sort of issue. This was not explicit in the discussion and, accordingly, might be entirely me projecting my own prejudices but these were some of the implicit assumptions that seemed to run through the discussion:

  • ambitious, talented people will seek the opportunities presented in London, ergo outside London people lack talent and ambition
  • a normal career path is to move to London and work hard when young and then to move to rural areas (though within easy access of London) to raise a family: ergo rural areas serve and are dependent on urban areas: they aren’t economies of their own
  • the value of being in London is so high that no solutions that would involve significant change to London focus are worth considering (you might as well discuss moving to the moon)

We didn’t really explore the question of whether London bias would lead to less appropriate solutions for other parts of the country. Though there was a general sense that this could be a problem.

Solutions

I wasn’t too worried about exploring solutions. I was really keen to see if we could define what (if any) problems there might be. But you can’t stop digital folk trying to fix things.

It was highlighted that there are things in the system trying to address at least some of these problems.

  • Work on recruitment and retention is trying to make a civil service career more attractive to tech-types
  • As I mentioned, digital hubs are being developed outside the capital (and we heard a nice example of where DWP in Leeds is working with other employers to demonstrate to undergrads that there are high quality opportunities if they stay in Yorkshire)
  • Developments like the Digital Services Framework are intended to make it easier for government to buy specialist services from SMEs and, we were told, these suppliers are widely geographically spread
  • Flexible working (hot-desking etc) initiatives are spreading through the civil service

It would be great if we could shift working practices to entirely virtual working. Though this discussion was about digital there is no reason, in principle, why the whole civil service should not operate virtually.

This was UKGovCamp and so we were focused on the public sector but any solution would have to see the whole economy rebalancing to spread skills across the UK. The much vaunted Northern Powerhouse is probably the right sort of idea.

Success metric

I live in Herefordshire (gold star if you have *any* idea where that is). A brand new university is planned to open in the county in 2017. It will focus on STEM subjects. My dream is that someone graduating from there in 2020 could join the civil service and pursue their career to become, in due course, Permanent Secretary while still being based in Herefordshire. Contributing to our economy, being part of our society, making sure the government has a fuller sense of the country it governs.

Is that realistic?

About UKGovCamp

UKGovCamp is an extraordinary thing really. You can find out more about it here http://www.ukgovcamp.com/ukgc2016/

An edited version of the session Jessica and I pitched appears on the UKGovGamp special natteron podcast People collaborated on notetaking during the session.

Photo credit

Autumn River Wye Reflections #Hereford #dailyshoot by Les Haines used under CC BY 2.0.

*by way of example, while at Herefordshire Council we were constantly trying to improve our public transport information. We looked at some simple integrations to a suitable looking API: like the “give me the time of the next bus on this service” request. Unfortunately if there was no bus within 24hrs it returned “no bus”. If you don’t understand why this is a problem have a go at moving around the Welsh borders by bus.

Things to worry about before you worry about SEO

Very close up of an apparently worried face in black and white

I was at a business networking meeting the other day (because that, ladies and gentlemen, is how I roll). When I explained what I do most people nodded sagely and said “SEO”. Some of them went so for as to say

“yes we do that, targeting keywords, making sure we come up top in search results”

And this made me worry for them. Because, though I’m sure it does them no harm to go chasing search terms on Google there are, almost certainly, they would find would deliver them a return faster.

Here are a few of those things:

  1. Does your website work?

By this I don’t mean, is it there (though that’s worth checking). I mean does it fulfill its purpose. For most businesses its purpose will be to sell things, or to generate leads.

Lots of websites, and I mean LOTS, are really bad at this.

It’s really easy to build websites that don’t work. Much harder to build websites that do.

Your website stats package should help you answer this question. My favourite is Google Analytics and it has powerful tools to help you understand what proportion of your visitors actually go on to buy or to contact you.

2) Does your website work on mobile?

When you visit your website you’re probably sitting at your desk at work. You probably showed it to the directors by projecting it onto a massive screen. It probably looks tremendous.

But your customers aren’t visiting it on a desk (they’re really not). They are visiting it on iPhones cheap and cheerful LG phones and Windows phones (well probably not the latter). They’re sitting in their cars waiting for the kids to finish cubs or in a pub pretending to look up the quiz answers.

It’s not as simple as saying “Oo that looks nice” on your iPad. It has to be incredibly clear and simple to use on mobile. Because if it isn’t, people will go elsewhere, it’s a big Internet.

3) Is it accessible?

Broadly, can people with disabilities use it?

Now I know that for many, many businesses the question of whether your potential blind customers will struggle with your interactions is of very little significance. That’s a moral (not to say legal) problem but that’s not, in fact, my argument.

There are a range of rather geeky things that can really help search engines to understand and classify your site. Proper mark-up (the correct use of HTML) is one of them. Not hiding key information in images and videos is another. All of these are covered by accessibility. So, in fact, by serving people with disabilities you make the site work better for everyone.

4) Is it fast?

How fast?

As fast as it can be.

Consider the sense of satisfaction one gets making a purchase with Amazon. If you have one-click ordering turned on you can go from wanting a thing to the thing being ordered in a very few seconds.

Some of this is down to the fact that Amazon have very big computers driving their site. But mostly it’s because they really focus on getting the job done. It’s in the design, in the way the pages load and in the decisions they make about how to offer you services.

Website speed is actually a fiendishly difficult thing to assess objectively but, again, Google Analytics will make a good stab at it. Under a second page load speed is what I’d be looking for. But even then don’t get complacent. That customer sitting in their car might have an incredibly flakey and slow connection. Your .75 second page might take 15 seconds for them. And the next one. And the next one.

5) Have a retention plan

Most people, even if they are delighted by the speed and simplicity of your site, are not going to buy straight away. They might want to compare prices on your competitor’s site. They might want to talk it over with their partner. They might get distracted by a phone call.

So you need to focus on reaching people who have grazed your site. Encourage them to leave their email for a newsletter (a newsletter that will actually benefit them). Ask them to follow you on Twitter. Run a remarketing campaign so you can target adverts to them.

And reward your existing customers. Give them nice things. Give them exclusive discounts. Make it really easy for them to recommend your products (and services) to their friends and family.

And, inevitably, your favourite web stats package (it’s Google Analytics isn’t it) can give you rich data about what happens on the several visits that people make.

Here comes the SEO

And when you’ve got all that in place. Then it’s time to think about targeting search terms in Google.

My company, Likeaword, can help you with all of this, including finding and targeting the right search terms.

(Photo credit: Worried life blues… by Joe Sampouw used under CC BY 2.0)

Navigating around maps on websites. A guide for local authorities (and others)

Map showing gritting routes

I was supposed to be sorting out the garage. In a desperate act of prevarication, I checked Twitter. Sure enough Dan Slee had posed a gnomic question to the world.

I’m not totally sure what the argument was or who was arguing. But there are some issues that are worth exploring. Though this post has local authorities in mind most of these issues actually apply to anyone using online maps.

  1. Why put maps on your website?

Maps can be useful for displaying (and allowing people to report) lots of information. If the information has a spatial component then maps can be a very helpful way of understanding that information. So if this is information about a specific location a map can help people understand where that location is relative to their location or a third location.

Here’s a map of the routes that are gritted in Herefordshire [disclosure I was responsible for creating this map in the first place though I no longer work for the council].

Maps aren’t suitable for all users or in all circumstances. People who can’t see the screen, who have difficulty processing this sort of abstract information or are unfamiliar with using maps need other ways of navigating this sort of data.

2) GIS and the printing of maps

Usually (though not always) when people sit down to plan gritting routes they draw them on an electronic map. Specialist Geographical Information Systems make this sort of thing easy and make it simple to change routes (when a new school is built for example). And, in principle, this information can be shared with other departments (the bin lorries like to know which routes are going to be gritted for example).

In a local authority, inevitably, these routes will be drawn on electronic versions of Ordnance Survey maps. It’s easy to forget that the Ordnance Survey dataset is amongst the best quality mapping that any country has. Under the Public Sector Mapping Agreement local authorities (and town and parish councils) get to use the OS data for their patch under a licence.

The simplest way to get the gritting routes from the GIS software onto the website is to output a screenshot as a PDF (or a JPEG). Easy but not very useful. PDFs are unusable for almost all cases (reading long reports offline on a mobile phone would be one exception). Either format can’t be zoomed and so, to cover a county like Herefordshire, a large number of PDFs or JPEGs would have to be shoved out.

3) Bring on the Slippy Map

A much more useful concept is to use the awesome power of the Internet to display the mapped data in an interactive way. You’re probably most familiar with this from Google Maps. If the bit you are interested in lies to the right of the map you’re viewing you just reach with your cursor (or finger these days) and drag the map to the right.

Already this is a much better way to display gritting routes. As technology should this, now familiar, approach actually relies on a series of clever and complex interactions.

This is a post aimed at a reasonably general audience so I’ll risk the ire of GIS geeks with a simple description.

In order to get the gritting routes on a slippy map on a council website, several things need to be delivered.

The background mapping has to be available in a dynamic format, so that as you drag the map to the right, a service sends the maps covering the new area. These are just images (called tiles) though they have to be delivered so they can be shown at the right scale and in the right place.

The lines for the gritting routes have to be delivered in a similar way. They form a separate layer and are drawn on top of the background mapping. They also have to be delivered in so they can be shown at the right scale and position.

Then you need a tonne of code in your webpage that will go and get the background maps and the routes and display them, and handle the interaction with the user.

4) Give it to Google

One of the very attractive things about the Google Maps service is that it makes it really easy to do all of the things described above. You can draw your gritting routes in its service, grab a simple embed code and bosh an interactive slippy map.

Here’s Herefordshire Council using Google Maps to display car park locations [disclosure I totally failed to stop using google for this service in my time at the council].

So that’s it then?

Well no. Back in step two we saw that the gritting routes were drawn on top of Ordnance Survey data. The licence the council uses the data under means they can’t give that information (which in this case is incredibly accurate information about where roads are) to Google.

5) So are we stuck?

We most certainly are not. There are a huge range of solutions open and closed source for delivering mapping on council sites. And under the PSMA the council is perfectly able to publish maps.

6) Maps are good. Data is better.

Just supposing you want to drive from Hereford to Worcester. You want to make sure you followed gritted routes. You need to visit two websites: Herefordshire Council for the Herefordshire part of your journey and Worcestershire County Council for the Worcestershire part.

It’s not a brilliant user experience. There is, of course, an alternative. Just supposing you want to drive from West Bromwich to Edgbaston. You could visit the relevant websites or you could visit this map from Mappa Mercia. Which displays all of the gritting routes across the West Midlands conurbation.

Brilliant. Why don’t they include gritting routes further afield? Well they would like to but they encounter the licensing problem. Mappa Mercia is an OpenStreetMap project and OpenStreetMap can’t use data with restrictive licenses.

There are many, many reasons why local authorities might want to support OpenStreetMap but they’ll have to wait for another post.

7) Data is rubbish. Open data is resource.

Imagine a world where the gritting routes the council used were derived in an open way. Perhaps by putting GPS loggers (or as I like to call them “phones”) in the cab of the gritters.

Those gritting routes wouldn’t be restricted by the Ordnance Survey license. They could be used to create Google Maps, OpenStreetMaps (or used in Bing or ESRI or a whole host of other services). WHo knows what use people might make of them.

The local authority would carry on using them against OS data in its back office.

The open and the un-open

This blog post is not complaining about OS licensing restrictions (not least because the OS is, in fact, opening ever more of its data. It’s an issue. It can be worked round. Like many situations where data can be open or non-open there is an imbalance. The way the local authority chooses to collect (and publish – I haven’t really gone into that) its data has real impacts on the use of the data by third parties.

This post will change nothing

To some of us, these downstream impacts are clear and urgent. But to most people it’s abstruse and abstract. We need to find ways to encourage people across public services (and other sectors too) to understand some of these issues. This blog post is probably not going to achieve that. But it has got me out of tidying up the garage.

(Image credits: Screengrab from Mappa Mercia site (c) OpenStreetMap contributors)

.

Should you be doing that? Social media, ickiness and privacy

 

Let me be absolutely clear. I am not a lawyer. I know a little about technology and I have been thinking about these issues for a while. But being an avid reader of Jack of Kent does not substitute for actual qualifications in legal practice.

Ickiness

A couple of years ago at BlueLIghtCamp Andrew Fielding pitched a session asking “when does social media use get ‘icky’”.

It’s a good question.

He was really thinking about a public sector comms team, say a police force. What are the limits of what they should be getting up to on Twitter? Is it OK to run searches for mentions of your local town? What about going back through the messages posted by someone tweeting about your local town? What about running a search on LinkedIn to find out more about them?  What about building a file on them?

(There is no suggestion that Andrew or indeed anyone in public service comms is doing these things, this is a thought experiment).

Clearly (at least hopefully, clearly) there is a point when the normal use of these technologies for engagement and customer service steps over a line.

My initial response was to suggest that organisations should publish a policy on what they will or won’t do on social media. I started something off on PenFlip

My thinking on how a policy should be framed has evolved a bit since then.

The view that it is necessary or desirable to have a policy covering these areas is not widely shared. I don’t know of any public body that has a policy covering these issues and when I talk to people working in digital comms they are surprised and sometimes angry at my position.

My position

My starting point is that citizens have a right to expect the state to respect their privacy within reasonable limits. Chapter 2 of David Anderson’s Report of the Investigatory Powers Review provides a nice primer on privacy generally (how often do you read that sentence?)

In fact the right to privacy is enshrined in the European Convention of Human Rights (Article 8). This Article does allow the state to infringe your right to privacy when it is legal, reasonable and proportionate to do so. This is one of the issues at the heart of the debate around investigatory powers. The debate (and rightly) is focused on the powers the state should have to have a look at things you have chosen to keep private.

What I’m concerned with is the limits the state should have to look at things you have placed into a public sphere. There is a perfectly coherent argument to the effect that if you have chosen to put information into a public forum, you should accept the consequences. That makes some assumptions about the nature of the public spaces online. Is your Facebook update public like graffiti or public like a chat down the pub. As a society we would be much more relaxed about the council monitoring messages sprayed on walls than we would about them hanging around in pubs on the off-chance that they will hear something interesting.

I think that, in reality, some online spaces are public like graffiti and others are public like the pub.

I would like to see public bodies thinking through these issues and helping their staff understand what is acceptable and what is ‘icky’.

The three areas of relative ickiness

Generally acceptable (not really icky)

There are a set of actions that should be uncontroversial. It is a good idea for organisations to use social media for customer relations, policy development and to be “networked”. They should respond to messages clearly sent to them (or written on their page). They should seek out statements that are clearly intended for a wide audience: blog posts, comments on the local paper website, Tweets using relevant hashtags. All this helps organisations to understand their online community and should be encouraged.

Need to be authorised and limited (icky)

There are a set of actions that are not part of a public body’s investigatory functions but should be thought through and only undertaken within limited circumstances. To me these become relevant when the organisation becomes more interested in individual people.

Here are a couple of examples (again thought experiments not real world):

The comms team is asked a couple of interesting questions on Twitter from a new account. They wonder if this is a new blogger. Keeping track of who is writing about matters relevant to the authority is part of the job of the comms team. So they visit their profile but the information there is opaque. They want to do some more investigating, reading back through the Twitter timeline, searching for the name / user name on other accounts.

Social workers are working with a family. Dad is not happy following a meeting with social workers. There is concern that he might encourage people to harass the social workers in question. In order to understand the potential threat to their staff a team leader wants to search Facebook and keep an eye on Dad’s profile and maybe the profiles of his friends.

To my mind neither of the proposed actions are things that public bodies should be doing routinely. Given the specific circumstances they seem to me to be potentially reasonable and proportionate.

So I would suggest that they should be authorised on a case by case basis by someone reasonably senior. We are not in an area where warrants are necessary but we are in an area where the potential infringement of people’s privacy has to be considered and balanced with the need to (in these cases) protect public employees or provide better public services.

Constitutes investigatory action (beyond icky)

Beyond these actions are a whole set of actions where public employees are undertaking formal investigations for the detection or investigation of crimes. The Chief Surveillance Commissioner thinks there should be a policy covering social media:

“I strongly advise all public authorities empowered to use RIPA to have in place a corporate policy on the use of social media in investigations.”  Annual Report of the Chief Surveillance Commissioner to the Prime Minister and to the Scottish Ministers for 2013-2014 para 5.33 

Personally I think a policy covering the use of social media overall would make the most sense: these things are generally permitted, those things must be authorised, these other things are dealt with under RIPA-like procedures.

Don’t we have better things to worry about?

While we attempt to dissuade the government from granting far-reaching powers to the police and security services to break into computers and messaging systems this may seem like a distraction.

One does not discount the other. We should strike a sensible balance between security, utility and privacy all the time, not just when people remember to whack up the privacy settings.

I am also aware that I could potentially unite the “Human Rights gone mad” brigade with the “JFDI” digital engagement gang.

I am also aware that I’ve been focusing on public bodies here. This is deliberate because, as I understand it, public bodies are directly bound by Article 8: it is a right that protects you from the state. All of the things I have described can be undertaken by anyone, in any country.

Should your district council be able to find out less about you than a Chinese company?

All I can say is. These seem like relevant issues. We have not sorted them as a society. Talking about them seems like as good a way of approaching them as any.

How to reduce the FOI burden on local government

Secret

(photo is: Secret Comedy Podcast 06 – 2 August 2013 by Amnesty International UK used under CC BY 2.0)

 

As reported by the Press Gazette, the LGA has provided evidence to the Freedom of Information Commission [PDF].

The LGA is disappointingly negative about FOIA, seeing it as a a cost to authorities rather than a boon to their communities.

It prompted me to get round to a blog post I’ve been meaning to write for a while, reflecting on my experience of making an FOIA request to every council in the country.

Asking for data

I wanted to know some things about the usage of council websites. I wrote a report about websites based on this data.

It is reasonably trivial to run a mailmerge and issue an email to every council and I was conscious that this would generate work in each authority so I tried to pick a small number of datapoints that it would be easy to obtain.

I’ve never issued an FOIA request before (though I’ve answered plenty) and I did feel a bit of a sense of guilt/fear when I pressed go.

This was assuaged somewhat by receiving a response within an hour (well done Cardiff) and exacerbated by receiving a call from a web manager apparently wanting to know what lay behind my request. I was a bit taken aback but when other people contacted me (in a less defensive way) I wrote a blog post explaining what I was up to.

Not that bad in the end

And I have to say the vast majority of councils responded promptly and in an extremely helpful manner. A small minority had a much more defensive attitude and some councils attached very restrictive licences to the data (despite the obligation, in England and Wales, to provide datasets under open licences).

A very small number (sadly I didn’t keep an accurate count of this) of councils responded to say “We already publish this”. Now to be honest, it was less convenient to me when they did this because it usually involved me in more work. But I was still delighted because it’s clearly such a sensible thing to to.

Gold star goes to East Sussex who just give you the login details of read-only account for their Google Analytics account.

It would be easy to publish this

It is technically trivial to publish data from Google Analytics (the tool used by the majority of councils). Website data is not secret, not personal and its publication is of benefit to the sector and potentially to the wider community.

And if it had been published the cost to the public sector of my report would have been marginal to nothing.

In fact the only reason not to publish this data is a cultural inclination not to tell people stuff.

The way to reduce the FOIA “burden” on local government is to answer people’s questions before they ask them.

And if local government routinely published its non-personal data then it would have a stronger argument when raising concerns about the cost of FOIA compliance.

What the evidence tells us

In fact the LGA evidence to the FOIA Commission reveals a sector stuck in a suspicious, closed, and secretive culture.

 

Which suggests, of course, we need the FOIA even more than before.

Oh. And and what chance do closed, suspicious, secretive organisations have of being effective in the digital age?

(Don’t answer that)

My initial reactions to Techfugees

CVOkOeuWIAAyLyD

Image was lifted from @asalvaire ‘s Twitter stream I hope they don’t mind.

I spent 2 December 2015 at the Techfugees conference in London, UK. I was wearing my Standby Task Force hat (mostly).

These are my instant reactions on the train back home.

1. Wow.

There is a lot going on. Amazing energy, talent and thought going into all sorts of innovative solutions. It was an amazing, invigorating, mind-numbing day.

2. This is a complex situation.

Now I know that’s a statement of the blindingly obvious but we have very fluid flows of refugees from a range of different countries entering Europe by very changeable routes and then making their way around in countries they know little about before claiming asylum in potentially other countries. Each country has its own state and civil society structures, cultural attitudes and legal complexities. As well as languages. And then there’s the politics. And this is just to get people to the stage of claiming asylum. If they are accepted as refugees they face, potentially, years of challenges such as dealing with trauma, learning a new language, understanding a new culture, integrating into their new communities on top of the usual stuff people want to to, falling in love, raising familes, earning money, having a laugh.

The app that fixes that is going to be very impressive.

3. There is an urgent need to make it easy for people (and let;s just start with refugee agencies) to be able to work out what support and help is available to whom and where. This is not really a technical issue it just needs a bunch of people to focus on gathering and presenting the data (well not just that but that is a necessary task).

4. Facebook and WhatsApp are already being used for an amazing amount of coordination by refugees (and their friends and family) themselves. Let’s not reinvent the wheel. Let’s add gears and steering.

5. In the UK let’s not lose sight of the fact that one fundamental underlying problem is the shortage of affordable housing. Let’s build more houses.

6. From a Standby Task Force perspective Google Hub Info looks utterly awesome. It goes on the long list of things I want to play with (but very near the top). https://github.com/google/crisis-info-hub

7. We frame refugees as a problem (it’s a crisis haven’t you heard?). And by definition refugees are people fleeing persecution. We heard from the excellent Hassan today that he would never have left Syria if it was safe. But refugees are also an opportunity, people with skills, ideas, energy. Imagine if we could see them as an opportunity for our communities, and our economies.

Imagine.

8. I really like spending time with geeks and looking at these problems as service design issues. I think that’s a useful way to think about things.

But the politics matters too.

9. There are many potential users of the potential and actual projects that are spinning out here. We need to make sure we stay close to the users. Which is easier said than done when so many of them are constantly moving.

10. People are amazing.

That is all.

Does bounce rate matter?

Invasion of the SpaceHoppers

space hopper invasion by Paul Stevenson used under CC BY 2.0

An interest in bounce rate can severely damage your health.

It’s true.

I recently started to muse about what a “normal” bounce rate might be for local government websites and this led me to spend a surprising amount of time gathering data from (almost) every council in the country.

Now I’m safely out of the other side of that I can muse a bit more about what it all tells us.

Nerdy explanation of bounce rate

In principle this is very straightforward. If I visit a page on your website and then don’t visit any other pages for a while then I “bounced” on that page. If half of the visits to your website were “bounces” then the bounce rate is 50%.

In reality it is slightly more complex than that. If you use Google Analytics to record this data (and 86% of local authorities do) then a bounce occurs if a user triggers only one event. A pageview is an event but Google Analytics can be configured to measure other events: such as playing a video. So if you set up Analytics to measure plays of videos, your bounce rate will go down even if people don’t visit any other pages.

And if people visit a second page but Google Analytics is not set up on the second page (which happens reasonably often in local government when people are passed to back office services (though only if the back office services are poorly configured or a bit pants (which is not-uncommon))) then Google Analytics will record a bounce even though the user is definitely not bouncing.

And finally there is the cross-domain issue. As people move between domains on your web estate (strictly speaking GA properties but it’s easier to think about it as domains). Imagine you have a main site www.marchford.gov.uk and then your comms team think it’s a jolly wheeze to set up a microsite to promote the council’s vision for the area: visionforthearea.marchford.gov.uk. When someone visits a news page on the main site and follows a link to the second site Google Analytics will record the first visit as a bounce. This can be fixed using cross-domain tracking though there are some situations when you might not find that helpful.

Time is also a factor in bounce rate. If I visit a page now and then come back in 15 minutes and visit a second page should we record that as one visit involving two pages (and so no bounces) or two visits each involving one page (two bounces)? Google Analytics by default assumes that there must be a gap of at least 30 minutes and so would plump for the one visit option. If the second page is visited 31 minutes after the first one that would make it two visits.  

What bounce rate is best?

It’s definitely not possible to say “local government websites should have a bounce rate of X%”. The correct bounce rate, ultimately, is the one that shows the website is being used as it was designed.

Many public sector sites are designed to work well with search engines (in the UK, let’s face it that’s basically Google) and to resolve the user’s question immediately. So if I type “Christmas holidays Barnet” into my favourite search bar I get a link to the Barnet School Term and holiday dates page. This gives me the answer I was looking for and so I go about my business. I just bounced on Barnet’s site (they can thank me later).

So if we’ve designed the site to work this way and it is working as designed, we would expect high bounce rates. If we believe that people are more likely to visit the front page of the website and then navigate to the answer and the site is working as designed then we would expect lower bounce rates.

Of course the local government digital estate is about more than delivering small packets of information. It’s about (or should be about) much more complex interactions, looking for support with housing, working out what support at home would be best for mum, understanding what options my visually impaired daughter has to get the best out of her further education.

These interactions are likely to be very different, longer, more interactive, perhaps paused halfway through for a bit of consideration and discussion. So if you’ve designed the site for these sorts of interactions and the site is working as designed bounce rate would be lower. The degree to which this affects the overall bounce rate will depend on the mix of the straightforward and the complex interactions. And the way you expect users to interact with your site.

So is it useful to monitor bounce rate for your site?

Yes and no.

Some eloquent explanations of why it is not useful to look at bounce rate were posted as comments on my blog a couple of months ago.

While I don’t disagree with the points made there but I do have a different take on the situation.

Consider the rev-counter on your car. It displays a single figure indicating the number of revolutions per minute your engine is making). When you depress the throttle the rate goes up and the indicator shows this change.

But the situation under the hood is much more complex: valves are opening and closing, fuel is being sprayed, exhaust gasses are leaving the cylinders and being exhausted.

RPM is a result of all of this activity, it’s not a measure of all of it but it is affected by changes in the whole system.

I think bounce rate is a bit like this. Overall bounce rate is a result of the design decisions you make for your site and how users actually use it. A web manager should have a good idea of what bounce rate they expect across their site(s). So if the rate they actually see is radically different that means something odd is going on. And if the rate changes suddenly that’s another indicator something needs investigation (or conversely if the rate doesn’t change when they expected it to).

What bounce rate should I expect?

The simple answer, without knowing anything much about your site, is between 40% and 51% because 50% of all local government sites are within that range.

Which doesn’t mean that if you have a bounce rate of 60% you are doing something wrong. It means that your site has an unusual bounce rate. If that’s what you expect then that’s great news (isn’t it?).

On the whole English district councils and London Boroughs tend to be a bit less bouncy than other councils. The explanation for this could be as simple as not having public transport responsibilities which we might expect to drive lots of “when’s the next bus” type traffic.

People visiting council sites from internal addresses tend to be much less bouncy than visitors overall. So if you have a site targeted at school professionals, for example, you would expect much lower bounces.

And the design decisions you take will also play a key role. Are you handing people to third-party sites for example?

Cross domain tracking rocks

One thing that struck me when I was asking councils for data on their website visits was the number of councils tracking each domain (or microsite) separately. If you treat your digital newsroom as a distinct analytics property to your main site, both will look artificially bouncy. If you use cross-domain tracking you’ll get a much more realistic sense of what is happening to traffic between those sites.

A tool, not a grade

To me using website analytics data is all about understanding what your site is designed for, how it is being used and then tweaking and improving the design to meet the needs of your users. A single figure (like bounce rate) can never give you the full detail of what’s going on under the hood but it can tell you whether things seem to be working as they should or need closer investigation.

Investigating crime data at small geographies

A bit of background (skip if you’re familiar with h-map)

I’ve been doing some work with Herefordshire charity the Bulmer Foundation (and a lot of other organisations) to create a central resource of data about how sustainable Herefordshire is becoming. The Foundation has been working on this for some time and, following a lot of consultation, they have identified the key indicators of sustainability for the county.

If these things are getting better, Herefordshire is developing more sustainably. If these things are getting worse Herefordshire is developing less sustainably.

Identifying the indicators is important but it’s not the whole task. Next we have identify the data that tells us if things are improving or not for each indicator. Then we have to display it in a useful way.

The Foundation has commissioned some development work which is still in progress but play-about-with-able at h-map.org.uk.

Understanding how things are changing at the whole county level might be interesting but for most practical purposes it may not be useful. We need to get into the data at much smaller geographic levels. As a way of investigating how this might work I decided to investigate crime data which is a nice, rich dataset.

Processing

This post is really a set of notes about how I processed the data. Partly for my records but also to encourage les autres and to enable people to suggest ways this could be done better.

I’ll write about the implications for the h-map project on the project blog in due course.

The police.uk website publishes crime data in a number of ways. There is an API but I haven’t explored that. Humans can download see some data on maps and they can elect to download some CSVs

I asked for everything they had on West Mercia ( the police force serving Herefordshire, Worcestershire, Shropshire and Telford). I received a zip file containing a bunch of csvs. One csv per month for all of the months starting Jan 2011 and ending Decemb 2014. Each csv contains a row for each crime recorded across the force area. Against each row is a type of crime, lat, long, outcome and sundry other data including the LSOA.

That’s great but much more than I need. All I ultimately want to know is the number of crimes recorded per year in each LSOA. If I had better coding/scripting skills I could probably have got a few lines of Python working on the task for me. But I don’t.

Also it was an excuse to play with Datagraft.net This new cloud thingy has sprung from the Dapaas project involving the ODI and other cool linked/open data folk. It allows you to create transformations in a reasonably point and click manner. There is a pipeline metaphor so transformations proceed sequentially. There is a bit of a learning curve and the documentation is early stage but I got the hang of it. It allows you to preview the effect of your transformation on real data which enables effective pressing of buttons until the right thing happens.

So, after a bit, I managed to build a little transformation that pulls out rows here the name of the LSOA contains “Herefordshire” and then creates a 2 column table from this with LSOA and month the respective columns.

I still have 1 csv for each month. It might be that datagraft will append rows onto an existing column but I couldn’t work out how to do this.

So I had a happy couple of hours uploading each csv and downloading the processed file.

What I was aiming for was number of crimes per year per LSOA. So I had to get my monthly files into 1 big yearly file. Which I did manually with the assistance of the excellent TextWrangler application. It really made this tedious manual task a breeze.

Then a simple pivot table for each year gives me the totals I was looking for.

There was a little bit of spreadsheeting to decide if crime was improving, worsening or not changing.

And finally the application of Google Fusion Tables to link the LSOA codes in my dataset to polygons (described as KML) and I have a lovely map painted red, amber and green.

Datagraft enables me to save the transformation so when all of 2015’s data becomes available I’ll be able to repeat the process. It also enables me to to publish my dataset and to RDF it.

Maybe next week for that.

If you have any suggestions for ways I could cut out steps, or improve my data wrangling I would love to hear them.

big numbers on council websites: a guide for comms folk

(I forgot to add this at the time, I’ve put it here for completeness, apologies if it is clogging up your RSS feed)

vintage-649760_1920

I wrote a piece on Comms2Point0 about why some of my findings about local government websites asking whether 10 visits per person per year is, in fact, a lot.

Read big numbers on council websites: a guide for comms folk now.