I’ve had a lot of conversations with folks from within the public sector in Scotland over the last year. In those conversations, a theme emerges. A wistful look comes across a civil servant’s eyes as they imagine a world where Scotland can be a disruptive innovator on a global stage. Then the wistful look disappears. However, the facts are that we have awesome things going on. Look at the example of OneTeamGov, Service Design or my personal favourite the Local Government Benchmarking Framework (LGBF).
There are good examples in the world of dashboards and Open Data portals that aim to connect data that run places and government with the public at large:
- London Dashboard – http://citydashboard.org/london/
- New York Dashboard – https://opendata.cityofnewyork.us/dashboard/
- Hong Kong Open Data – https://data.gov.hk/en-datasets
Local Government Benchmarking Framework
However, unlike these examples, the Scottish Local Government Benchmarking Framework is a system set up to consistently measure performance. All of this data is available for the public to engage with. Argue with it, make suggestions about it, or even better, complain about it. You will help to move civil service towards better efficiency and cost-effectiveness. The worst part is that LGBF seems to have just missed greatness before it became world leadingly good.
Improvement Service runs the framework. They describe the framework in their own words as “The LGBF is a high-level benchmarking tool designed to support senior management teams and elected members to ask questions about key council services.
The framework provides high-level ‘can openers’ which are designed to focus questions on why variations in cost and performance are occurring between similar councils. They do not supply the answers. That happens as councils engage with each other to ‘drill down’ and explore why these variations are happening.”
Nobody knows why, but the Improvement Service really likes to use the can opener analogy about LGBF. I’d just go with it. Excitingly they have data going back to 2010. They also repeatedly mention that it is a statutory requirement of all 32 local authorities to submit data.
So the system appears to have 4 key challenges blocking it from greatness as it currently stands. Working through each one, in turn, let’s describe the challenge and suggest ways it might be improved for all our benefit:
Lack of a Clear Mandate
This is exactly as it sounds. The Improvement Service mention in several places that this data collection is a statutory requirement in Scotland. However, I can’t seem to get my hands on any specific legislation. My searches are currently favouring the Local Government Act 1992 and the Local Government in Scotland Act 2003 as to what they refer to as a statutory requirement.
My interest is particularly in whether or not the legislation sets the metrics to collect. Or are those metrics dynamic? My gut instinct is that the legislation is vaguer such as the Acts I mentioned above. Then three bodies (SOLACE Scotland, COSLA, and the Improvement Service) manage the metrics. They must also disseminate requirements for local authorities to submit data annually.
I’m chasing the Improvement Service for clarification on this challenge. We will get iner details on how this process operates.
How is Data Collected?
This follows on from the first challenge. As of yet, I cannot find definitive legislation directing the collection of specific data for benchmarking. However, taking my assumption above that the legislation is vague then there is likely to be a guideline document. This document would be distributed to local authorities to follow in order to comply with submitting data. It will be this document that we’ll want to get our hands on and scrutinize. But, as I say, that is hypothetical until we have more information on the process.
Understanding the requirements behind how local authorities submit data regarding these indicators and then what sources are acceptable is key to deciding veracity behind the data. Right now we are seeing a lot of data within the LGBF that is a bit like a Third Year’s attempt at a research report on Siberian Tigers. They are spouting off facts about diet and habitat and we don’t doubt that those are true. However, without sources, we don’t get to give that student a good mark on their paper.
Moving forward we need to understand what data is being collected. We have to be specific here. Where is it coming from specifically (even if that is different systems for different local authorities)? And what processes have been applied to the data (assuming that there is a lot of cobbling together data from various sources, lining up of data, making connections, or even just anonymization and aggregation of data)?
Lack of Coverage
Of the 7 topics, there are 74 indicators with the majority (27 indicators) in Children’s Services. Reading through the indicators we see a few things you’d expect. They measure the basic cost of education. They look at attainment and perceptions. As well as looking at participation. Education plays a large role in a child’s life and it forms one of the most significant services of a local authority if you were to look at cost and effort. It makes sense to see a disproportionate amount of indicators attempting to understand education.
The challenge lies in what appears to be metrics picked for the ease of data collection. These do not specifically detail a picture of Children’s Services. I’m not advocating for 1000’s of metrics to do with education. If 27 is the number then that is brilliant. However, if you are measuring education without looking at educational staff in any real way that seems a bit misguided. Do we not value teachers enough to even measure their impact beyond how many sick days they take? Does school facilities not rank high enough to even get measured? Teacher/pupil ratios? Unfilled educational job opportunities? How about taking some pages from the Organisation for Economic Co-operation and Development (OECD) playbook created over 60 years of measuring educational performance across the world?
I’m ready to receive a rebuke saying that teacher sickness absence days is the best measurement of performance. However, my gut is telling me this is off. A fundamental aspect of this challenge is that there is not enough info about where the data comes from specifically. And, nothing detailing how the data is processed.
Time is a Factor
Currently, metrics are only collated annually. When our institutions and environments are transforming every day based on an unprecedented reduction in budgets then we should measure more often. As the system currently operates this would put an undue strain on the local authority resources to gather and prepare data.
What would it take to digitally connect to source systems and automate this process? We could remove this resource requirement while creating an up to the data source defining performance.
LGBF is so tantalizingly close to awesome. It allows not just our local government, but our citizen’s access to the basic building blocks behind how our government performs. We can kick-start citizen participation. And we can kickstart dialogue about why we do what we do. What does it cost? How can we improve? We need to stop letting politics get in the way of getting better at delivering government. Only citizens can move the needle on this initiative. Let’s not take no for an answer and get some real data that we can help to improve performance with.
Check it out for yourself: http://www.improvementservice.org.uk/benchmarking/