UK government's poor data practices unpicked in NAO report
HMRC, ONS and the Department for Business and Industrial Strategy were all found to be misrecording data
The UK government's department-wide data practices have been scrutinised by a National Audit Office report published today, detailing the data failures committed by various government departments, drawing in examples such as Windrush.
The report's findings were alarming and showed a consistent disregard for common data collection and storage practices.
The report focused on three core issues surrounding the government's data practices, the first of which was data quality. While the report stipulated that "perfect data is both impossible to achieve and is very costly", it demanded that data should be "good enough" which the government's currently isn't.
The Windrush scandal was cited as an example of what can happen as a result of poor data quality. "The Home Office undertook elements of data-sharing activities without fully assessing the impact of the quality of the underlying data," the report read.
It's not the first time that Windrush has drawn scrutiny of the Home Office's methods of data handling, back in March a separate report from the House of Commons' Public Accounts Committee revealed that MPs were "seriously concerned" with the way the Home Office showed repeated disregard for data handling.
Poor data quality can also lead to government departments being unable to assess their own effectiveness. The Department for Education wasn't able to properly assess academies and maintained schools because the only data it collected on them related to annual finances, according to the NAO report. The Ministry of Justice also made ill-informed decisions to the delivery of probation services based on inadequate data relating to previous probation services' methods and costs.
Secondly, the data standards used by the government were also called into question by the NAO, which said that it's important that data is presented in a consistent way so that government departments, when necessary and legally free to do so, can exchange information effectively to provide effective public services.
Across just 10 government departments, more than 20 different data collection methods were being used which led to issues such as the misrecording of businesses by HMRC, the Office of National Statistics and the Department for Business, Energy and Industrial Strategy.
Individuals' details were also being stored in a different way across departments, the report reveals. For example, 'Sam Jackson' on one department's system could be logged as 'Samantha Jackson' on another.
The challenge the government faces, the report reads, is that of implementing uniform collection and storage methods and the balance of the benefits and drawbacks of their introduction. It can "take time to embed and will require effort to assure compliance", the report explained, which also highlighted the trade-off between organising data for maximum value, and the burden it places on the information provider adhering to a stricter format.
It's long been known that the government operates on legacy IT systems and much like the implementation of consistent data collection methods, switching to modern technology would require an enormous logistical and financial effort.
The issue with data storage in legacy IT systems is well-documented, and the mish-mash of systems combined with incompatible software has led to ineffective patient record aggregation by NHS England and its hospitals trudging along with software that doesn't support the new record-keeping methods.
The interoperational relationship between these core issues is exemplified in the government's latest investment in AI-powered data analytics tools which have the capacity for effective large-scale data processing but can also magnify the department-wide problems with data quality.
The way IT systems are developed was also criticised for not having a data-first focus. Systems are designed with specific policy objectives that don't consider data needs such as merging data between departments and it's a problem that's replicated across the government.
If there's any government department one would expect to have an effective data sharing policy, it's the justice side of things. With so many institutions of the justice system (police, prosecutors, courts, prisons, probation), it would be expected that a uniform data collection process was in place. This isn't the case, as the institutions all collect data in different ways due to constitutional boundaries and this makes it difficult for data to be linked across systems.
While issues persist, the report claimed there is an appetite for change with demonstrable efforts made across departments to clean up data practices. But with compliance and legislative restrictions, as well as technical and financial limitations it proves to be a highly complex task, one that could take a long time fix.