top of page

Don’t let data barriers stifle innovation

A challenge often faced when exploring and developing new data tools, techniques and capabilities for policing is access to realistic data – whether live, historic or synthetic. Without this it is impossible to innovate, experiment and develop new ways of working that meet law enforcement’s needs. In a world where data exploitation technologies continue to evolve at pace, barriers to making better use of data mean that law enforcement is missing out on opportunities to advance these capabilities.


The biggest barrier to using real policing data is the extent to which it contains personal data, and the legal obligations around this. The Data Protection Act defines ‘personal data’ as any information which could identify an individual, such as a name, address, phone number, IP address or vehicle registration number and, in fact, any information that could be linked back to an individual can be considered personal data. The General Data Protection Regulation (GDPR) was introduced in 2018 to give individuals greater control over their personal data and increase transparency and accountability in how organisations handle it.


Any organisation which processes personal data is legally required to carry out a Data Protection Impact Assessment (DPIA) under UK-GDPR. The purpose of a DPIA is to identify and minimise the data protection risks of a project or activity that involves the processing of personal data and ensure that individuals’ rights and freedoms are protected. It sets out why the processing of personal data is necessary and justified, for instance for the effective prevention and detection of crime, ensuring public safety, or the effective administration of justice.


So what does ‘processing’ mean in a policing context? The types of activities that law enforcement might carry out which require a DPIA include any processing that involves profiling or evaluating individuals (as might happen in police intelligence systems), automated decision-making with legal or similar significant effects (such as the use of AI algorithms), and large-scale processing of personal data to analyse crime at a national level to detect threat and risk.

In parallel, as part of the drive to improve productivity in policing, there is ever-increasing interest in ‘black box technologies’ like ChatGPT which process information in ways that may not be easily explained or understood. Understanding the capability of these technologies has to rely on real life (and therefore personal) data, despite significant challenges in seeking approval to use this data without knowing exactly how it will be processed by AI tools.


As a result, limited experimentation can be carried out without access to real or realistic data, and real data requires a DPIA. Although completing a DPIA may sound straightforward, this can be challenging to achieve in a timely manner. There is no standardised DPIA approach across policing: each force has developed its own format and process, which limits opportunities to streamline DPIAs when forces collaborate at a regional or national level. The forms themselves can be confusing – they tend to be high-level with little explanation of the information that is required, and there are often very few people in a police force with knowledge of how to navigate the relevant legislation. This means that officers do not receive the necessary support to complete DPIAs successfully, with stretched Data Protection Officers unable to provide training or detailed guidance to everybody who needs it.




Completing and signing off a DPIA within a force inevitably takes longer than expected, with a range of different stakeholders required to provide insight across the various factors shown above - it is not unheard of for the process to take up to a year. Understandably the need to complete a DPIA is not only a barrier to releasing data for the purpose of technology projects and exploring the ‘art of the possible’, but also a deterrent to making the investment in the first place.

Law enforcement cannot afford to let a poor understanding of DPIAs stifle innovation and development of new ways of working. New technologies are emerging all the time and no doubt hold significant potential for policing to improve their services to the public, however unlocking this potential requires experimentation and testing, and experimentation and testing requires realistic data.


While the DPIA process is underway, synthetic data can offer a starting point for experimentation. The difficulty with most synthetic data however is that it may not provide the volume, realistic behaviour patterns or complexity of the real world, and therefore is not sufficient on its own as the basis for developing and testing new tools. ‘Realistic’ should mean not just reflecting the complexity of real data, but also all the flaws in terms of data quality, standards and omissions. Much of the time synthetic data is oversimplified and does not include the vast swathes of ‘noise’, or irrelevant data, which needs to be worked through and filtered out in an investigation. At Principle One we have been developing a synthetic data ‘generator’ which creates realistic synthetic data in terms of format, volume and complexity, tailored to different investigative scenarios. The data it generates comprises both the ‘needle’ (data pertinent to the investigation) and the ‘haystack’ (realistic volumes of irrelevant data that must be sifted out). You can read more about this here.


Using synthetic data is only a temporary solution however; a DPIA can only be deferred for so long if new tools and ways of working are to be used in day-to-day operations. Once development and testing are complete, there will be a point at which real live or historic data will need to be entered into the tool, and processing this data in a new way will require a DPIA. Failure to make the transition from a technical demonstrator developed against synthetic data to deployment against real data is one of the most significant barriers to realising benefit from innovation.


To accelerate the process, we have created an app to guide an officer through completing a DPIA. The app is designed to be used as preparation for filling out a DPIA form, recognising that formats and layouts of DPIAs differ across forces. The aim is to get the applicant thinking about the information they need to provide to get the DPIA accepted, and why and how to apply the right pieces of relevant legislation.



Its goal is to raise the standard of DPIA submissions by improving understanding of the legislation and requirements among those making the applications, therefore reducing the burden on the Data Protection Officer and cutting the time it takes to get approval.


Since DPIA forms can be very high-level and provide minimal guidance on the information required, our app is broken down into logical areas that an applicant will need to consider e.g. what the issue is and why it is necessary to use personal data, what their proposal is, the data and processes involved (including a section on AI, which is a significant risk), who will be affected by this proposal, the risks etc. This ensures that the information needed and the reasons why, are well understood. There is also a glossary on hand to explain the legislation and key terms, as shown below.



At the end of the process the user can see a summary of the questions and their responses, and this material can be copied and pasted into the relevant parts of the DPIA form (or printed for review). With the right guidance to completing a DPIA, the process will be less daunting and less of a barrier to accessing data for the development of new policing tools, technologies and capabilities.


Phil Tomlinson, Digital Intelligence Lead at Principle One, has significant experience of completing DPIAs to tackle complex data and analytical challenges in policing. He worked on the development of our DPIA app and says:


It is essential that DPIAs are considered, and considered early, in any policing technology project. It is very common for the DPIA process to cause significant delay to accessing the necessary data for developing and testing new tools and capabilities, and in some cases completing a DPIA is felt to be ‘too hard’ and projects never get off the ground. Sometimes proof-of-concept pieces do not progress beyond technology demos because access to data has not been considered in good time. We would encourage anybody commissioning or launching a police technology project to consider the data that is required and the data protection implications of this from the outset, and to begin the DPIA process as soon as possible.”


It is time for policing to refresh its approach to DPIAs and technology projects and recognise that the current difficulties around their completion are a substantial blocker to progress. Access to data does not have to be the barrier that it currently is to developing new tools, technologies and capabilities for policing; we simply need to create better understanding of DPIAs and make them a less daunting task for police officers. If you’d like to build confidence within your force, better understand DPIAs and the relevant legislation, and accelerate GDPR compliance, contact us to find out more about our DPIA app at policingapps@principleone.co.uk.

 


Kommentare


bottom of page