Recommendations
Area for action: coordinated international action to ensure the responsible development of PETS for public benefit
Recommendation 1
National and supernational organisations, including standards development organisations (SDOs) should establish protocols and standards for PETs, and their technical components, as a priority.
PETs have been developed by experts in different fields and with little coordination between them to date. The greatest potential for PETs – whether used in isolation or combination – are as components of data governance systems. Open standards (available for use by anyone) are likely to help drive the development, accessibility and uptake of PETs for data governance. Furthermore, standards will be necessary for audit and assurance, encouraging a marketplace of confident PETs users with effective regulation and quality assurance marks where appropriate.
SDOs such as the British Standards Institute (BSI) (UK), National Physical Laboratory (UK), Institute of Electrical and Electronics Engineers (IEEE) (US), the National Cyber Security Centre (UK) and National Institute of Standards and Technology (NIST) (US) should identify and convene international expert groups to address gaps in PETs technical standards. These should build on existing standards in cryptography and information security (Chapter 3). Open standards will be especially important in PETs that enable information networks, such as secure multi-party computation or federated learning (similar to how HTTP (footnote 12) provided a common set of rules that enabled communication over the Internet).
Alongside technical standards, process standards should guide best practice in the application of PETs in data governance. Privacy best practice guides, codes of conduct and process standards (such as the draft Institute of Privacy Design Process Standard (footnote 13)) could be used to integrate PETs into a privacy-by-design approach to data governance systems. Whereas technical standards will be essential for technical interoperability, codes of conduct for PETs in data management and use will be critical for ‘social interoperability’ and acceptance in partnerships and digital collaborations on new scales (such as international or cross-sector partnerships).
Recommendation 2
Science funders, including governments and intergovernmental bodies, should accelerate and incentivise the development and maturation of PETs by funding prize challenges, pathfinder projects (such as topic guides or resource lists) and cross-border, collaborative test environments (such as an international PETs sandbox).
Science funders should foster a network of independent researchers and universities working on PETs challenges that address PETs in security, partnerships and transparency applications. They could involve the private sector (for example cloud providers and social media platforms) in designing challenges and through international cooperation on standards, guidance and regulation. To date, exemplary programmes include the UK-US PETs Prize Challenge led by the UK’s Centre for Data Ethics and Innovation (CDEI) and the US White House Office of Science and Technology Policy; the Digital Security by Design Challenge (footnote 14) funded through UK Research and Innovation; the Data.org Epiverse Challenge funding call; and the French data protection authority sandbox on digital health and GDPR (footnote 15).
Intragovernmental bodies such as the United Nations and Global Partnership for Artificial Intelligence should lead by creating test environments and providing data for demonstrations to test the security, privacy, and utility potentials of specific PETs, as well as test configurations of PETs. An international PETs sandbox would allow national regulators to collaborate and evaluate PETs solutions for cross-border data use according to common data governance principles.
Recommendation 3
Researchers, regulators and enforcement authorities should investigate the wider social and economic implications of PETs, for example, how PETs might be used in novel harms (such as fraud or linking datasets for increased surveillance) or how PETs might affect competition in digitised markets (such as monopolies through new network effects).
The potential follow-on effects of PETs adoption are not well understood, particularly whether and how they might amplify data monopolies, or what oversight mechanisms are required to prevent the type of collaborative analysis that might be considered state surveillance (footnote 16). For example, the Arts and Humanities Research Council could consider the ethical, social and economic implications of PETs within their program on AI (particularly where PETs could be dual use or surveillance technologies) (footnote 17, footnote 18).
Regulators, such as the Information Commissioner’s Office (ICO) and the Competition and Markets Authority (CMA), could investigate the wider economic implications of PETs, particularly where they could enable competition through greater interoperability (as with open banking, for example). It is not understood how the adoption of PETs aligns with FAIR (footnote 19) principles, particularly where PETs (such as privacy-preserving synthetic data) are used as an alternative to open data. In collaborative analysis, the ability to audit data that is not shared should be better understood by those who might use PETs (to identify potential for biased outcomes, for example). The relationship between PETs and data trusts also remains ambiguous.
Area of action: a strategic and pragmatic approach to PETs adoption in the UK, led by the public sector through public-private partnerships, demonstration of use cases and communication of benefits
Recommendation 4
The UK Government should develop a national PETs strategy to promote the responsible use of PETs in data governance: as tools for data protection and security, for collaboration and partnership (both domestically and cross-border) and for advancing scientific research.
PETs could reform the way data is used domestically and across borders, offering potential solutions to longstanding problems of siloed and underutilised data across sectors. To ensure the use of PETs for public good, PETs-driven information networks should be stewarded by public sector and civil society organisations using data infrastructure for public good. A coordinated national strategy for the development and adoption of PETs for public good will ensure the timely and responsible deployment of these technologies, with the public sector leading by example.
PETs have a role to play in achieving the objectives outlined in Mission 2 of the National Data Strategy, securing a ‘pro-growth and trusted data regime,’ positioning the UK internationally as a trusted data partner, with wider implications for national security. This recommendation reflects emerging, coordinated PETs work in foreign governments (such as that led in the US by the White House Office for Science and Technology Policy) (footnote 20).
The PETs strategy should offer a vision that complements the Government’s National Data Strategy (footnote 21) and National AI Strategy (footnote 22). The PETs strategy should prioritise a roadmap for public sector PETs adoption, addressing public awareness and the PETs marketplace (Chapter 2), technological maturity, appropriate regulatory mechanisms and responsibilities, alongside standards and codes of conduct for PETs users (Chapter 3).
Recommendation 5
Local, devolved and national governments across the UK should lead by example in the adoption of PETs for data sharing and use across government and in public-private partnerships, improving awareness by communicating PETs-enabled projects and their results.
Public sector organisations could partner with small and medium-sized enterprises (SMEs) developing PETs to identify use cases, which could then be tested through low-cost, low-risk pilot projects. Legal experts and interdisciplinary policy professionals should be involved from project inception, ensuring PETs meet data protection requirements and that outcomes and implications are properly communicated to non-technical decision-makers.
Use cases illustrated in Chapter 5 highlight areas of significant potential public benefit in healthcare and medical research, for reaching net zero through national digital twins and for population-level data collaboration.
Communication of PETs and their appropriate use in various contexts will be key to building trust with potential users (footnote 23), encouraging the PETs marketplace (Chapter 2). The ICO should continue its work on using PETs for wider good and communicating the implications – including barriers and potential benefits. The CDEI should continue to provide practical examples that will help organisations understand and build a business case for PETs’ adoption.
Proof of concept and pilot studies should be communicated to the wider public to demonstrate the value of PETs, foster trust in public sector data use and demonstrate value-for-money (footnote 24).
Recommendation 6
The UK Government should ensure that new data protection reforms account for the new systems of data governance enabled by emerging technologies such as PETs and ensure any new regulations are supported by clear, scenario-specific guidance and assessment tools.
While data protection legislation should remain technology neutral so as to be adaptable, current plans to review UK data protection laws provide an opportunity to consider the novel and multipurpose nature of these emerging technologies, particularly as they provide the technical means for new types of collaborative analysis. The ICO should continue its work to provide clarity around PETs and data protection law, encouraging the use of PETs for wider public good (footnote 25) and drawing from parallel work on AI guidance where relevant (such as privacy-preserving machine learning).
Further interpretation may be required to help users understand how PETs might serve as tools for meeting data protection requirements.
For example, it may be required to clarify data protection obligations where machine learning models are trained on personal data in federated learning scenarios (footnote 26) or the degree to which differentially private or homomorphically encrypted data meets anonymisation requirements (footnote 27). Where PETs enable information networks and international data collaborations, the ICO might anticipate clarification questions specific to international and collaborative analysis use cases. Regulatory sandboxes (as in Recommendation 2) will be useful for testing scenarios, particularly for experimentation with PETs in structured transparency (footnote 28) (such as in open research, credit scoring systems) and as accountability tools (footnote 29).
The ICO could expand on its PETs guidance, for example, through developing self-assessment guides. Data ethics organisations, such as the CDEI, might also develop impact assessment tools, for example, a PETs impact assessment protocol that considers downstream implications on human rights. The Alliance for Data Science Professionals certification scheme (footnote 30), which defines standards for ethical and well-governed approaches to data use, could specifically consider the role of PETs in evidencing Skill Areas A (Data Privacy and Stewardship) and E (Evaluation and Reflection).
Area of action: foundational scholarship and professionalisation to encourage maturation of PETs, foster trust and drive uptake of PETs in data-using organisations
Recommendation 7
Universities, businesses and science funders should fund foundational scholarship in PETs-related fields, such as cryptography and statistics.
Foundational training and fellowships in PETs fundamentals (such as cryptography) for graduate level study will create the skilled workforce required for widespread development and implementation of PETs. Critical future-proofing questions could be addressed through fellowships and research posts (for example, evaluating the security guarantees of PETs in a post-quantum context, or the energy proportionality, sustainability and scalability of energy-intensive, cryptography-based PETs). Internships and work placement programmes in organisations developing PETs could assist new graduates in moving from academic fields into applied PETs research and development.
Recommendation 8
Organisations providing certifications and continuing professional development courses in data science, cybersecurity and related fields should incorporate PETs modules to raise awareness among data professionals.
Professional certifications and Continuing Professional Development opportunities (including British Computer Society Professional Certifications such as the Alliance for Data Science Professionals certification, Data Science Professional Certificates offered by Microsoft or IBM, or (ISC)² Certifications (footnote 31)) should include a primer on PETs to raise awareness and encourage baseline knowledge of PETs amongst in-house data professionals. For example, the International Association of Privacy Professionals now includes a module on PETs in their Certified Information Privacy Technologist Certification (footnote 32).
Footnotes
-
12. Hypertext Transfer Protocol.
Back to report -
13. Institute of Privacy Design (The DRAFT Design Process Standard). See https://instituteofprivacydesign.org/2022/02/11/ the-draft-design-process-standard/ (accessed 2 September 2022).
Back to report -
14. UK Research and Innovation (Digital security by design challenge). See https://www.ukri.org/what-we-offer/our-main-funds/industrial-strategy-challenge-fund/artificial-intelligence-and-data-economy/digital-security-by-design-challenge/ (accessed 20 September 2022).
Back to report -
15. Commission Nationale de l’Informatique et des Libertés (Un «bac à sable» RGPD pour accompagner des projets innovants dans le domaine de la santé numérique). See https://www.cnil.fr/fr/un-bac-sable-rgpd-pour-accompagner-des-projets-innovants-dans-le-domaine-de-la-sante-numerique (accessed 15 September 2022).
Back to report -
16. Liberty Human Rights (Challenge hostile environment data-sharing). See https://www.libertyhumanrights.org.uk/ campaign/challenge-hostile-environment-data-sharing/ (accessed 20 September 2022).
Back to report -
17. Ongoing research highlights the negative consequences of data sharing in dual-use or otherwise sensitive contexts. For example: Papageogiou V, Wharton-Smith A, Campos-Matos I, Ward H. 2020 Patient data-sharing for immigration enforcement: a qualitative study of healthcare providers in England. BMJ Open. (https://doi.org/10.1136/bmjopen-2019-033202)
Back to report -
18.Liberty Human Rights (Liberty and Southall Black Sisters’ Super-complain on data-sharing between the police and home office regarding victims and witnesses to crime). See https://www.libertyhumanrights.org.uk/issue/liberty-and-southall-black-sisters-super-complaint-on-data-sharing-between-the-police-and-home-office-regarding-victims-and-witnesses-to-crime/ (accessed 20 September 2022).
Back to report -
19. Go FAIR (FAIR principles). See https://www.go-fair.org/fair-principles/ (accessed 20 September 2022).
Back to report -
20. US Office for Science and Technology Policy (Request for Information on Advancing Privacy-Enhancing Technologies). https://public-inspection.federalregister.gov/2022-12432.pdf (accessed 17 July 2022).
Back to report -
21. HM Government (National Data Strategy). See https://www.gov.uk/government/publications/uk-national-data-strategy/ national-data-strategy (accessed 9 September 2022).
Back to report -
22. HM Government (National AI Strategy). See https://www.gov.uk/government/publications/national-ai-strategy (accessed 9 September 2022).
Back to report -
23. The Royal Society. Creating trusted and resilient data systems: The public perspective. (to be published online in 2023)
Back to report -
24. This is in line with the Digital Economy Act 2017. See: The Information Commissioner’s Office (Data sharing across the public sector: the Digital Economy Act codes). See https://ico.org.uk/for-organisations/guide-to-data-protection/ ico-codes-of-practice/data-sharing-a-code-of-practice/data-sharing-across-the-public-sector-the-digital-economy-act-codes/ (accessed 2 September 2022).
Back to report -
25. The Information Commissioner’s Office (ICO consults health organisation to shape thinking on privacy-enhancing technologies). See https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2022/02/ico-consults-health-organisations-to-shape-thinking-on-privacy-enhancing-technologies/ (accessed 20 March 2022).
Back to report -
26. Nguyen T, Sun K, Wang S, Guitton F, Guo Y. 2021. Privacy preservation in federate learning An insightful survey from the GDPR perspective. Computers & Security 110. (https://doi.org/10.1016/j.cose.2021.102402)
Back to report -
27. See for example: Koerner K. 2021 Legal perspectives on PETs: Homomorphic encryption. Medium. 20 July 2021. See https://medium.com/golden-data/legal-perspectives-on-pets-homomorphic-encryption-9ccfb9a334f (accessed 30 June 2022).
Back to report -
28. Trask A, Bluemke E, Garfinkel B, Cuervas-Mons CG, Dafoe A. 2020 Beyond Privacy Trade-offs with Structured Transparency. See https://arxiv.org/ftp/arxiv/papers/2012/2012.08347.pdf (accessed 6 February 2022).
Back to report -
29. See for example: Meta AI (Assessing fairness of our products while protecting peoples privacy). See https:// ai.facebook.com/blog/assessing-fairness-of-our-products-while-protecting-peoples-privacy/ (accessed 15 August 2022).
Back to report -
30. Alliance for Data Science Professionals (Homepage). See https://alliancefordatascienceprofessionals.co.uk/ (accessed 20 September 2022).
Back to report -
31. (ISC)² ((ISC)² Information Security Certifications). See https://www.isc2.org/Certifications# (accessed 13 May 2022).
Back to report -
32. International Association of Privacy Professionals (Privacy Technology Certification). See https://iapp.org/media/pdf/ certification/CIPT_BOK_v.3.0.0.pdf (accessed 30 June 2022).
Back to report