Privacy by design

Privacy by design is an approach to systems engineering initially developed by Ann Cavoukian and formalized in a joint report on privacy-enhancing technologies by a joint team of the Information and Privacy Commissioner of Ontario (Canada), the Dutch Data Protection Authority, and the Netherlands Organisation for Applied Scientific Research in 1995.[1][2] The privacy by design framework was published in 2009[3] and adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities in 2010.[4] Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The concept is an example of value sensitive design, i.e., taking human values into account in a well-defined manner throughout the process.[5][6]

Cavoukian's approach to privacy has been criticized as being vague,[7] challenging to enforce its adoption,[8] difficult to apply to certain disciplines,[9][10] challenging to scale up to networked infrastructures,[10] as well as prioritizing corporate interests over consumers' interests[7] and placing insufficient emphasis on minimizing data collection.[9] Recent developments in computer science and data engineering, such as support for encoding privacy in data[11] and the availability and quality of Privacy-Enhancing Technologies (PET's) partly offset those critiques and help to make the principles feasible in real-world settings.

The European GDPR regulation incorporates privacy by design.[12]

History and background

The privacy by design framework was developed by Ann Cavoukian, Information and Privacy Commissioner of Ontario, following her joint work with the Dutch Data Protection Authority and the Netherlands Organisation for Applied Scientific Research in 1995.[1][12] In 2009, the Information and Privacy Commissioner of Ontario co-hosted an event, Privacy by Design: The Definitive Workshop, with the Israeli Law, Information and Technology Authority at the 31st International Conference of Data Protection and Privacy Commissioner (2009).[13][14]

In 2010 the framework achieved international acceptance when the International Assembly of Privacy Commissioners and Data Protection Authorities unanimously passed a resolution on privacy by design[15] recognising it as an international standard at their annual conference.[14][16][17][4] Among other commitments, the commissioners resolved to promote privacy by design as widely as possible and foster the incorporation of the principle into policy and legislation.[4]

Foundational principles

Privacy by design is based on seven "foundational principles":[3][18][19][20]

  1. Proactive not reactive; preventive not remedial[3][18][19][20]
  2. Privacy as the default setting[3][18][19][20]
  3. Privacy embedded into design[3][18][19][20]
  4. Full functionality – positive-sum, not zero-sum[3][18][19][20]
  5. End-to-end security – full lifecycle protection[3][18][19][20]
  6. Visibility and transparency – keep it open[3][18][19][20]
  7. Respect for user privacy – keep it user-centric[3][18][19][20]

The principles have been cited in over five hundred articles[21] referring to the Privacy by Design in Law, Policy and Practice white paper by Ann Cavoukian.[22]

Proactive not reactive; preventive not remedial

The privacy by design approach is characterized by proactive rather than reactive measures. It anticipates and prevents privacy invasive events before they happen. Privacy by design does not wait for privacy risks to materialize, nor does it offer remedies for resolving privacy infractions once they have occurred — it aims to prevent them from occurring. In short, privacy by design comes before-the-fact, not after.[18][19][20]

Privacy as the default (PbD)

Privacy by design seeks to deliver the maximum degree of privacy by ensuring that personal data are automatically protected in any given IT system or business practice. If an individual does nothing, their privacy still remains intact. No action is required on the part of the individual to protect their privacy — it is built into the system, by default.[18][19][20]

PbD practices
  • Purpose Specification - The data subjects must be clearly communicated to at or before any data collection, retention, or usage occurs, and the purpose(s) must be limited and relevant to the stated needs.[18]
  • Collection Limitation - Collection of data must be fair, lawful, and limited to the stated purpose.[18]
  • Data Minimization - Collection of data should be minimized as much as possible, and technologies should default to have users be non-identifiable and non-observable or minimized if absolutely necessary.[18]
  • Use, Retention, and Disclosure - Use, retention, and disclosure of data must be limited and only for what has been consented to, with exceptions by law. Information should only be retained for the stated amount time needed and then securely erased.[18]

Privacy embedded into design

Privacy by design is embedded into the design and architecture of IT systems as well as business practices. It is not bolted on as an add-on, after the fact. The result is that privacy becomes an essential component of the core functionality being delivered. Privacy is integral to the system without diminishing functionality.[18][19][20]

Full functionality – positive-sum, not zero-sum

Privacy by design seeks to accommodate all legitimate interests and objectives in a positive-sum “win-win” manner, not through a dated, zero-sum approach, where unnecessary trade-offs are made. Privacy by design avoids the pretense of false dichotomies, such as privacy versus security, demonstrating that it is possible to have both.[18][19][20]

End-to-end security – full lifecycle protection

Privacy by design, having been embedded into the system prior to the first element of information being collected, extends securely throughout the entire lifecycle of the data involved — strong security measures are essential to privacy, from start to finish. This ensures that all data are securely retained, and then securely destroyed at the end of the process, in a timely fashion. Thus, privacy by design ensures cradle-to-grave, secure lifecycle management of information, end-to-end.[18][19][20]

Visibility and transparency – keep it open

Privacy by design seeks to assure all stakeholders that whatever business practice or technology involved is in fact operating according to the stated promises and objectives, subject to independent verification. The component parts and operations remain visible and transparent, to users and providers alike. Remember to trust but verify.[18][19][20]

Respect for user privacy – keep it user-centric

Above all, privacy by design requires architects and operators to keep the interests of the individual uppermost by offering such measures as strong privacy defaults, appropriate notice, and empowering user-friendly options. Keep it user-centric.[18][19][20]

Design and standards

The International Organization for Standardization (ISO) approved the Committee on Consumer Policy (COPOLCO) proposal for a new ISO standard: Consumer Protection: Privacy by Design for Consumer Goods and Services (ISO/PC317).[23] The standard will aim to specify the design process to provide consumer goods and services that meet consumers’ domestic processing privacy needs as well as the personal privacy requirements of data protection. The standard has the UK as secretariat with thirteen participating members[24] and twenty observing members.[24]

The Standards Council of Canada (SCC) is one of the participating members and has established a mirror Canadian committee to ISO/PC317.[25]

The OASIS Privacy by Design Documentation for Software Engineers (PbD-SE)[26] Technical Committee provides a specification to operationalize privacy by design in the context of software engineering. Privacy by design, like security by design, is a normal part of the software development process and a risk reduction strategy for software engineers. The PbD-SE specification translates the PbD principles to conformance requirements within software engineering tasks and helps software development teams to produce artifacts as evidence of PbD principle adherence. Following the specification facilitates the documentation of privacy requirements from software conception to retirement, thereby providing a plan around adherence to privacy by design principles, and other guidance to privacy best practices, such as NIST’s 800-53 Appendix J (NIST SP 800-53) and the Fair Information Practice Principles (FIPPs) (PMRM-1.0).[26]

Relationship to privacy-enhancing technologies

Privacy by design originated from privacy-enhancing technologies (PETs) in a joint 1995 report by Ann Cavoukian and John Borking.[1] In 2007 the European Commission provided a memo on PETs.[27] In 2008 the British Information Commissioner's Office commissioned a report titled Privacy by Design – An Overview of Privacy Enhancing Technologies.[28]

There are many facets to privacy by design. There is the technical side like software and systems engineering,[29] administrative elements (e.g. legal, policy, procedural), other organizational controls, and operating contexts. Privacy by design evolved from early efforts to express fair information practice principles directly into the design and operation of information and communications technologies.[30] In his publication Privacy by Design: Delivering the Promises[2] Peter Hustinx acknowledges the key role played by Ann Cavoukian and John Borking, then Deputy Privacy Commissioners, in the joint 1995 publication Privacy-Enhancing Technologies: The Path to Anonymity.[1] This 1995 report focussed on exploring technologies that permit transactions to be conducted anonymously.

Privacy-enhancing technologies allow online users to protect the privacy of their Personally Identifiable Information (PII) provided to and handled by services or applications. Privacy by design evolved to consider the broader systems and processes in which PETs were embedded and operated. The U.S. Center for Democracy & Technology (CDT) in The Role of Privacy by Design in Protecting Consumer Privacy[31] distinguishes PET from privacy by design noting that “PETs are most useful for users who already understand online privacy risks. They are essential user empowerment tools, but they form only a single piece of a broader framework that should be considered when discussing how technology can be used in the service of protecting privacy.”[31]

Global usage

Germany released a statute (§ 3 Sec. 4 Teledienstedatenschutzgesetz [Teleservices Data Protection Act]) back in July 1997.[32] The new EU General Data Protection Regulation (GDPR) includes ‘data protection by design’ and ‘data protection by default’,[33][34][12] the second foundational principle of privacy by design. Canada’s Privacy Commissioner included privacy by design in its report on Privacy, Trust and Innovation – Building Canada’s Digital Advantage.[35][36] In 2012, U.S. Federal Trade Commission (FTC) recognized privacy by design as one of its three recommended practices for protecting online privacy in its report entitled Protecting Consumer Privacy in an Era of Rapid Change,[37] and the FTC included privacy by design as one of the key pillars in its Final Commissioner Report on Protecting Consumer Privacy.[38] In Australia, the Commissioner for Privacy and Data Protection for the State of Victoria (CPDP) has formally adopted privacy by design as a core policy to underpin information privacy management in the Victorian public sector.[39] The UK Information Commissioner’s Office website highlights privacy by design[40] and data protection by design and default.[41] In October 2014, the Mauritius Declaration on the Internet of Things was made at the 36th International Conference of Data Protection and Privacy Commissioners and included privacy by design and default.[42] The Privacy Commissioner for Personal Data, Hong Kong held an educational conference on the importance of privacy by design.[43][44]

In the private sector, Sidewalk Toronto commits to privacy by design principles;[45] Brendon Lynch, Chief Privacy Officer at Microsoft, wrote an article called Privacy by Design at Microsoft;[46] whilst Deloitte relates certifiably trustworthy to privacy by design.[47]

Criticism and recommendations

The privacy by design framework attracted academic debate, particularly following the 2010 International Data Commissioners resolution that provided criticism of privacy by design with suggestions by legal and engineering experts to better understand how to apply the framework into various contexts.[7][9][8]

Privacy by design has been critiqued as "vague"[7] and leaving "many open questions about their application when engineering systems." Suggestions have been made to instead start with and focus on minimizing data, which can be done through security engineering.[9]

In 2007, researchers at K.U. Leuven published Engineering Privacy by Design noting that “The design and implementation of privacy requirements in systems is a difficult problem and requires translation of complex social, legal and ethical concerns into systems requirements”. The principles of privacy by design "remain vague and leave many open questions about their application when engineering systems". The authors argue that "starting from data minimization is a necessary and foundational first step to engineer systems in line with the principles of privacy by design". The objective of their paper is to provide an "initial inquiry into the practice of privacy by design from an engineering perspective in order to contribute to the closing of the gap between policymakers’ and engineers’ understanding of privacy by design."[9] Extended peer consultations performed 10 years later in an EU project however confirmed persistent difficulties in translating legal principles into engineering requirements. This is partly a more structural problem due to the fact that legal principles are abstract, open-ended with different possible interpretations and exceptions, whereas engineering practices require unambiguous meanings and formal definitions of design concepts.[10]

In 2011, the Danish National It and Telecom Agency published a discussion paper in which they argued that privacy by design is a key goal for creating digital security models, by extending the concept to "Security by Design". The objective is to balance anonymity and surveillance by eliminating identification as much as possible.[48]

Another criticism is that current definitions of privacy by design do not address the methodological aspect of systems engineering, such as using decent system engineering methods, e.g. those which cover the complete system and data life cycle.[7] This problem is further exacerbated in the move to networked digital infrastructures initiatives such as the smart city or the Internet of Things. Whereas privacy by design has mainly been focused on the responsibilities of singular organisations for a certain technology, these initiatives often require the interoperability of many different technologies operated by different organisations. This requires a shift from organisational to infrastructural design.[10]

The concept of privacy by design also does not focus on the role of the actual data holder but on that of the system designer. This role is not known in privacy law, so the concept of privacy by design is not based on law. This, in turn, undermines the trust by data subjects, data holders and policy-makers.[7] Questions have been raised from science and technology studies of whether privacy by design will change the meaning and practice of rights through implementation in technologies, organizations, standards and infrastructures.[49] From a civil society perspective, some have even raised the possibility that a bad use of these design-based approaches can even lead to the danger of bluewashing. This refers to the minimal instrumental use by organizations of privacy design without adequate checks, in order to portray themselves as more privacy-friendly than is factually justified.[10]

It has also been pointed out that privacy by design is similar to voluntary compliance schemes in industries impacting the environment, and thus lacks the teeth necessary to be effective, and may differ per company. In addition, the evolutionary approach currently taken to the development of the concept will come at the cost of privacy infringements because evolution implies also letting unfit phenotypes (privacy-invading products) live until they are proven unfit.[7] Some critics have pointed out that certain business models are built around customer surveillance and data manipulation and therefore voluntary compliance is unlikely.[8]

In 2013, Rubinstein and Good used Google and Facebook privacy incidents to conduct a counterfactual analysis in order to identify lessons learned of value for regulators when recommending privacy by design. The first was that “more detailed principles and specific examples” would be more helpful to companies. The second is that “usability is just as important as engineering principles and practices”. The third is that there needs to be more work on “refining and elaborating on design principles–both in privacy engineering and usability design”. including efforts to define international privacy standards. The final lesson learned is that “regulators must do more than merely recommend the adoption and implementation of privacy by design.”[8]

The advent of GDPR with its maximum fine of 4% of global turnover now provides a balance between business benefit and turnover and addresses the voluntary compliance criticism and requirement from Rubinstein and Good that “regulators must do more than merely recommend the adoption and implementation of privacy by design”.[8] Rubinstein and Good also highlighted that privacy by design could result in applications that exemplified Privacy by Design and their work was well received.[50][8]

The May 2018 European Data Protection Supervisor Giovanni Buttarelli's paper Preliminary Opinion on Privacy by Design states, "While privacy by design has made significant progress in legal, technological and conceptual development, it is still far from unfolding its full potential for the protection of the fundamental rights of individuals. The following sections of this opinion provide an overview of relevant developments and recommend further efforts".[12]

The executive summary makes the following recommendations to EU institutions:

  • To ensure strong privacy protection, including privacy by design, in the ePrivacy Regulation,
  • To support privacy in all legal frameworks which influence the design of technology, increasing incentives and substantiating obligations, including appropriate liability rules,
  • To foster the roll-out and adoption of privacy by design approaches and PETs in the EU and at the member states’ level through appropriate implementing measures and policy initiatives,
  • To ensure competence and resources for research and analysis on privacy engineering and privacy-enhancing technologies at EU level, by ENISA or other entities,
  • To support the development of new practices and business models through the research and technology development instruments of the EU,
  • To support EU and national public administrations to integrate appropriate privacy by design requirements in public procurement,
  • To support an inventory and observatory of the “state of the art” of privacy engineering and PETs and their advancement.

The EDPS will:

  • Continue to promote privacy by design, where appropriate in cooperation with other data protection authorities in the European Data Protection Board (EDPB),
  • Support coordinated and effective enforcement of Article 25 of the GDPR and related provisions,
  • Provide guidance to controllers on the appropriate implementation of the principle laid down in the legal base, and
  • Together with data protection authorities of Austria, Ireland and Schleswig-Holstein, award privacy friendly apps in the mobile health domain.[12]

Implementing privacy by design

The European Data Protection Supervisor Giovanni Buttarelli set out the requirement to implement privacy by design in his article.[51] The European Union Agency for Network and Information Security (ENISA) provided a detailed report Privacy and Data Protection by Design – From Policy to Engineering on implementation.[52] The Summer School on real-world crypto and privacy provided a tutorial on "Engineering Privacy by Design".[53] The OWASP Top 10 Privacy Risks Project for web applications that gives hints on how to implement privacy by design in practice. The OASIS Privacy by Design Documentation for Software Engineers (PbD-SE)[26] offers a privacy extension/complement to OMG’s Unified Modeling Language (UML) and serves as a complement to OASIS’ eXtensible Access Control Mark-up Language (XACML) and Privacy Management Reference Model (PMRM). Privacy by Design guidelines are developed to operationalise some of the high-level privacy-preserving ideas into more granular actionable advice.[54][55], such as recommendations on how to implement privacy by design into existing (data) systems. However, still the applications of privacy by design guidelines by software developers remains a challenge.[56]

See also

References

  1. Hes, R. "Privacy Enhancing Technologies: the path to anonymity" (PDF).
  2. Hustinx, Peter (2010). "Privacy by Design: Delivering the Promises". Identity in the Information Society. 3 (2): 253–255. doi:10.1007/s12394-010-0061-z.
  3. Cavoukian, Ann. "7 Foundational Principles" (PDF).
  4. "32nd International Conference of Data Protection and Privacy Commissioners Jerusalem, Israel 27-29 October, 2010 Resolution on Privacy by Design" (PDF).
  5. Xu, Heng; Crossler, Robert E.; Bélanger, France (2012-12-01). "A Value Sensitive Design Investigation of Privacy Enhancing Tools in Web Browsers". Decision Support Systems. 54 (1): 424–433. doi:10.1016/j.dss.2012.06.003. ISSN 0167-9236. S2CID 14780230.
  6. Cavoukian, Ann (2011). "Privacy by Design" (PDF). Information and Privacy Commissioner.
  7. van Rest, Jeroen (2014). "Designing Privacy-by-Design". Privacy Technologies and Policy. Lecture Notes in Computer Science. Vol. 8319. pp. 55–72. doi:10.1007/978-3-642-54069-1_4. ISBN 978-3-642-54068-4.
  8. Rubinstein, Ira (2012-08-11). "Privacy by Design: A Counterfactual Analysis of Google and Facebook Privacy Incidents". Ira Rubinstein and Nathan Good. SSRN 2128146.
  9. "Engineering Privacy by Design" (PDF). Seda Gurses, Carmela Troncoso, and Claudia Diaz.
  10. van Dijk, Niels; Tanas, Alessia; Rommetveit, Kjetil; Raab, Charles (2018-04-10). "Right engineering? The redesign of privacy and personal data protection". International Review of Law Computers & Technology. 32 (2): 230–256. doi:10.1080/13600869.2018.1457002. hdl:20.500.11820/fc11577d-3520-4ae4-abfd-3d767aeac906. S2CID 65276552.
  11. "Toward Privacy by Design for Data" (PDF). IEEE Data Engineering Bulletin, Special issue on the system implications of GDPR. Retrieved 2022-07-29.
  12. "Preliminary Opinion on privacy by design" (PDF). Giovanni Buttarelli.
  13. "Privacy Conference 2009 Fifth Plenary Session – Privacy by Design".
  14. "Report on the State of PbD to the 33rd International Conference of Data Protection and Privacy Commissioners" (PDF).
  15. Cavoukian, Ann (2010). "Privacy by Design: the definitive workshop. A foreword by Ann Cavoukian, Ph.D" (PDF). Identity in the Information Society. 3 (2): 247–251. doi:10.1007/s12394-010-0062-y. S2CID 144133793.
  16. "'Privacy by Design' approach gains international recognition". 2010-11-04.
  17. "Landmark Resolution passed to preserve the Future of Privacy". Archived from the original on 2010-11-08.
  18. Cavoukian, Ann (January 2011). "The 7 Foundational Principles Implementation and Mapping of Fair Information Practices" (PDF). Information and Privacy Commissioner of Ontario.
  19. Cavoukian, Ann. "Privacy by Design – Primer" (PDF).
  20. Cavoukian, Ann. "Privacy by Design – The 7 Foundational Principles" (PDF). Privacy and Big Data Institute.
  21. "Citations for Privacy by Design in Law, Policy and Practice". Google Scholar.
  22. Cavoukian, Ann. "Privacy by Design in Law, Policy and Practice – A White Paper for Regulators, Decision-makers and Policy-makers" (PDF).
  23. "ISO/PC 317 - Consumer protection: privacy by design for consumer goods and services". 2018-05-11.
  24. "ISO/PC 317 - Participating Members".
  25. "SCC ISO/PC 317 - Consumer protection: privacy by design for consumer goods and services". 2018-04-09.
  26. "OASIS Privacy by Design Documentation for Software Engineers (PbD-SE) TC".
  27. "Privacy Enhancing Technologies (PETs)".
  28. "Privacy by Design – An Overview of Privacy Enhancing Technologies" (PDF).
  29. Danezis, George; Domingo-Ferrer, Josep; Hansen, Marit; Hoepman, Jaap-Henk; Le Metayer, Daniel; Tirtea, Rodica; Schiffner, Stefan (2015). "privacy and data protection by design from policy to engineering". ENISA. arXiv:1501.03726. doi:10.2824/38623. ISBN 9789292041083. S2CID 7917275.
  30. Cavoukian, Ann. "Privacy by Design: Origins, Meaning, and Prospects for Assuring Privacy and Trust in the Information Era)".
  31. "The Role of Privacy by Design in Protecting Consumer Privacy".
  32. "Bundesgesetzblatt".
  33. "Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)". European Commissioner (January 2012).
  34. "European Commission - Fact Sheet Questions and Answers – General Data Protection Regulation".
  35. "Privacy, Trust and Innovation – Building Canada's Digital Advantage". 2010.
  36. "Towards Privacy by Design: Review of the Personal Information Protection and Electronic Documents Act. Report of the Standing Committee on Access to Information, Privacy and Ethics" (PDF).
  37. "Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for businesses and policy-makers" (PDF). FTC Report (March 2012).
  38. "FTC Issues Final Commission Report on Protecting Consumer Privacy". 2012-03-26.
  39. "Office of the Victorian Information Commissioner - Privacy by Design".
  40. "UK ICO - Privacy by Design". Archived from the original on 2018-05-24.
  41. "UK ICO - Data protection by design and default". 2018-11-23.
  42. "Mauritius Declaration on the Internet of Things" (PDF).
  43. "About the Privacy by Design Conference".
  44. "Privacy Commissioner for Personal Data – Privacy by Design".
  45. "Sidewalk Toronto commits to privacy by design principles amid citizen concerns". 2018-05-07.
  46. "Privacy by Design at Microsoft". 2010-11-30.
  47. "Ryerson, Deloitte partner to offer privacy certifications".
  48. "New Digital Security Models" (PDF). Danish National It and Telecom Agency.
  49. Rommetveit, Kjetil; Van Dijk, Niels (2022). "Privacy Engineering and the Techno-regulatory Imaginary". Social Studies of Science. 52 (Online first): 853–877. doi:10.1177/03063127221119424. PMC 9676411. PMID 36000578. S2CID 251767267.
  50. "Why 'Privacy By Design' Is The New Corporate Hotness". Kashmir Hill.
  51. "Privacy by Design - Privacy Engineering" (PDF). Giovanni Buttarelli.
  52. "Privacy and Data Protection by Design – from policy to engineering". ENISA.
  53. "Engineering privacy by design" (PDF).
  54. Perera, Charith; Barhamgi, Mahmoud; Bandara, Arosha K.; Ajmal, Muhammad; Price, Blaine; Nuseibeh, Bashar (February 2020). "Designing privacy-aware internet of things applications". Information Sciences. 512: 238–257. arXiv:1703.03892. doi:10.1016/j.ins.2019.09.061. S2CID 60044.
  55. "Implementing Privacy By Design". Privacy Policies. Retrieved 2020-12-13.
  56. Tahaei, Mohammad; Li, Tianshi; Vaniea, Kami (2022-04-01). "Understanding Privacy-Related Advice on Stack Overflow". Proceedings on Privacy Enhancing Technologies. 2022 (2): 114–131. doi:10.2478/popets-2022-0038.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.