Addressing Myths of Privacy by Design
From time to time, my office becomes aware of some comments questioning the value of Privacy by Design (PbD), our win/win strategy to privacy protection that calls for privacy to be built proactively into the design of information technologies, networked infrastructures and information practices. By embedding privacy directly into systems and practices, from the outset, PbD ensures not only that privacy is an essential component of the core functionality being delivered, but also enables other equally-important functionalities to co-exist. PbD offers a holistic, proactive model for privacy protection that represents a significant shift from the traditional reactive approaches that focus on setting out minimum standards for information management practices, and then provide post-hoc remedies for privacy breaches.
One criticism of PbD as a viable approach to privacy protection was based on a fundamental misunderstanding of PbD‘s purposes and scope. For example, in response to the IPC-Canada Health Infoway co-authored paper advocating for a PbD approach to the design and implementation of electronic health records (EHRs) ["Embedding Privacy into the Design of EHRs to Enable Multiple Functionalities – Win/Win"], one blogger questioned whether the very notion of “Privacy by Design” could even really exist. The blogger argued that as privacy is a social challenge, not a technological one, it cannot be addressed solely through technological “tricks” like a de-identification tool (described in our paper as one example of PbD in action in the context of EHRs). The blogger asserted that because only humans, and not technology, can really protect privacy — through raising societal awareness of the importance of privacy and by developing policies to address new privacy challenges as they arise, a model like PbD cannot truly ensure privacy because, in his view, the technological tools that PbD encourages are useless if they are not implemented within a culture that respects privacy. It is precisely this type of mindset that we must overcome – that it can only be one way, or another – not multiple streams of action concurrently moving forward to protect privacy.
I fully agree that technology by itself is insufficient to protect privacy — what is needed is a holistic approach that embeds privacy not only within technology, but also within business practices, processes and networked infrastructure, in order to support a culture of privacy. This is precisely what PbD provides. PbD does not merely call for the deployment of certain technological “tricks,” as suggested by the blogger, but rather for the careful consideration of privacy implications upfront, at the design stage, for all information technologies, business processes, physical spaces and networked infrastructures, in order to ensure that privacy is built into all functionalities, at the outset. For example, if privacy is not built into technological solutions at the outset, these tools will not support whatever privacy policies are implemented. This problem is illustrated by legacy systems. It can be extremely difficult, if not impossible, to build privacy into these systems after the fact. This can create difficulties in complying with privacy legislation and in complying with policies requiring that access to information be limited based on the “need to know principle” and that records of access in these systems be maintained.
Contrary to what was suggested, PbD does not merely propose that EHR designers “add on” a de-identification tool and conclude that technology has satisfied privacy requirements. What PbD instead requires is for designers to think about privacy upfront, as well as throughout the planning process, so that privacy can be designed directly into EHR systems, intentionally, through careful planning and thought, in order to support privacy legislation and internal privacy policies and procedures. As described in our paper, applying PbD in the context of EHR systems encompasses not only technological solutions (like de-identification, encryption and secure auditing tools) but also the statutory requirements and privacy policies with which these tools are designed to comply, as well as other elements of an appropriate governance framework, including independent privacy oversight, appropriate privacy and security training, robust data breach policies and procedures, and public education and awareness.
By correcting misinformation about the scope and the purposes of PbD, I hope to raise awareness about what PbD can achieve: a positive-sum, win/win result in which all legitimate interests are enabled, while at the same time protecting privacy to the greatest degree possible. My office is not alone in championing PbD‘s proactive, holistic approach to privacy protection: PbD has been embraced internationally by the European Commission and the U.S. Federal Trade Commission, and was unanimously adopted as an International Standard by Data Protection and Privacy Commissioners in 2010. The landmark resolution recognized PbD as an “essential component of fundamental privacy protection,” and urged its adoption in regulations and legislation around the world.