Making Privacy by Design a Reality?
Lachlan Urquhart
Mixed Reality Laboratory & Horizon Digital Economy CDT, University of Nottingham
We have developed a tool that aims to
take the principle of data protection by design from theory into practice.
Article 23 of the General Data Protection (DP) Reform Package (GDPR) mandates
data protection by design and default (DPbD). This requires system designers to
be more involved in data protection regulation, early on in the innovation
process. Whilst this idea makes sense, we need better tools to help designers
actually meet their new regulatory obligations. [1]
Guidance on what DPbD actually requires
in practice is sparse, although work from usable privacy and security or
privacy engineering does provide some guidance [5, 6]. These may favour
technical measures like anonymisation or tools to increase user control over
their personal data [7]; or organisational approaches like privacy impact
assessments. [2]
By calling on design to be part of
regulation, it is calling upon the system design community, one that is not
ordinarily trained or equipped to deal with regulatory issues. Law is not
intuitive or accessible to non-lawyers, yet by calling for privacy by design,
the law is mandating non-lawyers be involved in regulatory practices. We argue
that there is a need to engage, sensitise and guide designers on data
protection issues on their own terms.
Presenting a non-legal community with
legislation, case law or principles framed in complex, inaccessible legalese is
not a good starting point. Instead, a truly multidisciplinary approach is
required to translate legal principles from law to design. In our case, we
bring together information technology law and human computer interaction. [4]
Our data protection by design cards are
an ideation technique that helps designers explore the unfamiliar or
challenging issues of EU DP law. [8] Our cards focus on the newly passed GDPR,
which comes into effect in 2018. They are designed to be sufficiently
lightweight for deployment in a range of design contexts eg connected home
ecosystems or smart cars. We have been testing them through workshops with
teams of designers in industry and education contexts: we are trying to
understand the utility of the cards as a privacy by design tool. [9]
A further challenge for privacy by
design goes beyond how to communicate regulatory requirements to communities
unfamiliar with the law and policy landscape. Whilst finding mechanisms for
delivering complex content in more accessible ways is one issue, like our
cards, finding the best forums for engagement with these concepts is another.
Two examples could be the role of state regulators and industry/professional
associations. State regulatory bodies, like the UK ICO or EU Article
29 Working Party, have a role to play in broadcasting compliance material and
supporting technology designers’ understanding of law and regulation. The
needs of each business will vary, and support has to adapt accordingly. One
example could be the size and resources a business has at its disposal. It is
highly likely these will dictate how much support they needed to understand
regulatory requirements e.g. an under resourced Small or Medium-sized Enterprise
vs. a multinational with in-house legal services.
Industry and professional associations,
like British Computer Society, Association for Computing Machinery or the
Institute of Electrical and Electronics Engineers may be suitable forums for
raising awareness with members about the importance of regulation too. Sharing
best practice is a key element of this, and these organisations are in a good
position to feed their experience into codes of practice, like those suggested
by Art 40 GDPR.
[1] – L Urquhart and E Luger “Smart
Cities: Creative Compliance and the Rise of ‘Designers as Regulators’” (2015)
Computers and Law 26(2)
[2] – D Wright and P De Hert Privacy
Impact Assessment (2012 Springer)
[3] – A29 WP “Opinion 8/2014 on the
recent Developments on the Internet of Things” WP 233
[4] – We are conducting a project
in the EU and US involving researchers from: University of Nottingham (Tom
Rodden, Neha Gupta, Lachlan Urquhart),Microsoft Research Cambridge (Ewa Luger,
Mike Golembewski), Intel (Jonathan Fox), Microsoft (Janice Tsai), University of
California Irvine (Hadar Ziv) and New York University (Lesley Fosh and Sameer
Patil).
EU project page and cards are available
at designingforprivacy.co.uk
[5] – J Hong “Usable Privacy and
Security: A Grand Challenge for HCI” (2009) Human Computer Interaction
Institute
[6] – Danezis et al “Privacy and Data
Protection by Design– from policy to engineering” (2014) ENISA; M Dennedy, J
Fox and T Finneran “Privacy Engineer’s Manifesto” (2014) Apress; S Spiekermann
and LF Cranor “Engineering Privacy” (2009) IEEE Transactions on Software
Engineering 35 (1)
[7] – H Haddadi et al “Personal Data:
Thinking Inside the Box”(2015) 5th Decennial Aarhus Conferences; R Mortier et
al “Human -Data Interaction: The Human Face of the Data Driven
Society”(2014) http://hdiresearch.org/
[8] IDEO https://www.ideo.com/work/method-cards;
M Golembewski and M Selby “Ideation Decks: A Card Based Ideation Tool” (2010)
Proceedings of ACM DIS ’10, Aarhus, Denmarkhttps://dl.acm.org/citation.cfm?id=1858189
[9] E Luger, L Urquhart, T Rodden, M
Golembewski “Playing the Legal Card” (2015) Proceedings of ACM CHI ’15, Seoul,
S Korea
No comments:
Post a Comment