Wednesday, May 18, 2016

Seminar 6 Vian Bakir Position Statement


Engaging Publics and Policy-Makers on core DATA-PSST! Issues

 Vian Bakir

Professor of Political Communication & Journalism, Bangor University


Drawing on this seminar series’ policy recommendations, experience of this seminar series and subject expertise, we asked participants in this final DATA-PSST! seminar to reflect on what they think the general public and policy-makers most need to know; why they need to know this; and how these messages can be creatively and feasibly communicated.

What do the general public most need to know?

Seminar 2 concluded that ‘Given the many conflicting opinion polls and studies conducted since Snowden, a definitive analysis is needed on public perceptions of intelligence, surveillance, oversight and accountability.’

We took this challenge on and published, in 2015, Public Feeling on Privacy, Security and Surveillance - A Report by DATA-PSST and DCSS. This is a synthesis of opinion polls on state and commercial surveillance, and of EU-wide citizen summits on trade-offs between security and privacy conducted by the project on Surveillance, Privacy and Security (SurPRISE). This produced the following observations:

-        Unlike the UK government, the British public sees bulk data collection as constituting mass surveillance.

-        The topics of UK state surveillance of digital communications and online privacy matter to the British, and wider EU public. This is confirmed by opinion poll data since 2013 and in-depth studies.

-        The EU and UK public think that although certain surveillance technologies are useful/effective for combating national security threats, they compromise human rights and are abused by security agencies. These concerns especially apply to deep packet inspection.

-         There are identifiable criteria for what makes security-oriented surveillance technologies acceptable for EU publics. Targeted rather than blanket surveillance is preferred, as are clear communications to citizens about what is going on, with strong regulatory oversight.


Various recommendations emerged, but perhaps the most important is that:
‘governments seeking a popular mandate for digital surveillance should ensure that such surveillance is targeted rather than blanket, accompanied by strong regulatory oversight and clear communications to citizens about what is going on.’

From this, we can conclude that the public needs to know, in clear terms, what is going on when their data is surveilled, and how this is surveillance is overseen. This would be the first stage in enabling them to make informed decisions on whether they are OK with such surveillance, and if not, what they can do about it.

This recommendation feeds into the second question set for this seminar: what do policy-makers most need to know?

What do policy-makers most need to know?


A constellation of policy recommendations from the past five DATA-PSST! seminars converge on the same two points, both revolving around what citizens want: namely, better oversight of surveillant entities (i.e.intelligence agencies and commercial firms); and better communication of what is going on:

      On better oversight of surveillant entities, Policy Recommendations include:
-        We recommend a particular form of transparency – with opacity built in to protect necessary secrets, but with regular and periodical review of all stages of the data process by diverse actors drawn from citizenry, civil liberties groups, technologists, industry and of course intelligence agencies. (Seminar 5)
-        To improve oversight, and trust in this process, independent members of the public should be able to contribute to the Intelligence and Security Committee of Parliament. (Seminar 2)
-        We suggest greater transparency about data collection and processing, and about the effectiveness of policies based on such surveillance. (Seminar 5)
-        More accountability, not only transparency, concerning the actions of the state and secret-services is needed if public trust is to be rebuilt. (Seminar 3)
-        At policy-making level, participants recommend that: the government’s definition of its targets and who extremists are needs to be much more narrow; and selling surveillance technologies to non-democratic states must be regulated with better monitoring. (Seminar 3)
-        There needs to be meaningful review of the oversight for surveillance in the UK as well as greater openness regarding the systems in place to ensure targeting is carried out in a way that protects minorities and respects free speech and civil/human rights. We must target incitement and planning of violent activities. However, extreme views are not illegal. (Seminar 2)

     On better communication of what is going on, Policy Recommendations include:
-        We suggest that the aims of any governmental or commercial surveillant organisation involved in data collection and processing are publicly articulated more fully and clearly. They should provide more detail than blanket terms such as ‘protecting national security’, and more meaningful clarity than complex Terms and Conditions and associated tick-boxes of consent and compliance. (Seminar 5)
-        For these aims to be better understood within society we suggest the need for greater public engagement by surveillant entities with citizens. This would help generate challenges, dialogue and perhaps even consensus and greater trust.  (Seminar 5)
-        The public needs more digital and data literacy. As an ethical starting point, governments should more fully share with the public what their capacities to surveil are. The public needs to understand the surveillant black boxes that pervade everyday life, and what it gives up if it withholds data from commercial surveillers. We need a public debate involving mainstream media on whether we are able to understand these abstract surveillant processes. (Seminar 4)
-        More education and a better quality public debate (eg in the media) are required to inform the public on matters of surveillance and national security. The complexity of the issue makes it difficult to explain, and we need to find ways of making these issues both clearer and more relevant for a general public, bearing in mind that social change can happen through ‘agitators’ creating a better debate. (Seminar 3)

This seminar’s third question is on how these messages can be creatively and feasibly communicated.

How can these messages be creatively and feasibly communicated?


Specifically addressing communication of some of the messages above, Andrew McStay, drawing on his expertise in advertising and privacy, has initiated a creative advertising brief for seminar participants to consider.

In terms of how to engage creatively and feasibly with a range of users, we have a range of highly innovative ideas in this seminar’s Positions Statements.  For instance, reflecting the inter-disciplinary focus of DATA-PSST! some focus on explaining these abstract, complex ideas and processes to students reading very different types of university degree. Yuwei Lin explains how she has been encouraging data literacy, especially knowledge of big data, privacy and surveillance with her arts and design students at University for the Creative Arts. Lachlan Urqhuart explains his innovative ‘data protection by design’ playing cards developed at the University of Nottingham. These help computer designers and engineers explore the unfamiliar or challenging issues of forthcoming EU Data Protection law, so moving the principle of data protection by design from theory into practice.

More broadly reflecting on the creative tensions and opportunities when theory is married with practice, Clare Birchall asks what media form is best suited to disseminating the outcomes from a seminar series on transparency, surveillance and privacy. She urges us to think about the role and limits of revelation in public life and to experiment with media forms to highlight the problematics inherent in the ‘objects’ we study. An exemplary practice here is her colloquium on the politics and practices of secrecy. Reflecting also on the political economy of media forms, she further urges us to adopt ethical publishing practices.
Whatever we decide in this seminar, there should be some interesting outcomes. Watch this space!

Tuesday, May 17, 2016

Seminar 6: Steve Wright Position Statement


UNIVERSAL TARGETING – Can it be avoided?

Steve Wright

Applied Ethics, Leeds Beckett University


Much of Surveillance debate in Criminology quarters has been taken up with issues of privacy and accountability and rightly so. Technological capacities have advanced in ways which were undreamt of when surveillance studies as a discipline emerged in the late Seventies and early Eighties. Now respectable scientific journals such as New Scientist have asked if we have entered a new era, is this – “The end of anonymity” and “what happens when we cannot hide who we are anymore?” (New Scientist – 26 October 2013). These are significant issues for debate but in my mind they do not capture the core future issue of this important field. To me the single most important future issue is “What happens in Human terms When you Pass Your Surveillance Agenda Over to the Military and Their Ancillary Toolboxes, Ideology and Architecture with all of its capacities?”

Paul Lashmar has recently circulated this news item: - 9 April 2016 .Britain's top secret kill list: How British police backed by GCHQ fed names of drug lords to a US assassination unit, which - under cover of the war on terror - wiped out an innocent family with a missile strike  By David Rose for The Mail on Sunday. http://www.dailymail.co.uk/news/article-3531814/Britain-s-secret-kill-list-British-police-backed-GCHQ-fed-names-drug-lords-assassination-unit-cover-war-terror-wiped-innocent-family-missile-strike.html
In a nutshell it is about extrajudicial killing using surveillance and British collusion with a process which has gone beyond the limits of the law. And they got it wrong so who can the family call? At the time of writing, I am sensitive to the subject since the European Parliament Library Information Service has recently published a report on Fighting the Trade in tools of Execution and Torture: https://epthinktank.eu/2016/04/05/fighting-trade-in-tools-for-torture-and-executions-eu-legislation-in-progress/. I spent time serving as an expert on the relevant committee making the EU a torture and execution technology-free zone. Yet here we have an operation which, outside of Northern Ireland, would never be legally countenanced in Europe? But we outsourced it from here and brokered the necessary surveillance and killing tools which then got the wrong people but the military are in the luxurious position of burying their mistakes….

Military style surveillance operates on budgets which are beyond the reach of most civilian counterparts. And yet that technology and its integration with civilian counterparts is beginning to proliferate down into our street operations and overall apparatus of policing – albeit it with a C4I dimension of communication, computers, command, control and intelligence. This cybernetic approach is goal orientated and is essentially about targeting and of course there is an economic benefit of key or Prime manufacturers advocating such “solutions” using big data for civilian policing. 

The danger is that such military approaches are custom-built to move beyond simply observational surveillance into targeted functions, if permissions are incorporated into such systems - including intelligence and geo-location of wanted groups or suspect communities or high security installations around which designated new rules apply. So there will always be the danger of new norms arising to which there is precious little awareness or effective opposition. Right now, in Europe, that propensity is most likely to arise under the guise of counter terror operations or border management operations where the status of the group under surveillance is suspect or the borders, including maritime borders are just far too extensive for humans to do an effective job. Here we can foresee a perfect storm of migrants, suspect communities and potential threats demanding not just new levels of scrutiny but new capacities of exclusion if need be. Surveillance, targeting and area denial are natural operations in military practice but in most European areas such overt military presence on the ground is unusual, or was. Now military personnel are common sites in both Paris and Brussels after recent attacks and they are in their own words on “a war footing” which means operationalizing targeted surveillance backed up by lethal force. Future attacks in Europe are almost inevitable and will accelerate this process. The biggest challenge is talking about such realities now rather than in the highly emotional aftermath of some directed terrorist swarm attack with causalities in their hundreds across more than one state.  If any part of such a nightmare scenario were to be realized, the rules of surveillance as we know them will be rapidly dropped in favour of the militarized paradigm outlined here.
I hope we can discuss the pragmatics of avoiding such a depressing future.



Seminar 6 Clare Birchall Position Statement



Thoughts on Dissemination


Clare Birchall

Contemporary Culture, Kings College London



To accompany this final seminar in the series, I wanted to ask what media form would be best suited to the dissemination of the outcomes from a seminar series on the subjects of transparency, surveillance and privacy. The blog on which this position statement appears has been the medium of choice up until this point. There is also a special issue of Big Data and Society in the pipeline, as well as a documentary and a Vine being developed. Announcements have been made over listservs and commentary via Twitter. Traditional media outlets, too, have been turned to in order to communicate the very prescient concerns of this network. The final seminar is dedicated to discussing dissemination and outputs.

In this position statement, I want to think about the same issue from a slightly different angle. Running alongside my pragmatic concerns as a co-investigator with how to achieve impact and maximum outreach, I want to think about the role and limits of revelation in public life as well as the ways we can experiment with media forms to highlight the problematics inherent in the ‘objects’ we study.

Jodi Dean insists that revelation and transparency are beside the point. “All sorts of horrible political processes are perfectly transparent today. The problem is that people . . . are so enthralled to transparency that they have lost the will to fight” (2002: 174). She calls for “decisive action” as remedy. Alasdair Roberts makes a similar argument: “The significance of Abu Ghraib,” he writes in this context, “may also lie in the extent to which we overestimated the catalytic effect of exposure” (2006: 238). For him, democracy has to involve the responsibility of the public to act upon the information it apparently has a right to. Jeremy Gilbert asserts that any tendency towards transparency “has to go beyond the mere telling of secrets and become real acts of what we might call . . . ‘publication,’ or ‘publicity,’” (2007: 38) which involves the politicization of an event or issue – making them objects of debate, discussion, and intervention. While coming from different political angles, all of these writers insist on the need for action, decision, politicization, to accompany transparency measures, exposé, and revelation. Something has to ‘happen’ because of the new information and data released into the public sphere.

If we call on the language of Jacques Rancière, we could say that “the distribution of the sensible” has to alter because of the new space such information and data takes up. Rancière’s distribution of the sensible is an aesthetico-political settlement. It is, in his words:

a delimitation of spaces and times, of the visible and the invisible, of speech and noise, that simultaneously determines the place and the stakes of politics as a form of experience. Politics revolves around what is seen and what can be said about it, around who has the ability to see and the talent to speak, around the properties of spaces and the possibilities of time. (2004: 12-13)

The distributive regime determines what action, reaction, and thought is possible in any given situation. It is political precisely because in every ‘distribution of the sensible’ equality is either undermined or affirmed.

A distribution determines “those who have a part in the community of citizens” (7); it “reveals who can have a share in what is common to the community based on what they do and on the time and space in which this activity is performed” (8). Equality is when those without part, the unrepresented, come to take part; those without a share, have a share. In a process of subjectivisation, this involves refuting the subject positions one is ascribed by the system, and finding a name or identity-in-relation that will enable full participation and recognition – akin to the work the term ‘proletariat’ once performed. An instantiation of politics based on equality, then, is when demands for a new division and sharing of the social whole are granted.

In this way, the Snowden revelations or the Panama Papers potentially alter the distribution of the sensible, changing what can be known, but the revelations can always be absorbed into the white noise of communicative capitalism, be contained by various discursive manoeuvres, or prompt only weak tweaks to a robustly inequitable system before any new division has taken place.

In light of this, it might not always be desirable to reach the most amount of readers or audience. It might, rather, be preferable to reach readers in particular ways or at particular times to maximize attention and the chance for action. Such an approach to dissemination means keeping in mind two things:

1)     the confluence between form and content;
2)     the political economy of media forms.

To take the first of these, I’d like to briefly describe a project I devised with the help of my collaborator, Pete Woodbridge. Having put on a colloquium concerned with the politics and practices of secrecy, I wanted to disseminate recordings of the talks and sessions in a manner that did justice to the theme with which they were concerned. Pete and I decided that we wanted to emulate the experience of the secret. To do so, we “leaked” an instruction on listservs and social media to visit a website and enlist for more instructions. For those who signed up, a message was secreted to them with instructions on how to find and decrypt the talks online and the necessary passwords. The instructions self-destructed, erased themselves before the viewers’ eyes. The secret was briefly revealed. We also made the data available on a torrent so that files of the talks were distributed over the network like fragments of a secret. If users downloaded the files as a torrent onto their computer, they were emulating the secret societies that the grandfather of secrecy studies, Georg Simmel, wrote so much about.

In terms of the second point, it is necessary as academics engaged with the politics of transparency, that we self-reflexively consider the political economy of the publishing and distribution networks we engage with. The links major academic publishers have with ethically dubious enterprises has been well documented by Ted Striphas, Gary Hall, Janneke Adema and others. Though we all have to work within the constraints and demands of modern academic jobs (not least the REF in the UK), it is important to place the ethics of publishing above metrics. With these concerns in mind, and working with like-minded colleagues such as the aforementioned Gary Hall and Janneke Adema, I have been involved with various alternative publishing projects such as Liquid Books (an experimental series of open-edited and open-access books); Living Books About Life (a series of open-edited and open-access books funded by JISC); and Open Humanities Press (an open access publisher focused on critical and cultural theory that acts on principles of access, diversity and transparency).

In practice, an ethical publishing means, whenever possible, fulfilling the following goals:

·       working on a non-profit basis – all OHP books and journals, for example, are available open access on a free gratis basis, some of them libre too;
·       using open source software – OHP journals generally use either Open Journal Systems or WordPress;
·       operating as a collective - of theorists, philosophers, researchers, scholars, librarians, publishers, technologists etc. OHP operates as a networked, cooperative, collaborative, unpaid multi-user collective;
·       gifting our labour – rather than insisting on being paid for it. We see this as a means of helping to de-center waged work from its privileged place in late capitalist neoliberal society;
·       and working horizontally in a non-rivalrous fashion - OHP freely shares its knowledge, expertise and even its books with other presses such as Open Book Publishers at Cambridge, Open Edition in France, and the Hybrid Publishing Lab at Leuphana University in Germany.

I mention all of this by way of raising the following questions: What form of dissemination will do justice to the concerns of privacy, transparency, and surveillance? And: What would ethically informed modes of transparency/revelation/dissemination look like? How can we ensure that our revelations will be actionable? That they will alter, for the better, the distribution of the sensible?

Seminar 6 Lachlan Urquhart Position Statement


Making Privacy by Design a Reality?


Lachlan Urquhart

Mixed Reality Laboratory & Horizon Digital Economy CDT, University of Nottingham



We have developed a tool that aims to take the principle of data protection by design from theory into practice. Article 23 of the General Data Protection (DP) Reform Package (GDPR) mandates data protection by design and default (DPbD). This requires system designers to be more involved in data protection regulation, early on in the innovation process. Whilst this idea makes sense, we need better tools to help designers actually meet their new regulatory obligations. [1]

Guidance on what DPbD actually requires in practice is sparse, although work from usable privacy and security or privacy engineering does provide some guidance [5, 6]. These may favour technical measures like anonymisation or tools to increase user control over their personal data [7]; or organisational approaches like privacy impact assessments. [2]

By calling on design to be part of regulation, it is calling upon the system design community, one that is not ordinarily trained or equipped to deal with regulatory issues. Law is not intuitive or accessible to non-lawyers, yet by calling for privacy by design, the law is mandating non-lawyers be involved in regulatory practices. We argue that there is a need to engage, sensitise and guide designers on data protection issues on their own terms.

Presenting a non-legal community with legislation, case law or principles framed in complex, inaccessible legalese is not a good starting point. Instead, a truly multidisciplinary approach is required to translate legal principles from law to design. In our case, we bring together information technology law and human computer interaction. [4]

Our data protection by design cards are an ideation technique that helps designers explore the unfamiliar or challenging issues of EU DP law. [8] Our cards focus on the newly passed GDPR, which comes into effect in 2018. They are designed to be sufficiently lightweight for deployment in a range of design contexts eg connected home ecosystems or smart cars. We have been testing them through workshops with teams of designers in industry and education contexts: we are trying to understand the utility of the cards as a privacy by design tool. [9]

A further challenge for privacy by design goes beyond how to communicate regulatory requirements to communities unfamiliar with the law and policy landscape. Whilst finding mechanisms for delivering complex content in more accessible ways is one issue, like our cards, finding the best forums for engagement with these concepts is another. Two examples could be the role of state regulators and industry/professional associations. State regulatory bodies, like the UK ICO or EU Article 29 Working Party, have a role to play in broadcasting compliance material and supporting technology designers’ understanding of law and regulation. The needs of each business will vary, and support has to adapt accordingly. One example could be the size and resources a business has at its disposal. It is highly likely these will dictate how much support they needed to understand regulatory requirements e.g. an under resourced Small or Medium-sized Enterprise vs. a multinational with in-house legal services. 

Industry and professional associations, like British Computer Society, Association for Computing Machinery or the Institute of Electrical and Electronics Engineers may be suitable forums for raising awareness with members about the importance of regulation too. Sharing best practice is a key element of this, and these organisations are in a good position to feed their experience into codes of practice, like those suggested by Art 40 GDPR. 

[1] – L Urquhart and E Luger “Smart Cities: Creative Compliance and the Rise of ‘Designers as Regulators’” (2015) Computers and Law 26(2)
[2] – D Wright and P De Hert Privacy Impact Assessment (2012 Springer)
[3] – A29 WP “Opinion 8/2014 on the recent Developments on the Internet of Things” WP 233
[4] –  We are conducting a project in the EU and US involving researchers from: University of Nottingham (Tom Rodden, Neha Gupta, Lachlan Urquhart),Microsoft Research Cambridge (Ewa Luger, Mike Golembewski), Intel (Jonathan Fox), Microsoft (Janice Tsai), University of California Irvine (Hadar Ziv) and New York University (Lesley Fosh and Sameer Patil).
EU project page and cards are available at designingforprivacy.co.uk
[5] – J Hong “Usable Privacy and Security: A Grand Challenge for HCI” (2009) Human Computer Interaction Institute
[6] – Danezis et al “Privacy and Data Protection by Design– from policy to engineering” (2014) ENISA; M Dennedy, J Fox and T Finneran “Privacy Engineer’s Manifesto” (2014) Apress; S Spiekermann and LF Cranor “Engineering Privacy” (2009) IEEE Transactions on Software Engineering 35 (1)
[7] – H Haddadi et al “Personal Data: Thinking Inside the Box”(2015) 5th Decennial Aarhus Conferences; R Mortier et al “Human -Data Interaction: The Human Face of the Data Driven Society”(2014) http://hdiresearch.org/
[8] IDEO https://www.ideo.com/work/method-cards; M Golembewski and M Selby “Ideation Decks: A Card Based Ideation Tool” (2010) Proceedings of ACM DIS ’10, Aarhus, Denmarkhttps://dl.acm.org/citation.cfm?id=1858189
[9] E Luger, L Urquhart, T Rodden, M Golembewski “Playing the Legal Card” (2015) Proceedings of ACM CHI ’15, Seoul, S Korea


Seminar 6 Yuwei Lin Position Statement


A pedagogy for artivism


Yuwei Lin

University for the Creative Arts


The importance of data literacy and the need for improving it through both formal educational channel and public engagement have kept being flagged up in every single past Data-Psst! Workshop I have been to. There is a real demand for action taking. This academic year 2015-16, I took advantage of the knowledge I learned from the Data-Psst! workshop and devised a curriculum teaching the concepts of ‘big data’, ‘privacy’ and ‘surveillance’ to my level 5 undergraduate students at a specialist arts and design university in the UK, the University for the Creative Arts. We not only approached the issues from the legal perspectives (learning a number of problematic laws including the Data Retention and Investigatory Power Act (DRIPA), Digital Economy Act), but also engaged in the debate in a hands-on practical manner.

In the learning of laws, the students had demonstrated a great understanding of the flaws and problems of DRIPA and Digital Economy Act. They were fascinated by the great gestures of the hackers ‘Edward Snowden’ and ‘Julian Assange’, and the campaigns that various civic NGO groups have staged, including the Open Rights Group. However, many of them still subscribed to the ‘propaganda’ statement that the government has made: ‘nothing to hide, nothing to fear’. This is a generation of believers of ‘the death of privacy’. Since their engagement with digital technologies is so deeply entangled in everyday life, it is difficult to ask them to keep a ‘critical distance’.

One of the assignments asked the students to make a short video expressing their idea of ‘privacy’. The interpretations of that has been varied. Here are a list of exemplary videos made by the UCA students that you can view online:

https://youtu.be/n5TQ1EYgcm8
https://youtu.be/Fy8JzK5dxNw
https://youtu.be/inHevgHzs2I
https://www.youtube.com/watch?v=7ABPuOQUoUI     
https://www.youtube.com/watch?v=9ij2JTC8Y3A
https://vimeo.com/149351313
https://vimeo.com/149265579
https://youtu.be/dswcOI_K05k
https://www.youtube.com/watch?v=Fac3UuVYGyQ&feature=em-upload_owner

When asked to turned their ideas into an interactive piece to engage more people, quite a few designed quiz games or made a narrative choice-based game, like this one:


Other interactive projects include a ‘You’ve been framed!’ performance, and an audio-visual installation with unsettling images from the GCHQ.

Although I felt a bit disappointed that few of my undergraduate students actually became more critical of ‘privacy’ and ‘surveillance’ issues in today’s data-centric society, it motivates me to try harder next time. After all, it could only show that how difficult it is to change people’s mindset, beliefs and behaviours. It definitely has been an interesting journey.