Tuesday, May 17, 2016

Seminar 6: Steve Wright Position Statement


UNIVERSAL TARGETING – Can it be avoided?

Steve Wright

Applied Ethics, Leeds Beckett University


Much of Surveillance debate in Criminology quarters has been taken up with issues of privacy and accountability and rightly so. Technological capacities have advanced in ways which were undreamt of when surveillance studies as a discipline emerged in the late Seventies and early Eighties. Now respectable scientific journals such as New Scientist have asked if we have entered a new era, is this – “The end of anonymity” and “what happens when we cannot hide who we are anymore?” (New Scientist – 26 October 2013). These are significant issues for debate but in my mind they do not capture the core future issue of this important field. To me the single most important future issue is “What happens in Human terms When you Pass Your Surveillance Agenda Over to the Military and Their Ancillary Toolboxes, Ideology and Architecture with all of its capacities?”

Paul Lashmar has recently circulated this news item: - 9 April 2016 .Britain's top secret kill list: How British police backed by GCHQ fed names of drug lords to a US assassination unit, which - under cover of the war on terror - wiped out an innocent family with a missile strike  By David Rose for The Mail on Sunday. http://www.dailymail.co.uk/news/article-3531814/Britain-s-secret-kill-list-British-police-backed-GCHQ-fed-names-drug-lords-assassination-unit-cover-war-terror-wiped-innocent-family-missile-strike.html
In a nutshell it is about extrajudicial killing using surveillance and British collusion with a process which has gone beyond the limits of the law. And they got it wrong so who can the family call? At the time of writing, I am sensitive to the subject since the European Parliament Library Information Service has recently published a report on Fighting the Trade in tools of Execution and Torture: https://epthinktank.eu/2016/04/05/fighting-trade-in-tools-for-torture-and-executions-eu-legislation-in-progress/. I spent time serving as an expert on the relevant committee making the EU a torture and execution technology-free zone. Yet here we have an operation which, outside of Northern Ireland, would never be legally countenanced in Europe? But we outsourced it from here and brokered the necessary surveillance and killing tools which then got the wrong people but the military are in the luxurious position of burying their mistakes….

Military style surveillance operates on budgets which are beyond the reach of most civilian counterparts. And yet that technology and its integration with civilian counterparts is beginning to proliferate down into our street operations and overall apparatus of policing – albeit it with a C4I dimension of communication, computers, command, control and intelligence. This cybernetic approach is goal orientated and is essentially about targeting and of course there is an economic benefit of key or Prime manufacturers advocating such “solutions” using big data for civilian policing. 

The danger is that such military approaches are custom-built to move beyond simply observational surveillance into targeted functions, if permissions are incorporated into such systems - including intelligence and geo-location of wanted groups or suspect communities or high security installations around which designated new rules apply. So there will always be the danger of new norms arising to which there is precious little awareness or effective opposition. Right now, in Europe, that propensity is most likely to arise under the guise of counter terror operations or border management operations where the status of the group under surveillance is suspect or the borders, including maritime borders are just far too extensive for humans to do an effective job. Here we can foresee a perfect storm of migrants, suspect communities and potential threats demanding not just new levels of scrutiny but new capacities of exclusion if need be. Surveillance, targeting and area denial are natural operations in military practice but in most European areas such overt military presence on the ground is unusual, or was. Now military personnel are common sites in both Paris and Brussels after recent attacks and they are in their own words on “a war footing” which means operationalizing targeted surveillance backed up by lethal force. Future attacks in Europe are almost inevitable and will accelerate this process. The biggest challenge is talking about such realities now rather than in the highly emotional aftermath of some directed terrorist swarm attack with causalities in their hundreds across more than one state.  If any part of such a nightmare scenario were to be realized, the rules of surveillance as we know them will be rapidly dropped in favour of the militarized paradigm outlined here.
I hope we can discuss the pragmatics of avoiding such a depressing future.



Seminar 6 Clare Birchall Position Statement



Thoughts on Dissemination


Clare Birchall

Contemporary Culture, Kings College London



To accompany this final seminar in the series, I wanted to ask what media form would be best suited to the dissemination of the outcomes from a seminar series on the subjects of transparency, surveillance and privacy. The blog on which this position statement appears has been the medium of choice up until this point. There is also a special issue of Big Data and Society in the pipeline, as well as a documentary and a Vine being developed. Announcements have been made over listservs and commentary via Twitter. Traditional media outlets, too, have been turned to in order to communicate the very prescient concerns of this network. The final seminar is dedicated to discussing dissemination and outputs.

In this position statement, I want to think about the same issue from a slightly different angle. Running alongside my pragmatic concerns as a co-investigator with how to achieve impact and maximum outreach, I want to think about the role and limits of revelation in public life as well as the ways we can experiment with media forms to highlight the problematics inherent in the ‘objects’ we study.

Jodi Dean insists that revelation and transparency are beside the point. “All sorts of horrible political processes are perfectly transparent today. The problem is that people . . . are so enthralled to transparency that they have lost the will to fight” (2002: 174). She calls for “decisive action” as remedy. Alasdair Roberts makes a similar argument: “The significance of Abu Ghraib,” he writes in this context, “may also lie in the extent to which we overestimated the catalytic effect of exposure” (2006: 238). For him, democracy has to involve the responsibility of the public to act upon the information it apparently has a right to. Jeremy Gilbert asserts that any tendency towards transparency “has to go beyond the mere telling of secrets and become real acts of what we might call . . . ‘publication,’ or ‘publicity,’” (2007: 38) which involves the politicization of an event or issue – making them objects of debate, discussion, and intervention. While coming from different political angles, all of these writers insist on the need for action, decision, politicization, to accompany transparency measures, exposé, and revelation. Something has to ‘happen’ because of the new information and data released into the public sphere.

If we call on the language of Jacques Rancière, we could say that “the distribution of the sensible” has to alter because of the new space such information and data takes up. Rancière’s distribution of the sensible is an aesthetico-political settlement. It is, in his words:

a delimitation of spaces and times, of the visible and the invisible, of speech and noise, that simultaneously determines the place and the stakes of politics as a form of experience. Politics revolves around what is seen and what can be said about it, around who has the ability to see and the talent to speak, around the properties of spaces and the possibilities of time. (2004: 12-13)

The distributive regime determines what action, reaction, and thought is possible in any given situation. It is political precisely because in every ‘distribution of the sensible’ equality is either undermined or affirmed.

A distribution determines “those who have a part in the community of citizens” (7); it “reveals who can have a share in what is common to the community based on what they do and on the time and space in which this activity is performed” (8). Equality is when those without part, the unrepresented, come to take part; those without a share, have a share. In a process of subjectivisation, this involves refuting the subject positions one is ascribed by the system, and finding a name or identity-in-relation that will enable full participation and recognition – akin to the work the term ‘proletariat’ once performed. An instantiation of politics based on equality, then, is when demands for a new division and sharing of the social whole are granted.

In this way, the Snowden revelations or the Panama Papers potentially alter the distribution of the sensible, changing what can be known, but the revelations can always be absorbed into the white noise of communicative capitalism, be contained by various discursive manoeuvres, or prompt only weak tweaks to a robustly inequitable system before any new division has taken place.

In light of this, it might not always be desirable to reach the most amount of readers or audience. It might, rather, be preferable to reach readers in particular ways or at particular times to maximize attention and the chance for action. Such an approach to dissemination means keeping in mind two things:

1)     the confluence between form and content;
2)     the political economy of media forms.

To take the first of these, I’d like to briefly describe a project I devised with the help of my collaborator, Pete Woodbridge. Having put on a colloquium concerned with the politics and practices of secrecy, I wanted to disseminate recordings of the talks and sessions in a manner that did justice to the theme with which they were concerned. Pete and I decided that we wanted to emulate the experience of the secret. To do so, we “leaked” an instruction on listservs and social media to visit a website and enlist for more instructions. For those who signed up, a message was secreted to them with instructions on how to find and decrypt the talks online and the necessary passwords. The instructions self-destructed, erased themselves before the viewers’ eyes. The secret was briefly revealed. We also made the data available on a torrent so that files of the talks were distributed over the network like fragments of a secret. If users downloaded the files as a torrent onto their computer, they were emulating the secret societies that the grandfather of secrecy studies, Georg Simmel, wrote so much about.

In terms of the second point, it is necessary as academics engaged with the politics of transparency, that we self-reflexively consider the political economy of the publishing and distribution networks we engage with. The links major academic publishers have with ethically dubious enterprises has been well documented by Ted Striphas, Gary Hall, Janneke Adema and others. Though we all have to work within the constraints and demands of modern academic jobs (not least the REF in the UK), it is important to place the ethics of publishing above metrics. With these concerns in mind, and working with like-minded colleagues such as the aforementioned Gary Hall and Janneke Adema, I have been involved with various alternative publishing projects such as Liquid Books (an experimental series of open-edited and open-access books); Living Books About Life (a series of open-edited and open-access books funded by JISC); and Open Humanities Press (an open access publisher focused on critical and cultural theory that acts on principles of access, diversity and transparency).

In practice, an ethical publishing means, whenever possible, fulfilling the following goals:

·       working on a non-profit basis – all OHP books and journals, for example, are available open access on a free gratis basis, some of them libre too;
·       using open source software – OHP journals generally use either Open Journal Systems or WordPress;
·       operating as a collective - of theorists, philosophers, researchers, scholars, librarians, publishers, technologists etc. OHP operates as a networked, cooperative, collaborative, unpaid multi-user collective;
·       gifting our labour – rather than insisting on being paid for it. We see this as a means of helping to de-center waged work from its privileged place in late capitalist neoliberal society;
·       and working horizontally in a non-rivalrous fashion - OHP freely shares its knowledge, expertise and even its books with other presses such as Open Book Publishers at Cambridge, Open Edition in France, and the Hybrid Publishing Lab at Leuphana University in Germany.

I mention all of this by way of raising the following questions: What form of dissemination will do justice to the concerns of privacy, transparency, and surveillance? And: What would ethically informed modes of transparency/revelation/dissemination look like? How can we ensure that our revelations will be actionable? That they will alter, for the better, the distribution of the sensible?

Seminar 6 Lachlan Urquhart Position Statement


Making Privacy by Design a Reality?


Lachlan Urquhart

Mixed Reality Laboratory & Horizon Digital Economy CDT, University of Nottingham



We have developed a tool that aims to take the principle of data protection by design from theory into practice. Article 23 of the General Data Protection (DP) Reform Package (GDPR) mandates data protection by design and default (DPbD). This requires system designers to be more involved in data protection regulation, early on in the innovation process. Whilst this idea makes sense, we need better tools to help designers actually meet their new regulatory obligations. [1]

Guidance on what DPbD actually requires in practice is sparse, although work from usable privacy and security or privacy engineering does provide some guidance [5, 6]. These may favour technical measures like anonymisation or tools to increase user control over their personal data [7]; or organisational approaches like privacy impact assessments. [2]

By calling on design to be part of regulation, it is calling upon the system design community, one that is not ordinarily trained or equipped to deal with regulatory issues. Law is not intuitive or accessible to non-lawyers, yet by calling for privacy by design, the law is mandating non-lawyers be involved in regulatory practices. We argue that there is a need to engage, sensitise and guide designers on data protection issues on their own terms.

Presenting a non-legal community with legislation, case law or principles framed in complex, inaccessible legalese is not a good starting point. Instead, a truly multidisciplinary approach is required to translate legal principles from law to design. In our case, we bring together information technology law and human computer interaction. [4]

Our data protection by design cards are an ideation technique that helps designers explore the unfamiliar or challenging issues of EU DP law. [8] Our cards focus on the newly passed GDPR, which comes into effect in 2018. They are designed to be sufficiently lightweight for deployment in a range of design contexts eg connected home ecosystems or smart cars. We have been testing them through workshops with teams of designers in industry and education contexts: we are trying to understand the utility of the cards as a privacy by design tool. [9]

A further challenge for privacy by design goes beyond how to communicate regulatory requirements to communities unfamiliar with the law and policy landscape. Whilst finding mechanisms for delivering complex content in more accessible ways is one issue, like our cards, finding the best forums for engagement with these concepts is another. Two examples could be the role of state regulators and industry/professional associations. State regulatory bodies, like the UK ICO or EU Article 29 Working Party, have a role to play in broadcasting compliance material and supporting technology designers’ understanding of law and regulation. The needs of each business will vary, and support has to adapt accordingly. One example could be the size and resources a business has at its disposal. It is highly likely these will dictate how much support they needed to understand regulatory requirements e.g. an under resourced Small or Medium-sized Enterprise vs. a multinational with in-house legal services. 

Industry and professional associations, like British Computer Society, Association for Computing Machinery or the Institute of Electrical and Electronics Engineers may be suitable forums for raising awareness with members about the importance of regulation too. Sharing best practice is a key element of this, and these organisations are in a good position to feed their experience into codes of practice, like those suggested by Art 40 GDPR. 

[1] – L Urquhart and E Luger “Smart Cities: Creative Compliance and the Rise of ‘Designers as Regulators’” (2015) Computers and Law 26(2)
[2] – D Wright and P De Hert Privacy Impact Assessment (2012 Springer)
[3] – A29 WP “Opinion 8/2014 on the recent Developments on the Internet of Things” WP 233
[4] –  We are conducting a project in the EU and US involving researchers from: University of Nottingham (Tom Rodden, Neha Gupta, Lachlan Urquhart),Microsoft Research Cambridge (Ewa Luger, Mike Golembewski), Intel (Jonathan Fox), Microsoft (Janice Tsai), University of California Irvine (Hadar Ziv) and New York University (Lesley Fosh and Sameer Patil).
EU project page and cards are available at designingforprivacy.co.uk
[5] – J Hong “Usable Privacy and Security: A Grand Challenge for HCI” (2009) Human Computer Interaction Institute
[6] – Danezis et al “Privacy and Data Protection by Design– from policy to engineering” (2014) ENISA; M Dennedy, J Fox and T Finneran “Privacy Engineer’s Manifesto” (2014) Apress; S Spiekermann and LF Cranor “Engineering Privacy” (2009) IEEE Transactions on Software Engineering 35 (1)
[7] – H Haddadi et al “Personal Data: Thinking Inside the Box”(2015) 5th Decennial Aarhus Conferences; R Mortier et al “Human -Data Interaction: The Human Face of the Data Driven Society”(2014) http://hdiresearch.org/
[8] IDEO https://www.ideo.com/work/method-cards; M Golembewski and M Selby “Ideation Decks: A Card Based Ideation Tool” (2010) Proceedings of ACM DIS ’10, Aarhus, Denmarkhttps://dl.acm.org/citation.cfm?id=1858189
[9] E Luger, L Urquhart, T Rodden, M Golembewski “Playing the Legal Card” (2015) Proceedings of ACM CHI ’15, Seoul, S Korea


Seminar 6 Yuwei Lin Position Statement


A pedagogy for artivism


Yuwei Lin

University for the Creative Arts


The importance of data literacy and the need for improving it through both formal educational channel and public engagement have kept being flagged up in every single past Data-Psst! Workshop I have been to. There is a real demand for action taking. This academic year 2015-16, I took advantage of the knowledge I learned from the Data-Psst! workshop and devised a curriculum teaching the concepts of ‘big data’, ‘privacy’ and ‘surveillance’ to my level 5 undergraduate students at a specialist arts and design university in the UK, the University for the Creative Arts. We not only approached the issues from the legal perspectives (learning a number of problematic laws including the Data Retention and Investigatory Power Act (DRIPA), Digital Economy Act), but also engaged in the debate in a hands-on practical manner.

In the learning of laws, the students had demonstrated a great understanding of the flaws and problems of DRIPA and Digital Economy Act. They were fascinated by the great gestures of the hackers ‘Edward Snowden’ and ‘Julian Assange’, and the campaigns that various civic NGO groups have staged, including the Open Rights Group. However, many of them still subscribed to the ‘propaganda’ statement that the government has made: ‘nothing to hide, nothing to fear’. This is a generation of believers of ‘the death of privacy’. Since their engagement with digital technologies is so deeply entangled in everyday life, it is difficult to ask them to keep a ‘critical distance’.

One of the assignments asked the students to make a short video expressing their idea of ‘privacy’. The interpretations of that has been varied. Here are a list of exemplary videos made by the UCA students that you can view online:

https://youtu.be/n5TQ1EYgcm8
https://youtu.be/Fy8JzK5dxNw
https://youtu.be/inHevgHzs2I
https://www.youtube.com/watch?v=7ABPuOQUoUI     
https://www.youtube.com/watch?v=9ij2JTC8Y3A
https://vimeo.com/149351313
https://vimeo.com/149265579
https://youtu.be/dswcOI_K05k
https://www.youtube.com/watch?v=Fac3UuVYGyQ&feature=em-upload_owner

When asked to turned their ideas into an interactive piece to engage more people, quite a few designed quiz games or made a narrative choice-based game, like this one:


Other interactive projects include a ‘You’ve been framed!’ performance, and an audio-visual installation with unsettling images from the GCHQ.

Although I felt a bit disappointed that few of my undergraduate students actually became more critical of ‘privacy’ and ‘surveillance’ issues in today’s data-centric society, it motivates me to try harder next time. After all, it could only show that how difficult it is to change people’s mindset, beliefs and behaviours. It definitely has been an interesting journey.

Welcome to DATA-PSST! Seminar 6 - Cohering Inter-disciplinary Responses & Rebuilding the Agenda






DATA-PSST! Cohering Inter-disciplinary Responses & Rebuilding the Agenda


20 May 2016, 10am to 4pm
 Bangor University,
Building: John Philips Building
Room JP Studio (ground floor)

Your seminar leaders, Martina Feilzer and Yvonne McDermott Rees welcome you to this last of six seminars in the ESRC-funded seminar series, DATA-PSST! The aim is to collect what we have learned from the many academics, journalists, NGOs, intelligence and security professionals, companies, policy makers and artists who have contributed to these seminars, to identify on ‘what’, ‘why’ and ‘how’ to rebuild the public and policy agendas.

We will identify and articulate what we think our most important messages are that we should be communicating to the general public and policy-makers. Then, we will discuss creative ways of communicating this and build a communications plan to make best use of our findings from this seminar series.

The seminar will be organized around a number of key questions and all participants will be able to actively contribute to the debate:
-        What does the general public most need to know about life in the post-Snowden era, if anything? (See our report on Public Feeling on Privacy, Security and Surveillance: A Report by DATA-PSST and DCSS.)
-        What do policy-makers most need to know from our discussions? (See our policy recommendations we have collectively generated from the previous five seminars.)
-        How can abstract and complex data sur/sous/veillance, privacy, security and transparency practices be best explained to a lay audience and to policy-makers?
-        Can these processes be explained in an engaging and creative way?
-        What would an effective communications plan to promote this material look like?
-        What are the constraints and opportunities offered by our proposed format of communicating to the public – namely, a short online documentary and a promotional clip shared via social media (e.g. VINE)?

The event is reliant on all participants engaging so in addition to short keynote talks on modes of communicating on DATA-PSST issues to different audiences, the seminar will function through Position Statements and structured roundtable discussions.


Schedule 
19 May 2016 
7.15 pm - DRINKS AND NO HOST DINNER – Dylans, Menai Bridge.

20 May 2016 
10 - 10.15:       Registration
10.15 - 10.30: Introduction to final seminar (Martina Feilzer/Yvonne McDermott) 
10.30 - 11.00: What has come out of DATA-PSST so far? (Vian Bakir ) 
11.00 - 11.45: What do the public and policy makers most need to know? (Led by Andrew McStay  drawing on participants’ Position Statements) 
11.45 - 12.00: Coffee 
12.00 - 1.15:    Roundtable 1: How can abstract, complex data sur/sous/veillance privacy, security and transparency practices be creatively and engagingly explained to a lay audience and policy-makers? (Key Position Statements from artist Ronan Devlin and legal scholar Lachlan Urquhart
1.15 - 2.00:      Lunch 
2.00 - 3.15:      Roundtable 2: What would an effective communications plan to promote this material look like? (Key Position Statements from national security journalist Paul Lashmar and documentary maker Dyfrig Jones)
3: 15 – 3.30:    Coffee
3.30 - 4.00:     Plenary: Looking Forwards. Vian Bakir to summarise what next with DATA-PSST documentary, Special Issue, Policy Report and list-serv.