Disclaimer: This is a repost of an opinion piece from early April 2020.
With or without your consent
In times of a pandemic we all can feel that we are part of society, no matter how much we think of ourselves as individuals. We can feel our interconnectedness more than ever, which has led to some positive changes in the attitude how we relate to each other as human beings. But there is also a sense of uneasiness. Our current situation has brought to the spotlight much of how we organise, plan and govern our day to day lives depend on data.
How much data has become part of our narratives is evident by the new illustration of society that has emerged: society as a curve. This early cybernetic ideal incorporates the idea that we can directly influence and control this curve –as representative of us as a collective and our collective actions– but tweaking the input and consequently the output data by taking a multitude of measurements.
Without data we are blind
Current emergency measurements like the lock down are a response to the lack of data: How many people are or have been infected? How fast does the virus spread? How long does it live on different surfaces? There are many open questions and uncertainties that has leave spot in the image that we paint of our societies.
The lack of reliable data and basic scientific understanding -and therefore the lack of authoritative knowledge and trustworthy numbers and the resulting capacity to create actionable claims that guide parts of our policy and decision making -has led to us to be witnesses in a global experiment where countries are resolving to taking up different steps and approaches on a national level. Some of these approaches are concerning data protection scholars and activists such as privacy-invasive social contact tracking apps.
Times of crisis and uncertainty are often used by governments to introduce increasingly repressive measures against civil society and to limit fundamental rights. Certain temporary restrictions like the limitation of the freedom of movement right now might be justified. However, looking back at history, there are worries that some restrictions or changes will not be lifted even after the crisis has ended. A recent examples are France, which has lived in state of emergency for almost two years after the attacks in Paris with many of the temporary provisions becoming law such as easier access for police to execute search warrants. Another is Hungary, which passed a bill that allows for rule by decree during this state of emergency without any clear time limit and special measures include jail terms for spreading misinformation, which worries journalists and citizens. This demonstrates the ambivalence of crises: they show us the breaking points of our societies, brings to the forefront what or who is essential and thus can strengthen old dynamics or shift boundaries.
But without sufficient data, which of these measurements will prove to be effective? And with a barely coping health care system, what are our options? There is a lot of uncertainty around the available data and people are desperate to clutch at every straw that promises relief. Many companies are now pitching data-driven solutions without there being a basic, democratic discussion around the dimensions of the data. As Mireille Hildebrandt, Research Professor on ‘Interfacing Law and Technology’ at Vrije Universiteit Brussels tweeted:
What can go wrong with data-driven solutions?
On the 28th of March 2020, the U.K. government has announced
in a [blog post]
(https://healthtech.blog.gov.uk/2020/03/28/the-power-of-data-in-a-pandemic/)
partnerships between the NHS and several big tech companies, including
Microsoft, Google, Amazon Web Services and Palantir to help them “[…]
execute a coordinated and effective Covid-19 response […]”
Whereas most of the involved companies are not known for their business models of honouring privacy or data protection, the involvement of Palantir is especially concerning. The data mining and analyst company offers questionable surveillance tools to military and predictive policing techniques, and has a record of being involved in operations undermining democratic values such as against journalist and whistle blowers of Wikileaks or their partnership with Cambridge Analytica.
While the government is claiming that strict data protection
laws will be implemented, all data will remain within the U.K., and that the
solutions will be privacy-by design, these developments are concerning on
multiple levels:
What power structures will be introduced?
Times of crisis are being used to establish power structures
and broaden monopolies within the public health sector. It is unlikely that
these partnerships will end after corona. Building up data infrastructures will
rather lead to long-term dependencies of the NHS of a few big corporations.
These will be in the position to sell their products expensively in an area
that is already chronically under-financed and struggling with tight budgets
due to austerity. It is further unlikely that even corona-specific processes (such
as the tracking of ventilators) will not be transferred to other areas. The
next illness or problem will surely come! Then the goal for companies like
Palantir will not be to improve health care in the public interest, but at the
very best to re-structure them according to profit-driven interest to create
revenue, at worst turn the NHS health care system into a surveillance
monitoring machine.
No open data, no open analysis, no transparency
This is not a problem specifically with these companies. Privately
owned monopolies over data processing of public data are unlikely to be open
about their methods of data analysis and the algorithmics that they use (with
reference to ownership of intellectual property and the protection of their
business models). Hence, we will not be able to understand the assumptions and
biases build into them. As a result, there is almost no way to counter-check,
contest or make the decisions based on these results accountable. This is not problematic in terms of lived democracy, but very dangerous as it limits other knowledge production and potentially has horrendous consequences on how health care is delivered. In cases of emergencies it might seem helpful for a short amount of time to limit factors that lead to decision-making as a way to reduce complexity and to be able to act swiftly. However, it is exactly this complexity that is needed to deal with the uncertainties inherent in the situation (in particular, the unknown factors mentioned above).
In order to balance out assumptions and biases, to correct potential errors, to possibly challenge and counter-check decision making, to provide accountability, and for more holistic knowledge productions we need a multitude of perspectives. For example, as Dr Caroline Criado Perez highlights is the lack of gendered data in medicine is already disadvantaging women and in the current crisis sex-disaggregated data leaves women-dominated frontline workers at risk as default male PPE not fitting them correctly.
We need different models to compare, we need different teams from various disciplines and backgrounds, with different methods working on and with open data and with transparent methods. For this it is necessary that data, and with the data and the parameters and variables of their analysis are not closed in with private entities. Further, in times like these some confidential data might be needed but a certain level of transparency and openness still need limitations and safeguards within data protection principles. Any technical or analytic
measures need to balance in what and how they will interfere with fundamental
rights (such as informational self-determination) and in which cases these
interventions might be necessary and in which this expose might be impinging.
The general rhetoric of using sensitive data is often that only disaggregated or anonymous data is being used. Even if we set concerns aside if this is actually true, data protection concerns still hold as re-identification is often possible with additional data sets and via statistical methods. How can we check it if we exclusively give away data access to transnational companies like Google?
Democratic data infrastructures and data integrity
Scholars, academics and advocates have called for open data, the expansion and harmonisation of data infrastructures and the establishment of standards in the last decades. Further approaches to make use of data in a way that values privacy such as data stewardship or data visiting are well known. An excellent example of functional data integrity, transparent methods and infrastructures can look like are the genetic sciences that have vast networks for data and information exchange. In the case of Covid-19, the sequences of its genome have been made available so that scientists around the world could start their research.
However, efforts in the public health sector to have been blocked, massively delayed or projects targeted only short-term investments by governments and the NHS. These hesitant approaches and delays have often been legitimised by ethical concerns and the inability to harmonise information governance. These concerns are now been wiped away and being by-passed to fast track the introduction for data solutions of ethically highly questionable companies. The boundaries of what will be possible in terms of data protection of patient data the future have been massively shifted with this decision. The government effectively killing long-standing initiatives that offered solutions to preserve and protect the public and their fundamental rights. No ethics, no consent, no democratic discussion.
Against the simplistic rhetoric of violating fundamental rights
This shift has to be a concern for all of us as it marks a moment in which hegemonic structures are being expanded or introduced. Many of us have been getting used to the violation of our privacy and the limitation of fundamental rights. These are brought forward by a variety of thought schools, e.g. behavioural marketing or security discourse around the so-called “war
on terror.” These discourses often deploy a black-white rhetoric a la “Human lives or data protection!” It is important to differentiate and not to reinforce these dividing arguments.
From a point of data protection, the introduction of many of these systems is concerning and at times questionable. It is important to notice however, that technology and the access to data can help us to cope with the current situation. And for this certain limitation of data protection and fundamental rights can be legitimate if it happens within the right framework.
In order to ensure this we need to take measure:
1. Reassure accountability! All introduced measures have to
be balanced with the principle of proportionality. Are they justified or
overreaching? Moreover, we need to track all the actions and measures taken at
the moment and evaluate them periodically, especially after the exceptional
situations that have led to them ended.
2. Let’s create decentralised tech! There are plenty of possibilities and initiatives to create technology that values privacy. For example, the Pan-European Privacy Preserving Proximity Tracing initiative has announced to develop an app that uses decentralisation, local storage and encryption to perform privacy-preserving contact tracking. Compliant with GDPR, these are an alternative to background surveillance and enable civil engagement to help contain the virus.
3. Establish responsible ways for data sharing! In this moment, we can positively shift the discourse to demand responsible and responsive data infrastructures that reflect our value and protect our rights. We can demand that these infrastructures and processes (including standards and APIs) allow for multiple perspectives and diversity in knowledge making. This can be thought further even beyond health data, but for issues like climate change, education, digital competences, etc.
4. Embed solidarity in tech! Let’s create sustainable technology and solution that we can trust, that doesn’t hollow out our fundamental rights or is just there to create revenue for transnational global companies or rich individuals. Against a monopolisation of public data and in the interest of the public and civil society!
With this in mind: Keep safe and let us all remember that physical distancing doesn’t mean we have to be socially distancing!
Comments