Over the last years much has been written about the balance between security and individual freedom, particularly on the false trade-off between privacy and security (Solove 2011). While a pandemic such as the spread of COVID-19 requires comprehensive measures, we must keep in mind that the use of location data and other (potentially) personally or demographically identifiable data on such scale results in the production of a ‘data exhaust’ that invariably has consequences. Just because it might be an emergency, does not mean that everything goes.
The arguably under-considered use of location data is surprising at this point when thinking about the unintentional revelation of the location and features of US military bases through the usage of the fitness app ‘Strava’ by members of the forces (Liptak 2018), or recent work of the New York Times based on the analysis of a comprehensive set of pseudonymized mobile phone records that allowed to identify several prominent and influential individuals upon closer scrutiny (Thompson and Warzel 2019). No executive powers enshrined in regulatory frameworks were necessary to acquire these datasets and carry out the analysis, which in itself shows that our societies lack appropriate governance frameworks for such practices. Not only effective oversight on the use of such data is missing, it is also open how individuals would be safeguarded against abuse, and which kind of remedies they could use to defend themselves. Considering this misuse of location data, the Federal Communications Commission in the US on 28 February 2020 proposed a fine of 200 million dollars for mobile phone network operators repackaging and reselling location data (FCC Proposes Over $200M in Fines for Wireless Location Data Violations 2020).
Furthermore, research over the past years has proven again and again that the combination of the production of unprecedented amounts of data and improving techniques to analyse large data sets are rendering most – if not all – state of the art practices to pseudonymize/anonymize datasets meaningless, at least as time moves on (Rocher et al. 2019). The United Nations Special Rapporteur on the right to privacy has rightfully highlighted the risks resulting from the combination of ‘closed’ datasets with ‘open’ ones (Cannataci 2017). In our work on Mobile devices as stigmatizing security sensors we have proposed the concept of ‘technological gentrification’ which describes our lives in environments that are permanently monitored and where those believing in the benefits of omnipresent data render the choices of others de-facto obsolete (Gstrein and van Eck 2018).
While a crisis like the coronavirus pandemic requires dedicated, quick and effective measures we must not forget that data is contextual. One and the same dataset can be sensitive in different contexts, and we need appropriate governance frameworks to make sure that this data is being generated, analysed, stored and shared in legitimate and responsible ways. In light of the COVID-19 pandemic location data might be very useful for epidemiological analysis. In the context of a political crisis, the same location data can threaten the rule of law, democracy and the enjoyment of human rights.
Luckily, some authorities across the world have already reacted to the potential threats resulting from the use of location data in order to tackle the current pandemic (Data Protection Law and the COVID-19 Outbreak n.d.). On 16 March 2020 the European Data Protection Board released a statement in which chair Andrea Jelinek underlines that “[…] even in these exceptional times, the data controller must ensure the protection of the personal data of the data subjects (Olbrechts 2020). Therefore, a number of considerations should be taken into account to guarantee the lawful processing of personal data. […].”
While these efforts are commendable, it would be preferable to have dedicated legal frameworks, created through democratic processes in parliaments, as well as transparent policies. Given the necessity to act quickly, one might at least expect governmental decrees or executive acts describing the means, objectives and undertaken practices in a detailed manner, rooted in proper legal basis and competences, including the establishment of oversight mechanisms. Instead, the current picture suggests that ad-hoc practices have to be justified by independent data protection authorities which have to compromise their long-term supervisory objectives for short-term support of the greater good.