Using Location Data to Control the Coronavirus Pandemic
In times of crisis like the Coronavirus pandemic strong and decisive measures to save the lives and livelihoods of people across all parts of the world are needed. There is an increased need for governments to monitor and control the public, which might make it necessary to limit individual freedom. The use of location data to control the coronavirus pandemic can be fruitful and might improve the ability of governments and research institutions to combat the threat more quickly. However, the use of data on such scale has consequences for data protection, privacy and informational self-determination. Therefore, such measures should be carefully planned, executed with effective oversight, and transparently evaluated. Potential risks should ideally be mitigated through dedicated legal frameworks. At the very least, use of location data should adhere to general principles of fundamental rights protection.
Social stability versus individual freedom
In times of crisis there is an increased need for governments to monitor and control the public, which might make it necessary to limit individual freedom. To consider such developments from a formal perspective, it is useful to take a look at the legal and institutional framework of the Council of Europe (CoE). This international organization administers and controls one of the most important international human rights treaties guaranteeing individual freedoms, the European Convention on Human Rights (ECHR). The CoE has established procedures and case-law for times of crisis like the current one.
On 16 March 2020 the government of Latvia has notified that it will derogate from its responsibilities under the ECHR in response to the pandemic. It does so according to Article 15 ECHR with a limitation until 14 April 2020. At the time of writing Latvia seems to be the only one of the 47 member states of the CoE to have taken this step, but with increasing reports on the restriction of movement for citizens, the prohibition of public gatherings and festivals, and the closure of borders it is not unlikely more will follow. The guide on Article 15 ECHR for derogations in times of emergency has been updated recently on 31 December 2019. States may derogate in situations of
- war or other public emergency threatening the life of the nation,
- taking measures which are strictly required by the exigencies of the situation,
- and provided that measures are not inconsistent with other obligations under international law.
Furthermore, Article 4 of the United Nations International Covenant on Civil and Political Rights (ICCPR) is similarly worded, but beyond that requires state parties to report to all other parties via the UN Secretariat. Certain rights such as the right to life (except in respect of deaths resulting from lawful acts of war), the prohibition of torture and other forms of ill-treatment, the prohibition of slavery or servitude, and the rule of no punishment without law are non-derogable. However, many other rights are subject to derogation, including particularly the right to privacy, freedom of expression, as well as the freedom of assembly and association. Such derogations may only be of temporary nature.
Location Data to monitor and control of the public
When fighting a large-scale crisis such as a pandemic it is important for governments to understand why a threat is emerging, how the threat scenario develops, and whether the general population complies with measures for containment. Governments and research institutions need data to develop insights on these aspects, with location data being particularly attractive as work in the humanitarian sector has shown for many years by now. One core player in this field is the UN-OCHA’s Centre for Humanitarian Data, which together with many experts worked on detailed guidance notes that help to deal with data responsibly.
When it comes to the use of location data and COVID-19 specifically, just several weeks ago many commentators have been surprised by the fact that the government of the PRC co-developed a mobile phone application informing users whether they have been in close contact with someone infected by COVID-19. The insights presented by this app are most likely based on the analysis of location data collected through mobile phone networks, WiFi connections and other surveillance assemblages producing data that reveal the location of individuals and crowds. Furthermore, apps with maps to track the disease also became popular very quickly in Hong Kong and South Korea. In the PRC, this approach seems to have evolved into the ‘Alipay Health Code’, a system that classifies residents based on an opaque methodology. Once a survey has been filled out by a user, this data gets combined with other sources such as location data. Once the data has been analyzed, a QR code is generated which has one of three colors; green enables its bearer to unrestricted movement, the ‘owner’ of a yellow code may be asked to stay home for seven days, and a red QR code results in two-weeks of quarantine.
As COVID-19 makes its way westward, Israel enables emergency spy powers to track the mobile-phone data of people with suspected coronavirus, while mobile phone operators in Germany and Austria are already sharing their insights on the (non-)movement of groups of the population divided in pieces of twenty to thirty individuals each with research institutes and the government. In the meanwhile, the US government is in active talks with several large technology corporations such as Google and Facebook to explore venues how location data could be used to combat the pandemic, including tracking whether people are keeping one another at safe distances to counter the spread of the virus. The usefulness of these measures is subject to sceptic public scrutiny, especially considering the ethical implications. Finally, surveillance corporations such as Athena Security and the infamous spyware firm NSO advertise specialized surveillance cameras and dedicated data analysis services using location data to track the spread of the disease based on the movement of individuals and groups.
Potential concerns
Over the last years much has been written about the balance between security and individual freedom, particularly on the false tradeoff between privacy and security . While a pandemic such as the spread of COVID-19 requires comprehensive measures, we must keep in mind that the use of location data and other personally or demographically identifiable data on such scale results in the production of a ‘data exhaust’ that invariably has consequences for data protection and privacy. Just because it might be an emergency, does not mean that everything goes.
The arguably under-considered use of location data is surprising at this point when thinking about the unintentional revelation of the location and features of US military bases through the usage of the fitness app ‘Strava’ by members of the forces, or recent work of the New York Times based on the analysis of a comprehensive set of pseudonymized mobile phone records that allowed to identify several prominent and influential individuals upon closer scrutiny. No executive powers enshrined in regulatory frameworks were necessary to acquire these datasets and carry out the analysis, which in itself shows that our societies lack appropriate governance frameworks for such practices. Not only effective oversight on the use of such data is missing, it is also open how individuals would be safeguarded against abuse, and which kind of remedies they could use to defend themselves. Considering this misuse of location data, the Federal Communications Commission in the US on 28 February 2020 proposed a fine of 200 million dollars for mobile phone network operators repackaging and reselling location data.
Furthermore, research over the past years has proven again and again that the combination of the production of unprecedented amounts of data and improving techniques to analyze large data sets are rendering most – if not all – state of the art practices to pseudonymize/anonymize datasets meaningless, at least as time moves on. The United Nations Special Rapporteur on the right to privacy has rightfully highlighted the risks resulting from the combination of ‘closed’ datasets with ‘open’ ones. In our work on Mobile devices as stigmatizing security sensors we have proposed the concept of ‘technological gentrification’ which describes our lives in environments that are permanently monitored and where those believing in the benefits of omnipresent data render the choices of others de facto obsolete.
While a crisis like the coronavirus pandemic requires dedicated, quick and effective measures we must not forget that data is contextual. One and the same dataset can be sensitive in different contexts, and we need appropriate governance frameworks to make sure that this data is being generated, analyzed, stored and shared in legitimate and responsible ways. In light of the COVID-19 pandemic location data might be very useful for epidemiological analysis. In the context of a political crisis, the same location data can threaten the rule of law, democracy and the enjoyment of human rights.
The need for appropriate governance frameworks
Luckily, some authorities across the world have already reacted to the potential threats resulting from the use of location data in order to tackle the current pandemic. On 16 March 2020 the European Data Protection Board has released a statement in which chair Andrea Jelinek underlines that “[…] even in these exceptional times, the data controller must ensure the protection of the personal data of the data subjects. Therefore, a number of considerations should be taken into account to guarantee the lawful processing of personal data. […]”. A list of guidance documents by European data protection authorities has also been compiled already.
While these efforts are commendable, it would be preferable to have dedicated legal frameworks, created through democratic processes in parliaments. Given the necessity to act quickly, one might at least expect governmental decrees or executive acts describing the objectives and undertaken practices in a transparent manner, rooted in proper legal basis and competences, including the establishment of oversight mechanisms. Instead, the current picture suggests that ad-hoc practices have to be justified by independent data protection authorities which have to compromise their long-term supervisory objectives for short-term support of the greater good.
Furthermore, crisis response is increasingly a matter of collaboration between NGOs, corporate and governmental stakeholders. To that end, international guidance notes and draft guidelines on responsible data use as for example issued by UN OCHA’s Centre for Humanitarian Data, are invaluable. Particularly, since the use and misuse of data does not only concern states but also corporate actors such guidelines become indispensable.
Additionally, more profound questions around the meaningfulness of concepts such as individual consent and the nature of effective pseudonymization and anonymization remain. Unfortunately, it goes beyond the scope of this short piece to explore these in detail, but considerations on ‘group privacy’ and informational self-determination in the digital age would be potential starting points for such in-depth discussion. It needs to be highlighted that the humanitarian field is working on this subject extensively and with a mindset that is focused on using data responsibly, instead of mere compliance with regulatory frameworks, which need to resort to abstract human rights provisions too quickly since these frameworks themselves are limited in scope and application. Hopefully, this gap can be filled quickly in order to be able to fully focus on the containment of the pandemic, instead of additionally creating worries around the responsible use of data.
Adhering to principles of fundamental rights protection
It is important to note that location data is not the only useful data that can be used to curb the current crisis. Genetic data can be relevant for AI enhanced searches for vaccines, and monitoring online communication on social media might be helpful to keep an eye on peace and security. However, the use of such large amounts of data comes at a price for individual freedom and collective autonomy. The risks of the use of such data should ideally be mitigated through dedicated legal frameworks which describe the purpose and objectives of data use, its collection, analysis, storage and sharing, as well as the erasure of ‘raw’ data once insights have been extracted. In the absence of such clear and democratically legitimized norms, one can only resort to fundamental rights provisions such as Article 8 paragraph 2 of the ECHR that reminds us that any infringement of rights such as privacy need to be in accordance with law, necessary in a democratic society, pursuing a legitimate objective and proportionate in their application. The adherence to these principles is particularly relevant in times of crisis, where they mark the difference between societies that focus on political control and repression on the one hand, and those who believe in freedom and autonomy on the other.
Dear Herr Zwitter and Herr Gstrein, this may sound like a really stupid basic question but why is there “an increased need for governments to monitor and control the public”? Why? I know this sounds controversial but so far my understanding is that these measures to tackle a virus are new measures. So neither the measures have historical data to show their efficiency nor the virus itself of course has been around and investigated long enough to give evidence of its severity. Therefore neither these surveillance measures on the basis of an emergency as you describe nor the exact source of the crisis are grounded on scientific evidence. They are political decisions at the best and speculations at the worst. I have been reading the pandemic plan part 2 and a little bit of part 1 and it said that those detailed decisions how to cope with pandemic measures are left to the individual nations. I think however, with Covid19 somehow something like a campaign pushed all governments to do the same and people to think they knew better than governments and experts. The fact that nobody seemed to dare any critical thought about these measures made me feel even more like that. People are scared and seem almost brainwashed and Europe is taking debts to takle the impact while at the same time huge companies trading with our data gain huge profits and this while not paying taxes and then influence media and politics in return as they become more powerful,so nothing coming back. I also read in the pandemic plan about a pandemic in 2009. I cannot remember a pandemic in 2009, can you? Furthermore I read this: Preliminary findings indicate that the mortality rate for COVID-19 is 20-30 per thousand people diagnosed. This is significantly less than the 2003 SARS outbreak. However, it is much higher than the mortality rate for seasonal influenza.https://www.ecdc.europa.eu/…/novel…/questions-answers” This suggests to me that this lock down might be a little bit over the top at least if it is going on for a long time? How do we get out of this? What damage has it done to our psychies, finances and personal data? In the meantime we still have not understood the small print of privacy policies as they are very difficult to decipher and many people who are using smart technology are not even aware of the danger of surveillance. I am just thinking child protection in using video apps for example. Is AI and face recognition used “to improve” this technology? My main question is: should it not be the other way round, that we need to be less surveilled, our data more protected during a crisis and is it not outrageously wrong if not illegal to force people to use digital technology? In other words the people suffering and having to cope with this situation/crisis have no option to opt out of using their smart technology during a lock down so the legal situation of consent is from my point of few de facto not applicable. This feels abusive and not protective to me. There is no “option” or “choice” at the moment. This cannot be right.