Perspectives April 6, 2020
How to Protect Both Public Health and Privacy
Fighting COVID-19 does not have to mean abandoning the right to privacy
Governments around the world are racing to adopt new surveillance tools in response to the COVID-19 pandemic. Many are greenlighting invasive systems to monitor infections, trace infected people’s contacts, ensure quarantine compliance, and map the spread of the disease. Some governments are seeking real-time location data from mobile providers, while others are turning toward emerging technologies like facial recognition.
These steps may usher in a long-term expansion of the surveillance state. COVID-19 has arrived after two decades of rapid technological change, in which both the public and private sectors have exponentially increased their capacity to incorporate surveillance into various aspects of governance and commercial activity. Many democracies have tried, not always with success, to build legal barriers that constrain authorities’ ability to access and exploit the personal information collected by private companies. Coronavirus surveillance could dismantle these structures.
To avoid such an uncontrolled shift, policymakers must ensure that any new surveillance program complies with human rights principles, like those outlined by Freedom House, which safeguard basic freedoms while allowing the government to do what is necessary to protect public health.
Testing for necessity and proportionality
International human rights standards give states some leeway to adopt surveillance measures in the current crisis, but the programs must be scientifically justified and narrowly tailored, minimizing what data are collected and using the least intrusive options to accomplish legitimate goals.
South Korea has been comparatively effective at containing its coronavirus outbreak, but its Infectious Disease Control and Prevention Act (IDCPA) allows authorities to tap into broad surveillance powers, raising questions about epidemiological necessity and proportionality. For example, officials have pulled information from credit card records, phone location tracking, and security cameras—all without court orders—and combined it with personal interviews for rapid contact-tracing and monitoring of actual and potential infections. Importantly, IDCPA requires the data collected to “be destroyed without delay when the relevant tasks have been completed.”
Credit card histories reveal intimate details about people’s lives that go far beyond basic information for contact tracing, including sexual orientation or religious beliefs. Mobile-phone location data is also personal information, and some South Korean officials have publicized it to notify residents about patients’ movements. Yet the data may not be precise enough to discern whether two people were at least six feet apart—the suggested distance to avoid virus transmission. This ambiguity is especially problematic if the records are cited to penalize people for not complying with quarantine or social-distancing rules.
Instituting independent oversight
Surveillance programs need robust and independent oversight that can assess what types of data are collected, who manages the collection, and how and by whom that information is used. As the pandemic evolves, an independent legislative review process should routinely monitor programs to ensure that they remain necessary and proportionate. An avenue for judicial review should also be available so that affected individuals can appeal disproportionate restrictions and seek redress for any abuses.
Worryingly, this essential oversight is lacking in some surveillance initiatives. Israel’s caretaker government, for example, used emergency regulations to grant police and security officials access to a secretly obtained trove of sensitive smartphone metadata, including geolocation data, without parliamentary approval. The existence of this database and its underlying legal framework was previously undisclosed. After the government’s unilateral move, the High Court intervened to impose a temporary injunction and require some legislative involvement, but these controls do not appear to be sufficiently robust. With minor changes, the surveillance program continues.
Ensuring openness and transparency
Openness and transparency are crucial not only for keeping citizens safe during a health crisis, but also for helping them understand how and why their privacy is being affected. This builds public trust in the institutions tasked with curtailing the outbreak, while ensuring that surveillance programs and the officials running them remain accountable for their performance.
Many mobile applications that claim to track individuals’ movements and quarantine compliance fall short on transparency. They are generally opaque regarding how they collect and process data, and how and with whom they share that information.
In Poland, some are using the government’s new Home Quarantine app to prove compliance with isolation orders. Users first upload a profile image and are then sent periodic requests to upload a “selfie” picture for authorities. The app pulls geolocation data from the selfie while using facial recognition to match the image to the user’s original picture. It is unclear how much information the app is collecting and how it will be stored and shared. It remains uncertain, for example, whether the data can be retained or made available for other private or public facial recognition initiatives.
Sunsetting and limiting data collection, access, and use
Surveillance programs should have unambiguous sunset clauses. Information collected should be firewalled from other uses and generally be destroyed after the virus is brought under control.
Emergencies provide a shortcut for governments to access people’s personal information or roll out new surveillance tools that under normal circumstances would either not be allowed or would require significantly stronger judicial or legislative review. Moreover, indiscriminate monitoring and mass collection of sensitive information sidesteps due process standards, making everyone a suspect of potential wrongdoing.
Authorities could collect sensitive data or deploy facial recognition systems under the guise of countering the outbreak, only to use them later for political purposes, such as the repression of minority populations. Phone records can be, and have previously been, weaponized to track down and arrest journalists. Geolocation data could be used to identify and detain undocumented people for deportation. And police could repurpose data about people’s movement to identify civic organizing efforts and disrupt protests. Private entities such as insurance companies or advertising agencies may also seek to exploit such data for their own commercial ends without proper consent.
In practice, many programs lack or are ambiguous about sunset and firewall provisions. Certain mobile providers in Belgium, Germany, and Italy have supplied aggregated and “anonymous” location data to authorities. South African mobile carriers have also agreed to hand over location data. In the United States, mobile advertising companies, not mobile service providers, are reportedly giving government agencies access to similar information. It is unclear how and by whom this third-party data could be used during and after the outbreak, whether for law enforcement, immigration, or intelligence purposes.
Stopping the spread
While certain forms of monitoring—such as contact tracing—can be indispensable to COVID-19 containment efforts, they should remain in compliance with human rights standards. Aggressive expansion of surveillance programs without adequate checks could normalize privacy intrusions and create systems that may later be used for various forms of political and social repression.
Surveillance tools alone cannot solve a public health crisis. Enhanced technical monitoring does not provide rapid tests to patients, protective equipment to medical workers, or ventilators and staffing to hospitals. But broad and disproportionate surveillance imposed today can cause long-term harm. As democracies build out their responses to the pandemic, they should ensure that their efforts do not also institute a lasting deterioration in privacy rights.
This article was also published by WIRED on April 9, 2020.