Hallam Stevens (Associate Professor of History at NTU and Associate Director (Academic) of the NTU Institute of Science and Technology for Humanity) and Monamie Bhadra Haines (Assistant Professor in Sociology at NTU) comment on the privacy implications of the proposed wearable device for COVID-19 contact tracing.
On 5 June, the Singapore government announced that it has been developing “wearable” devices for use in COVID-19 contact tracing. This came in response to the low uptake and the shortcomings of the TraceTogether app, rolled out by GovTech in March. For instance, when TraceTogether runs in the background, the Bluetooth required for it to operate is disabled on iOS (Apple) phones.
The details released so far suggest that the new devices will work much like the app, without requiring the use of a smartphone. Using Bluetooth connections (not GPS or WiFi), like TraceTogether, the devices will not collect geo-location information. Rather, when two devices are in close proximity, they will exchange unique ID numbers; these numbers are then stored in the device for twenty-five days. Infected individuals will hand their devices over to contact tracers, who will upload their data to reveal the ID numbers of the other devices that came into contact with them. The aim is to distribute these devices to all residents of Singapore.
This announcement appears to have drawn an overwhelmingly negative reaction: at the time of writing, over 49,000 people have signed a petition opposing the device. In response, Vivian Balakrishnan, the minister in charge of Singapore’s “Smart Nation” initiative, issued clarifications:
This is NOT a tracking device. Unlike the countries you mentioned our contact tracing app and device does NOT track location. There is no GPS or mobile internet connectivity.
It acts as a personal diary, uses Bluetooth proximity data to collate prolonged close contacts, encrypts the data in your personal device, auto erases after 25 days and never leaves the device unless you are infected. If and only if that happens, then the data is used by authorised contact tracers to find people who may have inadvertently infected by you.
We believe we are actually being far more protective of privacy than in many other jurisdictions. We have to get the balance right between public health and personal privacy.– Vivian Balakrishnan’s Facebook page
Balakrishnan argues that with these safeguards, the devices represent a proportionate response to the need for rapid COVID-19 contact tracing. That may well be very true. Certainly, an acceptable balance between public health and personal privacy must be found for each nation.
But in evaluating this trade-off, it is important to carefully consider what is at stake. Balakrishnan’s central claim is that the device respects privacy because it does not use geo-location. However—as we have recently argued—there is more to privacy than locational data. Recent debates have also highlighted many other kinds of personal data that can be collected, online and off, such as data on “friends” on social media, “likes” or “favourites”, photographs or videos of us, Internet browsing history, credit card purchase histories, exercise data (e.g. collected by Fitbit), and more.
Private industry, especially Internet titans like Google and Facebook, are at the forefront of gathering such data, usually for targeted advertising. Sometimes, corporations know our deepest desires even better than we do. But this data is increasingly used in other ways: in screening individuals for jobs or college admissions, in setting insurance premiums, and in deciding who gets a loan. Shoshanna Zuboff has called this growing phenomenon “surveillance capitalism”—an increasing concentration of personal data, collected by both governments and corporations, that will be used to both predict and control our lives.
The proposed wearable device collects data about proximity and associations—who we have been near or with. Although location data certainly seems private, information about our associations reveals, potentially, far more about us. Balakrishnan’s comparison to a “personal diary” is unintentionally revealing: for most people, a diary is one of their most personal items. Just because the device is not tracking location, doesn’t mean it’s not tracking us. Associational data adds yet another layer to the troves of personal data now linked to individuals.
Moreover, although the device may not track location by itself, it could be coupled—even inadvertently—with other devices that do reveal location. For instance, if the device is used in combination with smartphones as part of a broader system (as has already been proposed), a wearable carried by one person could exchange IDs with a smartphone carried by another. That second person’s smartphone is linked to a location (via GPS), tying the first person to a place by proxy.
The wearable’s Bluetooth signals might also be picked up by a compatible device placed in a fixed location. For instance, a wearable could be placed on a street corner or in a restaurant. Uploading the data from that fixed device would reveal who had spent time at that location.
Recent history suggests that we should pay attention to these possibilities. The Aadhaar system, for example, seeks to create a unique digital identity for each of India’s 1.1 billion citizens. Vulnerabilities in the system have allowed hackers and scammers to access personal information to steal identities and benefits. In the Cambridge Analytica scandal, personal data harvested from Facebook was used to make predictions about individuals’ political preferences, and their susceptibility to political persuasion via advertising. This information was then deployed in an attempt to swing the 2016 US presidential election.
In these scenarios, and many others, private data has been used in ways that were not anticipated—or perhaps even conceived of—by the users or the designers of the systems. The gathering, circulation, aggregation, and analysis of data using artificial intelligence tools is increasingly inscrutable and opaque. Balakrishnan argues that most of the data from Singapore’s wearables will not be centralised. But contact tracing will require the centralisation and aggregation of at least some data.
Even this partial centralisation leaves significant room for concern. Are adequate legal safeguards in place to protect this data? Who is responsible if it is misappropriated or misused? What recourse will individuals or the public have? In the context of surveillance capitalism, with the propensity of data to migrate easily between the public and private realms, these are critical questions, and ones that deserve wide public debate (and not just on Facebook). Some safeguards, such as the Official Secrets Act, are already in place. But the government is still in the process of implementing its public sector data security recommendations, and this process is not expected to be wholly complete until 2023.
Regulations often lag behind the development of technologies. But if COVID-19 is an opportunity for Singapore to accelerate data collection efforts, it should also be an opportunity for speeding the implementation of governance, regulation, and accountability. In doing so, the government can reinforce public trust in its governance of such technologies. We are all now constantly reminded that our “data trails” are silently and invisibly collected everywhere we go. Even if the government promises that the devices will not be used in this way, it is not surprising that anyone would be wary of yet another means of collecting information about them—especially when it seems to add a new stream to those already rapidly swirling around us.
This article is a SOAP Awards ‘Commentary of the month’ winner (June 2020)
For media: Are you interested in republishing this article? Please see our guidelines here.