Technology assisted contact tracing (“TACT”), including contact tracing apps, have quickly become a component in many organizations’ and communities’ plans to combat COVID-19. Just as quickly as TACT entered the conversation, so did privacy concerns. In the haste to implement TACT solutions, there are concerns that TACT systems are not being narrowly tailored to achieve the goals of contact tracing, and that broader TACT operations may create unnecessary vulnerabilities to users’ privacy. The Country of Norway dealt with these privacy issues head-on when the county’s data protection authority raised these tailoring concerns in June, determining that the app proposed a disproportionate threat to user privacy and requiring the country to entirely halt use of its nationwide contact tracing app.
It is not just privacy watchdogs who are policing the use of TACT and its privacy implications. The individual users and potential users have enormous influence over the effectiveness of TACT. Widespread adoption is crucial for the success of a TACT system, with some studies suggesting that 60% of a population would need to install and use a contact tracing app to effectively slow the spread. However, consumers’ privacy concerns can be a heavy barrier to adoption. TACT requires collection not only of static personal data about users, but also (depending on the particular technology used) locations, movements, and relationships between users. Privacy conscious consumers are particularly sensitive to location tracking, and many will not participate in a TACT system they do not trust to protect such personal and vulnerable data. This may be particularly true in the United States, where there is no national or uniform set of laws regulating how personal information may be collected or used by TACT systems.
While TACT systems by nature need to collect personal data about their users, there are technical and administrative safeguards TACT system vendors and authorities can take to assuage user concerns and reduce privacy risks to build the trust in the system that its requires to succeed. Norway’s fraught implementation outlines some of the privacy implications and risks associated with TACT that users are wary of, and those protective measures that organizations and communities should consider implementing in connection with any TACT system to help mitigate those vulnerabilities and protect user privacy.
Scope of Data Collection: TACT Systems May Be Collecting Too Much Data
A common concern raised by privacy experts and consumers alike is the scope of data being collected by TACT systems, with location data being a particular point of concern. Location data raises special privacy concerns not only because it is often difficult or impossible to anonymize with the potential to reveal detailed personal information (even without intentional use of names) about a user’s movements and associations, but also because it is not actually necessary for effective contact tracing. Norway’s contact tracing app collected location data, contributing to the national data protection authority’s determination that the app collected and stored more data than necessary to achieve its contact tracing purposes. Effective contact tracing typically identifies whether two users have been in close contact, not where they met. TACT systems like Apple-Google’s joint solution use Bluetooth technology, measuring the strength of Bluetooth signal between the two user’s devices to determine whether two users have come in close contact with each other, to track potential contact between users without having to track their locations.
Centralization v. Decentralization: Storing and Sharing Data with a Central Authority May Make Data Collected Through TACT More Vulnerable
Another privacy concern raised by privacy experts, and cited by Norway’s data protection authority in its suspension of the nation’s contact tracing app, is the centralization of data collected through TACT. This become especially concerning for users when TACT data is centralized in the control of an authoritative entity like a government, employer or university, which may have particular power over the individual. Further, centralization, even in the hands of a good actor, makes data more vulnerable to attack by bad actors, as it creates one point of access, allowing the breach of a single source to result in the data of all users being compromised. TACT systems like the Apple-Google joint solution rely on minimal centralization, storing data collected through the solution locally on user devices until and unless a user voluntarily elects to push information, i.e. to report a positive test result, out to a central authority. This decentralized approach gives users more control over the use of their information, and reduces the risk of wide-reaching breach or abuse.
Commitment to Transparency and a Minimum Necessary Principle
A lack of clarity as to how data is being collected through TACT systems will be used was the final nail in the coffin for Norway’s national contact tracking app, with the nation’s data protection authority citing the ambiguity as yet another concern resulting in its suspension of the app. Clarity as to use of the TACT data may be of particular importance in the United States, since there is no central privacy scheme guaranteeing protections for users or outlining permissible use of such information. This creates uncertainty both for individuals, who cannot be sure of what privacy rights or protections they may have, if any, with respect to their TACT data, and organizations and governmental authorities looking to implement TACT systems, who cannot be sure whether their use of the data may violate some component of the United States’ decentralized web of privacy regulations.
TACT system vendors and designers, as well as organizations and authorities looking to implement TACT systems, may want to take a cue from Apple and Google, who have publically committed to minimizing data used by their joint solution and to not monetizing the project. This kind of transparency and limitation on usage to the minimum necessary for contact tracing may both help build trust in users and reduce the risk of an entity’s use of data running afoul of privacy regulation.
With the power to collect and use vital personal information of their users, contact tracing apps are a double edged sword in the fight against COVID-19, as the same data that can reduce a community’s vulnerability to the virus can increase the data subjects’ vulnerability to privacy violations. The information collected by TACT is not just protected as personal information, like health status and personal health information, but is also the type of information particularly vulnerable to abuse and stigmatization. Contact tracing app providers and technology vendors providing the foundational technology systems and services should be transparent about their privacy and data collection practices, and should commit to restricting their use of the data for contact tracing purposes. Organizational users, like governments, employers and universities implementing TACT systems should make similar commitments to limited use and transparency to reduce the risk of privacy violations and establish trust with users to encourage the pervasive use of the system required for it to be effective.