Technology, Power and the Pandemic

Thanks to Tilburg’s Global Data Justice project for the invitation to speak on their CPDP panel this week to launch their excellent new report, Digital disruption or crisis capitalism? I used the case of England’s contact-tracing app to further illuminate some elements of the report, based on a book chapter of mine which should be published this autumn. Here are my notes:


  • March 2020: NHSx, a “digital transformation” unit of the UK government’s Department for Health and Social Care, commissioned a private software company to produce a centralized contact tracing app for England
  • While the epidemiologists at Oxford University advising the government suggested a centralized model would allow better top-level monitoring of the pandemic, technical experts and media had loudly criticized the delay in producing the app, and questioned why it was not developed using the Apple and Google system for reasons of simplicity, effectiveness, and privacy protection 
  • This pressure increased as other governments announced they were switching to a decentralized model, particularly the Republic of Ireland at the end of April 2020 (tens of thousands of people cross the Irish border each day) 
  • A large-scale trial on the Isle of Wight, a small island off the south coast of England, found that 96% of iPhones were not detectable by this app because the iPhones were frequently putting the app to sleep to save energy
  • This was the final straw for the centralised model, and NHSx switched to a design using GAEN. Uptake was significant (used by 28% of total population) and an academic/industry review published in Nature concluded the app averted 300-600k cases at end of 2020

Democratic accountability

The report found: “We have watched a reduction of the role of government in the delivery of public services, a reduction in the degree to which regulators can scrutinise and set limits for the technologies that affect everyday life, and a profound challenge to the capacity of the civil society organisations (CSOs) who act as the watchdogs and the voice of human rights and social justice when technological overreach occurs.”

  • NHSx controlled overall design of app ✅, but there was limited transparency in its decision-making — there was huge pressure on the govt to “do something” quickly. Civil society and other stakeholders were “consulted” but I suspect a few individual experts had the biggest impact by explaining the issues to journalists and in media interviews. So ultimately how democratic?
  • While UK (and other governments) put extensive pressure on Apple to open up its iOS location functionality, there was no discussion of a legal requirement to do so — would that not have been the democratic route? Emergency legislation can be passed very quickly in UK Parliament. 
  • Govt also rejected the advice of Parliament’s human rights committee to legislate specifically on the use of personal data to combat Covid, saying GDPR was enough. But you could very easily imagine in UK political context similar outcomes of centralised models as Singapore, where police were using contact tracing data to investigate crime (later limited to “serious” crime)
  • UK GDPR Art.25: Information Commissioner’s Office advised govt on design of app, so ✅ for DPbDaD — but did that make it more difficult for the ICO later to be involved in the public debate on centralised vs decentralised models? Liz Denham told Parliament not. But could ICO have gone anywhere near as far as Norwegian authority, which required public health body to stop collecting location data with a week’s notice?
  • Govt was constrained by impact of network effects, and without the time to develop interoperability/standards between different models

Moving beyond a privacy/bias legal focus

The report found: “We argue that the pandemic shows us it is time to move the analysis of technological interventions beyond a privacy/data protection or algorithmic bias approach that asks whether companies are complying with the law.” 

  • NHSx commissioned a rapid evidence review on technical constraints and societal impact from Ada Lovelace Institute (completed 20 April 2022) which concluded:
    • 1) the significant technical limitations and deep social risks of digital contact tracing outweighed the value offered to the crisis response, and
    • 2) extensive work would be required to overcome these issues and justify any roll out.
  • Final report of Ethics Advisory Board noted the speed of app development made it difficult “to explore the ethical implications of the technical choices and tests that were being undertaken to evaluate the effectiveness of the app” (p.8). 
  • The Board expressed frustration at the quality of information it was given to conduct its work, and “did not always feel that suitably comprehensive or final answers were supplied to their questions” (p.8). 
  • The Board further noted that “journalists had received briefings that appeared to be at variance with those being presented to the board. Sometimes technical information that we regarded as ethically significant, and had requested, was not available to us at the point when we needed to crystallise advice” (pp.15-16). 
  • Board scrapped in July 2020 but government committed to making use of the ethical framework it developed in its broader test and trace program (based on previous work by the government’s Centre for Data Ethics and Innovation)

Constitutionalisation of infrastructure

The report found “Technology firms are increasingly taking on the functions of states: they are providing essential digital infrastructure to the public sector, and in turn providing the parameters for what policy can achieve in fields such as public health, transport and welfare systems.”

  • Constitutional/human rights responsibilities of such firms?
  • In an important sense, G+A acted to constrain the capacity of states to track their citizens — they had a bigger impact than many privacy regulators over exposure notification
  • Importance for human rights of public procurement and governments (trying) to stay in control of the technology they use for public services, as well as ensuring they have meaningful choices over tools and infrastructure

Reducing dependencies with competition and smarter procurement?

The report identified five types of dependencies introduced by technology firms entering previously public sectors: Economic and infrastructural; Ethical; Political; Civic and Regulatory.

  • “Sector creep” = economies of scope; “scope creep” = economies of scale
  • Importance of competition enforcement and regulation via new laws such as DMA and Data Act
  • Do governments need to become much smarter (and more ethically aware) commissioners of public service tech? CDEI. Clearly, transparency should be a starting point (but Foxglove had to sue to access other Covid-related technology contracts, DPIAs etc.)
  • Can governments collaborate more to stimulate the development of “public service tech” (including directly via open source), as well as ensure effective competition regulation of infrastructure?