Last month the UK government published a review of the Investigatory Powers Act 2016 (IPA) by David Anderson (former independent watchdog on terrorism legislation, author of two important reviews of surveillance legislation following the Edward Snowden leaks, and now an independent member of the House of Lords). His most significant recommendation is to approve (with modifications) a Home Office proposal to “creat[e] a new, light-touch regulatory regime for the retention and examination by the UK Intelligence Community (UKIC) of bulk personal datasets in respect of which individuals have a low or no expectation of privacy.”
Lord Anderson highlighted this recommendation in a speech this week to a House of Lords debate on AI, because of its relevance to feeding the maw of machine learning tools within the UKIC. I tweeted some brief concerns:
- You hear endlessly from GCHQ and friends that intrusion only happens when someone reads intercepted material, not when it is accessed or processed. This sounds similar: “the focus on how bulk data is acquired and retained may further evolve…towards a focus on how it is used”.
- I also lost track in the IPA debate how many times I heard from spooks “but Facebook/Google does this, so why can’t we!” and here we have the AI version: “in a world where everybody is using open-source datasets to train large language models, UKIC is uniquely constrained…”
- Two answers to this point: it isn’t clear FB/Google WERE quite the data-gobblers of GCHQ’s dreams, and (as is becoming more and more clear now) some of what they were doing in 2015 was never legal under the Data Protection Directive/GDPR; we may soon enough find this out about OpenAI etc. And more broadly: private companies, even those the size of GAFAM, don’t have a monopoly on violence, or the ability to put people in jail, like states. Or reject them at a border, as was highlighted just this week as a consequence of US mass surveillance.
Lord Anderson kindly replied, asking for my views on the relevant chapter (3) of the report. Rather than try to squeeze them into tweets, I’ve set them out below. Bonus reading: some further views at the end from my co-blogger, Douwe Korff!
The report thoroughly and helpfully sets out and tests the current IPA Chapter 7 legal regime relating to use of bulk personal datasets (BPDs). While the recommendations are carefully argued, I disagree with three underlying premises, relating to (1) the strength of safeguards applied to the intelligence agencies compared to other government bodies; (2) the particular human rights approach applied; and (3) the complexity of the existing let alone proposed regime, and the difficulties for democratic legitimacy this creates.
Firstly: I’d argue it is entirely appropriate there should be stronger safeguards within the intelligence community than e.g. the health service (s3.33) on the use of data, because of the (by necessity) lower levels of transparency and scrutiny of the former. And data scientists’ disappointment they don’t get to play with all their wonderful new toys (s.3.34) isn’t a good justification for weakening fundamental rights protection.
I was slightly alarmed to read officials are spending so much time (s3.36) on “directly delivering BPD authorisations” to enable the agencies to ingest new datasets (with MI6 using automated tools in the process, s3.29) — and not because I think they are wasting their time on safeguards.
Secondly: The UKIC references a European Convention on Human Rights (ECHR) Art. 8 standard of “reasonable expectation of privacy” (s3.48). The more modern rights regime in the EU Charter of Fundamental Rights (CFR) (of course no longer applicable in the UK) does not impose this standard in relation to the right to data protection — rightly, in my view, given how analysis and combinations of seemingly non-sensitive data (and especially with more sensitive datasets) can reveal all kinds of sensitive information — even more so using machine learning tools.
The report compares the UK framework only with three other very culturally similar jurisdictions (indeed, whose legal systems are heavily based on England & Wales’, s3.58). What do European countries with a higher regard for privacy in particular (e.g. Germany) do?
More broadly, I think the Court of Justice of the EU’s (CJEU) approach to data protection is one that will ultimately better protect the essence of fundamental rights, rather than the European Court of Human Rights’ (to me) overly procedural and administrative approach relying on allowing intrusions with endless safeguards. As the report well illustrates, these layer complexity on complexity until those agencies subject to them beg for mercy — and to how effective a long-term end? (Obviously, the ECHR is the applicable law in the UK via the Human Rights Act, at least for now!)
Finally, the report says a “regime which reads logically enough on the pages of the IPA and Code of Practice is proving cumbersome in practice” (s3.28). I’d say it’s heading towards the levels of anti-democratic complexity for which the Regulation of Investigatory Powers Act was rightly criticised, with only a handful of individuals outside government following, barely able to keep up.
As much as I supported Lord Anderson’s A Question of Trust recommendations, which led to the IPA, in practice so far I haven’t been particularly impressed by the Investigatory Powers Commissioner’s Office (IPCO) which resulted, or the Court of Appeal’s interpretation of key IPA provisions relating to the encrypted messaging platform EncroChat. MI5’s now well-known illegal data-handling practices are alarming (and I acknowledge IPCO’s role in challenging them, even if they did go on for at least three years).
One specific problem, highly relevant here too, is limiting public debate to — as the report puts it — “bland and unspecific terms appropriate to an unclassified report” (s3.59). We need to find better ways of doing this than trying to somewhat-blindly delegate democratic decision-making to ministers and highly security-cleared individuals in the Home Office, IPCO and the parliamentary Intelligence and Security Committee (as we know from at least one case, this will exclude experts whose views the Home Office is not keen on.)
Despite their use for “more than a century”, it is astonishing the government only publicly acknowledged it was using these bulk personal dataset capabilities in 2015 (hence they are barely considered in A Question of Trust) (s3.2), and there has been little public debate since. Given the rapid advances in machine learning techniques in the last decade, this will make it particularly difficult for UKIC officials or a judicial commissioner (proposed by the Home Office and Lord Anderson respectively) to decide which datasets could be included in a “low/no expectation of privacy” regime.
Given all of these disagreements on basic premises, I am reluctant therefore to support the report’s specific recommendations, no matter how thoughtfully and carefully they have been reached.
Douwe Korff adds:
Re Ian’s point that “the CJEU’s approach to data protection is one that will ultimately better protect the essence of fundamental rights, rather than the ECtHR’s … overly procedural and administrative approach relying on allowing intrusions with endless safeguards”: I agree. The CFR is a (quasi-)constitutional document that is part of the constitutional legal order of an international legal entity with its own legal order, the EU. That means the judicial body charged with upholding that legal order (by the constitutional instruments establishing that order, the Treaty on European Union and Treaty on the Functioning of the EU), the CJEU, can be more forceful (“braver”, if you like) in enforcing it than a supra-national court that is not as such part of the legal orders of the entities to which its judgments and rulings apply (even if in some, like the Netherlands, the ECHR and the judgments of the ECtHR apply directly).
Luxembourg is also more influenced by Continental constitutional doctrines than Strasbourg — specifically also re: the doctrine that the “essence” of fundamental rights may never be compromised: that comes directly from the German Constitutional Court (and is now also enshrined in the CFR). And there is no “margin of appreciation” doctrine in Luxembourg (or at least it is not so stretched as in Strasbourg — especially in relation to national security). And on a more political level, the spines of the Strasbourg judges (never that firm to start with) have been weakened considerably over recent years, not least by the departure of Russia and authoritarian developments in Hungary, Poland (and the UK).
But of course (a) the CFR no longer applies in the UK and (b) the current government wants to change the law so UK judges would no longer have to follow ECHR law or European Court of Human Rights judgments or [interim] rulings…