I’ve found it fascinating to spend much of my time over the last year exploring the digital competition reviews and reforms taking place around the world, particularly in relation to the use of mandated interoperability as a mechanism to improve competition in digital markets with strong network effects — something Chris Marsden and I have been calling for since 2008 🙂
This experience has reminded me how specialised competition law and economics have become, and both competition and wider Internet regulation might be improved if there was more interaction between these for-now largely separate worlds. Privacy is an example of a policy area where there is much better multidisciplinary cooperation between lawyers, sociologists, behavioural economists (and very slowly and grudgingly, mainstream economists), and computer scientists. (Colleagues such as Chris, Simonetta Vezzeso, Tommaso Valletti, Cristina Caffarra, Liza Lovdahl Gormsen, Rob Nicholls and Kevin Coates are rare exceptions; and kudos to EU Competition Commissioner Margarethe Vestager for including computer scientist Yves-Alexandre de Montjoye in her 2019 review of digital competition, and the UK Treasury similarly for including Derek McAuley in its expert panel on the same subject.)
The meaning of “data interoperability” is a concrete example of how better cross-disciplinary interaction could improve competition policy. This term seems to be used in slightly different ways by different authors, yet is one of the key components of (for example) the German government’s Competition 4.0 recommendations. Some use it broadly, to cover interaction between technical systems run by different firms — for example, lawyer Herbert Hovenkamp:
The phone network is a successful example of this remedy. An antitrust decree issued by a federal court in 1984 changed it from a single firm into a network operated by hundreds of competitors that share interoperability protocols and information. Interoperability works so well that a caller cannot even identify the equipment or carrier used by another caller.
Yet to me, as a computer scientist, the AT&T decree — and telecoms interoperability more broadly, as may also be imposed by EU regulators on electronic communications companies “which reach a significant level of coverage and user uptake” (Art.61(2)(c) EECC) — is the quintessential example of broader interoperability, where competitor systems can cause each other to take actions (such as connecting a call), not just access data from a multi-homing user’s account on another firm’s system. (An interviewee told me the UK’s Open Banking programme moved from a focus on data interoperability to connecting financial services providers via APIs, for efficacy reasons.)
This confusion now appears to be causing issues with EU competition law reforms, with a “data interoperability” clause — Article 6(1)(h) of the proposed Digital Markets Act — unsurprisingly focusing on user data and building on the EU’s existing “data portability” right in the General Data Protection Regulation:
Article 6 Obligations for gatekeepers susceptible of being further specified
1. In respect of each of its core platform services identified pursuant to Article 3(7), a gatekeeper shall…
(h) provide effective portability of data generated through the activity of a business user or end user and shall, in particular, provide tools for end users to facilitate the exercise of data portability, in line with Regulation EU 2016/679, including by the provision of continuous and real-time access ;
From a deeply nerdy computer science perspective, full data interoperability — the ability of one system to read and write all of the data on another — is simply a mirror image of interoperability focused on actions (such as sharing content with a user). But I have never seen the term used precisely this way by a non-computer scientist. And it is very bad programming practice for external components to directly manipulate data in a system, rather than via an interface (e.g. API) because it means the system designer must freeze the internal representation of the system (its data schema), and doesn’t have the ability to check changes made are consistent or otherwise comply with the schema. This is the whole basis of object-orientated programming, which has only been mainstream software engineering practice since… the 1980s 🙂
I also suspect the resistance to interoperability per se from some economists on cost grounds comes from a lack of understanding of how straightforward it can be, using freely available open source libraries, to add support for open standards to software (compared to, for example, installing extremely expensive new hardware in telecoms networks.)
I’m sure I will continue to make mistakes on competition economics and law, and am grateful for the colleagues mentioned above who have already taught me a great deal. I hope we can continue to broaden this conversation, so digital competition policy becomes as diverse a field as privacy, and also better informs (and is informed by) all the other vital areas of debate on Internet regulation currently taking place (for example, over the EU’s parallel proposed Digital Services Act).