About Interoperability

Sophie Bloemen from commons network reached out to me about an interview she wanted to do, I asked her to send out the questions, here they are along with my answers, I share them here for feedback before I send them out tomorrow.

Please introduce yourself and what is your experience with interoperability? In particular have you been working on any projects that implemented interoperability in practice?

I am natacha from petites singularités who is a founding member of the IN COMMON collective, I have bee busy with IN COMMON since its inception and continue to be involved with the latests projects such as IN COMMON RDF and DREAM. IN COMMON is a collective structure working towards shared protocols and models for the data of the commons.

What are in your opinion the core advantages of interoperability, why is it important? Who can benefit from interoperability, and in what way? We can relate here to the examples of own work that you provided, and your field of expertise [technical and standards / legal and regulatory / social and political]

Well… interoperability is a characteristic of systems, it involves many things from code to infrastructure and human organisation, talking of interoperability as abstraction does not make much sense; it is important to understand how interoperable systems are organised and to whose benefit, interoperability like everything else should take in account power imbalance in existing systems.

I will here speak about the type of interoperability we are busy with at IN COMMON, that is: making citizen generated data of the commons interoperable accross the different actors of the commons. Interoperability is essential to decentralised and distributed actors of the commons, so that they can get organised from their local point of anchorage towards a larger system communicating across borders, and hopefully across continents. By providing models and protocols allowing for the standardisation and decentralized hosting of citizen generated data we ensure the data commons infrastructure is robust resilient and we manage to distribute the costs across its actors allowing for structures with less financial capacity to access information without supporting the costs of hosting it themselves. However the technical features offered by IN COMMON do not resolve everything, digital divide is overwhelmingly present and only the most privileged population has time and technical capacity to organise commons data, even more to host them.

And what are the challenges, or risks related to interoperability?

Many challenges! Interoperability in itself does not mean anything, what is important is who decides what is interoperable and how.
As an example, as of today, I would certainly not want my community based decentralised networks to be interoperable with Facebook, because we would get flooded by their data and on a very short term would become invisible to each other; furthermore because of network effect their modalities and the choices they make to organize their users, including censoring them would become prevalent over our own community models.
We are facing companies who have a worldwide quasi monopoly over online exchanges, it is crucial that the system we agree upon allow enough space for diversity and the multiplicity of projects fediverse and other smaller structures propose.
We have published an article last Winter about this issue: What is at stake with interoperability - PUBLIC (a translation of Quelques enjeux de l'interopérabilité - petites singularités). Laurent Chemla, whom we cite, also has very interesting views on those issues, and a great deal of experience.
Laurent Chemla also explains very well the confusion that exists between protocols eventually allowing to exchange data, and private platforms who track users, asking them to login first, therefore preventing interoperable exchanges out from their control.

We are looking for key examples of interoperability in practice - what are they, in your opinion?

Well, the Internet of course, as the name indicates. The Internet has very clear binding conditions for Interoperability inscribed in the structure of its TCP-IP protocol, that imposes to not discriminate between packets and treat them equally on a basis of first come first transmitted across the network, this is often called net neutrality. This is crucial to permit the existence of smaller and diversified providers. Preserving net neutrality is essential and also a constant fight as corporate monopolies constantly make moves that threaten to overthrow it. Thankfully we have a consensus in civil society and many organisations representing us that have permitted us to keep this model, this is quite unique.

Another widely used and among the oldest example of standardisation and decentralised protocol is e-mail. E-mail agrees on an interoperable addressing model; loginname@example.net and SMTP protocol to send it; POP and IMAP open standards to check it.
Again, providers have organically organised over these interoperable e-mail protocols and share modalities and tools for spam management, building on each other’s experience; however the overwhelming presence of GMail threatens interoperability, with its own privatised spam management system that declares as spam/censors e-mail coming from small domains – this contributes to an important imbalance within the system that effectively breaks interoperability.

My domain of understanding are open standards and protocols that allow civil society to build decentralised software and data management.

ActivityPub is a great set of protocols, it gave rise to a number of software that can share activity notifications across their different fields, would it be file sharing, photo, text, blogging, book review, social networking and more, most of them join on the SocialHub forum where they can discuss the evolution of the standard and further interoperability.

Then you have protocols facilitating operations in specific professions, or accross public services for example.

We want to argue that Europe should invest in and support an interoperable public civic digital ecosystem. What are the most important building blocks of a policy that ensures this? What should we recommend as most important steps?

Interoperabilty is a quality the word in itself does not mean much, usually interoperability is associated to a protocol shared throughout certain people/institutions/softwares, and more then anything involves a lot of human organisation to be maintained with equity along time.

As we have seen Interoperable systems are a fragile ecosystem that need to be actively maintained in order to keep their interoperable feature.
I would say that the condition for a properly working interoperable public civic digital ecosystem. are well known and claimed by digital right defenders since a long time.

  • Public Money Public Code is of course the first requirement,
  • Supporting existing grass-roots European free software ecosystem by consulting their actors using it and contributing to it .
  • Organising real democratic public consultations about technological choices, and democratically set the next agenda for technology out from existing monopolies.
  • Take back education in our hands: both the platforms that are used to transmit (open) knowledge and know-how, and the way we teach technology in school.
Thinking of the many potential uses of interoperability for public institutions and civic projects, what is currently holding these public institutions back from adopting this, or pioneering in this field?

This is a very vast question, I am not sure public institutions are always holding back on interoperability, I think there are a number of fields where they are working towards interoperability, such as border control through private structures like Frontex or police files through dedicated public institutions such as Interpol, or more recently health data.
Probably the issue is again: who is included in the conversation about what type of interoperability, maybe listing the possibilities you are thinking about could help answer this question,
However I think it is important in that process to make sure citizen data and personal information is secured and not shared across services, it has not been the case lately.
In all cases those issues need to be judged case by case along clear guidelines of participation discussed and agreed among civil society organisations.

Interoperability could lead to ecosystems, in which commercial, public and civic actors share data and services - how do we make sure that (large) commercial actors do not dominate this space? How large is the risk of commercial exploitation / capture of the IPCE? Do we need additional market regulation to safeguard this?

The risk of capture is very high and has been experienced multiple times in the short history of digital technologies.
It is very difficult to respond to this question in an abstract manner, I guess the key to this process lies into proper analysis of the ecosystem, organised systems of participation where the different actors could express and discuss their mutual views.
As of now it appears that when it comes to discussing technology the actors that are heard are U.S. based monopolies.
I personally do not think market regulation can do anything, also I feel it is crucial that we stay aligned with a comprehension of the commons that are only existing out of the market (Ostrom 1991). I however think applying existing regulation to protect privacy is essential and absolutely necessary, and most often it is not respected.

From where I stand it also feels that the only way to protect ourselves from capture is stay busy with stuff that have no value for the market, for example resistance and radical care, I imagine. The good news is that we need a lot more people in those fields.

If there is time, let’s explore the specific cases that the interviewee is experienced with, to get in-depth information about them.
Please point us to other people and projects that in your opinion are doing key work on interoperability.

The recent rejection of the DID interoperability working group at W3C demonstrate how big players will never let go their access to identification, therefore rendering decentralisation impossible.