↖ back to jolocom.io

Decentralized identity discussed: An INATBA roundtable round-up

Flashbacks  •  
Feb 08, 2021

One of Jolocom’s founding principles is its mission to forward the establishment and maintenance of standards in our industry. As such, we are very happy to have our team member Kai Wagner, serving as an active board member of INATBA (the International Association for Trusted Blockchain Applications) as well as being the co-chair of the organization’s Identity Working Group.

It was in this role that Kai chaired a recent INATBA event, a roundtable discussion based on the organization’s position paper, Decentralised Identity: What’s at Stake?, published in late 2020. Joining Kai were four fellow stakeholders, each with their own view on where decentralized identity is at present and how it should take shape over the months and years to come. 

Sitting around the virtual table with Kai were Karyl Fowler, CEO of Transmute, a software company working within the sector and active member of the Decentralized Identity Foundation’s (DIF) steering committee. Also from the US came Anil John, technical director of the US government’s Department of Homeland Security’s Science and Technology Silicon Valley Innovation Programme.

Joining them was Dr. Loretta Anania, scientific officer at the European Commission’s DG Connect who, through funding, helped get INATBA off the ground and was there in her capacity as project officer of eSSIF-Labs. And finally, from Belgium came Daniël Du Seuil, convener of the European Self-Sovereign Identity Framework and the national Belgian self-sovereign identity (SSI) project leader. 

The paper, written over the course of a year by six key authors, was produced by INATBA with the aim of defining the expectations for an open and innovative market in decentralized identity. It analyzes the current state of SSI developments and opens up discussion of the next steps, taking a clear stance on how INATBA would like to see development continue, making demands of the key stakeholder groups and setting goals for the future of SSI.

Answering the paper’s core question of what’s at stake, it gives three essential scenarios:

  1. Ideal – full convergence of SSI technology with interoperability by default.
  2. Functional – partial convergence resulting in detached ecosystems.
  3. Dysfunctional – no convergence and isolated, locked-in ecosystems.

With an awareness that the INATBA paper’s ideal position might be seen by some as “utopian”, Kai then went round the table to gauge the participants’ initial responses to it.

Responding first was Karyl, who agreed wholeheartedly with the INATBA position both from her perspective as a member of DIF and as a business person. Without achieving interoperability, she said, we “risk implementing a solution that further locks individuals out of mobility – socially and otherwise – as well as locks businesses out of future business agility.” 

Karyl added: “We don’t want to implement innovation that takes two steps back.”

Anil, having told the audience of his earlier doubts that blockchain was the universal panacea some of its champions believed it to be, went on to say, “I tend not to buy into the techno-utopianism of self-sovereign identity.” Despite this, Anil also said, “I do believe in personal agency and control.”

This, he explained, was the starting point for work he was funding to try to create a competitive ecosystem of services, noting, “As a US government agency, we have often been walked into a corner by vendors and told this is the only magic thing that will solve your problem.”

“We are not interested in being locked into another platform. If you want to go down that path, you need to ensure that there exists a competitive ecosystem of service providers that actually interoperate at a base level on the security and privacy side. That is the motivation for us.”

With 30 years of experience funding Internet research for the European Commission and time spent at MIT before that, Loretta explained that this, the funding of research, was core to her perspective on prioritizing investments. Specifically, with eSSIF-Lab she is supporting the strategic planning for the next generation Internet and laying the foundations for the proposed new European digital ID. To do this, Loretta said, we need to discover “how to put more trust in the network, a more decentralized trust framework and more permissionless autonomy to safeguard accountability and cherish  free speech on the Internet.”

“Open is easy to say and hard to do,” she added. However, with the European Commission’s plan to make the next decade one of twin transition, putting green and digital into everything it does, Loretta noted that, although this will be difficult, it is also a beginning and that, especially when standards are being set, “It’s important to be there,  involved at the beginning.”

Daniël, coming from a perspective of wanting to empower citizens, made it clear he thought the INATBA paper addressed important questions about the evolution of the technology and its adoption. “It’s not that easy,” he said. Both public and private sectors need to know the best ways to migrate from the current centralized to new, decentralized systems.

SSI, he argued, could be the answer to what the new call for a European digital ID is looking for. If the European digital ID is all about empowerment, privacy and new models of data sovereignty, it needs to be technology neutral. In the end, Daniël noted, the question is always the same: How can we empower the citizens?

With Kai tying up the common threads between the panelists’ first responses, he noted a clear difference between the US and European approaches. While in Europe the exploration and funding of research into digital identity technology and SSI came from a perspective of data sovereignty for citizens, in the US there was a broader requirement. Here, there was a desire for no system lock-in for all types of exchanges – such as the provenance of goods and information as well those of private identities. As such there was a difference in the motivation behind the funding of projects on either side of the Atlantic. However, as these were essentially similar if not the same problem, how could we ensure that the solutions that emerge were compatible?

Here, Loretta was quick to come in, arguing that use cases were key, both in making sure that we’re not trying to solve problems that don’t exist and that we are solving those that do. But, she went on to say, Loretta didn’t believe that the core problem was seamless compatibility, rather it was one of waste. And that the answer is data minimization. “The internet of waste is about unsustainable pollution, consuming resources at a time when the green transition is our biggest problem.”

“I don’t think that it’s an impossible task,” she said. But, for now, we should look to the use cases, “start small and use the best possible solution.”

In his experience, said Anil, “Everybody talks interoperability but seems to have a different idea of what it is.” He went on to argue that true interoperability requires two things. Firstly, that there are standards and specifications baked into the implementation of the technology, which is proven by publicly open, interoperable test suites that are developed and available to make sure that anything is conforming to the standards. And secondly, you need to do end-to-end matrix testing between implementations. Only then can you ensure both conformity to standards and genuine interoperability at product and vendor level. These two principles were something that all the participants generally agreed with.

In the light of this, Anil explained that the Department of Homeland Security had decided that interoperability test suites themselves could not be developed and owned by the companies being funded by them and carrying out the research. They should be owned and available independently – under the WC3 umbrella. “It makes sense for global collaborations,” he argued, “to work for the common benefit of our citizens.”

Agreeing, Karyl also put forward her belief that the way business owners or individuals come to the conclusion that interoperability is important is very use case-dependent. It’s easier to see in public sector applications. “We have to acknowledge that for each of these use cases, there are existing standards,” she said, addressing the fact that switching technologies is a costly affair and not one that businesses can take on lightly. “In many cases, what we’ve found is that a lot of these processes are not standard across regions or even within industry itself… Technology is not a silver bullet. If we implement it on a process that is fundamentally not serving the way that we’re interacting in the 21st century, we’ve just created more problems.”

Returning to the issue of standards, Daniël suggested that we need to think both of domain standards and technology ones. With a new technology, he argued, you need stability. You don’t need lots of new initiatives all the time and, at the moment, the technology is not mature enough to say that interoperability can be arranged in one way or another. We all know what the end goal is, Daniël said, but getting there isn’t going to be easy.

At this point, Kai took the opportunity to put a final question to the panelists. “If we actually want to get closer to some kind of proper interoperability,” he asked, “what is the kind of interaction we want to enable, irrespective of the use case, that should then serve as a basis for further work?”

A passionate Anil leapt in to answer this, saying that the hardest part when it comes to interoperability is following what he called the three Ps: “Pipes, payloads and policies.”

Pipes, he said, are the protocols you need to agree on. Payload is the data format of what’s moving through the pipes. Everything else is the policy construct that you put in place to make sure that everybody’s talking about the same thing.

He also argued, in agreement with Karyl’s earlier comments, that “rip and replace is not a strategy or path to success,” saying you can’t expect any sector to tear up its existing infrastructure just because something “bright and shiny” turns up. Technologists need to understand where people are and meet them there, providing something that reuses the technology those people have already spent a lot of time and money setting up. What we need to do is “provide connectivity to the new.”

Karyl addressed matters from the perspective of an American seeing how things are progressing in the EU, and recognized how productive having the public sector buying into the need for interoperability and standards was. This, she said, was liberating as it would redefine what was pre-competitive and where a business’s value lay. As a business, working with your competition rather than against it could only be productive as you could tackle problems together rather than in parallel. It was liberating for innovation.

However, Karyl also recognized that regulation and policy are, in use cases, the things that will define what is pre-competitive. It will look different when considering education, or global trade, or identity – be it an individual’s, an enterprise’s or an organization’s.

Agreeing with Anil, Loretta said that, in the EU, “We build on what works.” In her experience, standardization takes decades. “Do not underestimate,” she said, “the amount of effort that the EU had to put in to get a passport that’s recognized in one country to be also recognized in all the others.” 

What’s more, standards were not the right place to start, Loretta argued. “I don’t start with standards,” she explained. “I start with people who have good ideas, who are innovative.”

“I totally agree that standardization is a long process,” said Daniël, going on to argue that for him, interoperability is next to security in its importance. However, regulation is also essential as openness will not come easily. “It’s not something that emerges naturally,” he noted.

With that, and the clock running out, Kai called for the panellists closing thoughts. 

Sadly, Anil had already had to leave the table. Karyl meanwhile suggested that eventually interoperability and the development of standards are key to driving adoption. Without that, she argued, we would be locking businesses out of a more agile future and locking individuals into limited mobility. 

Coherence with common standards will come, thought Loretta, and that ultimately people will vote with their feet. However, reiterating her earlier point, it will take time, she said. “But I’m patient.”

Striking an optimistic final note, Daniel said he thought that this is an interesting time. Right now, both industry and the public sector are interested in decentralized identity and what it can offer them. Now, he argued, we need to prove that this technology really works. If we collaborate in an open way then interoperability will come.


Cover image © INATBA

Keep reading about

Interoperabilityopen source