The interoperability of technologies, data and applications across different government agencies, tiers and jurisdictions has been a keystone of e-government and government transformation programs for almost a decade. The nirvana of any such program is to achieve seamless integration between processes and applications, to make the structure of government invisible (or irrelevant) to service delivery, and to set the basis for agile, truly transformational government.
Jurisdictions worldwide have published government interoperability frameworks in multiple versions (see a few examples from the UK, Germany, Australia, Denmark, New Zealand) and all these have helped move technical interoperability in the right direction. Compatible technical architectures or document standards are necessary conditions for processes and applications to interoperate, but they are not sufficient. As the European Commission’s work on the European Interoperability Framework shows, there are several level of interoperability, and technical is just one: the toughest ones are semantic (do data have the same meaning?), organizational (are processes compatible?), legal (do similar laws apply to the same issues?).
This reminds me of when I was doing research on software reusability, back in the late 80′s. At the time, barriers to software reuse were technical (i.e. different platforms, operating systems, programming languages) as well as organizational (i.e. reuse cost models, design methods and processes). However even with most of those technical challenges overcome (by the adoption of open standards and more and more technically interoperable platforms), reuse is still relatively low, because organizational challenges are far more difficult to solve.
The same applies to interoperability: we can have all countries and regions using the same (open) standards, and yet data models may be incompatible. Technical interoperability provides a common lexicon and part of a common syntax, but can’t help with semantics. The realm of semantic and organizational interoperability dangerously borders whole-of-government enterprise architecture, another venture that has rarely provided much value, besides creating a context for compliance and scrutiny.
An interesting question to pose is: how much more effort will be put on cracking the toughest interoperability problems, taking into account the difficult times ahead? This question would clearly be amenable to some scenaric planning, but let me just take the most controversial view. If this is going to be a long and deep recession and governments need to deploy exceptional resources to sustain economic development and social cohesion within their respective boundaries, government interoperability may become irrelevant.
Let me start with the counterargument to my own thesis. In order to rapidly achieve challenging political objectives around economic recovery and job protection and creation, different government agencies and tiers need to collaborate more effectively and efficiently. Focus on national and local priorities will cause efforts like “European Interoperability” to slow down or even grind to a halt. But also within countries, states, provinces, what is likely to happen is a recognition that there is no more time to devote to semantic or organizational interoperability, to government-wide enterprise architectures and global transformational programs.
On the one hand governments will have to slash the cost of their operations (finance management, HR, procurement, general administration) and put some programs on the afterburner (as they contribute less to economic recovery or welfare). On the other hand they will have to focus on reinforcing critical programs and launching new ones (such as bail-outs for entire industrial sectors or new infrastructure investments) of a nature and a scale that are almost unprecedented.
This will lead to consolidating and commoditizing IT related to general administration , wherever feasible, by imposing shared or centralized services: interoperability won’t help much (besides using legacy data), as agencies will need to transition toward common services, applications, infrastructures, as opposed to running their own in an interoperable fashion. Where this is unfeasible or requires too much time, things will be left as they are, just reducing budgets.
Low-priority programs will be left lagging behind, with just the IT support that is required to let them survive, but little or no money for any significant enhancement, let alone major architectural redesign.
Finally, high-priority programs will have very specific interoperability requirements that are likely to break the boundaries of current efforts in the field. They will have to engage resources, information, processes in different industry sectors. For example, how to measure the impact of several dozens or hundreds million dollars? How to prevent misuse of such funds? How to adapt job creation and support measures to the changing landscapes in various sectors and parts of the jurisdiction?
Interoperability will become (or – in a certain sense – remain) a tactical issue, to be faced in conjuction with a specific problem to be solved. There will not be much space left for one-size-fits-all measures, any more enterprise architecture, or religious battles about which flavor of an open standard is open enough.
Interoperability will still play a role to help cushion government agencies from the risk of defaulting IT suppliers responsible for maintaining legacy products. In many cases, these products will be proprietary in nature and hardly interoperable with new products that are candidate for replacement. Therefore, backward compatibility with proprietary data formats will become a key criteria in selecting new, open-standard-compliant alternatives.
In essence, in a long and deep recession the attitude to interoperability will become much more pragmatic. This is an opportunity to show what its true value is, and a challenge for those who still pursue too an ambitious and (sometimes) abstract approach to it.