Gartner Blog Network


What is Wrong with Interoperability (in healthcare)?

by Andrew White  |  September 30, 2016  |  Comments Off on What is Wrong with Interoperability (in healthcare)?

I have to put the tired old record on….so here we go….

Interoperability (we need to argue what we mean by this term) cannot work effectively if organizations only focus on data standards; they need to recognize that business process standards are required as a prerequisite.  Since 99% of all interoperability efforts focus on technology, communication, and data standards, is it any wonder that so many fail to yield the value they promised?

I saw this article this morning: List 9 Crucial Steps for Achieving Interoperability.  The article is in connection to healthcare data and the need, the requirement, to share data across the industry.  It is a good article, and a good list, but I think it misses a key point that most similar efforts miss and I see it all the time.  Here is the list:

  1. Data Standards
  2. Enterprise and Consumer Data (standards and scope)
  3. Patient identification and matching (data quality)
  4. SOA and web-based services (technical interoperability)
  5. Alignment of Incentives
  6. Data Sharing
  7. Aggregation and privacy
  8. Workforce enablement
  9. Trust – privacy and security

The hidden tenth and most important missing element would in fact replace 1 and give real meaning to 5.  The missing link is process standards and process interoperability.  When organizations focus on data standards they tend to delivery some kind of standard that seeks to ensure that all stakeholders use the same data definitions.  In other words, many such efforts focus on the metadata describing the data that is supposed to interoperate.  In other words, we should use the same rules for how we define the data.  This is all goodness.  It even works, sometimes.  But assuring the data itself is the same does not actually delivery interoperability.  At least, it might in some cases.  In many cases it is the use of the data, the context, that creates the challenge.

We might all agree to a specific set of Units of Measure for defining scale of a volume, and we might even agree to a definitive or permissible set of values for good measure.  What can go wrong with this?  First, data quality; second, process interoperability.  The quality of the actual data stored and shared might vary.  This we know about of course and we can apply some techniques and tools to help.  So let’s assume this is resolved and quality is acceptable.  So what of process interoperability?

Process design is what defines what kind of data is needed and more importantly why and how it is used/needed.  For example, patient data is needed in financial systems as well as patient care systems.  The context though is very different and so how the data is used differs widely.  It is in the details of the “how the data is used” that the semantics of the data – even the same data – change.  As such, a focus on data standards has to take into account context.

Again, we know this.  So don’t data standards deliver this?  It is because the focus is on data, not process, and that is why.  A focus on data standards soon ends up as a focus on API’s and how technology moves data around.  That is quite easy, overall.  If the focus was on process, the right kind of data standards would have to follow.  But it is because we ask the industry to focus on the lowest level of the stack (data) that they miss, they are incented to miss, the really important point.

So what does a focus on process interoperability really mean?  It means that conceptual and logical process models need to be defined and compared so that all uses of the data by all the processes can be compared and necessary semantics captured (in the metadata) in order to support effective data standards.  If the processes themselves can interoperate, the data will necessarily interoperate.  But when the data can be exchanged mechanically without an understanding of the processes and process interoperability, the result will just be the usual: lots of IT spend and little real usable or value-add interoperability.

I wrote a note a long time ago: New Peer to Peer Solutions Will Redefine the B2B Supply Chain. This is old stuff for sure, but there is some sound logic behind the research that explains nicely what and how process interoperability should work.  If only we could get the data guys to think more about process (and the process guys to think more about data)….

Additional Resources

View Free, Relevant Gartner Research

Gartner's research helps you cut through the complexity and deliver the knowledge you need to make the right decisions quickly, and with confidence.

Read Free Gartner Research

Category: health-it  healthcare  information-exchange  information-sharing  interoperability  peer-to-peer-p2p  

Andrew White
Research VP
8 years at Gartner
22 years IT industry

Andrew White is a Distinguished Analyst and VP. His roles include Chief of Research and Content Lead for Data and Analytics. His main research focus is data and analytics strategy, platforms, and governance. Read Full Bio




Comments are closed

Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.