Wes Rishel

A member of the Gartner Blog Network

Wes Rishel
VP Distinguished Analyst
12 years at Gartner
45 years IT industry

Wes Rishel is a vice president and distinguished analyst in Gartner's healthcare provider research practice. He covers electronic medical records, interoperability, health information exchanges and the underlying technologies of healthcare IT, including application integration and standards. Read Full Bio

Coverage Areas:

Healthcare Interop and the ARRA: Hope Happens

by Wes Rishel  |  February 16, 2009  |  18 Comments

For a while, we will all be trying to estimate how the American Recovery and Reinvestment Act of 2009 (ARRA) will impact our bailiwicks. On list servers some writers read the tea leaves to see a much broader and more systematic look at healthcare informatics and interoperability than has been pursued by the ONCHIT. They argue with some justification that the true “electronic health record” is more about information and the way individual health IT systems are interconnected than it is another term for the EMR with some unspecified interoperability thrown in.

Soothsayers, however, should say a different sooth. Congress is clearly not looking to go back to the drawing boards. The ARRA encodes in law the approach adopted by the Bush administration through executive orders and administrative actions.

Some believe that the current approach is doomed to failure because of the lack of the aforementioned systematic framework for current interoperability efforts. Others, including me, believe that applications integration is always messy, if for no other reason than that the goal will always be to integrate systems that are in different points in their life cycle and have different information models. Successful application integration depends entirely on identifying specific scenarios that are in easy reach for a majority of systems in usage or have such a clear and measurable value to the owners of the system (not the vendors) that they will pay for the re-engineering.

In other words, national interoperability in the U.S. can not rise above rough-cut precision in the short term (say, ten years).

One can only hope that the US gets that far. Interoperability under HIPAA has been successful in a few cooperating communities but generally a disaster. HIPAA absolutely proved that the full Federal regulatory process is too lugubrious for promulgating IT standards.

The current U.S. approach is faster than HIPAA, but it suffers from one of HIPAA’s main problems, that the entity responsible for producing the specifications is not responsible for their being implemented. For some time, I have been speaking about the notion of a profiler-enforcer organization (PEOs). (Gartner clients can read a more detailed piece on this topic here.) Organizations such as Connecting for Health in England and Infoway in Canada take multiple standards, develop harmonized profiles and see the process through contracting with HIT vendors to implement them. Their charter is to get HIT implemented and standards are a part of the process. IHE is similar in that it at least takes responsibility for a full cycle through Connectathons.

The fundamental point is that interoperability is not achieved by a waterfall process, but by recycling. Early deliverable specifications are rough drafts to be tuned by implementation experience. Unless the entity responsible for creating the specs is responsible for them being used well, the feedback loop will remain open and confusion will prevail.

In the U.S., the work is split between HITSP, CCHIT, the NHIN Trial Implementation project and perhaps other implementation projects. The feedback loop is broken.

The ARRA allow a hopeful person some justification for short-term and long-term optimism for healthcare interoperability. In the short term, the language includes support for rejiggering the U.S. process so that the goal of creating actual interoperation might be under a tighter span of control.

Long term hope arises from language directing NIST to award academic grants for multidisciplinary “centers for health care information enterprise integration.” We can hope that NIST develops the grants in a way to support a more systematic look the challenges of healthcare interoperability. One also hopes potential applicants for such grants will work immediately to bone up on the excellent if incomplete work done by HL7, the Object Management Group and the National Cancer Institute in establishing a more rigorous basis for interoperability. Principles developed there can find their way into the architecture of HIT systems over time and can get us from rough-cut to a smooth finish, if not to finely honed cabinetry.

18 Comments »

Category: Healthcare Providers Interoperability Vertical Industries     Tags: , , ,

18 responses so far ↓

  • 1 J. Marc Overhage, MD, PhD   February 16, 2009 at 10:51 pm

    Here! Here! Wes. As many of called for, we need to create healthcare interoperability 0.1 and refine it to get to 1.0 and then, eventually, 2.0. Your clear identification of the problem and charge “The fundamental point is that interoperability is not achieved by a waterfall process, but by recycling. Early deliverable specifications are rough drafts to be tuned by implementation experience.” is correct and critical to any success we may have. The corollary is that we cannot design every specification up-front, without implementation, and have any hope that they are “right”. We need to focus on a limited number of high value specifications and refine those then expand as we are able.

  • 2 Norman Daoust   February 16, 2009 at 10:58 pm

    Your mention of “Unless the entity responsible for creating the specs is responsible for them being used well, the feedback loop will remain open and confusion will prevail.” reminds me of the story I heard recently of why German roads are of considerably higher quality than US roads. Living in the Boston area, the word “considerably” is an understatement. The reason given was acontract to build a road in Germany include the provision to maintain the road for ten years. In contrast, in the US we typically select the lowest bidder to build roads.

    Should we consider only allowing implementers to create healthcare standards? Or perhaps we should require standards developers to implement their standards twice (ala software development organizations utilizing their own software, aka “eating your own dog food”).

    I recall asking a group of people developing healthcare standards at a meeting one time, how many had implemented a version of the standard at least once: less than half the people raised their hands. How many have implemented a version of the standard more than five times? Only two of twenty raised their hands. (In case you’re wondering, I was one of the two!)

    By the way, is that a new organization you’re a part of: the Operating Management Group? :)

    I’ll keep in mind in my efforts to help US healthcare standards organizations harmonize a portion of their standards a quote from an optimist I admire:

    “What you do is insignificant, but it is very important that you do it.” -
    Mohandus Ghandi

  • 3 charles kennedy   February 16, 2009 at 11:10 pm

    Wes,
    Good to hear from you. The more I hear the word interoperability, the more concerned I am that this one word will create expectation mismatches. When I speak to Congressional staff, there is an expectation that this deployment will create interoperable health care (as opposed to interoperable systems) such that ideal decisions will get made and care fragmentation will be resolved. Yet, when I look at where we are with CCD and the notion of interconnected EMRs, I worry that we will all execute exactly what we think the other is expectating and have some very significant expectation mismatches.

  • 4 Sumit Nagpal   February 17, 2009 at 1:34 am

    First, A disclaimer – this was posted at 1:31 am – hence my apologies in advance for wordiness.

    Wes – as always you are right on about organic growth of interoperability around specific use cases that provide high value to large numbers of people (clients of vendors). This is one of a small number of approaches that have succeeded in our style of free market. Financial incentives from payors (including CMS) being another.

    NHIN-II demonstrated that this use case approach could work in “internet time” by gathering a set of operating HIE’s and focusing their strategic and technical teams to cooperate with one another under strict deadline and under the consistent guidance of a watchful cheerleader – the ONC. We all succeeded in designing and implementing a handful of high value use cases and the “core infrastructure” necessary to support them.

    The end result proved that:

    - Given the right incentives, deep interoperability can be achieved across many *competing* vendor systems spanning a dozen plus sizable organizations in “internet time” – a first for our industry

    - When tightly led under a single, unified vision (think ONC rather than each state or region creating its own standards), a national-scale project such as the NHIN-II that required participants to be “production ready” by its conclusion can in fact deliver production ready interoperable solutions on a national scale. MedVirginia is set to be the first NHIN connected HIE (NHIE) transacting patient summaries supporting disability claims with the Social Security Administration, and others are soon to follow

    - Deep interoperability can happen without the hand crafting that was required with “legacy” approaches. MedVirginia’s and CareSpark’s Wellogic systems can “plug and play” with the VA’s Vista and Kaiser’s ehr and others simply because we all worked together to create and then followed the same standards and passed the same rigorous certifications. NHIN-II has progressed the status of interoperability by a light year – all within one calendar year. Plug and play via web services are within reach and in fact now exist in each NHIN-ready vendor system

    - Normalized data exchange (semantic interoperability) is a reasonable (invaluable) requirement for healthcare interoperability. Being able to review and make sense of patient observations from multiple venues of care without the usual paper chase and without having to translate values across systems – not just moving data between systems, but ensuring that it means the right things when it gets there and is used – is available here and now. Each NHIN-II participant complies with a HITSP-derived terminology set for all the defined use cases, converting internal codes and terms to this normalized form for data exchange

    - The holy grail of interoperability – doing all of the above so that evidence based guidelines and decision support rules may be consistently applied to normalized patient data across all venues of care, and therefore inform patient care *at the point of care, at the moment of truth* rather than prospectively – is now readily available in systems such as MedVirginia’s that have taken the NHIN-II charter to its logical conclusion. MedVirginia – and several other Wellogic clients who have adopted this model – are ready to support safer, more cost effective care delivery via real-time application of such rules and guidelines assembled from the AMA, PQRI, AQA, and others.

    - Solutions are not only feasible but now “shrink wrapped” via NHIN-II for complex topics like patient consent, clinical transaction data content, authorization/authentication and others

    2009 is going to be an exciting year for healthcare interoperability because of the tremendous foundation laid by the ONC, SSA, and NHIEs that participated in NHIN-II and are now gearing up to go live. We look forward to helping accelerate the pace of this change. The standards, techniques, and incentives are smaller hurdles now. It is time for the vendor community in particular to step up and deliver the impact on safety, outcomes, and cost that we have been proposing – in internet time.

    Sumit Nagpal
    President and CEO
    Wellogic

  • 5 Mark Frisse   February 17, 2009 at 6:41 am

    Thanks to Wes and Gartner for this realistic and thoughtful perspective. Few have a better view and perspective.

    There is in ARRA a great opportunity to make some adjustments and close the loop. We all need optimism (and clear heads) as we go forward.

  • 6 Charlene Marietti   February 17, 2009 at 8:56 am

    Whether you are an idealist or a realist, interoperability remains a shorthand term to indicate a streamlined and more effective/efficient system of care, but is as elusive as it ever was.

    So far, healthcare SDOs are to be commended for their selfless work toward achieving the goal of interoperability. Unfortunately, the advancements are primarily effective only among products in the pipeline–not in all the bits and pieces of legacy apps currently in use.

    Oh, and let’s not forget the cultural aspects of privacy, business advantage, etc. To date, many have focused on the “if only’s”–if only we had more money, if only we can get physicians to digitize, etc.

    It’s time to move on. Many people are working hard to make interop a reality, but it’s unlikely to happen overnight (How long have HITSP, HL7, CCHIT, and others been working?) and it won’t happen without commitment and buy-in from end users and purchasers.

  • 7 Ted Klein   February 17, 2009 at 9:17 am

    Thank you, Wes, for a thoughtful viewpoint on the impact of ARRA. As you wisely point out, there have been many efforts in the past to create a single unified vision and architecture for all of healthcare interoperability, and such efforts have crashed on the shoals of reality. None of us will live long enough to see the end game of any universal “Big Bang” solution.

    The current approach of targeting cooperating communities to implement specific Use Cases, although appearing piecemeal and fragmented on the surface, has proven to be an effective means of moving things forward. I too am hopeful that the additional funding and direction in the bill will help to accelerate this process, but I am also mindful of the danger of unrealistic expectations. It is unclear how, at the end of the day, these myriad different implementations in many locales for multiple use cases will bolt together to provide at least the simulation of the expected ‘seamless interoperability’ that so many politicians seem to prattle on about. But the large number of implementers who are now actually working with the standards, which themselves are more mature and comprehensive, is a positive sign that perhaps a critical mass to enable wider development and deployment may, if not already eached, may be imminent.

  • 8 Gary Christopherson   February 17, 2009 at 11:21 am

    Wes,

    Based on my discussions and my reading of the enacted ARRA, I see nothing that will significantly limit our ability to deliver a truly interoperable system at the local or national levels.

    The bigger question is whether or not the Federal leadership, hopefully in partnership with other governments and the private sector, will actually lead and help deliver an interoperable system across the whole United States and in an expeditious manner. I’m not convinced we are yet on the right path or the right timetable.

    In 2001, a path was laid out that would have produced a truly nation-wide, interoperable system by 2010. As you can see, we are failing to deliver on that timetable or the promise it had in improving health and healthcare.

    Gary

  • 9 John Mattison   February 17, 2009 at 11:37 am

    Sorry to agree with you Wes, but I think your synthesis is spot on.
    I would however extend one of your observations though, and that is that ” the goal will always be to integrate systems that are in different points in their life cycle and have different information models” . I agree that we will always be integrating systems in different points in their life cycle, but I haven’t given up hope on an incremental convergence on similar information models, hopefully based on the HL7 RIM. The current work of Peter Hendler, Gunther Shadow, and Rene Spronk and the RIMBAA group in HL7 holds promise for demonstrating if not inspiring such a migration. Clearly it is unrealistic to expect any vendor to change their information model except as an opportunistic response to a need to replatform an application. But when a vendor recognizes the need to replatform, it is often an opportune time to update their information model, and converging on a RIM-like information model could offer many attractive virtues, including lower cost of integration by potential purchasers. If this optimism is justified, there might even be a classical “tipping point”, at which having a RIM-based information model could be a critical criterion in system selection. Only time will tell, but given how painful and expensive it is today to integrate systems with different information models, one can only hope.

  • 10 Andy Wiesenthal   February 17, 2009 at 11:40 am

    Wes–you have characterized the issue accurately, and the organic nature of NHIN development that you describe is likely to be much less expensive and much more likely to succeed over time. It will not succeed instantly, but we will get some effective data exchange fairly soon. That should be allowed to grow, and the rest of the country should be able to take advantage of successes and failures (what not to do). I would hesitate to point to the NHS as a successful venture, because they have created some important problems for themselves and do not have a successful deployment to point to or even on the horizon. Canda HealthInfoway and NEHTA in Australia look a little more promising at the moment.

  • 11 Jan Root   February 17, 2009 at 12:05 pm

    Wes,

    As always, very good food for thought. Interoperability is definitely an iterative process. Not only does practice make perfect (or at least better) but inevitably we’re always playing catch-up as the industry comes up with new ‘silver bullets’ for how to lower health care costs (DRGs, P4P, etc).

    However, I think it is key to remember that interoperability does not occur in a vacuum. It makes a lot of sense to us techy-types but it obviously does not to many others. For it to be truly sucessful, It must occur in a context where there is an economic incentive for the industry as a whole to not only adopt standards but require them. As you say, HIPAA has been a disaster in most of the country. However in Utah, it’s been successful. Why the difference?

    In Utah a significant portion of the health care community has bought into the idea of standards as a way to grow market share and to reduce costs (long story). So, somehow we, as a country (or maybe region by region?) need to figure out how to move the industry from the ‘my way is the best (only?) way” to “interoperability (standards)” is the only way.
    http://api.recaptcha.net/image?c=02Od1LsgJwDvASwjaNPfgzDBO4fRepo8p_181uz3gA5k-n4FUAgiMZybhYVRj-ala0vkT8Xl-r8e6uWfr-TTdjgX-ekzYlEjLbq8XJDr8Q47JBL44wDUA0MNleud-joC5AOVwb4wuTgrJAv-IQSHRSkpX9vTBBRHIp_iWWyHXq1dRnN8rIipikXehrINH6E4FelCsQQrk-fmr7lx9g6JpDQqejkh3v76oep6BStJJ3g5qNhsZUsmfSrwZy5gXXLrcmxbh5B_92EaYPmlGzVm3dLe8a-5oB
    That’s a different discussion!

    Thanks again for the interesting thoughts!

  • 12 Glenn Keet   February 17, 2009 at 12:37 pm

    Wes, I believe your analysis is largely accurate. And I fully agree with the notion that the feedback loop is missing with regards to the standards bodies. Who ever really believed, though, that ARRA would be directly responsible for achieving interoperable health care? In my mind the free market is and has been working towards that, and I think that ARRA will only hasten it by making that free market bigger for a short period.

    In my twenty year tenure of integrating health care records systems I have seen health care data interoperability progress from Rube Goldberg contraptions, on to wholly unreproducible science projects, and later on to fairly reproducible exchanges between unrelated commercial products that begin to make the health care data “liquid” (to steal the recent Booz Allen term). Yes it has a ways to go, but the market is already pushing vendors towards this.

    So I suppose I am an optimist (and a Darwinist) when it comes to interoperability. There will be products that are interoperable with others in real and useful ways in my fiscal lifetime, while many other products that are not will disappear from the marketplace.

    Now, the really broken thing in U.S. health care is the current ‘pay for procedure’ reimbursement system, and the challenge is to replace that without necessarily moving to a single payer model. ARRA doesn’t take any steps towards that, and that might be prudent since it is meant to be short term stimulus.

  • 13 Kenneth A. Kleinberg   February 17, 2009 at 1:14 pm

    If you had asked folks a couple of decades ago if they believed that effectively all the worlds computers and companies would be connected in one giant more-or-less interoperable network (the Internet), they would have laughed. Could it happen in healhcare? I’ve heard many say “not in my lifetime”. I tend to go along with what I heard another person recently say: “It can’t not happen.”. We learned a long time ago with OOP that getting folks to agree on one “perfect” data model” is unrealistic. Providing a lower common denominator of what information can be exchanged via the most needed and valuable use cases is the key. With some jumpstart to get more organziations exchanging information, the benefits of receiving will start to exceed the benefits of giving, and adoption will take off – probably faster than most of us could predict.

  • 14 Patrice Kuppe   February 17, 2009 at 1:32 pm

    Wes,
    Here we are over 10 years since we passed HIPAA “Administrative Simplification” and we are struggling with interoperability for those transactions. We have a new law in MN that requires providers to bill ALL PAYERS electronically (what a concept)! My company is advanced but we are finding we can’t achieve this mandate because we don’t have interoperabiltiy – as in connections. The workers comp, property and casulty, and auto insurers aren’t all linked up to clearinghouses. The ones I use don’t connect to theirs, or they don’t even have one. We have missed a great opportunity in the reduction of administrative costs in health care because we do not yet have seemless connectivity. We have the standards – we need a pipe.

  • 15 Rob Bush   February 17, 2009 at 4:40 pm

    Who is steering the ship?

    Someone is working on a plan – right now – who has the blessing and support of the administration. But who knows who that is? As you clearly point out, the best plan will come from accumulating the knowledge of the people in the trenches who are working today to make interoperability happen.

    How can this industry close the loop you describe by getting involved with the planning? I want to volunteer!

    Rob Bush
    Orchard Software

  • 16 Joan Duke   February 26, 2009 at 3:26 pm

    I agree with all Wes’s analysis. I am also very much aligned with Glenn Keet who wrote that “the really broken thing in U.S. health care is the current ‘pay for procedure’ reimbursement system”. It worries me that our reimbursement system does not yet provide the incentives for coordination of care that is facilitated by the exchange of clinical data. As long as our health care services are reimbursed on an incident/procedure basis (unrelated to outcome) rather than on an episode basis, there is not substantial financial reward for the expense or the changes in behavior need to promote the improvements that can be gained by access to information across the many settings of care.

  • 17 Wes Rishel   February 26, 2009 at 5:17 pm

    Thanks, Joan. I think we are all aware that promoting healthcare IT in an environment with perverse incentives is trying to push a rope. An interesting moot question is would we recommend stopping Health IT incentives until the fundamental health incentives are in alignment?

    The question is moot in the sense that we don’t get to say. It is not so moot if a hospital or vendor is determining a business strategy. If one really believes that improper incentives will doom EHR incentives and interoperability to failure, one should bet against it. That would mean gambling on CMS becoming the provider of health IT as it takes on a greater percentage of the population or a market that comes to be dominated by one or two very large vendors.

  • 18 Nicholas Bessmer   April 14, 2009 at 1:13 pm

    The recent Harvard Medical School article about EMR states that the chief barrier to the acceptance of automation is the cost of implementation. My partner and I are working in the private sector on a start up that delivers a low cost data quality solution for HL7 (interoperability. We have found very few private equity funds who want to have any involvement with regulatory or standards based health care software systems.

    I am still unclear whether stimulus money is available to start ups working on EMR / EHR solutions. No one seems to have the definitive answer.