Today I had an interesting conversation with a government client who – over ten years ago – architected a domain-specific integration broker supporting mission-critical applications running across jurisdictional boundaries. After having been detached to another agency for several years, he came back to his old agency at the beginning of this year, and discovered that – besides minor updates – the system had not evolved significantly and he could still recognize most of the architecture and building blocks he helped put together at the time.
The “problem” (if that’s a problem) is that the broker has been running almost without a glitch for a long time, and became almost invisible to those who use it. This is clearly a system component that – from the user perspective – has become a commodity, although it was a custom development and is still maintained by the agency.
As part of their increasing interest for cloud computing, the client said he would consider this as a good example of something that could be used as a service in the cloud. However, isn’t that what happened already? The broker became a commodity and – as a result – it did not evolve and now there are doubts about its ability to evolve to meet new requirements and to take advantage of new technologies.
While services available through a public cloud provided by a vendor are likely to be constantly innovated as a consequence of competitive pressures, those that governments embed in their own private clouds may not have enough incentives to be innovated over time. This argument has been used already in the past for government shared services, but becomes even more pertinent for cloud services.
But, after all, is that a bad thing? In the path to increasing commoditization the best future of a private government cloud may not be a better private government cloud.: It may actually be a public cloud.