Over the last month or so there have been a couple of worthwhile events about the US Federal Government’s cloud computing activities.
The first one has been the publication of an interesting report, authored by the federal CIO Vivek Kundra, on the State of Public Sector Cloud Computing. This is a compendium of what the federal government is doing as well as a list of actual and announced projects at federal, state and local level. The purpose of this report is to boost confidence in the progress of the cloud computing agenda and to show to reluctant agencies how several government organizations are making or considering the move.
The second one has been the launch of a request for proposals by the General Services Administration (GSA) for cloud-based email and collaboration services. This is probably the highest profile move to email as a service after the one undertaken by the City of LA. According to Vivek’s report, also the Department of Interior would have announced an agency-wide e-mail as a service program, but ironically their CIO, Sonny Bhagowalia has been hired by the GSA Office of Citizen Services to deal with – guess what? – the cloud portfolio.
Taken at face value, these events witness the unstoppable force of cloud computing in the US government. Indeed I and my colleagues who cover cloud computing receive an increasing number of inquiries and requests to help frame RFPs for cloud-based services. Therefore reality seems to be matching perception.
However, it is interesting to note that Vivek’s report contains quite a few “announced projects”, where no contract has been awarded yet, nor is there any evidence that savings have been accrued or value created. One of the oldest GSA projects, the move of USA.gov to cloud, is still on-going and I guess it will take some time before one can assess total cost of ownership and savings (this was a subject of debate after a wrong statement about expected savings a few months ago).
Vivek’s report also mentions interesting development spearheaded by NIST, such as SAJACC (Standards Acceleration to Jumpstart Adoption of Cloud Computing) and FedRAMP (Federal Risk and Authorization Management Program), both of which are deemed essential to overcome some of the residual skepticism about using cloud computing in government. Unfortunately all it says about both is the general goal and what NIST role is, but does not say what the progress is nor to what extent either or both are going to make life easier.
While there is increasing awareness about cloud computing, and the offerings are rapidly maturing, all this buzz keeps reminding me about the excitement of European, South American and Asian governments about open source (see here for an article from 2001). This was complemented and reinforced by the well-justified interest for open formats (see Belgium and Denmark ruling in favor of ODF, and the never-ending debate about whether Microsoft-backed OOXML is good enough).
While the use of open formats and open standards is becoming a requirements in most system and software public procurements in Europe and elsewhere, there are also some governments that actively push for open source or mandate that open source alternatives are considered for any procurement. However, with few exceptions, the advice is always to use open source where it demonstrably delivers value for money (see the UK latest open source policy).
What Vivek’s report says about the federal budget planning reminds me of those positive discriminations forcing agencies to think about the use of open source and somewhat prove why it can’t be used. In fact the text says:
By September 2011 – all newly planned or performing major IT investments acquisitions must complete an alternatives analysis that includes a cloud computing based alternative as part of their budget submissions.
By September 2012 – all IT investments making enhancements to an existing investment must complete an alternatives analysis that includes a cloud computing based alternative as part of their budget submissions.
By September 2013 – all IT investments in steady-state must complete an alternatives analysis that includes a cloud computing based alternative as part of their budget submissions
I almost feel I could replace “cloud computing” with “open source” and things would still look the same.
I have raised a few times (see here) the intriguing relations between open source and cloud computing.
But I find quite ironic that governments use the same mechanisms (hype and pilots) for both, as they actually serve different purposes and benefit different audiences. Indeed, one could say that both help save costs (although this must be demonstrated on a case by case basis). But while open source (together with open standards) aimed at reducing and preventing vendor lock-in. cloud computing is all about being locked-in with vendors.
Indeed, as Vivek says, there are standardization activities going on, but they are still far from being well articulated, let alone produce any actual result. If there is anything to learn from the ODF vs OOXML battle, standardization processes may be difficult and adoption of standards by vendors is not a straightforward task.
I am not sure that those who are pushing for massive cloud computing adoption today are putting enough thought into the total cost of ownership of cloud computing solutions, including their exit cost.
Exactly like for open source, the right approach is in the middle. Cloud computing is one possible sourcing option to consider, alongside traditional outsourcing, remote hosting, infrastructure utility, on-premises software and so forth. It is not end, but a means to an end (cost-effective IT).
It would be great if the cloud “knights” in the US federal administration would study the recent history of open source and open standards in European governments. They may conclude that they need more caution in promoting vendor solutions, a more heavy hand on standardization and interoperability and a less religious approach to the cloud as THE solution to all the IT evils.