Those who have been reading some of my earlier posts about the Open Government initiative in the US Federal Government (see here and here for a start) know that I have been critical about the way open government has been implemented. The Open Government Directive had made it a compliance-oriented rather than a value-oriented exercise and has put too much emphasis on publishing data sets for the sake of transparency, or creating opportunities for external engagement, without asking fundamental question about the purpose this would serve and the target audience.
At the end of January a report by the Congressional Research Service “The Obama Administration’s Open Government Initiative: Issues for Congress” restated the exact same objections I raised in previous posts, some of which didn’t really make me many friends.
Although recognizing the potentially positive impact of open government, it stresses some of its downsides:
Conversely, Congress may find that increased transparency and public attention make the federal government more susceptible to information leaks of sensitive materials. Additionally, increased collaboration and participation may make the sometimes slow process of democratic deliberation even slower. Congress may also choose to evaluate the monetary costs associated with implementation of the open government policies
When it comes to data sets
Although the datasets released to the public may be useful in many ways, it is unclear how some of them will increase the transparency of the operations and actions of the federal government. […] this type of transparency does not give Congress or the public much insight into how the federal government itself operates or executes policies. […] Crowdsourcing may improve data and operations, but only for agencies that make useful data accessible.
Releasing these datasets to the public also assumes that the public will have the knowledge, capacity, and resources to evaluate the data, offer valid insights, and reach replicable results and verifiable conclusions.
On the latter, I said that open government is for as captive audience of usual suspects (lobbyists, corporations, NGOs, political parties) having a vested interest in data.
Irresponsible manipulation of the datasets may allow certain groups or individuals to present unclear or skewed interpretations of government datasets, or come to questionable conclusions. Moreover, users of this data will have to know exactly what datasets they seek, especially in agencies that release hundreds of thousands of datasets. Counter intuitively, agencies that release vast amounts of datasets may become even less transparent because the public will be unable to decipher which data are important to their needs. […] Making the data public, in this way, does not necessarily make the data more accessible or usable. Without the ability of the public to access and use the datasets that are released, the government may not be more transparent
I did make many of these points since the very beginning of the open government initiative, and I have been warning other jurisdictions not to enter a competition based on how many data sets, or how many Facebook pages or idea contests they would run vs. other jurisdictions.
What very few have done, and should have been hardwired in the directive, is to link openness to value and mission objectives. There is still time, although the clock is ticking as the Congress – which is no longer as favorable as it was at the beginning of the Obama administration – starts looking more closely into this matter.
I hope that some of the people who judged my comment about the Deputy CTO for Open Government Beth Noveck leaving as disrespectful and ungrateful will see with greater clarity and less emotion why a change of course is sorely needed.