by Andrea Di Maio | September 18, 2013 | 7 Comments
Almost every day I either read an article, or hear from a client or discuss with a colleague about “digital government”. Every single jurisdiction has or is in the process of cranking up a digital plan of some sort, or is in the middle (or better the beginning) of its implementation.
As I sift through countless documents, statements, white and green papers, I can’t help notice the parallel between digital government and what used to be called e-government. Verbiage like “citizen-centric”, bridging the digital divide, enhancing collaboration and joined activity in the back office are all areas that we saw in well-reputed e-government plans and that we are seeing again in digital plans.
There are different reasons for this.
The most mundane is that the generation of “new kids on the blocks” who are put in charge of “digital” in some jurisdiction were still at college or high school during the eGov days, and they are living this adventure as if it were all new.
Another, more serious reason might be that technology is pervasive today among citizens and businesses, and principles like “digital first” make much more sense than 10 years ago.
A third reason is a recognition that e-government has failed or at least fell short of expectations, and must be re-branded, with new roles that may have a better time than their predecessors to achieve the desired outcomes.
Whichever the driver, it is important to avoid wheel reinvention. It is true that today we have cloud, big data, pervasive mobility. However the fundamental reasons why some of those earlier endeavors failed are still there, in the cross-agency and cross-tier governance challenges, in the lack of maturity in managing an evolving base of service providers, in the lack of key technology skills inside the public sector, in the weight and complexity of legacy application and infrastructure.
Replacing “e” by “digital” won’t take us very far, unless we start taking a close look at where previous programs failed or stumbled, and understand the fundamental differences that new technologies bring to the table in terms of architectures and ownership of data, services and assets. The irony is that while data is taking center stage (think about open data, big data, social data), the CIO role (where “I” stands for Information) gets challenged and repurposed or replaced by Chief Digital Officers and the likes.
If digital government is a just a rebranding of e-government and Chief Digital Officers just a front-office focused version of the CIO, I suspect we won’t get much more from digital government than we did from e-government.
Category: e-government Tags: digital government
by Andrea Di Maio | June 9, 2013 | 9 Comments
When discussing the impact of information technology on economy and society, there are two prevailing view points.
The first one emphasizes the benefits created by the mass availability of information though increasingly affordable devices and increasing communication bandwidth. This has evident impacts on the establishment and strengthening of democracies, it gives people the ability to be better informed about their rights, their health, their jobs. It makes education more affordable to families who can hardly afford expensive textbooks. And so forth.
The second one stresses the drawbacks, looking at the intentional and unintentional loss of privacy through the abuse of social networking tools as well as government eavesdropping, and highlighting that digital divides multiply rather than closing.
I took part in a recent conversation on Facebook, started from an article (in Italian) written by Italian writer Umberto Eco, who claims that e-books will not totally replace physical books when it comes to novels or poetry. Irrespective of whether he is right or wrong, it occurred to me that the replacement of physical books with e-books will eliminate bookshelves from our homes or offices. This is something we have seen with music already: disc collections are being replaced by music stored on a file server, so that people still have their earlier CDs or vinyl on their shelves, but there is little trace of what they have been listening more recently.
Undoubtedly looking at somebody’s library tells you something about him or her. Sure, some people use to consider books as a piece of furniture, and there is no guarantee that showing Joyce’s Ulysses or Dante’s Divine Comedy means they have ever opened them. Yet, in the vast majority of cases, the warmth of books that you can glance through to get a feel of a person’s taste can’t be matched or compensated even by the coolest technology toy.
And it goes further. Borrowing a used book from a relative or a friend, with their underlined or highlighted sentences and handwritten footnotes makes that object something alive, with its own story to tell beyond the one from the author. Actually the very experience of “borrowing” goes away, with digital rights management that will prevent any even temporary use by a different user, unless one hands over the e-book itself (which is clearly not possible, as it is your access to all your library).
Also, the amazing experience of visiting a bookstore, where your senses are captured by the view, the touch, the sound, the smell of thousands of books and their pages, will gradually vanish, as the disappearance of some major bookstore chains is witnessing. The same is happening, and much faster, with music stores.
And what about looking at people who read books on a train, a bus, a plane, and what those books tell us about them and how many times we have decided to read a book because somebody else was?
So, how will the future look like? Will reading lose its social dimension, or will technology help recover some of it? Maybe the cover page of the book we are reading will be shown on the oled screen on the cover of our e-book. Maybe our virtual bookshelves will appear on screens that cover our walls, pretty much like those – replacing windows – and will show us the landscape we fancy (watch the excellent movie Cloud Atlas for an example of this). Or we will see our guest’s virtual libraries projected on our glasses.
In the Facebook discussion above many people compare the defense of physical books to the defense of horse-powered cars or wooden-powered heaters, which have disappeared almost a century ago and none of us misses.
The difference though is that those innovations demonstrably improved our productivity and comfort: we could move faster and get warmer. E-books touch upon the emotional sphere. They are not uploading the content of the book into our brain in a matter of minutes. We are still supposed to hold an object in our hands. There is little we gain, moving from a three-.dimensional to a bi-dimensional experience, and from engaging at least four senses to engaging only one.
Category: Uncategorized Tags: e-books
by Andrea Di Maio | May 31, 2013 | 4 Comments
The Australian government just published the National Cloud Computing Strategy, which goes beyond the usual domain of how government is going to use the cloud (something that several other jurisdictions, including Australia, have done over the last couple of years).
This strategy comes as a response to what the Australian Prime Minister promised back in October 2012 and builds on the roll-out of the National Broadband Network that will provide universal access to very fast Internet services across the nation. The national cloud strategy suggests a path to leverage this great resource.
The strategy looks at three distinct and important areas:
- the use of cloud computing in government, building on a previous paper on strategic directions
- the use of cloud computing by small businesses, not-for-profit and consumers
- the support of the cloud services sector
As far as the first pillar is concerned, the strategy maintains the down-to-Earth, no-nonsense approach that I have learned to like in Australia. Rather than pushing for an aggressive “cloud-first” approach, like the US did a few years ago and the UK stated more recently,
Government agencies will be required to consider cloud services (including public cloud services) for new ICT procurements. Government agencies will choose cloud services, where the service represents the best value for money and adequate management of risk, compared to other available options
For those who are familiar with the open-source “religious” battles of the last decade, this is far closer to what most of those ended up producing: strategies that would recommend agencies to consider the option, but not forcing them to adopt it or to justify why it is not being adopted. I actually drew a parallel between the two in an old blog post, which I believe is still very current. However the strategy is more decisive when it comes to public facing web sites as well as test and development environments, which are expected to be moved to the public cloud as soon as practical.
There is an emphasis on educating IT leaders in agencies about understanding benefits and risks as well as about how to procure and manage cloud services. Trials and experience sharing are encouraged, and lessons learned from the earlier Data Center as a Service (DCaaS) experience will be used to evolve this mechanism.
What is slightly less convincing is the reference to the feasibility of a community government cloud: exploring the feasibility is fine, but it should be rather clear from trends in other countries, that the most crucial aspect is to buy cloud services rather than build services (or have them built). In this respect the other potential weakness is not to push earlier for other Service Multi Use Lists beyond the DCaaS or other lighter-weight service and vendor catalogs similar to what the UK is doing with its G-Cloud Cloudstore (and that probably gives them enough strong a case for a more aggressive approach).
Another point that would require some additional detail as AGIMO reviews the strategic directions paper by mid 2013, is cloud service certification. The strategy document says
AGIMO is developing a certification framework. This framework will provide agencies
with a user friendly way of determining whether the services offered by a cloud vendor meet the
legal and operational requirements of government. The certification framework being considered by AGIMO will be a light touch framework that builds on, rather than duplicates, the existing framework of relevant technical standards. The framework will differentiate between different kinds of cloud service, and allow agencies to assess whether different platform, software or infrastructure cloud service offerings meet their needs.
However there is no clear deadline about when this framework would be issued, nor whether this would meet all certification requirements, or agencies should still perform their own certification.
As far as the second pillar, concerning the use outside government, proposed actions are quite sensible. They include addressing influencing bodies to help reach out to smaller enterprises, promoting a Cloud Consumer Protocol that would give sufficient confidence about consumer’s rights, and helping smaller businesses access technology expertise. Of course the uptake of cloud computing will also be influenced by the attitude taken by larger enterprises and how those will reverberate across their supply chain. It is wise for the strategy not to be prescriptive or condescending when it comes to larger enterprises, but I expect that working with industry associations and other intermediaries should explore the role of value chains.
Finally, as far as supporting the cloud supply side, the strategy pulls together all the necessary levers: education, research and trade, to provide the necessary skills, to address outstanding technical issues and seize opportunities, as well as to promote Australia as a “trusted hub for data storage and processing”.
The broad scope of the strategy requires the involvement and collaboration of several ministries, from Finance to Trade, from Broadband and Digital Economy to Industry and Innovation, from Education to Research.
Some of the objectives and deadlines may be vulnerable to the upcoming elections in September, but the overall structure is well-founded and it is difficult to find any major weakness.
Some commentator may wish to see the government take a more aggressive stance but, given the context and the early stage of “cloud-first strategies elsewhere, I believe the Australian government has taken a smart direction.
Category: Uncategorized Tags: Australia
by Andrea Di Maio | May 31, 2013 | 5 Comments
Yesterday I attended a meeting where different European jurisdictions were describing their experiences and plans about government interoperability frameworks, to address the thorny problem of supporting meaningful and timely data exchange as well as coordination and synchronization of business processes across different agencies within the same or different tiers. There were several mentions of enterprise bus services, service oriented architecture, open standards, and the likes.
What struck me were the references to shared services and reuse that all speakers made. These came in two flavors: reusable components to support interoperability and shared services that would actually replace the need for interoperability.
The former category includes data transformation functionalities as well as identity and access management services. The latter would include application services, such as financials, accounting, payroll, base registries, that could be shared to basically reduce the size and complexity of the interoperability problem. It was interesting to hear that two jurisdictions that have been quite successful with their interoperability efforts also enjoy centralized base registries (holding people’s names and birth dates, lists of registered companies, as well as land and other real estate location).
It appears that interoperability is more important where there is limited ability to share and centralize. Establishing an interoperability framework is a way to overcome the insurmountable obstacles of rearranging data and applications in ways that make services truly citizen-centric.
On the other hand, implementing a framework and enforcing its use is not an easy undertaking either. It does require strong governance and ability to drive agency behaviors in the same direction. But, then, how much more difficult would it be to take an extra step and go for sharing or even centralization? Is interoperability just a temporary fix for a more sustainable solution based on rearranging competencies and responsibilities in order to manage information in a far more citizen-centric way?
Category: Europe and IT Tags: interoperability
by Andrea Di Maio | May 28, 2013 | 1 Comment
I spent my first week ever in Beijing, where my local colleague gave me ample opportunities to meet clients and prospects – both government organizations and IT vendors – as well as to discover some of the beauties of this formidable country, like the well-known Forbidden City or the less-known but equally impressive contemporary architectural complex built close to the Great Wall.
When my colleagues prepped me for this trip and during a first lovely dinner on Sunday night, they highlighted that China is really different and how, while clients are eager to ear about good practices and issues elsewhere, one has to appreciate the major differences and put those practices into perspective.
However as I was going through the list of top issues that government CIOs are struggling with in China and after my first conversation with them, it was clear that I could have had those same conversations in a western country. IT governance across different agencies, portfolio management, vendor management and more in general the choice of effective sourcing strategies are issues that I discuss every day with client around the world.
One might think that with five-year plans and relatively little churn, the Chinese government should enjoy sufficient stability to make things happen.
For instance, their current strategic plan puts a great emphasis on smart cities, also taking into account the extraordinary urban development caused by people moving from the countryside looking for better paid jobs in a still growing (albeit slower) economy. However when one looks at the majority of smart city projects in China, they are as siloed as those in the West: smart transportation, smart public safety, smart building, each as a separate project, with separate drivers and responsibilities and very little integration, if any.
Vendors are very similar to those in the West. Some are really good, with quite impressive talent, significant IPs and a good level of innovation. But they package all their offering as “smart something” in response to the government’s strategy. So I can find the same confusion I find elsewhere between offerings that look more like rebranding existing products than truly innovative smart cross-domain solutions,
Some Chinese government CIOs believe that Western countries, and the US in particular, have accomplished a lot in terms of IT management and governance. I told them that there is a huge difference between having roles, mandates, directives and architectural frameworks, and being really able to benefit from them rather than simply complying not to get in trouble with one’s budget. They seemed surprised to realize that they are not alone in struggling with some fundamental governance issues that make shared service implementations as well as other types of cross-agency endeavors particularly difficult.
This is one of the aspects that makes being an analyst a wonderful job. While official papers and press interviews show one side of the story, talking to clients on a daily basis often reveals a different and more nuanced picture.
The world is really smaller and issues are much closer to each other than we think.
Category: e-government smart government Tags: China, smart city
by Andrea Di Maio | May 22, 2013 | 2 Comments
Last week I spoke to a local government CIO about innovation. We discussed about the impact of digital, open data, social media and more. When I made my usual point about the imbalance in the use of open data and more in general of digital innovation toward external impact and the apparent disregard for how these can be leveraged internally, he was in violent agreement with me. This was both surprising and rewarding, as his jurisdiction has been quite active on the open government front.
The point he was making is that, as the chief information officer, he has to be on top of all the data that impact service delivery or operations, be that data internal or external. Therefore I assume he does not see the need for separating new roles, such as chief digital or chief data officer, from the CIO role, and certainly not for distinct reporting lines. This challenges the belief that, in order to boost transformation and innovation, these roles must be independent. However the CIO needs to have a pretty clear view about.
- all the stakeholders involved in the innovation process
- how the boundaries between different stakeholders category evolve
- what is the direction of engagement
The most obvious stakeholder categories are: government (different agencies), other tiers of government, constituents (citizens, businesses, visitors), industry and consumer associations, NGOs, citizen social networks.
Open government has shown that the boundaries between different categories of stakeholders are blurring:shared service initiatives on the one hand and citizen engagement in service delivery on the other hand are examples of how boundaries are changing. In particular, the evolving role of employees who are empowered by consumer technology used in the workplace is moving the boundaries between government agencies and citizens.
Finally, government can be a driver of engagement, by determining the place and pace of collaboration and engagement, or a follower, by joining existing communities. As the latter mostly involves individual employees and the former refers to the government organization as a whole, this creates interesting dynamics around workforce management.
CIOs who understand all different dimensions and nuances will thrive. Those who don’t will struggle and gradually lose their relevance.
Category: smart government Tags: CIO, digital government
by Andrea Di Maio | May 10, 2013 | 5 Comments
On May 9, after a longer-than-expected preparation, the Open Data Policy announced as part of the US Digital Government Strategy has been issued together with an executive order signed by President Obama about Making Open and Machine Readable the New Default for Government Information.
As one reads the order, browses through the first few pages of the policy or watches the short video that CIO Steve Van Roekel and CTO Todd Park released to explain the policy, the first impression is that this is just the reinforcement of prior open government policies. The order is quite explicit in saying that (emphasis is mine):
The default state of new and modernized Government information resources shall be open and machine readable. Government information shall be managed as an asset throughout its life cycle to promote interoperability and openness, and, wherever possible and legally permissible, to ensure that data are released to the public in ways that make the data easy to find, accessible, and usable. In making this the new default state, executive departments and agencies (agencies) shall ensure that they safeguard individual privacy, confidentiality, and national security
Looking at the definition of open data in the policy itself, the first attribute for open data is being public, and then accessible, fully described, reusable, complete, timely and managed post-release. Therefore one might think that this policy is mostly about encouraging agencies to pursue what was started four years ago with the Open Government Directive and build on the success of the many initiatives that Todd Park has relentlessly pushed since when he became US CTO.
Even if this were the only focus of this policy, it would be a great accomplishment. The policy provides clarity on issues like the so-called “mosaic effect” (i.e. the risk that combining individual datasets may lead to identifying individuals).the need to prioritize data releases by engaging customers, the need to enforce privacy and confidentiality, and more. The policy also announces the establishment of a new resource called Project Open Data, which will be an online repository of tools, best practices and schema to help agencies.
But there is more, and this is where the policy gets really interesting. As the Scope section says,
The requirements in part III, sections 1 and 2 of this Memorandum apply to all new information collection, creation, and system development eff011s as well as major modernization projects that update or re-design existing information systems
Section 1 is about collecting or creating information in a way that supports downstream and dissemination activity, while section 2 is about building information systems to support interoperability and information accessibility. In the former, the policy asks agencies to “use machine readable and open formats for information as it is collected or created”. The latter suggests that “the system design must be scalable, flexible, and facilitate extraction of data in multiple formats and for a range of uses as internal and external needs change, including potential uses not accounted for in the original design”. Still in section 1 one can read “Agencies must apply open licenses, in consultation with the best practices found in Project Open Data, to information as it is collected or created so that if data are made public there are no restrictions on copying, publishing, distributing, transmitting, adapting, or otherwise using the information for non-commercial or for commercial purposes”.
The scope section also says that
The requirements in part III, section 3 apply to management of all datasets used in an agency’s information systems
Section 3 is about strengthening data management and release practices and says that “agency data assets are managed and maintained throughout their life cycle”, and “agencies must adopt effective data asset portfolio management approaches”. Agencies must develop an enterprise data inventory that accounts for datasets used in the agency’s information systems. “The inventory will indicate, as appropriate, if the agency has determined that the individual datasets may be made publicly available”
Now, let’s forget the first attribute of open data for a moment and let’s look at how this applies to any data, even non-public one. Most of what is said above still holds. The enterprise data inventory is for all data, machine-readable and open formats apply to all data, interoperability and information accessibility apply to all data. Some, maybe most data for some agencies will be public, but other will not, and yet the same fundamental principles that look at data as the most fundamental asset still apply.
A while ago I wrote about the concept of basic data that the Danish government had come up with, and more recently I have written a research note about the importance of data-centricity in government transformation (subscription required). This policy seems to go in the same direction.
While its packaging and external focus is mostly about open public data, and in this respects it further develops policies that we have seen a few years ago, its most disruptive implication is that the concept of “open by default” does apply to any data.
It would have been beneficial to make a clear distinction between “open data” and “open public data”, but I understand that the constituencies that push for transparency and openness would not welcome the distinction, assuming that this would give the government the ability to decide at leisure where to share and where to hide data.
Nonetheless, the policy can be read and used as a means to initiate a tidal shift in how data is used across government.Section 5 of the policy is about incorporating new interoperability and openness requirements into core agency processes. Information Resource Management (IRM) strategic plans must align to agency’s strategic plans and “provide a description of how IRM activities help accompanying agency missions”.
Finally the implementation section puts the CIO at the very center of this change, without calling – at least explicitly – for any new role (such as Chief Data Officer), and stresses that cost savings are expected and potential upfront investments should be considered in the context of their future benefits and be funded through the agency’s capital planning and budget processes”. Which is to say that openness is not a nice to have, for which additional financial support should be expected, but is at the core of how agencies should operate to be more effective and efficient.
As I am a cynical analyst, I can’t just be complimentary of an otherwise brilliant policy, without flagging one minor point where the policy might have been more explicit. In section 3.d.i the policy indicates the responsibility for “communicating the strategic value of open (public) data to internal stakeholders and the public”. This is great as selling open public data internally is absolutely fundamental to get support and make openness a sustainable practice. However I would have loved an explicit mention to the need for agencies to use and leverage each other’s open public data, rather than suggesting that the only target is “entrepreneurs and innovators in the private and nonprofit sector”.
Let’s be clear: there is nothing in the policy that would either prevent or discourage internal use of open public data. But as the policy gets implemented, the balance and collaboration between the CTO Todd Park – who will most likely continue pursuing the external impact of open public data – and the CIO Steve VanRoekel – who chairs the CIO Council and will be mostly concerned with the internal use of information – will be crucial to make sure that openness by default becomes the new mantra.
Category: open government data Tags: open data, US CIO, US CTO
by Andrea Di Maio | May 9, 2013 | Comments Off
Yesterday I attended a Gartner event held in Milan, where the Italian analyst team addressed over 300 executive-level clients on how the so-called nexus of forces, i.e. the confluence of social, cloud, mobile and information (big data) is going to create new business opportunities, disrupt existing business models and challenge the IT department.
In one session, six executives from IT vendor organizations were put on stage and asked a number of questions. They would answer individually in a range from “strongly agree” to “strongly disagree”, and then the audience would express their opinion on the question.
For most of the questions, differences among vendors and between the vendors and the audience were rather predictable and made for a good dynamic.
Then came the question about whether people think that the price drop in increasingly commoditized IT services will not be compensated by increased volumes. Basically the question wanted to probe what people thought about the combined impact of consumer technology and public cloud creeping into the enterprise.
All vendors responded that they strongly disagreed. I think that was the only question where they all had the same answer. Actually none of the vendors was in the consumer space so what else should one expect? Of course they believe they will be able to both sell enough volume and loads of value added services.
What struck me was that also the majority of the audience agreed. Even if I discount a few vendors in the mix, this means that most IT executives in enterprises believe that nothing will really change, that they will keep buying more lower-cost services and a whole lot of new value-adding services, pretty much from the same vendors.
As I look at the struggle of imposing an enterprise collaboration tool over a consumer one, at the acceleration of BYOD, at the increasing use of public cloud, and at the reality that CIOs are playing catch-up with some of these trends, I suspect people are putting their heads in the sand.
Although different industries will go through this transformation at different speed, one thing is for sure: in many cases the CIO will be caught by surprise or in the middle. And that’s not a good place to be.
Category: IT management Tags: Italy, nexus of forces
by Andrea Di Maio | April 22, 2013 | 3 Comments
When people talk about smart cities, there seems to be an implicit assumption that those who get the concept and are able to lead the change must be demographically young. The same applies with other recent technology-driven phenomena, such as open government or government 2.0. Datapaloozas. hackatons, hangouts, unconferences: people who are over 40 can be barely seen and elder people are allowed only if they are recognized as innovators or as part of the audience to be inducted into the “new way” of doing business.
Last Friday I attended the first public hearing for Milan as a smart city. There were several speakers, from cities, associations, enterprises, to discuss various aspects of a smart city program, what technologies are required, the importance of alliances, the many examples around the world, the networking mechanisms among leading cities and so forth.
When I was about to be overwhelmed by a sense of déjà vu, Piero Bassetti, a 85 year old veteran entrepreneur and politician, took the floor and in a few minutes, with incredible acumen, nailed the reason why smart cities often turned into little more than a talkshop, a pilot or a showcase for technology. He basically said that people focus on what is a smart city, and how to make it smart, but not on why smartness is needed. Every city has its own vocation, problems, peculiarities: comparing across city is pointless and the key is to understand whether and how smartness can help the city strengthen its vocation or solving one of its top priority problems.
His message resounded in the room after a stream of previous speeches, especially by representatives of city associations, had delivered little else than rankings, comparisons and endless lists of “smart” projects.
Ironically Italy has just re-elected its 87 year old president after a short competition in the parliament with two other candidates, both over or close to 80 yo. Many commentators flag this as a sign of a country that is unable to change and innovate- And yet, the fact that the most senior speaker in a traditionally younger and geeky environment has also been the most disruptive and inspirational may prove them wrong.
Category: smart government Tags: Italy, smart city
by Andrea Di Maio | April 4, 2013 | 2 Comments
Earlier today I was being briefed by a major vendor about their public sector offering. This vendor provides – among other things – content management solutions. In passing, I asked whether they plan to add support for open data creation, management, mash-up, but they answered that they are seeing less demand for this than just a year ago, mostly due to budget constraints. they said that as CIOs need to focus on how to deliver services more efficiently, openness and transparency are more a “nice to have” than something indispensible to run the business.
At the same time, more datapaloozas and similar events are taking place around the world, with Chief Technology Officers, Chief Data Officers or Chief Digital Offices mostly supportive and convinced of the key role of open data.
The reality is that the community of interest around open data is not changing. Certainly growing and as vocal as ever, but for the most part still self-referential and either unable or unwilling to create a clearer connection between open data and the resource challenges that many government CIOs are facing.
The good news is that there is still time for this to happen. The bad news is that roles like the Chief Data or the Chief Digital Officer – or at least the way they seem to be interpreted in government – are biased toward outward-facing outcomes (such as citizen engagement, contribution to an open data ecosystem, and so forth), and potentially missing the opportunity of internalizing the value of open government.
Category: open government data Tags: open data