by Andrea Di Maio | May 10, 2013 | 5 Comments
On May 9, after a longer-than-expected preparation, the Open Data Policy announced as part of the US Digital Government Strategy has been issued together with an executive order signed by President Obama about Making Open and Machine Readable the New Default for Government Information.
As one reads the order, browses through the first few pages of the policy or watches the short video that CIO Steve Van Roekel and CTO Todd Park released to explain the policy, the first impression is that this is just the reinforcement of prior open government policies. The order is quite explicit in saying that (emphasis is mine):
The default state of new and modernized Government information resources shall be open and machine readable. Government information shall be managed as an asset throughout its life cycle to promote interoperability and openness, and, wherever possible and legally permissible, to ensure that data are released to the public in ways that make the data easy to find, accessible, and usable. In making this the new default state, executive departments and agencies (agencies) shall ensure that they safeguard individual privacy, confidentiality, and national security
Looking at the definition of open data in the policy itself, the first attribute for open data is being public, and then accessible, fully described, reusable, complete, timely and managed post-release. Therefore one might think that this policy is mostly about encouraging agencies to pursue what was started four years ago with the Open Government Directive and build on the success of the many initiatives that Todd Park has relentlessly pushed since when he became US CTO.
Even if this were the only focus of this policy, it would be a great accomplishment. The policy provides clarity on issues like the so-called “mosaic effect” (i.e. the risk that combining individual datasets may lead to identifying individuals).the need to prioritize data releases by engaging customers, the need to enforce privacy and confidentiality, and more. The policy also announces the establishment of a new resource called Project Open Data, which will be an online repository of tools, best practices and schema to help agencies.
But there is more, and this is where the policy gets really interesting. As the Scope section says,
The requirements in part III, sections 1 and 2 of this Memorandum apply to all new information collection, creation, and system development eff011s as well as major modernization projects that update or re-design existing information systems
Section 1 is about collecting or creating information in a way that supports downstream and dissemination activity, while section 2 is about building information systems to support interoperability and information accessibility. In the former, the policy asks agencies to “use machine readable and open formats for information as it is collected or created”. The latter suggests that “the system design must be scalable, flexible, and facilitate extraction of data in multiple formats and for a range of uses as internal and external needs change, including potential uses not accounted for in the original design”. Still in section 1 one can read “Agencies must apply open licenses, in consultation with the best practices found in Project Open Data, to information as it is collected or created so that if data are made public there are no restrictions on copying, publishing, distributing, transmitting, adapting, or otherwise using the information for non-commercial or for commercial purposes”.
The scope section also says that
The requirements in part III, section 3 apply to management of all datasets used in an agency’s information systems
Section 3 is about strengthening data management and release practices and says that “agency data assets are managed and maintained throughout their life cycle”, and “agencies must adopt effective data asset portfolio management approaches”. Agencies must develop an enterprise data inventory that accounts for datasets used in the agency’s information systems. “The inventory will indicate, as appropriate, if the agency has determined that the individual datasets may be made publicly available”
Now, let’s forget the first attribute of open data for a moment and let’s look at how this applies to any data, even non-public one. Most of what is said above still holds. The enterprise data inventory is for all data, machine-readable and open formats apply to all data, interoperability and information accessibility apply to all data. Some, maybe most data for some agencies will be public, but other will not, and yet the same fundamental principles that look at data as the most fundamental asset still apply.
A while ago I wrote about the concept of basic data that the Danish government had come up with, and more recently I have written a research note about the importance of data-centricity in government transformation (subscription required). This policy seems to go in the same direction.
While its packaging and external focus is mostly about open public data, and in this respects it further develops policies that we have seen a few years ago, its most disruptive implication is that the concept of “open by default” does apply to any data.
It would have been beneficial to make a clear distinction between “open data” and “open public data”, but I understand that the constituencies that push for transparency and openness would not welcome the distinction, assuming that this would give the government the ability to decide at leisure where to share and where to hide data.
Nonetheless, the policy can be read and used as a means to initiate a tidal shift in how data is used across government.Section 5 of the policy is about incorporating new interoperability and openness requirements into core agency processes. Information Resource Management (IRM) strategic plans must align to agency’s strategic plans and “provide a description of how IRM activities help accompanying agency missions”.
Finally the implementation section puts the CIO at the very center of this change, without calling – at least explicitly – for any new role (such as Chief Data Officer), and stresses that cost savings are expected and potential upfront investments should be considered in the context of their future benefits and be funded through the agency’s capital planning and budget processes”. Which is to say that openness is not a nice to have, for which additional financial support should be expected, but is at the core of how agencies should operate to be more effective and efficient.
As I am a cynical analyst, I can’t just be complimentary of an otherwise brilliant policy, without flagging one minor point where the policy might have been more explicit. In section 3.d.i the policy indicates the responsibility for “communicating the strategic value of open (public) data to internal stakeholders and the public”. This is great as selling open public data internally is absolutely fundamental to get support and make openness a sustainable practice. However I would have loved an explicit mention to the need for agencies to use and leverage each other’s open public data, rather than suggesting that the only target is “entrepreneurs and innovators in the private and nonprofit sector”.
Let’s be clear: there is nothing in the policy that would either prevent or discourage internal use of open public data. But as the policy gets implemented, the balance and collaboration between the CTO Todd Park – who will most likely continue pursuing the external impact of open public data – and the CIO Steve VanRoekel – who chairs the CIO Council and will be mostly concerned with the internal use of information – will be crucial to make sure that openness by default becomes the new mantra.
Category: open government data Tags: open data, US CIO, US CTO
by Andrea Di Maio | May 9, 2013 | Comments Off
Yesterday I attended a Gartner event held in Milan, where the Italian analyst team addressed over 300 executive-level clients on how the so-called nexus of forces, i.e. the confluence of social, cloud, mobile and information (big data) is going to create new business opportunities, disrupt existing business models and challenge the IT department.
In one session, six executives from IT vendor organizations were put on stage and asked a number of questions. They would answer individually in a range from “strongly agree” to “strongly disagree”, and then the audience would express their opinion on the question.
For most of the questions, differences among vendors and between the vendors and the audience were rather predictable and made for a good dynamic.
Then came the question about whether people think that the price drop in increasingly commoditized IT services will not be compensated by increased volumes. Basically the question wanted to probe what people thought about the combined impact of consumer technology and public cloud creeping into the enterprise.
All vendors responded that they strongly disagreed. I think that was the only question where they all had the same answer. Actually none of the vendors was in the consumer space so what else should one expect? Of course they believe they will be able to both sell enough volume and loads of value added services.
What struck me was that also the majority of the audience agreed. Even if I discount a few vendors in the mix, this means that most IT executives in enterprises believe that nothing will really change, that they will keep buying more lower-cost services and a whole lot of new value-adding services, pretty much from the same vendors.
As I look at the struggle of imposing an enterprise collaboration tool over a consumer one, at the acceleration of BYOD, at the increasing use of public cloud, and at the reality that CIOs are playing catch-up with some of these trends, I suspect people are putting their heads in the sand.
Although different industries will go through this transformation at different speed, one thing is for sure: in many cases the CIO will be caught by surprise or in the middle. And that’s not a good place to be.
Category: IT management Tags: Italy, nexus of forces
by Andrea Di Maio | April 22, 2013 | 3 Comments
When people talk about smart cities, there seems to be an implicit assumption that those who get the concept and are able to lead the change must be demographically young. The same applies with other recent technology-driven phenomena, such as open government or government 2.0. Datapaloozas. hackatons, hangouts, unconferences: people who are over 40 can be barely seen and elder people are allowed only if they are recognized as innovators or as part of the audience to be inducted into the “new way” of doing business.
Last Friday I attended the first public hearing for Milan as a smart city. There were several speakers, from cities, associations, enterprises, to discuss various aspects of a smart city program, what technologies are required, the importance of alliances, the many examples around the world, the networking mechanisms among leading cities and so forth.
When I was about to be overwhelmed by a sense of déjà vu, Piero Bassetti, a 85 year old veteran entrepreneur and politician, took the floor and in a few minutes, with incredible acumen, nailed the reason why smart cities often turned into little more than a talkshop, a pilot or a showcase for technology. He basically said that people focus on what is a smart city, and how to make it smart, but not on why smartness is needed. Every city has its own vocation, problems, peculiarities: comparing across city is pointless and the key is to understand whether and how smartness can help the city strengthen its vocation or solving one of its top priority problems.
His message resounded in the room after a stream of previous speeches, especially by representatives of city associations, had delivered little else than rankings, comparisons and endless lists of “smart” projects.
Ironically Italy has just re-elected its 87 year old president after a short competition in the parliament with two other candidates, both over or close to 80 yo. Many commentators flag this as a sign of a country that is unable to change and innovate- And yet, the fact that the most senior speaker in a traditionally younger and geeky environment has also been the most disruptive and inspirational may prove them wrong.
Category: smart government Tags: Italy, smart city
by Andrea Di Maio | April 4, 2013 | 2 Comments
Earlier today I was being briefed by a major vendor about their public sector offering. This vendor provides – among other things – content management solutions. In passing, I asked whether they plan to add support for open data creation, management, mash-up, but they answered that they are seeing less demand for this than just a year ago, mostly due to budget constraints. they said that as CIOs need to focus on how to deliver services more efficiently, openness and transparency are more a “nice to have” than something indispensible to run the business.
At the same time, more datapaloozas and similar events are taking place around the world, with Chief Technology Officers, Chief Data Officers or Chief Digital Offices mostly supportive and convinced of the key role of open data.
The reality is that the community of interest around open data is not changing. Certainly growing and as vocal as ever, but for the most part still self-referential and either unable or unwilling to create a clearer connection between open data and the resource challenges that many government CIOs are facing.
The good news is that there is still time for this to happen. The bad news is that roles like the Chief Data or the Chief Digital Officer – or at least the way they seem to be interpreted in government – are biased toward outward-facing outcomes (such as citizen engagement, contribution to an open data ecosystem, and so forth), and potentially missing the opportunity of internalizing the value of open government.
Category: open government data Tags: open data
by Andrea Di Maio | March 7, 2013 | Comments Off
I spent three days at the Gartner Symposium in Dubai, where over 500 attendees from countries in the Gulf area gathered to listen to a number of Gartner analysts presenting about current trends in the use of information technology. Our overarching message, like for our recent symposia in North America, Brazil, Europe, India, Japan and South Africa, revolved around the so-called Nexus of Forces, i.e. the confluence of social, cloud, mobile and information, which creates an inflection point in how technology supports and enables businesses in all industry sectors.
I had several sessions: a workshop on social media and open government, a presentation on our “smart government” scenario, a round table on cloud computing in government, and several one-on-ones or many-to-ones on these topics.
I was very positively impressed with the quality of interactions and the relevance of questions. During the presentation, I got more questions than I did for the same presentation in Europe, and all questions were extremely topical. They were not only relevant for the region, but they were questions that I would have loved to see asked in other parts of the world as well.
We had a whole discussion on the boundaries between personal and professional identities on social media as well as about who should be in charge for social media strategies and policies. Participants were very sharp in grasping almost instantly the relevance of focusing on how to enable positive behaviors rather than only on preventing inappropriate ones. They were clearly acutely aware of the challenges posed by social networks, as the Arab spring showed a couple of years ago, but they seem to be eager to move beyond that point and explore other, more beneficial options. One clear demonstration of this was how rapidly the discussion on open government shifted from citizen engagement in policy making (that – despite what some people may believe – is something that most of them take very seriously) to citizen involvement in service delivery and transformation.
There are probably two main risks that are in the way to success.
The first one is complacency. The lack of budget pressures and the fact that authority is clear and can be enforced may make government CIO and other executives believe that they can keep operating in an environment that they control. Be it the desire of an employee to use a personal device, be it the attempt of a business unit to purchase a cloud SaaS solution without IT even knowing, these may be seen as alien situations in the region. However, talking to representatives of a shared service provider, it was evident they understood that, irrespective of any mandate to use their services, if they underperform, they are going to be challenged by alternative approaches. What may be less clear though, and it is not very clear to many government shared services organizations around the world, is that the ability to prevent such a situation is rooted into the willingness to constantly challenge themselves and ask whether they should really keep delivering all the services they currently deliver. Being able to identify service areas to discontinue because they can be procured almost as a commodity from the market is an important element of a sustainable shared services offering.
The other one is obsession with importing best practices from western countries. While taking inspiration from technology-intensive solutions in other jurisdictions is always a good thing, aiming at imitating or being better can be a futile exercise. The context – in terms of governance models. market maturity, regulatory constraints, cultural traits – is different and – as I said before – GCC IT executives seem to be rather savvy enough nd aware of their context (as well as of their options) to make the right decisions. However, as it often happens also in the Western world, political leaders like to see their countries at the top of comparative rankings, and vendors may have an easy life at amplifying their accomplishment in other jurisdictions rather than helping their clients better understand local context and opportunities.
From my limited experience through this symposium I am very confident that most GCC governments will find effective approaches to deal with challenges and opportunities offered by the nexus of forces. It won’t always be a smooth ride but maybe, at the end of it, they may find themselves much higher in the rankings. Even unintentionally.
Category: IT management Tags: nexus of forces
by Andrea Di Maio | March 1, 2013 | Comments Off
Many commentators have been discussing about the outcome of the last Italian elections, held a few das ago, which resulted in a tie between the two major coalitions (center-left Democratic Party with its allies and center-right People of Freedom) and a surprising success of the new grassroots Five-Star Movement, led by former comedian Beppe Grillo.
In the months leading to the elections, a lot has been said about the new model proposed by the Five Star Movement, which is based on the direct engagement of people through the web, started with Grillo’s highly influential blog. Looking at the success of this model as well as at the key role played by the web in several other elections over the last few years, ranging from the US to Malaysia, the two main coalitions have jumped onto the web and social media bandwagon, suddenly discovering that, despite the official statistics and the never-ending lament about the need for a digital agenda, Italians are way more wired than many think.
While political leaders were spending time tweeting and doing online hangouts, Grillo has been the only leader to spend as much time as possible on the ground, with an endless series of events, which started with a successful tour in Sicily where he swam across the Strait of Sicily, and ended with a huge meeting in Rome that attracted well over half a million people.
Of course he maintained his web presence, but many believe that the secret to his massive success, beyond the radically populist messages, was his closeness to Main Street.
While the Italian situation is very peculiar, there may be lessons to be learned for all those who discount the importance of physical channels, both in politics and in administration. How many of us have stumbled across a web site or an automated voice response system while trying to solve a problem, and have found a real person who has helped us through? How many of us remember a great tax filing online experience over an employee who sat with us to explain how to file?
At the end of the day, which technology will be more essential to win an election? Big data analytics and social media, or a powerful PA system for the candidate’s voice to reach out to people in the streets?
Category: Europe and IT web 2.0 in government Tags: government 2.0, Italy
by Andrea Di Maio | January 30, 2013 | 2 Comments
Crowdsourcing can be an effective means to tap into the so-called “wisdom of the crowd” to solve complex problems, stimulate innovation, slash the cost of research, encourage collaboration across organizational boundaries. Examples like Innocentive or IdeaScalev come to mind, but there are plenty of areas where crowdsourcing can help.
Usually it is applied ex-ante: when we recognize we cannot find a solution or we need an out-of-the-box one, we engage a community – whose size depends on the problem at hand to solve the problem.
However there is another use of crowdsourcing that serves a different purpose. Rather than engaging a crowd to come up with an idea, a solution or a position, or to further develop one that is at a very early stage, the crowd can be engaged after the idea or positions are cast in stone (hence ex-post), to seal it from external criticism.
This tactic can be used by enterprises that see one of their products or services under attack by consumers on different social media platforms, and unleash an army of followers that will praise the product, boost the ratings and aim at tilting the balance in favor of the enterprise.
It can be used by individuals too.
This can be done very tactfully, by just factually arguing in favor of the position: irrespective of the merit of that position, if there are enough followers who are available to support the individual, his critics are likely to be outnumbered.
It can be done less tactfully, in case the supporting crowd does not have enough elements to reinforce and defend the original position or the position is inherently weak. In this case the crowd, either spontaneously or building on a comment by the individual, will focus its attention on the critics, claiming an unfair attitude or even going as far as indirectly threatening of some form of legal retribution.
The main benefit of this defensive tactic is that the average personal bandwidth of people attending the discussion is often insufficient to grasp the origin of the discussion and even to discern about opposing viewpoints. If the debate is then colored by other allegations, attention may spike, but move even further away from the original topic.
The downside is that it is vulnerable to any sort of retrospective research that could highlight behavioral patterns by the enterprise or the individual.
Category: Uncategorized Tags:
by Andrea Di Maio | January 30, 2013 | 4 Comments
Those who happen to read my blog know that I am rather cynical about many enthusiastic pronouncements around open data. One of the points I keep banging on is that the most common perspective is that open data is just something that governments ought to publish for businesses and citizens to use it. This perspective misses both the importance of open data created elsewhere – such as by businesses or by people in social networks – and the impact of its use inside government. Also, there is a basic confusion between open and public data: not all open data is public and not all public data may be open (although they should, in the long run).
In this respect the new experimental site alpha.data.gov is a breath of fresh air. Announced in a recent post on the White House blog, it does not contain data, but explains which categories of open data can be used for which sort of purposes. And the nice surprise is that at the top of the page it says
A collection of open data from government, the private sector and non-profits that are fueling a new economy
There are examples of non-government open data, such as car data streams that already power new insurance business models. There are examples of personal open data, such as personal academic data for students to build personal learning profiles, around which one can imagine an ecosystem of services and applications; or personal health data, such as that supported by the Blue Button initiative. Besides, of course, plenty of government public data in areas like health, commerce, education, finance.
Alpha.data.gov hints to a new role for governments, that can shift from being simple open data providers, to become open data hubs. Whereas I suspect that large information service providers will be willing to position themselves as the open data hub of choice, alpha.data.gov can show the path, raise awareness and ultimately help governments move from being pure providers to being actual consumers of open data.
Category: open government data Tags: open data, open government
by Andrea Di Maio | January 28, 2013 | 2 Comments
For anybody who has been watching the evolution of consumer technology, it is quite clear how devices are becoming obsolescent much sooner than in past years. My parents used the same fridge for over 30 years and the same TV set for almost 20, and my hi-fi has been serving me well for over 20 years. Things have changed with digital technology, and now laptops, tablets, cellphones, TVs get replaced ever few years or – sometimes – every few months.
One thing is to know this, another thing is to experience it. I have been one of the first owner of an iPad in my circle of friends and colleagues. I remember I bought in Chicago a few days after its launch, and when I flew back I had all flight attendants around me watching this strange new device (and making me feel so proud of getting their attention).
Now, less than three years after its purchase, people opening their shiny new tablets during a meeting, look down on me with disdain, watching the unmistakable iPad 1 cover, and I can clearly ask the question in their eye: “is there a problem with this guy? Can’t he afford a new one?”. Oddly enough, if there is somebody at the same table, holding just paper and a pen, people look at him or her with curiosity mixed with respect, and that person can say “I would miss the feel of paper in my hands”.
Until when you concede and buy yourself a device, you have all sort of defenses. If somebody asks you “have you ever considered to buy yourself a tablet?” you can say many things, ranging from “I share one with my partner” to “I am not good at typing and my handwriting is horrible” to “I touch-type so I’d rather use a laptop” to “I’m a Luddite” (although this would be hardly credible for a technology analyst). But if you have the old model, then how do you defend yourself against that the “What’s wrong with this guy?” question in your counterpart’s eyes?
Sometimes people approach me and ask more direct questions: “Don’t you miss the camera?”, “Hasn’t that become horribly slow?”, “How do you manage with apps that can’t be updated any longer?” and so forth.
Reality is that I can’t find the personal business case for upgrading. I have my bunch for applications for watching video, playing and recording music, reading and annotating books and documents. I have lost access to corporate email when I missed the upgrade to iOS 6, but I have other devices for that.
I admire those who keep upgrading and buying themselves the latest toys, and even find ways to give their old models to their kids. But, when I try to figure the scene at my place, I can already hear my kids saying “You got it wrong Dad. You keep the old clunker, and we get the new model”.
Category: Uncategorized Tags: iPad
by Andrea Di Maio | January 25, 2013 | Comments Off
Government organizations around the world have been on a continuous path toward greater IT efficiency as a result of overall spending cuts and budget reductions driven by the economic and financial situation in most of the developed world.
An excellent report recently published by the UK National Audit Office shows that recipes for IT cost containment applied by the UK government, especially in the area of better and more consolidated procurement, are delivering the expected results.
There are jurisdictions where there is still a lot of room for improvement when it comes to IT cost containment: insufficient coordination and standardization, complexity and devolution of decision making processes, conflicts of interest or even corruption get in the way.
But in many places, and the UK is one of them, IT cost containment has been relentless pursued, and one might argue that government IT organizations both at national and local level are close to the bones and to not being able to reduce their costs any further. On the other hand spending projections for the next several years in the same jurisdictions indicate that more savings are expected through headcount reduction and other measures to bring down operational spending.
For them, a proportional cost reduction is no longer an option. In order to sustain citizen services and discharge their statutory obligation, they will be force to automate, transform and digitize much further. Although individual technologies become cheaper, the simple magnitude of the digitization ahead is such that IT spending cannot decline any further.
This seems to be confirmed by some Gartner data (in particular Forecast: Enterprise IT Spending for the Government and Education Markets, Worldwide, 2010-2016, 4Q12 Update – client access required), where especially at state and local level, but also at national level (albeit a bit later), a decline in IT spending reduction and a return to growth around 2014.
After years when government IT professionals were struggling to prove the value of IT, we may be at a point where their business colleagues finally understand.
However more IT spending does not mean more IT spending by the government IT departments. The use of consumer and commodity technology is likely to shift IT spending from the IT department to IT users.
Therefore, in order for that spending to really help cushion the impact of overall budget cuts rather than be wasted into multiple streams, it is essential for government CIOs to become good shepherds: they must strike the right balance between what they need to control and what they can leave to their IT users to choose.
Category: IT management Tags: cost cutting