Thomas Bittman

A member of the Gartner Blog Network

Thomas J. Bittman
VP Distinguished Analyst
18 years at Gartner
29 years IT industry

Thomas Bittman is a vice president and distinguished analyst with Gartner Research. Mr. Bittman has led the industry in areas such as private cloud computing and virtualization. Mr. Bittman invented the term "real-time infrastructure," which has been adopted by major vendors and many… Read Full Bio

How to Create a Cloud Strategy That Fails Big!

by Tom Bittman  |  June 24, 2014  |  6 Comments

The elephant in the room is – everyone sees a different part of the elephant.

elephmag4

Cloud computing is becoming widely adopted, and yet, there are very different schools of thought on exactly what “cloud” is all about. That’s partly due to the term “cloud computing” being used for a broad collection of services, delivered at many different layers (infrastructure, application platform, software, and business process), and implemented in a variety of different ways (public, private, hybrid, on-premises, and off-premises). And yet, many organizations see only one value proposition of “cloud.” “Cloud computing” is, essentially, a horrible, damaging, dangerous term that will cause tremendous pain and suffering, transmogrify and destroy the careers of IT professionals and IT executives, and it should be banished from the world of IT. But until we come up with better terms, we should at least understand what we can and cannot get from the varied world of “cloud computing.”

Three perceptions of “cloud computing” dominate today, and often drive enterprise IT strategies. Usually, the IT organization is focused on only one of these, to their peril:

It’s a way to save money. The belief here is that cloud computing’s economies of scale will always win, that IT is a commodity. We see this all the time in “cloud first” (or worse, “cloud only”) strategies. The danger here is that cloud computing is not always cheaper. Services that can be effectively outsourced to “the cloud” to save money are often highly-standardized commodity services that have varied demand for infrastructure over time. The cost of retrofitting existing applications can sometimes be prohibitive. Standardization is required for cloud, and standardization can save money – but standardization has eluded enterprise IT for years in certain areas. Cloud computing doesn’t, by itself, make standardization easier to adopt. Cloud providers also have little motivation to make interoperability with other cloud providers easy (although integration and sharing across services might be necessary), and certainly no motivation to help manage usage efficiently. While economies of scale reduces costs, cloud providers also need to make a profit margin, and in a highly competitive market, some of them will not survive – which is problematic, because providers also aren’t motivated to make migration away easy. Bottom line: cloud computing can save money, but only for the right services.

It’s a way to renovate enterprise IT. The belief here is that cloud computing is an ideal form of computing, and that enterprise IT should change to adopt the learnings of the “cloud.” This is often expressed by enterprise IT organizations attempting to turn their entire infrastructure into a “private cloud.” However, cloud computing is, by definition, turning IT services into fast food. Not everything fits this style of IT service design and delivery. Not everything requires speed of deployment, or rapid scaling up or down. Not every IT service benefits from runtime automation. Some services are unique, and run the same way, day in and day out, for years (and will struggle if the underlying service keeps changing). Some services require significant and unique enterprise differentiation and customization. For decades, the goal of enterprise IT has been to build efficient, synergistic horizontal management tools and consistent and complete operational processes that covered a broad portfolio of IT services. Cloud computing turns this cross-service construct into many stand-alone services that are optimized within their siloes, but lack cross-service efficiencies. And, if a service does fit the cloud model, it might make more sense to deploy in a public cloud rather than a private cloud model. Bottom line: enterprise IT can learn from cloud computing, and private cloud, when applied to the right services (that can’t be deployed to a public cloud provider), can drive the organization to more efficient and effective standards.

It’s a way to innovate and experiment. The belief here is that cloud computing is where start-ups are born. While this is certainly true, the low barrier to entry of cloud computing enables large enterprises to behave like start-ups for new services. Cloud computing makes it extremely easy to get started, and to pilot new services. The challenge for enterprises is to enable innovation and experimentation, but to have a feasible path from pilots to production, and operational industrialization. Cloud computing makes it possible for IT users to leverage services without engaging the central IT organization. While this might be fine for the right services, the fact is that central IT has often protected the enterprise in terms of compliance, security, performance management, availability management, risk management, disaster recovery, etc. Unless the user of cloud computing services has a rich understanding of their requirements, and the requirements that the business has for services and data (and they rarely do), there is a benefit to having some form of oversight and future-planning for IT use. Bottom line: cloud computing enables new forms of computing, and can enable experimentation and short-running services like never before – but there is a balance between innovation anarchy, and long-term operational effectiveness and efficiency that needs to be managed, just right.

There’s a big difference between “cloud” for bottom-line improvement, and “cloud” for top-line improvement, and they can be diametrically opposed strategies. Which goes to the premise of this post. Cloud does not have a single value proposition for all enterprises and all services. A cloud computing strategy should include all of the above: are there opportunities to save money, and shift from capital expense to operational expenses? Are there internal IT services that would benefit from the lessons of cloud computing, and are there services that deserve their own, streamlined architectures and operational process models? And perhaps most importantly, are there services that were not feasible prior to cloud computing, that enable new services and new uses of IT that can truly help the enterprise innovate and grow?

In the end, most organizations should see cloud computing as a broad array of new possibilities that the enterprise and IT should leverage. And since it isn’t one, simple thing, this will drive enterprise IT to a new core competency, away from solely being a provider of services, and toward being both a provider and a broker of services – what we call Hybrid IT. But I’ll leave that for another post.

For now, don’t get stepped on by the elephant.

6 Comments »

Category: Agility Cloud Future of Infrastructure Hybrid Cloud IT Governance Private Cloud     Tags: , ,

Hybrid Cloud is Three Years Behind Private Cloud

by Tom Bittman  |  October 24, 2013  |  2 Comments

Polls (and inquiries) imply that the hybrid cloud computing trend is almost exactly three years behind private cloud computing.

By any measure, private cloud computing has become a major trend since the concept was first publicly discussed in 2008. While it isn’t mainstream yet, many enterprises have deployed something, and have plans to expand. Market hype is being replaced with focused investments and realistic plans.

Gartner’s Data Center Conference tends to draw infrastructure and operations decision makers from larger enterprises. Polls taken in sessions at this conference have proven to be fairly good barometers of the large enterprise datacenter market – even with small sample sizes. So I found it fascinating when the results for polls about hybrid cloud computing taken in December 2012 nearly matched polls for private cloud computing from December 2009.

Image

In 2009, I asked attendees how many of them already had a private cloud service in place (this was after I explained Gartner’s definition of private cloud computing – so there wasn’t a semantic issue). Crickets. So while none had a private cloud service in place yet, three-fourths aspired to have one in three years. I asked the same question about hybrid cloud three years later – again, no one claimed to have a hybrid cloud service as of yet, but nearly three-fourths had aspirations in three years.

Of course, aspirations rarely become reality. I also asked attendees in December 2012 about private cloud progress – 44% had deployed something, 45% were planning to deploy something, and only 11% had no plans. Not quite the 76% who aspired to be there – but progress, nonetheless.

Pri Prog Poll

While we can’t draw too many conclusions, indicators are pointing to hybrid cloud becoming an interesting trend in the next three years. Yes, hybrid is overhyped, and the idea of cloudbursting will need to be brought to Earth a bit – most hybrid cloud deployments will be managed at provisioning-time, not quite as dynamically as “bursting” implies. While there are certainly challeges to overcome, I actually believe the challenges are less daunting than the transformations that private cloud computing drives (in process, culture, funding models, skills and organization).

2 Comments »

Category: Hybrid Cloud Private Cloud     Tags: ,

The Consumerization of Truth, Part II: Sandy Hook

by Tom Bittman  |  July 16, 2013  |  18 Comments

When I wrote The Consumerization of Truth, and Virtual Villages, little did I know that I was about to be thrust into the middle of wildly divergent worldviews. My blog post discussed how a connected world of people devolves into polarized and radicalized worldviews, as like-minded people confirm each other’s beliefs. Three days after my blog post, tragedy struck my school and neighbors.

Memorial We live in Sandy Hook, Connecticut. My kids all attended Sandy Hook School. I was a cub scout leader there for ten years. My wife ran the school newspaper. We’ve lived in Sandy Hook for nearly two decades. This is our town, and our school. On December 14, twenty first graders and six adults lost their lives there in a mass shooting. We didn’t know all of them, but we knew many of them, and had connections to many others. We live in a small, rural town, and neighbors know and take care of their neighbors. So it wasn’t surprising when many of our neighbors immediately created meal circles, volunteered services, organized. With several friends, I co-founded Sandy Hook Promise, and focused my energies on helping the families of the victims in whatever way I could.

All of us wanted something positive to come from this tragedy, so it wasn’t long before we were looking at what should change in school safety, parenting, mental health and gun responsibility. We had to make what happened to us less likely. It didn’t take long before we found ourselves on a national stage, in the middle of very polarized worldviews on guns.

Different points of view are perfectly understandable . What shocked us was how our very horrible truth became someone else’s conspiracy. I had no idea that there were Americans who simply could not accept what we knew was true, and instead preferred to believe that we were actors, that no children died, that Sandy Hook was a staged event designed for one purpose – gun control. These Americans consider themselves patriotic, many consider themselves religious, and yet they abused the families of victims, with posts on Facebook, hours and hours of YouTube videos, web sites, phone calls, letters, and face-to-face “investigative” visits to Newtown.

There will be two kinds of people who read this: those who find their behavior abhorrent, immoral and out of touch with reality, and those who believe I am a part of the conspiracy.

How can this be?

Our connected world makes it so much easier to find people who think like you do. So a person who is already biased in their opinion will seek out others who think the same way, seek out information that matches their opinion, ignore information that doesn’t fit their worldview. It’s called confirmation bias (see some very interesting articles here and here). While it’s hard to understand how a person can believe something that is so out-of-touch with reality, it needs to be seen as a slippery slope. They likely didn’t start that way, but when two people start to question things, it amplifies. Now multiply it by hundreds, or even thousands. At one point, they weren’t sure. But when confirmed by others who weren’t sure, it becomes their truth. A virtual village is born.

Is there any way to avoid this divide? Will it get worse?

I don’t know. And that’s a scary statement about society.

I worry about the press becoming a mouthpiece for bias. I worry about an education system that continues to train people to be old-world bureaucrats and factory workers, when we need critical thinkers. It isn’t just important, it’s paramount that we make a tremendous effort to arm our children to be smarter information consumers.

As we say in Newtown, “Choose Love.” For the sake of our society, I can only hope that the bias that wins the day is a bias toward love, and grace. If it does, everything else will sort out – as long as we use our heads.

18 Comments »

Category: Education     Tags:

The Consumerization of Truth, and Virtual Villages

by Tom Bittman  |  December 11, 2012  |  7 Comments

How is it possible that we have wildly divergent worldviews in a society connected by email, the Internet, Google, and Facebook? It’s because of email, the Internet, Google, and Facebook. And more importantly, the people who use these tools, and the virtual villages we build.

villagesWhen Thomas Friedman argued that the world is flat – that connectivity and speed have converted the world into a level playing field with near-zero market friction, I agreed with him in theory. But in practice, especially in society, we have a very different reality – because a level-playing field means little when the players are people, and they stay in their various corners.

The world may be potentially flat, but humanity is wildly divergent, local, locally-nurtured, and opinionated. I contend that connectivity is making things worse, more divergent, more radicalized, and not “flatter.”

Idealists saw the Internet and the connectivity it created as the great melting pot of ideas. We could have different opinions, but reach a consensus; different views of facts, but bad data would be self-correcting. The absolute opposite has occurred, because people are using the Internet. We can connect with anyone, and discuss any idea – but we don’t. People seek out like-minded people, and become more and more radicalized in their views. We are still villagers, but unlike the village of old, we get to define our own village, and we tend to build villages full of people who are just like us. Different views of facts create entire worldviews based on falsehood, or spin. Bad data propagates, mutates and spreads like wildfire. Topics like politics, religion, evolution, global warming, race relations are becoming more and more polarized.

This is the “consumerization of truth.”

It’s certainly true that the Fourth Estate (i.e., the press) has evolved into something very different in the past few decades. Mass media has become more opinionated, and it is hard to identify a news source that isn’t labeled “liberal” or “conservative” anymore. But I think that’s only a part of the problem, and perhaps just a small part.

The Internet is a wonderful replication and transmission mechanism for memes (in the form of opinions, worldviews, “facts”). I believe that the consumerization of truth is much more about meme creation and replication on the Internet, than what a major news network says. More importantly, I believe worldviews are created not just by a single meme, but by a constant barrage.

Here’s an example of a single, common meme that originally replicated in email, and more recently, in Facebook – a photograph of a newspaper article. In the article, a man named “Tyrone” in Georgia stole a laptop, and as he was leaving the store, several marines selling toys for tots stopped the thief, with one of the marines (with an Irish name) getting wounded by a knife in the process. The police reported that Tyrone was injured by falling – many broken bones, broken nose, missing teeth, etc. – implying, of course, that the marines levied their own justice.

In a recent post of this meme, 99% of those reading it (out of 400+) believed it was true and appropriate. Semper Fi. Only 1% realized that the article was a fake, based on a true story. The real bad guy was named Tracey (less “cultural”), not Tyrone. The marines didn’t beat up the guy, and in fact he was captured by both marines and store security. In other words, the meme tells a story of good-guy, white, John Wayne marines dealing with a bad black guy, and the police turning a blind eye on vigilantism.

By itself, this is only one false data point in someone’s worldview. But when you are inundated with stories and pictures with a similar theme, it has an effect – views on race relations, the military, appropriate justice – they all evolve, just a little. Before long, a person can be radicalized. Someone might even act based on that, based on their own view of what is right and “normal.”

I think this also explains how the different camps in the 2012 U.S. presidential election had such completely different views on who would win the election. Certainly, mass media had an effect, but I think the virtual village effect was even more important – when your primary conversations about projections were with like-minded people, you became more and more convinced you were right. How else do you explain Karl Rove’s utter confusion when Fox News called Ohio for Obama?

There’s a parallel here with the consumerization of IT, where consumer devices are invading the workplace. In the good old days, the IT organization had control, and could erect security boundaries around all software and hardware connected to enterprise IT. Now, users BYOD (bring your own devices), managed and unmanaged diversity are the rules, and security perimeters are shrinking. Enterprise IT has less control.

In consumerization of truth, there’s less control of information creation, replication and distribution. “The truth is out there,” but everywhere, and wildly divergent. We cannot “control” all of these truths. All we can really do is educate the consumers to be better at filtering and analyzing the flood of information coming their way. This starts with our education system. Unless we make a tremendous effort to arm our children to be smarter information consumers, email, the Internet, Google and Facebook will continue to divide and radicalize us. I fear for a world divided into polarized virtual villages, continuing to mutate their opposing worldviews.

But, let’s have some hope here. Maybe, just maybe, this is a generational problem. Maybe the generations born into this connected world will be smarter about navigating it. Maybe. But we can certainly help that along, can’t we?

7 Comments »

Category: Education     Tags:

Mind the Gap: Here Comes Hybrid Cloud

by Tom Bittman  |  September 24, 2012  |  18 Comments

Just when you thought you were starting to understand cloud computing, and private cloud computing, here comes hybrid cloud!  hybrid car small

Vendors are already flocking to the term – it means everything from remotely managed appliances to a mix of virtual and non-virtual servers to traditional applications using cloud services, and everything in between. So what is it?

Gartner defines a hybrid cloud service as a cloud computing service that is composed of some combination of private, public and community cloud services, from different service providers. A hybrid cloud service crosses isolation and provider boundaries so that it can’t be simply put in one category of private, public, or community cloud service. This definition is intentionally loose, because there really are a lot of interesting edge exceptions, and rather than draw a tight boundary around what is, and what isn’t, this seems to get to the central point of the matter well enough.

So why is hybrid cloud computing useful? It allows you to extend either the capacity or the capability of a cloud service, by aggregation, integration or customization with another cloud service. For example, there might be a community cloud service that needs to include data from public cloud services in its analysis – while retaining a certain amount of analytics or data privately. Or a private cloud service that needs to expand its capacity by extending temporarily into a public cloud service (or perhaps a somewhat private cloud service offered by a third party provider). It allows you to balance your privacy needs with additional capacity and capability needs.

The terms “overdrafting” and “cloudbursting” have been used to describe how a hybrid cloud service could be used for capacity, but they paint an extreme example. Hybrid cloud compositions can be static (designed to require multiple services), composed at deployment/usage time (e.g., perhaps choosing one service provider or another, or combining based on policies), or composed dynamically (e.g., cloudbursting – or perhaps at disaster recovery time). 

While these compositions can be designed into services and/or cloud-based applications, they will often be managed by cloud services brokerages – the intermediary that Gartner expects to become a major phenom in the next few years (something like the system integrator of the cloud world). Large enterprises will often take on this role themselves – in fact, this is central to Gartner’s vision for the future of IT.

So what does all this mean now? It means look out – the world is not going to be neatly divided into separate private and public cloud services. To maximize efficiency and take advantage of publicly available cloud services, we’re going to munge them together. Private clouds, in particular, will not stay simply “private” for long – Gartner expects most private cloud services to become hybrid. Minding the gap is key – planning for it, balancing isolation and value, leveraging it – hybrid will move from the realm of hype and vendor cloud-washing to reality in the next few years.

18 Comments »

Category: Cloud Hybrid Cloud Private Cloud     Tags: , ,

Top Five Private Cloud Computing Trends, 2012

by Tom Bittman  |  March 22, 2012  |  16 Comments

Top 5 Private CloudPrivate cloud computing continues to heat up, and there are several key trends defining private cloud computing in 2012:

1) Real Deployments: We’ll see about a 10X increase in private cloud deployments in 2012. Enterprises will find where private cloud makes sense, and where it’s completely over-hyped. We’ll see successes – and there will also be a number of failures (we’ve seen some already).

2) Hybrid Plans: According to polls, enterprises are already looking beyond private cloud to hybrid cloud computing (not cloud bursting, per se, but resource pool expansion). Interest in hybrid is affecting architecture plans and vendor selection today – but actual hybrid cloud usage is really rare right now.

3) Choices Expand: The cloud management platform market is very immature, but there are choices, and four distinct categories are forming up: a) virtualization platforms expanding “up”, b) traditional management vendors expanding “down”, c) open source-centered initiatives (most notably OpenStack), and d) start-ups often focused on Amazon interoperability (and note that Amazon just announced a tighter relationship with Eucalyptus Systems for exactly this).

4) Sourcing Alternatives: While on-premises private clouds are becoming the most common, there’s a growing interest in private clouds managed by service providers – but with varying levels of “privacy”, and understanding that is critical.

5) Value is Shifting: Many enterprises have assumed that the primary benefit of private cloud is lower costs. That’s changing. According to recent polls, the majority of large enterprises consider speed and agility to be the primary benefit. This is making private cloud decisions more sophisticated, based more on understanding business requirements. Enterprises engaged in private cloud projects to reduce their costs will usually fail to meet objectives, as well as miss the mark on potential business benefits.

2012 will be the year that private cloud moves from market hype to many pilot and mainstream deployments. So much will be happening in 2012 that winners and losers in the vendor sweepstakes will probably be pretty clear by year-end 2012, and certainly by year-end 2013. Also, enterprises are rushing so fast that there will be casualties along the way. Staying on top of best practices and learning from early adopters is a must.

16 Comments »

Category: Agility Cloud Private Cloud     Tags: , ,

Top Five Server Virtualization Trends, 2012

by Tom Bittman  |  March 21, 2012  |  19 Comments

Top 5 Serv VirtServer virtualization is a maturing but still very dynamic market, and there are several key trends reshaping server virtualization in 2012, and affecting the advice we are giving to our clients:

1) Competitive Choices Mature: Imagine how the virtualization market would be different if the server virtualization trend was starting today. VMware’s competition has greatly improved in the past few years, and price is becoming a big differentiator. Enterprises that have not yet started to virtualize (and they exist, but tend to be small) have real choices today.

2) Second Sourcing Grows: Existing VMware users may not be migrating away from VMware, but they’re concerned with costs and potential lock-in. A growing number of enterprises are pursuing a strategy of “second sourcing” – deploying a different virtualization technology in a separate part of the organization. Heterogeneous virtualization management is mostly aspirational, although there is interest.

3) Pricing Models in Flux: From expensive hypervisors to free hypervisors to core-based pricing and now memory-based entitlements – virtualization pricing has always been in flux, and trends toward private and hybrid cloud will ensure that virtualization pricing will continue to morph and challenge existing enterprise IT funding models.

4) Penetration and Saturation: Virtualization hitting 50% penetration. Competition and new, small customers driving down prices. The market is growing, but not like it used to, and vendor behavior will change significantly because of it. And don’t forget the impact on server vendors – the next few years will prove to be a challenge until virtualization slows down.

5) Cloud Service Providers Are Placing Bets: IaaS vendors can’t ignore the virtualization that is taking place in enterprises. Creating an on-ramp to their offerings is critical, which means placing bets – should they create their own standards (perhaps limited their appeal), buy into the virtualization software used by enterprises (perhaps commoditizing themselves), or build/buy software that improves interoperability (which may or may not work well)? Not an easy choice, and winners and losers will being determined.

Maturity for the server virtualization market doesn’t mean that it stabilizes – if anything, trends in this market point to continued change and even some turmoil. Enterprises and vendors need to stay on their toes.

 

19 Comments »

Category: Virtualization     Tags: , ,

Private Cloud and Hot Tubs

by Tom Bittman  |  February 28, 2012  |  3 Comments

snow n iceSitting in my hot tub with my wife in central Wisconsin in February made me think about private cloud computing. Why? I was perfectly comfortable where I was, but I knew that eventually I was going to need to get out and make a run across the deck through below-freezing temperatures. I was going to need speed and agility. Which, of course, is the same reason for implementing private cloud computing.

ROI isn’t always going to be in terms of reduced costs. Sometimes it’s about improved quality of service. Sometimes it’s about agility. Sometimes it’s about all three. IT tends to think about IT investments in terms of cost-recovery, which is a wonderful thing. But the biggest benefit of private cloud computing is not going to be lower costs. Yes, automation can eliminate rote manual tasks and save operational expenses, but automation isn’t free. The biggest benefit of private cloud computing is agility. As in, business agility. As in, an investment that helps IT’s customers do more, faster, experiment more often, ramp up, ramp down, beat the competition, grow the business.

The business case for private cloud really requires the business to be involved. If the business, for some reason, sees no value in speed and agility, private cloud is likely a wasted investment. I’ve seen examples of private clouds deployed by IT without business involvement, and then – surprise, surprise – no one used it. Cloud Fail. I’ve also had IT organizations come to me saying they weren’t going to build private cloud services because they couldn’t reduce IT costs in the process. That’s doing a disservice to business customers who might be willing to invest to improve IT services in certain areas.

The good news is large IT organizations seem to get it. In a December 2011 poll at the Gartner Data Center Conference in Las Vegas, attendees were asked, “What is your main driver in moving to private clouds?”. 59% said “agility” – only 21% said “cost”. These folks get it. Sometimes you need to improve the bottom line with lower costs. Sometimes you just have to get from the hot tub without getting frostbite. Or get inside and lock the door before your wife can make it there. Priceless.

3 Comments »

Category: Agility Cloud Future of Infrastructure Private Cloud     Tags: , ,

Mark Twain and the Open Virtualization Alliance

by Tom Bittman  |  May 19, 2011  |  1 Comment

eggs On May 17, 2011, HP, IBM, Intel and Red Hat (as governing members) joined BMC Software, Eucalyptus Systems and Suse to announce the “Open Virtualization Alliance”, or OVA (which means “eggs” in Latin, right?). Their stated purposes include “increase overall awareness” of KVM, “accelerate the emergence of an ecosystem” around KVM, and so on.

Sure, the server virtualization market is in dire need of good competition, no doubt about that. In fact, it needed competition ten years ago.

So what’s wrong with open source? Nothing! Xen was introduced in 2003, a mere two years after VMware introduced ESX Server. Xen is widely used – especially by service providers (such as Amazon’s EC2). Citrix XenServer and Oracle VM are based on open source Xen. Wait a minute – this alliance isn’t about Xen, it’s about KVM, right?

That’s concern number one. I have no issues with KVM – except it’s very late to the market. What KVM is really, really good at is what was really interesting a few years ago, both to enterprises and service providers. What they aren’t so good at – ready-made and rich management and automation tools – is what customers need today (and service providers want to tap into an installed base of enterprise customers). So, “accelerating the emergence of an ecosystem” to me is a sad place to start today in a market that has been growing and evolving rapidly over the past ten years. Especially because this alliance helps to further fragment the open source response to VMware. Is VMware cheering this on?

No doubt, this little hypervisor concept has launched a huge trend toward infrastructure modernization, private and hybrid cloud computing. And HP and IBM have been somewhat on the outside looking in. Yes, they missed having a leadership role in a critical trend, and it is a dangerous one to miss, given it’s viral and mutating nature in all things infrastructure.

So, what do we make of OVA? Back to the egg reference – Mark Twain said “Noise proves nothing. Often a hen who has merely laid an egg cackles as if she laid an asteroid.”

Marketing and alliances and rhetorical use of “open” and “standard” all prove nothing. Let’s see some execution, some fire, some innovation. Show me a sense of urgency, some leadership. Not just about hypervisors and hypervisor ecosystems, and not just about catching up – but leaping ahead. Show me a rocket, and prove that there’s an asteroid out there.

1 Comment »

Category: Cloud Future of Infrastructure Virtualization     Tags: , ,

How Cloud Computing Reboots the Channel

by Tom Bittman  |  April 7, 2011  |  9 Comments

In the past two months I’ve spoken to an audience of channel partners, had 6-7 lunch roundtables with channel partners in the U.S. and Canada, and I’ve met with a few channel partners in Europe. Two things are becoming increasingly clear to me: the channel will be critical in broader adoption of cloud computing (and private cloud), and the channel is not ready to do this. The channel needs to be rebooted. Until they are, the midmarket, in particular, will leverage cloud computing in a slipshod and hit-or-miss manner. Likewise, channel partners who don’t reboot and adjust to the new reality (that more and more IT capabilities purchased by the midmarket will be coming from the cloud, and not through hardware and software sales) won’t survive for long.

I see three clear, broad opportunity areas for the channel with respect to cloud computing (I’m sure there are more):

(1) Assessments. Basic education. What is it, and what does it mean to a customer? What could leverage cloud computing, and what can’t? Where should an organization focus their cloud efforts? How do they get started? Private or public or both? The assessment helps put the channel partner into the decision-making process – rather than find themselves disintermediated and locked out.

(2) Transformation. Helping an organization (business and IT) change. Process change, management changes, organization and skills changes, culture, politics – this is a broad area, and one in which goes beyond the skill base of most VARs and resellers. Application re-design fits here, too. And designing private cloud with hybrid in mind. Technology changes are easy, it’s everything else that is very, very hard.

(3) Broker. Assessments and transformation are large areas of opportunity, but once complete, the channel is no longer needed – unless they take on a broker and aggregation role. Most companies leveraging cloud computing will have several – perhaps many – providers. The channel has the opportunity to aggregate those services, provide value-add integration and other services, provide insurance, deal with failures, monitor SLAs, be a single throat to choke. The white box for cloud providers. For private cloud, the channel can smooth the way to hybrid cloud computing, and remain the broker in the equation.

Is the channel ready for any of this? No way! Are the provider and vendor business relationships with the channel making this easy? No way (vendors/providers are completely unclear whether they want to own the customer relationship or not)! Will the midmarket be able to adopt cloud computing in large scale without the channel? I don’t believe so. Cloud is simply too hard, too paradigm-shifting, too “cloudy”.

Time to start rebooting. Or watch the rest of the channel re-invent themselves for cloud computing and leave the rest in the dust clouds.

9 Comments »

Category: Cloud     Tags: , ,