by Tom Bittman | December 11, 2012 | 7 Comments
How is it possible that we have wildly divergent worldviews in a society connected by email, the Internet, Google, and Facebook? It’s because of email, the Internet, Google, and Facebook. And more importantly, the people who use these tools, and the virtual villages we build.
When Thomas Friedman argued that the world is flat – that connectivity and speed have converted the world into a level playing field with near-zero market friction, I agreed with him in theory. But in practice, especially in society, we have a very different reality – because a level-playing field means little when the players are people, and they stay in their various corners.
The world may be potentially flat, but humanity is wildly divergent, local, locally-nurtured, and opinionated. I contend that connectivity is making things worse, more divergent, more radicalized, and not “flatter.”
Idealists saw the Internet and the connectivity it created as the great melting pot of ideas. We could have different opinions, but reach a consensus; different views of facts, but bad data would be self-correcting. The absolute opposite has occurred, because people are using the Internet. We can connect with anyone, and discuss any idea – but we don’t. People seek out like-minded people, and become more and more radicalized in their views. We are still villagers, but unlike the village of old, we get to define our own village, and we tend to build villages full of people who are just like us. Different views of facts create entire worldviews based on falsehood, or spin. Bad data propagates, mutates and spreads like wildfire. Topics like politics, religion, evolution, global warming, race relations are becoming more and more polarized.
This is the “consumerization of truth.”
It’s certainly true that the Fourth Estate (i.e., the press) has evolved into something very different in the past few decades. Mass media has become more opinionated, and it is hard to identify a news source that isn’t labeled “liberal” or “conservative” anymore. But I think that’s only a part of the problem, and perhaps just a small part.
The Internet is a wonderful replication and transmission mechanism for memes (in the form of opinions, worldviews, “facts”). I believe that the consumerization of truth is much more about meme creation and replication on the Internet, than what a major news network says. More importantly, I believe worldviews are created not just by a single meme, but by a constant barrage.
Here’s an example of a single, common meme that originally replicated in email, and more recently, in Facebook – a photograph of a newspaper article. In the article, a man named “Tyrone” in Georgia stole a laptop, and as he was leaving the store, several marines selling toys for tots stopped the thief, with one of the marines (with an Irish name) getting wounded by a knife in the process. The police reported that Tyrone was injured by falling – many broken bones, broken nose, missing teeth, etc. – implying, of course, that the marines levied their own justice.
In a recent post of this meme, 99% of those reading it (out of 400+) believed it was true and appropriate. Semper Fi. Only 1% realized that the article was a fake, based on a true story. The real bad guy was named Tracey (less “cultural”), not Tyrone. The marines didn’t beat up the guy, and in fact he was captured by both marines and store security. In other words, the meme tells a story of good-guy, white, John Wayne marines dealing with a bad black guy, and the police turning a blind eye on vigilantism.
By itself, this is only one false data point in someone’s worldview. But when you are inundated with stories and pictures with a similar theme, it has an effect – views on race relations, the military, appropriate justice – they all evolve, just a little. Before long, a person can be radicalized. Someone might even act based on that, based on their own view of what is right and “normal.”
I think this also explains how the different camps in the 2012 U.S. presidential election had such completely different views on who would win the election. Certainly, mass media had an effect, but I think the virtual village effect was even more important – when your primary conversations about projections were with like-minded people, you became more and more convinced you were right. How else do you explain Karl Rove’s utter confusion when Fox News called Ohio for Obama?
There’s a parallel here with the consumerization of IT, where consumer devices are invading the workplace. In the good old days, the IT organization had control, and could erect security boundaries around all software and hardware connected to enterprise IT. Now, users BYOD (bring your own devices), managed and unmanaged diversity are the rules, and security perimeters are shrinking. Enterprise IT has less control.
In consumerization of truth, there’s less control of information creation, replication and distribution. “The truth is out there,” but everywhere, and wildly divergent. We cannot “control” all of these truths. All we can really do is educate the consumers to be better at filtering and analyzing the flood of information coming their way. This starts with our education system. Unless we make a tremendous effort to arm our children to be smarter information consumers, email, the Internet, Google and Facebook will continue to divide and radicalize us. I fear for a world divided into polarized virtual villages, continuing to mutate their opposing worldviews.
But, let’s have some hope here. Maybe, just maybe, this is a generational problem. Maybe the generations born into this connected world will be smarter about navigating it. Maybe. But we can certainly help that along, can’t we?
Category: Education Tags: Education
by Tom Bittman | September 24, 2012 | 13 Comments
Just when you thought you were starting to understand cloud computing, and private cloud computing, here comes hybrid cloud!
Vendors are already flocking to the term – it means everything from remotely managed appliances to a mix of virtual and non-virtual servers to traditional applications using cloud services, and everything in between. So what is it?
Gartner defines a hybrid cloud service as a cloud computing service that is composed of some combination of private, public and community cloud services, from different service providers. A hybrid cloud service crosses isolation and provider boundaries so that it can’t be simply put in one category of private, public, or community cloud service. This definition is intentionally loose, because there really are a lot of interesting edge exceptions, and rather than draw a tight boundary around what is, and what isn’t, this seems to get to the central point of the matter well enough.
So why is hybrid cloud computing useful? It allows you to extend either the capacity or the capability of a cloud service, by aggregation, integration or customization with another cloud service. For example, there might be a community cloud service that needs to include data from public cloud services in its analysis – while retaining a certain amount of analytics or data privately. Or a private cloud service that needs to expand its capacity by extending temporarily into a public cloud service (or perhaps a somewhat private cloud service offered by a third party provider). It allows you to balance your privacy needs with additional capacity and capability needs.
The terms “overdrafting” and “cloudbursting” have been used to describe how a hybrid cloud service could be used for capacity, but they paint an extreme example. Hybrid cloud compositions can be static (designed to require multiple services), composed at deployment/usage time (e.g., perhaps choosing one service provider or another, or combining based on policies), or composed dynamically (e.g., cloudbursting – or perhaps at disaster recovery time).
While these compositions can be designed into services and/or cloud-based applications, they will often be managed by cloud services brokerages – the intermediary that Gartner expects to become a major phenom in the next few years (something like the system integrator of the cloud world). Large enterprises will often take on this role themselves – in fact, this is central to Gartner’s vision for the future of IT.
So what does all this mean now? It means look out – the world is not going to be neatly divided into separate private and public cloud services. To maximize efficiency and take advantage of publicly available cloud services, we’re going to munge them together. Private clouds, in particular, will not stay simply “private” for long – Gartner expects most private cloud services to become hybrid. Minding the gap is key – planning for it, balancing isolation and value, leveraging it – hybrid will move from the realm of hype and vendor cloud-washing to reality in the next few years.
Category: Cloud Hybrid Cloud Private Cloud Tags: cloud computing, hybrid cloud computing, private cloud
by Tom Bittman | March 22, 2012 | 16 Comments
Private cloud computing continues to heat up, and there are several key trends defining private cloud computing in 2012:
1) Real Deployments: We’ll see about a 10X increase in private cloud deployments in 2012. Enterprises will find where private cloud makes sense, and where it’s completely over-hyped. We’ll see successes – and there will also be a number of failures (we’ve seen some already).
2) Hybrid Plans: According to polls, enterprises are already looking beyond private cloud to hybrid cloud computing (not cloud bursting, per se, but resource pool expansion). Interest in hybrid is affecting architecture plans and vendor selection today – but actual hybrid cloud usage is really rare right now.
3) Choices Expand: The cloud management platform market is very immature, but there are choices, and four distinct categories are forming up: a) virtualization platforms expanding “up”, b) traditional management vendors expanding “down”, c) open source-centered initiatives (most notably OpenStack), and d) start-ups often focused on Amazon interoperability (and note that Amazon just announced a tighter relationship with Eucalyptus Systems for exactly this).
4) Sourcing Alternatives: While on-premises private clouds are becoming the most common, there’s a growing interest in private clouds managed by service providers – but with varying levels of “privacy”, and understanding that is critical.
5) Value is Shifting: Many enterprises have assumed that the primary benefit of private cloud is lower costs. That’s changing. According to recent polls, the majority of large enterprises consider speed and agility to be the primary benefit. This is making private cloud decisions more sophisticated, based more on understanding business requirements. Enterprises engaged in private cloud projects to reduce their costs will usually fail to meet objectives, as well as miss the mark on potential business benefits.
2012 will be the year that private cloud moves from market hype to many pilot and mainstream deployments. So much will be happening in 2012 that winners and losers in the vendor sweepstakes will probably be pretty clear by year-end 2012, and certainly by year-end 2013. Also, enterprises are rushing so fast that there will be casualties along the way. Staying on top of best practices and learning from early adopters is a must.
Category: Agility Cloud Private Cloud Tags: agility, cloud computing, private cloud
by Tom Bittman | March 21, 2012 | 16 Comments
Server virtualization is a maturing but still very dynamic market, and there are several key trends reshaping server virtualization in 2012, and affecting the advice we are giving to our clients:
1) Competitive Choices Mature: Imagine how the virtualization market would be different if the server virtualization trend was starting today. VMware’s competition has greatly improved in the past few years, and price is becoming a big differentiator. Enterprises that have not yet started to virtualize (and they exist, but tend to be small) have real choices today.
2) Second Sourcing Grows: Existing VMware users may not be migrating away from VMware, but they’re concerned with costs and potential lock-in. A growing number of enterprises are pursuing a strategy of “second sourcing” – deploying a different virtualization technology in a separate part of the organization. Heterogeneous virtualization management is mostly aspirational, although there is interest.
3) Pricing Models in Flux: From expensive hypervisors to free hypervisors to core-based pricing and now memory-based entitlements – virtualization pricing has always been in flux, and trends toward private and hybrid cloud will ensure that virtualization pricing will continue to morph and challenge existing enterprise IT funding models.
4) Penetration and Saturation: Virtualization hitting 50% penetration. Competition and new, small customers driving down prices. The market is growing, but not like it used to, and vendor behavior will change significantly because of it. And don’t forget the impact on server vendors – the next few years will prove to be a challenge until virtualization slows down.
5) Cloud Service Providers Are Placing Bets: IaaS vendors can’t ignore the virtualization that is taking place in enterprises. Creating an on-ramp to their offerings is critical, which means placing bets – should they create their own standards (perhaps limited their appeal), buy into the virtualization software used by enterprises (perhaps commoditizing themselves), or build/buy software that improves interoperability (which may or may not work well)? Not an easy choice, and winners and losers will being determined.
Maturity for the server virtualization market doesn’t mean that it stabilizes – if anything, trends in this market point to continued change and even some turmoil. Enterprises and vendors need to stay on their toes.
Category: Uncategorized Tags:
by Tom Bittman | February 28, 2012 | 3 Comments
Sitting in my hot tub with my wife in central Wisconsin in February made me think about private cloud computing. Why? I was perfectly comfortable where I was, but I knew that eventually I was going to need to get out and make a run across the deck through below-freezing temperatures. I was going to need speed and agility. Which, of course, is the same reason for implementing private cloud computing.
ROI isn’t always going to be in terms of reduced costs. Sometimes it’s about improved quality of service. Sometimes it’s about agility. Sometimes it’s about all three. IT tends to think about IT investments in terms of cost-recovery, which is a wonderful thing. But the biggest benefit of private cloud computing is not going to be lower costs. Yes, automation can eliminate rote manual tasks and save operational expenses, but automation isn’t free. The biggest benefit of private cloud computing is agility. As in, business agility. As in, an investment that helps IT’s customers do more, faster, experiment more often, ramp up, ramp down, beat the competition, grow the business.
The business case for private cloud really requires the business to be involved. If the business, for some reason, sees no value in speed and agility, private cloud is likely a wasted investment. I’ve seen examples of private clouds deployed by IT without business involvement, and then – surprise, surprise – no one used it. Cloud Fail. I’ve also had IT organizations come to me saying they weren’t going to build private cloud services because they couldn’t reduce IT costs in the process. That’s doing a disservice to business customers who might be willing to invest to improve IT services in certain areas.
The good news is large IT organizations seem to get it. In a December 2011 poll at the Gartner Data Center Conference in Las Vegas, attendees were asked, “What is your main driver in moving to private clouds?”. 59% said “agility” – only 21% said “cost”. These folks get it. Sometimes you need to improve the bottom line with lower costs. Sometimes you just have to get from the hot tub without getting frostbite. Or get inside and lock the door before your wife can make it there. Priceless.
Category: Agility Cloud Future of Infrastructure Private Cloud Tags: agility, cloud computing, private cloud
by Tom Bittman | May 19, 2011 | 2 Comments
On May 17, 2011, HP, IBM, Intel and Red Hat (as governing members) joined BMC Software, Eucalyptus Systems and Suse to announce the “Open Virtualization Alliance”, or OVA (which means “eggs” in Latin, right?). Their stated purposes include “increase overall awareness” of KVM, “accelerate the emergence of an ecosystem” around KVM, and so on.
Sure, the server virtualization market is in dire need of good competition, no doubt about that. In fact, it needed competition ten years ago.
So what’s wrong with open source? Nothing! Xen was introduced in 2003, a mere two years after VMware introduced ESX Server. Xen is widely used – especially by service providers (such as Amazon’s EC2). Citrix XenServer and Oracle VM are based on open source Xen. Wait a minute – this alliance isn’t about Xen, it’s about KVM, right?
That’s concern number one. I have no issues with KVM – except it’s very late to the market. What KVM is really, really good at is what was really interesting a few years ago, both to enterprises and service providers. What they aren’t so good at – ready-made and rich management and automation tools – is what customers need today (and service providers want to tap into an installed base of enterprise customers). So, “accelerating the emergence of an ecosystem” to me is a sad place to start today in a market that has been growing and evolving rapidly over the past ten years. Especially because this alliance helps to further fragment the open source response to VMware. Is VMware cheering this on?
No doubt, this little hypervisor concept has launched a huge trend toward infrastructure modernization, private and hybrid cloud computing. And HP and IBM have been somewhat on the outside looking in. Yes, they missed having a leadership role in a critical trend, and it is a dangerous one to miss, given it’s viral and mutating nature in all things infrastructure.
So, what do we make of OVA? Back to the egg reference – Mark Twain said “Noise proves nothing. Often a hen who has merely laid an egg cackles as if she laid an asteroid.”
Marketing and alliances and rhetorical use of “open” and “standard” all prove nothing. Let’s see some execution, some fire, some innovation. Show me a sense of urgency, some leadership. Not just about hypervisors and hypervisor ecosystems, and not just about catching up – but leaping ahead. Show me a rocket, and prove that there’s an asteroid out there.
Category: Cloud Future of Infrastructure Virtualization Tags: cloud computing, private cloud, Virtualization
by Tom Bittman | April 7, 2011 | 9 Comments
In the past two months I’ve spoken to an audience of channel partners, had 6-7 lunch roundtables with channel partners in the U.S. and Canada, and I’ve met with a few channel partners in Europe. Two things are becoming increasingly clear to me: the channel will be critical in broader adoption of cloud computing (and private cloud), and the channel is not ready to do this. The channel needs to be rebooted. Until they are, the midmarket, in particular, will leverage cloud computing in a slipshod and hit-or-miss manner. Likewise, channel partners who don’t reboot and adjust to the new reality (that more and more IT capabilities purchased by the midmarket will be coming from the cloud, and not through hardware and software sales) won’t survive for long.
I see three clear, broad opportunity areas for the channel with respect to cloud computing (I’m sure there are more):
(1) Assessments. Basic education. What is it, and what does it mean to a customer? What could leverage cloud computing, and what can’t? Where should an organization focus their cloud efforts? How do they get started? Private or public or both? The assessment helps put the channel partner into the decision-making process – rather than find themselves disintermediated and locked out.
(2) Transformation. Helping an organization (business and IT) change. Process change, management changes, organization and skills changes, culture, politics – this is a broad area, and one in which goes beyond the skill base of most VARs and resellers. Application re-design fits here, too. And designing private cloud with hybrid in mind. Technology changes are easy, it’s everything else that is very, very hard.
(3) Broker. Assessments and transformation are large areas of opportunity, but once complete, the channel is no longer needed – unless they take on a broker and aggregation role. Most companies leveraging cloud computing will have several – perhaps many – providers. The channel has the opportunity to aggregate those services, provide value-add integration and other services, provide insurance, deal with failures, monitor SLAs, be a single throat to choke. The white box for cloud providers. For private cloud, the channel can smooth the way to hybrid cloud computing, and remain the broker in the equation.
Is the channel ready for any of this? No way! Are the provider and vendor business relationships with the channel making this easy? No way (vendors/providers are completely unclear whether they want to own the customer relationship or not)! Will the midmarket be able to adopt cloud computing in large scale without the channel? I don’t believe so. Cloud is simply too hard, too paradigm-shifting, too “cloudy”.
Time to start rebooting. Or watch the rest of the channel re-invent themselves for cloud computing and leave the rest in the dust clouds.
Category: Cloud Tags: cloud computing, private cloud, Virtualization
by Tom Bittman | April 6, 2011 | 13 Comments
(This blog post was written by my teenaged son, Danny)
I’ve been using the iPad in school since April 2010. Since I bought the iPad, I’ve probably gone through ten different note-taking apps, and four or five planners. So rather than dwell on each app and describe why I didn’t like them, I’ll just tell you my current system and along the way explain why the other apps didn’t work.
Right now I use a total of four apps in my everyday school work, so I’ll start with my notes apps. To take notes I use an app called UPAD. This app allows me to draw, type, highlight, change font sizes, add guidelines and so on. So, using a feature that allows me to write on a zoomed in spot of the sheet while still seeing the whole piece of paper, I can do all of my math notes on the iPad without slowing down behind everyone else. But then, when a teacher gives us a word definition to write down, I can switch to typing mode by tapping one button and I can insert text where ever I want and highlight anything important that I think I will need to study. Then, when all is said and done, I send my notes to my “filing folder app” which is a note-taking app itself, yet I use it as a filing folding because of its friendly interface. This app is called aNote (Awesome Note). Now this is what I basically had been doing with the iPad 1, but now with the iPad 2, I can do ALL of my work on the iPad using the cameras. The app I was talking about before, UPAD, lets you change the background image that you draw on, so, when a teacher hands out a piece of paper or packet, I quickly snap a picture of all sides and put it as a background on my notes. So if a teacher hands out a packet worksheet, I snap a picture of all sides, and hand the worksheet right back to them. If the whole school was on this system, the teachers wouldn’t even need paper, because they could just email the documents to all the kids so right as they walk into class, they should have the worksheet. When the kids finished they could just email it right to the teacher, saving a good 5 to 10 minutes from passing out and collecting papers.
Now this filing cabinet app, aNote, is really something else. No matter what, when kids come into class, they always have their papers; there is no possible way that they “lost” it because it’s all saved on the iPad. But that’s in the short term. Due to the small file size of every note, you never have to delete notes, or “throw out your papers / empty binders” so when a quarter comes to an end, a student does not have to be scrambling to collect their notes for tests. All they have to do is go to their classes tab and there you go, the app even has a calendar where you can see your notes. Maybe you’re a senior in Spanish level 3 and you remember that you took extremely good notes on the command form of words sophomore year, all that senior would have to do is either jump to their sophomore Spanish folder and look, or just search “command form”.
The third app I use is Pages, Apple’s word processing app. This app allows me to work on all my essays in school, so if a teacher tells us what type of header to write on our essay, I can put it in right away. If our class goes to the computer lab to work, rather than waste (and I actually timed it) 14 minutes to turn on the computer, find your essay, then start up word, I can just turn on my iPad and Pages will come up in less than 30 seconds. Also, don’t forget the fact that if all the kids had iPads, we wouldn’t need to take even more time out of class to walk to the computer labs.
The fourth app I use is called OmniFocus. Now there are a plentiful amount of planner apps out there that work just fine like iStudiez pro and iHomework, but Omnifocus beats everything by far. With Omnifocus I can create tasks, folders, projects, project start times, locations, contexts, anything. So I have a school folder, and inside that folder there are folders for every class. Then inside those are my projects, so I have a homework project, which is where I put all my regular homework, and then I have my Project projects, like for lab reports and presentations. This way, Instead of just writing down what’s due, I can make every task that I have to do to get a project done, and then order them sequentially to make sure I finish everything in a timely manner. Although this is great, a regular planner app would probably work the best for everyone, so I would suggest using iStudiez pro, because you can add semesters, partners, teacher contacts, and much more.
Finally, there are a bunch of apps that I just use every now and then, but they save my back from carrying the extra weight – for example, iBooks. Most of the books we read in class are classics that, in the iBooks store, are free. So when my teacher hands me a copy of Macbeth, I can instantly go online and download a free copy of this book, which I can read, take notes on in the app, book mark certain pages, look up a sentence or word online, or even, without any type of online service, look up a word in the dictionary. Another app I use is Dictionary.com’s app, which gives you a 43mb full, offline dictionary and thesaurus. My calculator app is also very useful. I’m sure there are others, but everything above are the must-haves.
Prices (in addition to the iPad itself):
- UPAD = $4.99
- aNote = $4.99
- Pages = $9.99
- Calculator = FREE
- Dictionary = FREE
- iBooks = FREE
- iStudiez Pro = $2.99
Total cost = $23
Now think about all the pencils, binders, folders, papers, books, planners that people would have to buy in one year, and multiply that by the 4 years a kid is in high school. I’ve been collecting every piece of paper that my teacher gives me and bringing it home to put in a pile of papers to make a point. Without notebooks, that pile is already 8 inches tall from just my sophomore year. With 5 notebooks, that’s 11 inches tall of pure paper. If you do the math for the 1,700 kids in my school for 4 years, that’s about 1.2 miles of stacked paper!
So overall the iPad extremely convenient, not a hassle, and I actually find it humorous to watch people walk down the hallway with these huge 20-pound backpacks, when sometimes I have nothing in mine because I carry my iPad in my hand. Then when I get to my next class, there is no need for me to switch my binders, or file my papers, just sit and take off the smart case.
Category: Education Uncategorized Tags: Education
by Tom Bittman | April 5, 2011 | 2 Comments
I’m a knowledge worker. I’m in Copenhagen, on business. My laptop is in Connecticut. And I’m OK with that.
Now let me preface this by saying as an analyst, I don’t cover client computing, or PCs or tablet computers. I’m writing this as Joe Knowledge Worker. Even so, I’m going to avoid using product brand names. I’m not promoting a specific product. But I am promoting a new way of getting things done.
I know I’m not the first to have this aha moment, and that’s a bit of a sore point with me. I still have a working 8080 system from the early 1970s. I bought IBM’s first PC when it came out. I bought IBM’s first laptop computer – the PC Convertible – in 1986 (and yes, still have it and it still works). I jumped on the Palm Pilot as soon as it was available. I consider myself an early adopter. When it comes to tablet computers, however, my son is the early adopter and the pioneer. He’s been using his tablet computer in high school for a year now, and trying to convince me that it would work for me, too. I didn’t see it then, but I do now.
I tried it, on two business trips. The first one, I pulled out the tablet computer and played a little with it. Still, I did most of my work on the laptop. Second trip, my laptop battery died on a flight. I wrote a complete research note on the tablet. Suddenly, work was getting done, and without a laptop.
I’m in love. I love the lo-ong battery life. I love the tactile user interface. I love the super-thin size and portability. These three are huge for a traveler.
There are trade-offs. A physical keyboard is helpful, but I’m finding that to be a non-issue, and possibly more of a rut than a need. A DVD player is nice to watch shows when away from home – but Netflix works just fine instead. A data warehouse on a hard disk is nice, but do I really need all of those files with me? Cloud storage works great when I’m connected – which is very often – and I have plenty of memory for offline files. Showing presentations? I have the adaptor, and it works perfectly.
I’m an inveterate planner and organizer. Spreadsheets and lists that used to live on my laptop don’t live there anymore. It’s all on the tablet. Frankly, at this point, there are only a few things that really require my laptop – and I’m working to reduce that, too.
So, I’m in Europe and away from the office for four days, and work has not stopped, and I’m not searching every airport for outlets to give my laptop a little more juice, and my backpack is extremely light (and probably unnecessary now), and I may actually do more “knowledge work” on my tablet computer on this trip than I would have with a laptop. And, of course, I’ve just posted my first blog entry from my tablet.
I’ve only had this device for about three weeks, but I suspect that bringing the laptop on trips will be the exception going forward. Not quite an early adopter – but I’m all in now.
Category: Cloud Education Future of Infrastructure Industry Analyst Tags: analyst, cloud computing
by Tom Bittman | February 11, 2011 | 5 Comments
Will virtualization, multicore, and cloud computing trends send x86 architecture server and processor volumes down for the next decade? It certainly is a realistic scenario – and perhaps the most likely.
At Gartner, we spend a lot of time trying to understand future scenarios, the likelihood of each, indicators that a scenario is likely to occur, impacts on our clients, and what our clients should do. We’ve studied the impact of virtualization on the server market since virtualization was first introduced <begin chest-thumping>and Gartner was the first firm to point out the negative ramifications of virtualization on server volumes<end chest-thumping>. But we’re getting to the moment of truth.
With the exception of the economic collapse in 2009, server volumes have been dependably growing for years. However, virtualization rates are hitting a point that the negative effect of virtualization on the server market are becoming unmistakable. Not in five years. Now.
2010 was a good year for servers – nearly 9 million were sold. My contention is that if virtualization didn’t exist, there would have been 13, or 14, or 15 million sold.
The engine of server market growth has been the growth of workloads. Since 2004, the compound annual growth rate (CAGR) in workloads has been about 16 percent. 2010 was certainly a much better year than that – but if you factor in the the volume decline in 2009, the growth in 2010 just exactly made up the difference.
If the workload CAGR remains steady, server volumes will start to decline in 2011, and we won’t see 2010’s volumes again in this decade.
The good thing – virtualization (and cloud computing) makes it easier and faster to deploy a workload, and that has a tendency to increase the workload CAGR. However, even accounting for faster workload growth, 2010 is either at or near the peak of server volumes for the next ten years.
However, if Moore’s Law is going to be driven by increasing amounts of cores, those cores are going to need VMs to leverage them. Multicore is going to drive higher virtualization densities, and even fewer servers.
What will it take to drive server volumes up? Low virtualization growth, high workload growth, low virtualization densities. A combination of factors that seems unlikely.
Bottom line – there are a number of realistic scenarios for server volumes in the next decade. Each scenario will drive different vendor behavior (and results), pricing, and end user strategies. But – anyone want to place a bet? I’m blogging it, so I’m placing mine right now.
Category: Cloud Virtualization Tags: cloud computing, servers, Virtualization