Thomas Bittman

A member of the Gartner Blog Network

Thomas J. Bittman
VP Distinguished Analyst
18 years at Gartner
29 years IT industry

Thomas Bittman is a vice president and distinguished analyst with Gartner Research. Mr. Bittman has led the industry in areas such as private cloud computing and virtualization. Mr. Bittman invented the term "real-time infrastructure," which has been adopted by major vendors and many… Read Full Bio

How Cloud Computing Reboots the Channel

by Tom Bittman  |  April 7, 2011  |  9 Comments

In the past two months I’ve spoken to an audience of channel partners, had 6-7 lunch roundtables with channel partners in the U.S. and Canada, and I’ve met with a few channel partners in Europe. Two things are becoming increasingly clear to me: the channel will be critical in broader adoption of cloud computing (and private cloud), and the channel is not ready to do this. The channel needs to be rebooted. Until they are, the midmarket, in particular, will leverage cloud computing in a slipshod and hit-or-miss manner. Likewise, channel partners who don’t reboot and adjust to the new reality (that more and more IT capabilities purchased by the midmarket will be coming from the cloud, and not through hardware and software sales) won’t survive for long.

I see three clear, broad opportunity areas for the channel with respect to cloud computing (I’m sure there are more):

(1) Assessments. Basic education. What is it, and what does it mean to a customer? What could leverage cloud computing, and what can’t? Where should an organization focus their cloud efforts? How do they get started? Private or public or both? The assessment helps put the channel partner into the decision-making process – rather than find themselves disintermediated and locked out.

(2) Transformation. Helping an organization (business and IT) change. Process change, management changes, organization and skills changes, culture, politics – this is a broad area, and one in which goes beyond the skill base of most VARs and resellers. Application re-design fits here, too. And designing private cloud with hybrid in mind. Technology changes are easy, it’s everything else that is very, very hard.

(3) Broker. Assessments and transformation are large areas of opportunity, but once complete, the channel is no longer needed – unless they take on a broker and aggregation role. Most companies leveraging cloud computing will have several – perhaps many – providers. The channel has the opportunity to aggregate those services, provide value-add integration and other services, provide insurance, deal with failures, monitor SLAs, be a single throat to choke. The white box for cloud providers. For private cloud, the channel can smooth the way to hybrid cloud computing, and remain the broker in the equation.

Is the channel ready for any of this? No way! Are the provider and vendor business relationships with the channel making this easy? No way (vendors/providers are completely unclear whether they want to own the customer relationship or not)! Will the midmarket be able to adopt cloud computing in large scale without the channel? I don’t believe so. Cloud is simply too hard, too paradigm-shifting, too “cloudy”.

Time to start rebooting. Or watch the rest of the channel re-invent themselves for cloud computing and leave the rest in the dust clouds.

9 Comments »

Category: Cloud     Tags: , ,

Guest Blog: A Student’s iPad Experience

by Tom Bittman  |  April 6, 2011  |  14 Comments

(This blog post was written by my teenaged son, Danny)

I’ve been using the iPad in school since April 2010. Since I bought the iPad, I’ve probably gone through ten different note-taking apps, and four or five planners. So rather than dwell on each app and describe why I didn’t like them, I’ll just tell you my current system and along the way explain why the other apps didn’t work.

Right now I use a total of four apps in my everyday school work, so I’ll start with my notes apps. To take notes I use an app called UPAD. This app allows me to draw, type, highlight, change font sizes, add guidelines and so on. So, using a feature that allows me to write on a zoomed in spot of the sheet while still seeing the whole piece of paper, I can do all of my math notes on the iPad without slowing down behind everyone else. But then, when a teacher gives us a word definition to write down, I can switch to typing mode by tapping one button and I can insert text where ever I want and highlight anything important that I think I will need to study. Then, when all is said and done, I send my notes to my “filing folder app” which is a note-taking app itself, yet I use it as a filing folding because of its friendly interface. This app is called aNote (Awesome Note). Now this is what I basically had been doing with the iPad 1, but now with the iPad 2, I can do ALL of my work on the iPad using the cameras. The app I was talking about before, UPAD, lets you change the background image that you draw on, so, when a teacher hands out a piece of paper or packet, I quickly snap a picture of all sides and put it as a background on my notes. So if a teacher hands out a packet worksheet, I snap a picture of all sides, and hand the worksheet right back to them. If the whole school was on this system, the teachers wouldn’t even need paper, because they could just email the documents to all the kids so right as they walk into class, they should have the worksheet. When the kids finished they could just email it right to the teacher, saving a good 5 to 10 minutes from passing out and collecting papers.

Now this filing cabinet app, aNote, is really something else. No matter what, when kids come into class, they always have their papers; there is no possible way that they “lost” it because it’s all saved on the iPad. But that’s in the short term. Due to the small file size of every note, you never have to delete notes, or “throw out your papers / empty binders” so when a quarter comes to an end, a student does not have to be scrambling to collect their notes for tests. All they have to do is go to their classes tab and there you go, the app even has a calendar where you can see your notes. Maybe you’re a senior in Spanish level 3 and you remember that you took extremely good notes on the command form of words sophomore year, all that senior would have to do is either jump to their sophomore Spanish folder and look, or just search “command form”.

The third app I use is Pages, Apple’s word processing app. This app allows me to work on all my essays in school, so if a teacher tells us what type of header to write on our essay, I can put it in right away. If our class goes to the computer lab to work, rather than waste (and I actually timed it) 14 minutes to turn on the computer, find your essay, then start up word, I can just turn on my iPad and Pages will come up in less than 30 seconds. Also, don’t forget the fact that if all the kids had iPads, we wouldn’t need to take even more time out of class to walk to the computer labs.

The fourth app I use is called OmniFocus. Now there are a plentiful amount of planner apps out there that work just fine like iStudiez pro and iHomework, but Omnifocus beats everything by far. With Omnifocus I can create tasks, folders, projects, project start times, locations, contexts, anything. So I have a school folder, and inside that folder there are folders for every class. Then inside those are my projects, so I have a homework project, which is where I put all my regular homework, and then I have my Project projects, like for lab reports and presentations. This way, Instead of just writing down what’s due, I can make every task that I have to do to get a project done, and then order them sequentially to make sure I finish everything in a timely manner. Although this is great, a regular planner app would probably work the best for everyone, so I would suggest using iStudiez pro, because you can add semesters, partners, teacher contacts, and much more.

Finally, there are a bunch of apps that I just use every now and then, but they save my back from carrying the extra weight – for example, iBooks. Most of the books we read in class are classics that, in the iBooks store, are free. So when my teacher hands me a copy of Macbeth, I can instantly go online and download a free copy of this book, which I can read, take notes on in the app, book mark certain pages, look up a sentence or word online, or even, without any type of online service, look up a word in the dictionary. Another app I use is Dictionary.com’s app, which gives you a 43mb full, offline dictionary and thesaurus. My calculator app is also very useful. I’m sure there are others, but everything above are the must-haves.

Prices (in addition to the iPad itself):

- UPAD = $4.99
– aNote = $4.99
– Pages = $9.99
– Calculator = FREE
– Dictionary = FREE
– iBooks = FREE
– iStudiez Pro = $2.99

Total cost = $23

Now think about all the pencils, binders, folders, papers, books, planners that people would have to buy in one year, and multiply that by the 4 years a kid is in high school. I’ve been collecting every piece of paper that my teacher gives me and bringing it home to put in a pile of papers to make a point. Without notebooks, that pile is already 8 inches tall from just my sophomore year. With 5 notebooks, that’s 11 inches tall of pure paper. If you do the math for the 1,700 kids in my school for 4 years, that’s about 1.2 miles of stacked paper!

So overall the iPad extremely convenient, not a hassle, and I actually find it humorous to watch people walk down the hallway with these huge 20-pound backpacks, when sometimes I have nothing in mine because I carry my iPad in my hand. Then when I get to my next class, there is no need for me to switch my binders, or file my papers, just sit and take off the smart case.

14 Comments »

Category: Education Uncategorized     Tags:

Going Laptopless

by Tom Bittman  |  April 5, 2011  |  2 Comments

I’m a knowledge worker. I’m in Copenhagen, on business. My laptop is in Connecticut. And I’m OK with that.

arrow downNow let me preface this by saying as an analyst, I don’t cover client computing, or PCs or tablet computers. I’m writing this as Joe Knowledge Worker. Even so, I’m going to avoid using product brand names. I’m not promoting a specific product. But I am promoting a new way of getting things done.

I know I’m not the first to have this aha moment, and that’s a bit of a sore point with me. I still have a working 8080 system from the early 1970s. I bought IBM’s first PC when it came out. I bought IBM’s first laptop computer – the PC Convertible – in 1986 (and yes, still have it and it still works). I jumped on the Palm Pilot as soon as it was available. I consider myself an early adopter. When it comes to tablet computers, however, my son is the early adopter and the pioneer. He’s been using his tablet computer in high school for a year now, and trying to convince me that it would work for me, too. I didn’t see it then, but I do now.

I tried it, on two business trips. The first one, I pulled out the tablet computer and played a little with it. Still, I did most of my work on the laptop. Second trip, my laptop battery died on a flight. I wrote a complete research note on the tablet. Suddenly, work was getting done, and without a laptop.

I’m in love. I love the lo-ong battery life. I love the tactile user interface. I love the super-thin size and portability. These three are huge for a traveler.

There are trade-offs. A physical keyboard is helpful, but I’m finding that to be a non-issue, and possibly more of a rut than a need. A DVD player is nice to watch shows when away from home – but Netflix works just fine instead. A data warehouse on a hard disk is nice, but do I really need all of those files with me? Cloud storage works great when I’m connected – which is very often – and I have plenty of memory for offline files. Showing presentations? I have the adaptor, and it works perfectly.

I’m an inveterate planner and organizer. Spreadsheets and lists that used to live on my laptop don’t live there anymore. It’s all on the tablet. Frankly, at this point, there are only a few things that really require my laptop – and I’m working to reduce that, too.

So, I’m in Europe and away from the office for four days, and work has not stopped, and I’m not searching every airport for outlets to give my laptop a little more juice, and my backpack is extremely light (and probably unnecessary now), and I may actually do more “knowledge work” on my tablet computer on this trip than I would have with a laptop. And, of course, I’ve just posted my first blog entry from my tablet.

I’ve only had this device for about three weeks, but I suspect that bringing the laptop on trips will be the exception going forward. Not quite an early adopter – but I’m all in now.

2 Comments »

Category: Cloud Education Future of Infrastructure Industry Analyst     Tags: ,

The End of Server Growth?

by Tom Bittman  |  February 11, 2011  |  5 Comments

Will virtualization, multicore, and cloud computing trends send x86 architecture server and processor volumes down for the next decade? It certainly is a realistic scenario – and perhaps the most likely.

arrow downAt Gartner, we spend a lot of time trying to understand future scenarios, the likelihood of each, indicators that a scenario is likely to occur, impacts on our clients, and what our clients should do. We’ve studied the impact of virtualization on the server market since virtualization was first introduced <begin chest-thumping>and Gartner was the first firm to point out the negative ramifications of virtualization on server volumes<end chest-thumping>. But we’re getting to the moment of truth.

With the exception of the economic collapse in 2009, server volumes have been dependably growing for years. However, virtualization rates are hitting a point that the negative effect of virtualization on the server market are becoming unmistakable. Not in five years. Now.

2010 was a good year for servers – nearly 9 million were sold. My contention is that if virtualization didn’t exist, there would have been 13, or 14, or 15 million sold.

The engine of server market growth has been the growth of workloads. Since 2004, the compound annual growth rate (CAGR) in workloads has been about 16 percent. 2010 was certainly a much better year than that – but if you factor in the the volume decline in 2009, the growth in 2010 just exactly made up the difference.

If the workload CAGR remains steady, server volumes will start to decline in 2011, and we won’t see 2010’s volumes again in this decade.

The good thing – virtualization (and cloud computing) makes it easier and faster to deploy a workload, and that has a tendency to increase the workload CAGR. However, even accounting for faster workload growth, 2010 is either at or near the peak of server volumes for the next ten years.

However, if Moore’s Law is going to be driven by increasing amounts of cores, those cores are going to need VMs to leverage them. Multicore is going to drive higher virtualization densities, and even fewer servers.

What will it take to drive server volumes up? Low virtualization growth, high workload growth, low virtualization densities. A combination of factors that seems unlikely.

Bottom line – there are a number of realistic scenarios for server volumes in the next decade. Each scenario will drive different vendor behavior (and results), pricing, and end user strategies. But – anyone want to place a bet? I’m blogging it, so I’m placing mine right now.

5 Comments »

Category: Cloud Virtualization     Tags: , ,

Embracing the Blur

by Tom Bittman  |  February 9, 2011  |  4 Comments

We’re having an interesting discussion inside of Gartner (due credit to Neil MacDonald, Lydia Leong, Cameron Haight and David Cearley for the ideas in this post – I hope they post further on this). The concepts here aren’t new. For example, in 2004, I talked about “the walls coming down” between business, the data center and development. I wasn’t unique – others have discussed boundaries breaking down between different aspects of IT architecture for years. However, I’m not sure how many people are aware of how utterly pervasive this megatrend in IT really is, and how much it affects all of us. In a word, the megatrend is "blur." Think about it.

  • blurWhatever happened to the market where there were distinct servers, storage, and networks? Fabric is blurring that.
  • What the heck is an operating system any more, and what does it matter when I have a virtual pool of distributed resources I need to use?
  • Whatever happened to the boundary between consumer technology and enterprise technology? Consumerization of IT. And not just personal technology devices – some IT services are given away for free (and subsidized by advertising). Which leads to boundaries disappearing in business models.
  • Whatever happened to the boundary between outsourcing and insourcing? Now we have cloud computing: public, private, hybrid, and every other variation. Looking for a black and white definition of cloud computing? A waste of time – it’s gray!
  • What about ownership of intellectual property? Open source, community collaboration. Is it plagiarism if you add value to existing content? In a society of information, can you afford not to build on what’s already out there? What should 21st century students do?
  • What about the boundary between trusted enterprise data and untrusted data? Can we really afford to ignore any business information that might be useful? Isn’t it about what we do with the data, rather than whether the data is 100% trusted and owned by the enterprise? The boundaries of data used for business intelligence have been blown completely down. For that matter, we are entering a period of data overload – some we can trust, some we partially trust, some that is impartial, some that is partial. Successful people and businesses will be able to find value in that data. Unsuccessful people and businesses will drown in the data, or hide from it.
  • Whatever happened to the boundary between IT and the business? In some cases, being solidified in the form of services-orientation (e.g., cloud computing), in other cases, the boundary simply does not exist. How many business people can afford to be laggards in leveraging the latest IT capabilities? How many IT personnel can ignore business strategy?
  • What about the boundary between applications and operations – and security, for that matter? It used to be that developers threw their creations over the wall for operations to run, with a kiss “good luck”. New applications are being written based on operational models, with automated deployment/operations/optimization in mind. Security is being captured as policy that moves with the application.

Virtualization. Consumerization. Cloud. Instant connections and collaboration. I could go on.

An overall IT megatrend today is a complete and utter blurring of boundaries – which we could handle conceptually, but it directly affects people and market competition. It’s a lot harder to re-skill, re-organize, and react to partners that become competitors and competitors that become partners and partners who are also competitors depending on the situation.

If there is one “skill” that is critical for an enterprise to have, and for individuals to have who use and/or help deliver IT capabilities (which, by the way, is everyone) – it’s “agility.” If you depend on the predictability of competition, and the predictability of a job category, you’re not gonna make it. You or your company will become noncompetitive faster than you can say “blur.”

To use Neil MacDonald’s perfect phrase, success requires “Embracing the Blur.”

(By the way, Neil has pointed out an interesting book by Stan Davis, called – not surprisingly – “Blur.” I need to take a look!)

4 Comments »

Category: Agility Cloud Education Future of Infrastructure Virtualization     Tags: ,

Economies of Fail

by Tom Bittman  |  December 7, 2010  |  2 Comments

Interesting discussions here at Gartner’s Data Center Conference in Las Vegas. While discussing the importance of economies of scale to cloud providers, I pointed out that economies of scale is a double-edged sword.

cardsWhile enterprises tend to have many (often hundreds, or even thousands) IT services that they provide, cloud providers tend to have only one, or a handful, but provided on huge scale. Standardization makes automation much easier, and certainly makes economies of large scale very attractive. But what happens when a “service” suffers a decline in demand? For an enterprise, diversification makes this much less of an issue – usually, a decline in one “service” will be made up by growth in another. The capital expense risk is real, but not huge. But what about a cloud provider that focuses on just that service?

Economies of fail.

Megaproviders in the cloud are not immune to economic declines, or changing demand. One of the benefits of cloud computing for end users is transferring their own capital risk to cloud providers. Doesn’t this sound an awful lot like the mortgage crisis in the U.S.?

For cloud providers to be successful, they must protect themselves. As much as possible they must find corollary markets for their services that are not directly related to their core service market – without abandoning the simplification and standardization that enables automation and economies of scale.

Potential customers of cloud providers should be very aware of a cloud provider’s business risk, and protect themselves. Cloud provider resiliency, market diversification and stability should be selection criteria. Remember: a provider cannot be too big to fail – in fact, some providers might become so big and so focused that failure is inevitable.

2 Comments »

Category: Cloud     Tags: ,

Virtualization Then & Now: Symposium 2009-2010

by Tom Bittman  |  October 18, 2010  |  19 Comments

fountMy first presentation at Symposium 2010 was Server Virtualization: From Virtual Machines to Private Clouds.” Attendance was crazy – the large room was packed, people were standing at the back, and apparently a few dozen were turned away at the door. This proves that server virtualization is not only a hot topic, it’s getting hotter right now (one stat I mentioned was that more virtual machines would be deployed during 2011 than 2001 through 2009 combined).

I started the presentation with some fundamental changes in server virtualization since I presented a year ago.

1) Virtual machine penetration has increased 50% in the last year. We believe that nearly 30% of all workloads running on x86 architecture servers are now running on virtual machines.

2) Midsized enterprises rule. For the first time, the penetration of virtualization in midsized enterprises (100-999 employees) now exceeds that of the global 1000 (or it will before year-end). There has been a HUGE uptake in the last year. Also, unlike large enterprises, midsized enterprises tend to deploy all at once – with outside help.

3) Hyper-V is under-performing. Maybe my expectations were too high, but Hyper-V has not grabbed as much market share as I was predicting. I especially thought that Microsoft would be the big beneficiary of midmarket virtualization. Surveys show otherwise – VMware is doing pretty well there. Here’s a theory. Clients repeatedly told us that live migration was a big hole in Microsoft’s offering – even for midmarket customers (to reduce planned downtime managing the parent OS). Microsoft’s Hyper-V R2 (with live migration) came out 8/2009. Was that too late? Did the economy put pressure on midsized enterprises to virtualize early, before Hyper-V R2 was proven in the market? Or did VMware just have too much mindshare?

VMware’s competition is growing (especially Microsoft, Citrix and Oracle), but VMware is still capturing plenty of new customers.

4) Private clouds are the buzz. Every major vendor on the planet who sells infrastructure stuff has a private cloud story today. In the last year, the marketing, product announcements and acquisitions have been mind-numbing. Some of this is clearly cloudwashing (“old stuff, new name”), but we’ve seen a number of smart start-ups captured by big vendors, and important product rollouts (notably VMware’s vCloud Director). Now the question is – what will the market buy?

5) IaaS Providers Shifting to Commercial VMs. IaaS (infrastructure as a service) providers have focused on open source and internal technologies to deliver solutions at the lowest possible cost. But that’s changing. In the past year, there’s been a rapidly growing trend for IaaS providers to add support for major commercial VM formats – especially VMware, but also Hyper-V and XenServer. The reason? To create an easy on-ramp for enterprises. As enteprises virtualize (and in many cases, build private clouds), the IaaS providers know that they need to make interoperability, hybrid, overdrafting, migration as easy as possible. The question is whether that will require commercial offerings (such as VMware’s vCloud Datacenter Services, or Microsoft Dynamic Datacenter Alliance), or if conversion tools will be good enough. I tend to think that service providers better make the off-premises experience as identical to the on-premises experience as possible – and I’m not sure conversion will get them there.

19 Comments »

Category: Cloud Virtualization     Tags: , , , , , ,

The Buzz at Gartner’s Symposium 2010: Cloud!

by Tom Bittman  |  October 18, 2010  |  1 Comment

pleaseGartner’s Symposium this year is a blow-out – more than 7,500 attendees, and more than 1,600 CIOs. That means a very busy week of presentations and one-on-ones. As an analyst, what I always find interesting is “the buzz”. You get a real good sense of what’s hot based on one-on-one load, and one-on-one topics. I was one of a few analysts fully booked a few weeks before Symposium, so my topics are hot. The questions? Continued interest in virtualization, but shifting heavily to cloud computing, both private and public.

Because of presentations, roundtables and so forth, I only had 35 one-on-one slots available. 11 of those are on virtualization (mostly VMware and Microsoft). 9 are about cloud computing (mainly what’s ready, which services, which providers, customer experiences). 14 are about private cloud (how do I start, VMware’s vCloud, etc.).

The sense I get so far is the interest in cloud computing continues to grow, but there is more real activity and near-term spending on private cloud solutions. A lot of interest in VMware’s vCloud – but attendees want some proof first.

At the end of the week, I’ll summarize what I learned. Should be a great week!

1 Comment »

Category: Cloud Virtualization     Tags: , , ,

IT Operations: From Day-Care to University

by Tom Bittman  |  May 24, 2010  |  2 Comments

After spending the day discussing IT operations, here are some musings on the future of IT ops.

Traditionally, IT ops has been responsible for managing operationally "dumb" applications. These legacy applications are like infants – they need constant care and feeding. They can’t take care of themselves, and they rely entirely on others to survive. Actually, these dumb applications are even less capable than infants – at least infants cry when they’re hungry!

IT operations today is like day-care. Every infant is different, has different needs, signals their needs in different ways. There’s not much economies of scale here at all. Not a lot that can be automated. And new infants are being added daily!

There are three major paths for IT operations in the future – and each of them is very different:

(1) The Day-Care for Clones: Limit IT operations to management of a single (or small number of) applications. Knowing exactly how these applications work allows you to custom design IT operations/automation to their needs. This is what cloud providers typically do today, and application-centric environments (around Oracle, for example).

(2) The Smart Day-Care: The effort for years has been to make the day-care smarter, more adaptive, more on-demand. This has been a huge challenge, and will continue to be a huge challenge. One new concept has been the introduction of virtual machines, that can be used to encapsulate workloads – which doesn’t solve the problem, but it does enable more automation. Ideally, you still want to have metadata about what’s inside the virtual machine, which can describe service topology, security requirements, even service level requirements.

(3) The University: Expect more from the applications. They need to manage themselves, describe their requirements. They don’t "trust" infrastructure at all – if there are failures, the application is designed to be resilient and extremely self-reliant. On the other hand, IT operations still has a role. With "smart" applications, IT operations can’t necessarily trust them. The role of IT operations is to set constraints, manage the amount of resource that can be used, monitor behavior, look for changes in behavior.

The issue in IT operations is that these three paths are each viable, but each has very different skill, architecture, process, and management tool requirements. This confusion will take place inside enterprise IT – managing a mixed bag of “dumb” applications, “smart” applications, management of virtual machines, private clouds, and public clouds. Get ready for a bumpy ride!

2 Comments »

Category: Cloud Future of Infrastructure     Tags: , ,

Clarifying Private Cloud Computing

by Tom Bittman  |  May 18, 2010  |  34 Comments

I continue to talk with clients who understand the concept of private cloud computing, they think they know it when they see it, but they can’t quite explain it in words. A year ago I described The Spectrum of Private to Public Cloud Services, but I didn’t put that in the form of a definition. Here’s a shot.

Gartner’s official definition of cloud computing is “A style of computing where scalable and elastic IT-enabled capabilities are delivered as a service to customers using Internet technologies.” We also describe five defining attributes of cloud computing: service-based, scalable and elastic, shared, metered by use, uses Internet technologies. A key to cloud computing is an opaque boundary between the customer and the provider. Graphically, that looks like this:

image

When the customer does not see the implementation behind the boundary, and the provider doesn’t care who the customer is, you have a public cloud service. So what is private cloud?

Private cloud is “A form of cloud computing where service access is limited or the customer has some control/ownership of the service implementation.”

Graphically, that means that either the provider tunnels through that opaque boundary and limits service access (e.g., to a specific set of people, enterprise or enterprises), or the customer tunnels through that opaque boundary through ownership or control of the implementation (e.g., specifying implementation details, limiting hardware/software sharing). Note that control/ownership is not the same as setting service levels – these are specific to the implementation, and not even visible through the service.

image

The ultimate example would be enterprise IT, building a private cloud service used only by its enterprise. But there are many other examples, such as a virtual private cloud (the same as the example above, except replace ‘enterprise IT’ with ‘third-party provider’), and community clouds (the same as a virtual private cloud, except opened up to a specific and limited set of different enterprises).

Still “foggy”, or is it “clear”?

34 Comments »

Category: Cloud     Tags: ,