by Dave Cappuccio | June 7, 2012 | Comments Off on The Case for the Infinite Data Center
When faced with planning a new data center the question of how much space will be needed is potentially the most difficult to determine. That said, the answer is often one of the quickest made – with the least amount of analysis, and I would suggest that it is rarely correct, and in most cases the final size is far larger than what is actually needed.
The first mistake many people make is to base their estimates on what they currently have – extrapolating out future space needs according to historical growth patterns. It sounds like a logical approach, but there are two fundamental problems; the first being an assumption that the floor space currently used is being used properly, and the second is a 2 dimensional view, or the assumption that usable space is a horizontal construct, rather than a combination of both horizontal and vertical space.
Many times I have seen Data Center managers or Facilities teams start with the following assumption: We are out of (or near) capacity in our data center, therefore when we build next we will need more space. If we have 5,000 feet today we must need at least 7,500 or more to sustain our growth. The error is that the focus is on square footage, not compute capacity per square footage.
By looking at compute capacity as the metric things begin to change rather quickly. As an example, lets take a typical environment of 40 server racks. In a high percentage of data centers today these racks would be populated with servers 1 or 2 generations old, depending on corporate refresh cycles, and the average server would be a standard 2u height. The racks would rarely be nearing physical capacity but might actually be maxed out in logical capacity due to power or cooling constraints at the rack level (the mantra to avoid creating hot spots in data centers has actually made floor and rack use a lot less efficient).
Given a 60% load capacity on average (again, to avoid hot spots), our example would yield an average of 13 physical servers per rack (assume 42u racks) and 520 physical servers. Given 30 square feet per rack (which includes aisle ways, door swing space, etc), the 40 racks would require 1,200 feet of floor space.
So how big should the next data center be? If we assume 15% CAGR as an average growth target, in 10 years our small IT room would need to support at least 160 racks with over 2,000 physical servers, and would require almost 5,000 square feet of floor space.
But – what if we thought both vertically and horizontally? The above all assumes things stay status quo and I acquire the same type of equipment and apply the same configuration policies throughout. But lets assume whatever floor size you design was created to allow full use of rack space without the fear of hot spots (and there are many ways to do this with a great deal of expense). Taking the same 40 racks, if pushed to 90% capacity on average (leaving some room for switches, etc), and upgrading the existing server base over the next 2 years to 1U servers would support 1,520 physical servers.
So a data center of the exact same size, containing 40 racks, with the proper design, would support 15% growth every year for at least 8 more years. Now the question becomes – do we build it bigger to support the original target of 2,000 servers, or will a future technology refresh within the next 8 years double our capacity yet again?
Doing some simple spreadsheet exercises and asking these “what if” questions can yield some startling results when it comes to capacity estimates. And the logic works with servers as well as storage, as each device category continues to decrease in size, improve in capacity and performance, and reduce it’s power consumption per unit of work with each new generation.
If we were to look at these performance and density trends and make the assumption that the curve will continue – even at a much slower pace, it becomes clear that even small data center environments can have significant growth rates (well in excess of 20% CAGR), while maintaining the exact same footprint over the next 15 to 20 years.
Food for thought – and as an aside – food for thought when contemplating the life cycle of a Container based data center as well.
Read Complimentary Relevant Research
Predicts 2017: Artificial Intelligence
Artificial intelligence is changing the way in which organizations innovate and communicate their processes, products and services. Practical...
View Relevant Webinars
Align Marketing & Customer Experience to Build Loyal Advocates
EDT: 10:00 a.m. & 1:00 p.m. | PDT: 7:00 a.m. & 10:00 a.m. | GMT: 14:00 & 17:00 Great customer experience design demands data-driven...
Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.