by Dave Cappuccio | March 30, 2012 | Comments Off on Steps to Ease Data Center Cooling – Number 4
Cooling with the data center has become our achilles heal in many cases. Historically the folks in IT had relatively nothing to do with heat or cooling management, this was strictly under the purview of the facilities team (after all, if it wasn’t IT gear, it didn’t count). In todays world though the IT team has to get involved, since they are the ones that need to live with (and fix) the problem.
Well the good news is that in most older data centers (older being 10+ years), there are plenty of low hanging fruit to choose from when deciding what project to undertake in order to develop a more efficient cooling environment within the data center.
In this series of posts I’ll posit 10 of the easy steps you can take to solve, or mitigate the cooling issue at your site.
7. Shut down CRACs
It is often the case that a data center can have too much cooling rather than too little. Many companies find themselves with a sizeable data center that is cooled to a consistently low temperature across the floor space, even when some of that floor space is either empty, or contains equipment that needs minimal cooling. In these situations, especially within older data centers, the solution can be as easy as physically shutting down some CRACs. This simple technique is often overlooked by IT for a couple of reasons. First, the responsibility for Infrastructure equipment like CRACs falls under the purview of the Facilities team, and therefore IT staff rarely think about CRAC efficiency. The second reason, especially on older equipment, is that the system has been set to a standard fan speed (often High), and left in that condition as a standard operating procedure. These fans in many cases are the most energy hungry devices on the data center floor, so any opportunity to either moderate them (item 2) or shut them down should be taken advantage of.
8. Shrink Floor-space
Companies that have experienced M&A activity, or those that have employed newer (smaller) server and storage technologies often find themselves with more floor space than is actually needed. In many cases IT looks at this space as a value-add, as it provides room for potential growth in years to come. However, this excess space also needs to be conditioned and is often kept at the same temperature as the rest of the floor since it’s all one contiguous space. In the past few years we have seen an increasing trend to shut down this space, freeing it up for other uses. By walling off excess space IT can reduce monthly operating costs (reduced energy use), while at the same time freeing up possible office space, IT work areas, or even releasing leased space. In situations where the asset is owned and IT isn’t quite sure how much growth they can expect over time, the use of temporary moveable walls might be a viable alternative. In either method the objective is to reduce the conditioned IT space down to what is absolutely needed for the next few years, and not keeping all available space just because it’s there.
Read Complimentary Relevant Research
Predicts 2017: Artificial Intelligence
Artificial intelligence is changing the way in which organizations innovate and communicate their processes, products and services. Practical...
View Relevant Webinars
How to Live Without Mobile Device Management
This webinar addresses the growing trend of users refusing to have enterprise management of their mobile devices due to privacy concerns....
Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.