Data Center Energy Efficiency Trends

advertisement
Data Center Energy Efficiency Trends
June, 2013
Roger Schmidt, IBM Fellow
Chief Engineer - Data Center Energy Efficiency
c28rrs@us.ibm.com
845-473-2929
© 2010 IBM Corporation
The Emerging Data Center
New Data Centers are not like old ones
New Data Centers are designed around
efficiency
In power utilization
In space allocation
In capital expenditures
© 2010 IBM Corporation
Key Issues
What emerging trends will have the greatest
impact on data center design and operations?
What are the critical design considerations
and best practices in emerging data centers?
How can efficiency and scalability be
implemented in the data centers while keeping
cost reasonable?
© 2010 IBM Corporation
Data Center Market Drivers and Trends
Total cost of ownership and environmental footprint
Servers/infrastructure used 332 TWh of electrical energy in
2012(1.8% of global energy use)
Carbon emissions
© 2010 IBM Corporation
Energy Efficiency Trends
© 2010 IBM Corporation
Energy and Green IT
PUE = Total Data Center Power/IT Power
PUE de facto standard
EPA and EU getting involved
Performance per kW is key
Forced review of IT efficiency
Intersection between facilities and IT
The power issue is moving to the top of the food chain
Power consumption becoming as critical as
performance
Corporate social responsibility tightly linked
Increased use of air and water side economizers
New construction and retrofits focus on efficiency and
reuse
© 2010 IBM Corporation
Design Trends
© 2010 IBM Corporation
Emerging Design Trends
Build Small, build often
Build for density
Scale vertically, then horizontally
Build(and rebuild) pods
Build density zones
Consider Multi-tiered designs
Use free air, and reuse heat
Design for the unknown
© 2010 IBM Corporation
What is Containment?
Hot Aisle Containment
Cold Aisle Containment
Exhaust Chimney Containment
© 2010 IBM Corporation
Containment Solutions
Data Center Containment can reduce
the energy use as much as 30%.
© 2010 IBM Corporation
Increase in Rack Level Liquid Cooling Solutions
Side Views
Rear Door
Heat Exchanger
Overhead
Heat Exchanger
Top View
In row
Heat Exchanger
© 2010 IBM Corporation
Increase in Data Center Designs with Economizers
Building
Water Side Economizer
Datacom Equipment Center
Data Center
Cooling
Tower
CRAC
CDU
Rack
Load
Rack
Chiller
Turn
Off
Condenser
Water System
(CWS)
Chilled
Water
System
(CHWS)
© 2010 IBM Corporation
Use of Air Side Economizers in Japan
with Wider Temperature limits for Hardware
ASHRAE Class A2
14 % of Japan avoids using chillers all year
ASHRAE Class A3
91 % of Japan avoids using chillers all year
© 2010 IBM Corporation
Two major field corrosion problems
• Copper creep corrosion
• Copper corrosion to copper sulfide which creeps across the circuit board
surface shorting adjacent closely spaced features on the PCB.
• Silver terminal metallization corrosion
• Silver termination metallization in surface-mount resistors corrodes,
leading to open-circuited resistors.
© 2010 IBM Corporation
IT Equipment Environment –
Gaseous Contamination Monitoring
G1
G2
G3
GX
ASHRAE Whitepaper added Silver Coupons
 Now updated to include silver and copper corrosion rates < 200 Å / month & 300 Å / month,
respectively
 1 angstrom (Å) is a unit of length equal to 0.1 nanometer or 1 × 10−10 meters.
ASHRAE Whitepaper, "Gaseous and Particulate Contamination Guidelines for Data Centers" (Spanish & Chinese
Versions) http://tc99.ashraetcs.org
© 2010 IBM Corporation
Environmental Trends
© 2010 IBM Corporation
Building
Datacom
Equipment
Center
Datacom
Equipment
Center
Cooling
Tower
Air Cooled
Rack
CRAC
CDU
Rack
Load
IT Equipment
Chiller
Condenser
Water System
(CWS)
Chilled
Water
System
(CHWS)
Environmental
Guidelines
Building
Liquid Cooled
Datacom Equipment Center
Cooling
Tower
Rack
IT Equipment
CDU
Load
Chiller
Condenser
Water System
(CWS)
Chilled
Water
System
(CHWS)
© 2010 IBM Corporation
IT Equipment Environment – Measurement at Inlet
Four Key Environmental Requirements
1. Inlet Air Temperature
2. Inlet Humidity
3. Inlet Particulate Contamination
4. Inlet Gaseous Contamination
AIR INLET to datacom equipment IS the important specification to meet.
OUTLET temperature is NOT important to datacom equipment.
© 2010 IBM Corporation
Air Cooled IT Equipment – ASHRAE Psychrometric Chart
New Classes A1 and A2 are EXACTLY
the SAME as previous Classes 1 & 2
Classes A1 and A2 apply to new and
legacy equipment
New Classes A3 and A4 do
NOT include legacy
equipment (allows more
economizer hours)
These are the cold
aisle requirements!!
81
90
95
104
© 2010 IBM Corporation
113
Can I Eliminate My Chiller?
US
1200
# of Locations Needing Chiller
1018
1000
969
8C Delta Between Outside Air
(WB or DB) and Computer
Room Air Assumed
1020
981
987
938
956
900
800
863
4C Delta Between Outside Air
(WB or DB) and Computer
Room Air Assumed
724
600
488
388
400
349
179
200
74
19
13
2
1
0
0
Typ, 20C
(Wet)
A1, 32C
(Wet)
A2, 35C
(Wet)
A3, 40C
(Wet)
A4, 45C
(Wet)
Typ, 20C
(Dry)
A1, 32C
(Dry)
A2, 35C
(Dry)
A3, 40C
(Dry)
A4, 45C
(Dry)
© 2010 IBM Corporation
dT Reduction at Increased Operating Temp
54
C
ef
fe
ct
ive
58
ce
ilin
g
C
ef
fe
ct
ive
60
ce
ilin
g
C
ef
fe
ct
ive
ce
ilin
g
© 2010 IBM Corporation
Some key Features of PureSystems
Rack
 ASHRAE Class A3 (40 C max.
temperature)
 Rack and chassis architected for higher
speed and future growth
 Water cooled rear door can extract up to
30 kW
 Improved controls and energy efficiency
 Scalable power and cooling
 Easily serviced
 New end user optimization features for
cooling and power
Chassis
© 2010 IBM Corporation
Overview of “ASHRAE’s Power Trends & Cooling Book”
Chapter 1— Introduction
Chapter 2— Background
Chapter 3— Component Power Trends
Chapter 4 --- Load Trends and Their
Application
Chapter 5— Air Cooling of Computer
Equipment
Chapter 6— Liquid Cooling of Computer
Equipment
Chapter 7 – Practical Application of
Trends to Data Center Design
Appendices A – C
ASHRAE TC 9.9 Website
www.tc99.ashraetcs.org
© 2010 IBM Corporation
IT Load: ASHRAE’s Volume Server Power Trends to 2020
Market Requirements force IT manufacturers to maximize
performance/volume creating high heat load/rack
These rack heat loads will result in increased focus on improving data center
ventilation solutions and localized liquid cooling solutions
© 2010 IBM Corporation
IT Power Trends – Simple Adjustment Factor Example
A simple ADJUSTMENT FACTOR can be applied based on YOUR data center MEASURED
load to better reflect how the trend impacts YOUR data center FUTURE loads. Here is a quick
example:
–
Trend Chart Value for a 1U, 2s Volume Server in 2010:
600 Watts
–
ACTUAL MEASURED Value for YOUR 1U, 2s Server:
300 Watts
–
Calculated Adjustment Factor for YOUR 1U, 2s Server = 300 Watts / 600 Watts = 0.50
–
A 0.50 Adjustment Factor applied to Future Trend Chart Values for a 1U, 2s Volume Server yields:
© 2010 IBM Corporation
Water Cooling
© 2010 IBM Corporation
Building
Datacom
Equipment
Center
Datacom
Equipment
Center
Cooling
Tower
Rack
CRAC
CDU
Rack
Load
Chiller
Condenser
Water System
(CWS)
Liquid Cooled
Air Cooled
Chilled
Water
System
(CHWS)
IT Equipment
Liquid Cooled IT
Equipment
Building
Datacom Equipment Center
Cooling
Tower
Rack
IT Equipment
CDU
Load
Chiller
Condenser
Water System
(CWS)
Chilled
Water
System
(CHWS)
© 2010 IBM Corporation
Typical Data Centers – what really happens
CRAC return air
Most of today’s existing data centers attempt to cool the IT equipment
by flooding the air supply with as much cool air as possible. Precision
air flow, as opposed to flooding the space, reduces costly as well as
unnecessary of cooling air and the power to produce it.
© 2010 IBM Corporation
Liquid Cooling IT Equipment – ASHRAE 2011 Guidelines
New
Co
olin
g
To
wer
Chille
r
CD
U
Datacom
Equipment Ra
Lo
Center
ck
ad
Bu
ildi
ng
© 2010 IBM Corporation
Why Water Cooling (vs Air Cooling)?
Water Advantages
• Order of magnitude lower unit thermal resistance
• 3500X heat carrying capacity
Greater Performance
• Total control of the flow
Greater Efficiency
• Lower temperature
Lower power (less leakage)
Better Quality
Better reliability
Water Disadvantages
• Added complexity
• Added cost (but not necessarily cost/performance)
• The perception of water cooling
© 2010 IBM Corporation
Preferred RDHx Implementation
Qload (W)
Twater, inlet ( C )
Tair, rack inlet (C)
~45F (7C)
Water Temp Below Dew Point.
Pipes Require Insulation.
Water Temp Above Dew Point.
Pipes/Hoses Do not Have Insulation.
© 2010 IBM Corporation
Production Data Center with Rear Doors
© 2010 IBM Corporation
Typical CRAC
90% energy savings vs
traditional CRAC units
IBM Rear Door
© 2010 IBM Corporation
Leibniz Rechenzentrum, Garching, Germany
SuperMUC: Warm-Water Cooled 3 PFLOPS System
1Q12—2Q12: ~10000 IBM System x iDataPlex Water Cooled dx360 M4
© 2010 IBM Corporation
dx360 M4 - Water Cooling design
•Hot Water Cooled Node with 90% heat recovery.
•Power advantage over air cooled node by 5-7%.
{Due to lower CPU temps and absence of fans}
•Water inlet 18°C to 450C @ 0.5 liters/min per Node
{37 liters/min Rack}
45 C supply water temperature allows reduced
chiller hours (or elimination of chillers) and
use of water side economizer
iDataplex Rack w/ water
cooled nodes
© 2010 IBM Corporation
Water Side Economization
Compute Room
36.3C
Rack
mounted
heat
exchangers
CDU
32C
28C
WB
30C
Wet Cooler / Tower
•
22 KW/Rack
Equipment water supply temperature
This simple water-side economized configuration provides proper water temperatures to
the iDataPlex direct water cooled and/or rear door heat exchanger cooled racks
throughout the year.
© 2010 IBM Corporation
36
Measuring and Monitoring
© 2010 IBM Corporation
Building
Datacom
Equipment
Center
Datacom
Equipment
Center
Cooling
Tower
Rack
CRAC
CDU
Rack
Load
Chiller
Condenser
Water System
(CWS)
Chilled
Water
System
(CHWS)
Air Cooled
IT Equipment
Measuring &
Monitoring
Building
Liquid Cooled
Datacom Equipment Center
Cooling
Tower
Rack
IT Equipment
CDU
Load
Chiller
Condenser
Water System
(CWS)
Chilled
Water
System
(CHWS)
© 2010 IBM Corporation
Final Thoughts
© 2010 IBM Corporation
Datacom Facility Planning
Unfortunately, for many companies the planning process for the growth of datacom
facilities or the building of new datacom facilities is NOT a well-documented
process.
What we can state with confidence is:
– Each datacom facility is UNIQUE.
– Each company has a UNIQUE culture / value system / business plan / risk
tolerance.
– Each company utilizes different applications, resulting in a different set of
hardware.
This in turn makes the characteristics of datacom facilities VARY QUITE
DRAMATICALLY.
© 2010 IBM Corporation
Datacom Facility Planning
The hardware that makes up the datacom facility should not be the initial focus for
planning a datacom facility.
Although the hardware physically occupies the space on the datacom facility floor,
the software does all the work. Therefore, the planning should begin with an
understanding of the business’ goals, both now and in the future.
Application capacity drives hardware acquisition, which in turn, drives the following
requirements:
– Space Allocation over time
– Total IT Equipment (storage and servers) power over time
– IT Equipment Utilization over time
– Total Images over time
– Number of assets and average age
The interrelationships of the other elements that go into the plan for datacom facility
floor space must be understood. The following is an example of how to plan for
growth.
© 2010 IBM Corporation
Facility Space Utilization Metrics
42
IT Load Power Consumption History
Historical UPS Load
YE2008:
YE2009:
YE2010:
YE2011:
YE2012:
YTD 2013:
25,612 kW
26,989 kW
30,081 kW
31,396 kW
31,143 kW
31,218 kW
Q&A
© 2010 IBM Corporation
Download