Adaptive Architecture - Cooling XD

Future of Cooling High Density
Equipment
Steve Madara
Vice President and General Manager
Liebert Precision Cooling Business
Emerson Network Power
Agenda
z Issues facing the Data Center
– Managing the explosive growth
– Power consumption getting the attention of Governments
z Developing the right cooling strategy for today and
into the future
z Examples of how the right cooling strategy can
lower the Total Cost of Ownership
z Myth – High Density computing is more costly
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
2
Managing High Density Servers
Rack density trend when
fully populated with the
newest server technology
How most sites are dealing
with server density
2000
2002
2006
28 x 2U Servers 42 x 1U Servers 6 Blade Centers
2kW Heat Load 6kWw Heat Load 24kW Heat Load
2009
Rabid Blades
40kW Heat Load
Heading Toward 50kW
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
3
Progression to High Density
The average server replacement cycle is 3-4 years
Rack kW
1 kW
2 kW
5 kW
10 kW
15 kW
20 kW
Issues facing the IT Manager
z
z
z
z
z
Getting Air out of the Racks
Hot air mixing with the inlet of other racks
Diversity of loads in the Data Center
Not aware that more Fans create heat
Flexibility of “On Demand Cooling”
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
4
DCUG Survey Results – What are the
Biggest Issues facing the IT Manager
0%
Heat Density (Cooling)
Space Constraints/Grow th
Pow er Density
Adequate Monitoring Capabilities
Availability (Uptime)
Technology Changes / Change
Energy Costs / Equipment Efficiency
Other
Security (Physical or Virtual)
Data Center Consolidations
Data Storage
Hardw are Reliability
Regulatory Compliance
Staffing/Training Limitations
10%
20%
30%
40%
50%
22%
19%
18%
8%
7%
7%
5%
4%
3%
2%
2%
1%
1%
0%
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
5
Cooling Presents an Opportunity
For Energy Savings
Data Center Power Draws
Cooling
About 37%45%
z
z
z
z
z
z
Electricity
Transformer/
UPS
10%
Air
Movement
12%
Cooling
25%
Lighting, etc.
3%
IT Equipment
50%
Source: EYP Mission Critical Facilities Inc., New York
Sources of Energy Waste
Fans / Blowers running on redundant units
Lack of air containment (cable openings, room leakage)
Unnecessary cooling unit cycling on and off
Lack of humidification control between units
Mixing of Hot and Cold air lowering the effectiveness of the cooling unit
Excess Fan energy that turns into heat
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
6
US EPA Report Data Center Efficiency
Public Law 109
-431
109-431
z
Energy consumed by servers & data centers has doubled
in last 6 years and is expected to double again in next 5
years to > 100 Billion kWH
z
State-of-the-Art technologies and best management
practices could reduce electricity use by up to 55%
z Recommendations include:
– Standardized performance measurements for data centers and
equipment
– Incentive programs
– Research and development
– Industry collaboration and partnerships
www.energystar.gov/ia/partners/prod_development/downloads/EPA_Da
tacenter_Report_Congress_Final1.pdf
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
7
Understanding The Basics
z Heat generated is directly related to the server power (100%)
z As server power increases (kW), the airflow (CFM) through
z
z
z
z
the rack increases proportionally
Raised floor tiles are limited in airflow (about 500-1000 CFM)
Higher entering air temperatures on a cooling unit will provide
more capacity and increased efficiency
Higher density servers will have a greater range of
temperatures leaving the rack over time (larger swings of
server load)
Fan horsepower to move the air is significant and all the
power turns into heat (100 kw Cooling unit uses typically a 10
HP motor that generates 8.5kW of heat)
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
8
Having The Right High Density Cooling
Strategy Delivers. . .
z Cooling for High Density racks
– Increase servers per rack
– Increase number of racks per room
z Energy Efficiency
– Lower Operating Costs
– More power allocated for IT/Server loads
Get more out of your existing facility
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
9
Planning for High Density requires A
Systems Approach to the Cooling System
Traditional Floor-Mount through the first 100-150 w/sqft
(or 4-5 kW per rack) and Supplemental Cooling above
that level
Provides high density sensible cooling at
the source
Computer
Room /
Data
Center
SUPPLEMENTAL COOLING
TRADITIONAL
Floor-Mount
0
5
Controls Humidity and Filtration
+
Creates the base airflow distribution
10
15
20
25
30
35
40
kW / Rack
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
10
Critical Requirements Of The Base
Cooling Load Equipment
z Cooling units with Variable cooling capacity
– DX – Compressors such as the Digital Scroll
– VFDs on the fans
z Controls so that units can work as a team
– Eliminate dehumidification/humidification fighting
– Balance the load
– Optimize the cooling performance
z High efficiency condensers
z “Green Refrigerant” products
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
11
Moving the Heat
Room
Servers
Chip
Heat
Sink
Rack
CRAH
Memory
Other components
Network Devices
Power Supplies
Chiller
Heat
Rejection
Areas for Improvement
z Reduced server fan power
z Higher temperatures over cooling coils (room temperature / less
mixing of hot and cold air)
z Reduced resistances lower fan power
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
12
Cooling Solutions to Meet the
Higher Density Requirements
z Requires moving the cooling closer to the heat
source to more precisely cool the specific load
and not waste energy “brute force” cooling the
whole room.
z Cooling coils may be in multiple locations
– External to the Rack – Overhead, Back, Side
– In the Rack – Under, Side
– Part of the Server
z Requires a fluid to be delivered to the cooling
coils to transport the heat out of the Data Center
– Chilled Water (CW)
– Refrigerant (pumped low pressure)
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
13
Energy Efficiency Benefits for Cooling
Closer to the Load
Traditional Cooling Only
Fan Power- 8.5kW per
100 kW of Cooling
Average entering air
temperature of 80-84°F
Cooling
Unit
Liebert XD & Base Cooling
Fan Power- 3.5 kW per
100 kW of Cooling
(XD @ 2 kW per 100kW)
Average entering air
temperature of 96-98°F
Blower
Resistances
z
65% less fan
power
z Greater
cooling coil
effectiveness
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
14
Liquid Cooling Configurations –
with Building Chilled Water
Building Chiller
XDWP
CW
Building
Chilled
Water
Valve
Heat
Exchanger
Tank
CDU
Pump
Pump
2nd
Loop
XDP
Refrigerant
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
15
Open and Closed Architectures
z
Open and Closed Architecture Systems as defined by
ASHRAE
– The open architecture systems utilize cooling coils
near the heat load either inside or outside the server
rack and utilize the room air volume as a thermal
storage to ride through short power outages.
– The closed architecture fully encloses the rack with
the cooling coil inside. Other provisions are
required for power loss ride through.
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
16
Secondary Fluid Comparisons
Pumped Refrigerant System Chilled Water Based System
Advantages
z Advantages
– No Water in the DC or
– Lowest Fluid Cost
electrical hazards
– No limitation to room size
– Micro-channel coil efficiency
–
–
(+50%), lower air side pressure
drop => lower operating costs
Smaller piping requirements
z
Cooling Modules can be
located anywhere
Scalable Capacity (2-3x to CW )
–
z Disadvantages
– Small room scalability
– Higher Fluid Cost
Disadvantages
– Electrical Hazard
– Lower Operating Efficiency
– May require fluid treatment to
–
prevent fouling
Limited overhead cooling
options
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
17
Flexibility - Extends Your Existing
Infrastructure Investment
z
z
z
z
z
z
z
On demand, plug-and-play flexibility to add additional capacity
Cooling at the source of heat with advanced compact heat exchangers
Multiple module configurations to meet any Data Center layout
Works with any brand of racks
Cooling fluid is a gas (no water)
Self-regulating capacity
100% Sensible cooling
Liebert XDV
Liebert XDO
Hot Aisle – Cold Aisle Configuration
Hot Spots, Zones and Hot Rooms
Liebert XDH
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
18
Plug and Play Capacity on Demand
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
19
Liebert XD Energy Efficiency Benefits
z Cooling closer to the source
– Dramatically less fan power required to move the air
– Higher air temperature entering the coil results in
increase performance
z Coil Technology
– Microchannel coils – most efficient coil surface
z Sensible cooling
– All cooling module operate the coil at 5 degree F
above the dew point in the room
– Does not unnecessarily dehumidify requiring additional
humidification (value of 7% in efficiency)
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
20
The Traditional Way
AC
AC
AC
PDU
PDU
PDU
PDU
AC
PDU
(120 racks @ 8kw/rack)
z (12) 30 ton CW Air Handlers
PDU
– 10 operational for the load
– 2 stand by
PDU
PDU
AC
PDU
PDU
PDU
PDU
AC
AC
PDU
PDU
PDU
PDU
AC
PDU
z Floor space – 4256 sqft
PDU
z Requires a raised floor of 48”
PDU
AC
PDU
PDU
AC
PDU
PDU
AC
PDU
AC
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
21
Liebert XD Solution (120 racks @ 8kw/rack)
AC
AC
z (4) 20 ton CW Air Handlers
PDU
PDU
PDU
PDU
PDU
PDU
XD
– 3 operational for the load
– 1 stand by
XD
z (6) XDP with (96) XDV
XD
PDU
PDU
PDU
PDU
PDU
PDU
PDU
PDU
PDU
PDU
PDU
PDU
– 5/80 operational
– 1/16 redundant
XD
z Floor Space – 3640 sqft
XD
XD
PDU
PDU
PDU
AC
PDU
PDU
PDU
AC
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
22
Total Room Load Calculations
120 Racks
8 kW per Rack
Rack Loads
Fans(BHP)
Room Latent
Excess Latent
PDU
People
Build Env
Lights
Total (kw)
Floor Mount
AH
Liebert XD &
Floor Mount
960.0
101.7
5.1
137.3
28.8
1.5
7.9
5.6
960.0
44.6
5.1
29.8
28.8
1.5
7.9
5.6
1247.9
1083.3
Liebert XD benefits
– Less Fan power
– 100% sensible
Smaller load to size the
Chiller
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
23
Summary of Equipment
120 Racks
8 kW per Rack
Floor Mount AH
Liebert XD
Liebert XD &
Floor Mount (vs traditional method)
Chillers
CW Pumps
(3) 250 ton
(3) 25 HP
(3) 200 ton
(3) 20 HP
Floor Mount Units
Liebert XD
(12) 30 ton
(4) 20 ton
(96) XDV
Floor Space (sqft)
Raised Floor Height (in)
4256
48
3640
24
– 15% less floor
Scalable Design
8kw
20kw
– Scalable
– 20% less Chiller
plant
space
Platform
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
24
Annual Energy Consumption (kW)
120 Racks
8 kW per Rack
Floor Mount Liebert XD &
AH
Floor Mount
Prec Air Units
XDV
XDP Pumps
CW Pumps
Chiller
Tower Fans
Condenser Pumps
Rehumidification
101.7
0.0
0.0
42.4
195.1
15.3
15.3
51.5
25.4
19.2
6.3
33.9
169.4
13.2
13.2
11.2
Total Kw
421.3
291.9
Operating Costs (@$.08/kw) $
Delta
295,231
$
$
Cooling closer to the
source is more efficient.
204,571
(90,660)
-31%
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
25
Capital Costs – Total Impact
120 Racks
8 kW per Rack
Results
Floor Mount Liebert XD &
AH
Floor Mount
Chiller
Cooling Units
Installation
Floor Space
Total (E,I,S)
$
$
$
$
$
Delta
Operating Savings
Payback (yr)
187,500
135,360
322,860
645,720
$
$
$
$
$
$
150,000
258,015
452,265
(123,200)
737,080
91,360
$
(90,660)
1.0
Traditional – 4256 sqft
Liebert XD – 3640 sqft (-616 ft²)
Example at $200 / ft²
Industry range $250-1000 / ft²
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
26
Liebert XD Technology - Fluid Cooling at
the Source Drives Down Operating Costs
Chiller Capacity
Latent Load
Fan Load
Sensible Load
1.20
1.00
Annual Power Consumption
0.80
0.450
30% Lower
0.60
0.400
0.40
0.20
0.00
Traditional CW CRAC
CW Enclosed Rack
Refrigerant Modules
20% less capacity of the
support equipment
– Chiller (s)
– Cooling Tower / Condensers
– Chiller water circulating
pumps
kW power to cool 1 kW of sensible heat
kW Chiller Capacity per kW of Sensible Heat Load
1.40
0.350
0.300
Fan
0.250
Pump (CDU)
Pump (CW)
0.200
Chiller
0.150
0.100
0.050
0.000
– Emergency Generators
– Electrical Switch Gear
Traditional CW CRAC
CW Enclosed Rack
Refrigerant Modules
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
27
Liebert XD
Full Range of Opportunities
XDO20
XDV10
XDC
or
XDP
Base
Infrastructure
(160 kw)
Future pumping
units of larger
capacity
XDH20/32
Standard
Cooling
Modules
10-35kw +++
Embedded
Cooling
Embedded & Chip
Cooling
(microchannel
intercoolers)
(microchannel
intercoolers and
Cooligy chip cooling)
Tested 35-60kw
Developing up to
100kw
Tested 50 kw (100%
redundant)
Capable over 100kw
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
28
Energy Savings driven by
reduction in Fan Power (cooling
system and server) plus Heat
Transfer efficiency
0.65
0.6
ÅTraditional CoolingÆ
2010
2006/2007
2004/2006
2006/2007
2005
2000
0
0.2
Future
0.1
45%
Component
Cooling
(>50 kW)
0.2
0.31 30%
Embedded
Cooling
(35 – 60 kW)
Optimal
0.3
0.36
(10 – 20 kW)
Data Center Best Practices
Enclosed Rack
0.4
0.42
Egenera
0.5
XD Modules
(10 – 35 kW)
0.7
1990
kw for cooling per kw of server heat load
Additional System Opportunities for
Improved Cooling Efficiencies
Å Liebert XD Opportunities Æ
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
29
TCO
Flexibility
Availability
TCO
Flexibility
Availability
TCO
Flexibility
Availability
TCO
Flexibility
Availability
TCO
Flexibility
Availability
TCO
Flexibility
Availability
TCO
Flexibility
Availability
TCO
Flexibility
Availability
TCO
Flexibility
Cooling Process
Availability
Cooling Process Throughout The Range
Of Rack Loads
Chip/Component Cooling
Embedded Rack - Refrigerant
Rack Mounted - Refrigerant
Rack Mounted - CW
Traditional CRAH
>0-5
>5-10
>10-15
>15-20
>20-25
>25-30
>30-35
>35-50
>50
Rack Load (kW)
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
30
Improving The Total Cost Of Ownership
with the Liebert XD Cooling Systems
How
‰ Cooling closer to the source of the heat
makes the heat exchangers more efficient
(higher entering air temperatures)
‰ Lower total Fan HP
‰ Sensible cooling eliminates wasted energy
dehumidify unnecessarily and then having to
re-humidify
‰ Less Chiller or DX infrastructure required
Results in …
¾ More cooling capacity
for energy consumed
¾ Less Power (energy
consumed)
¾ Less Power (energy
consumed)
¾ Less Power and
Capital Equipment
‰ Overhead cooling modules require no
¾ Less Floor Space
additional floor space
consumed
‰ Cooling solutions that meets the requirements ¾ Less Floor Space
consumed
to fill racks of high density servers
‰ Infrastructure for modules today and for future ¾ Extends your capital
life
server / rack designs
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
31
Lower Power For Cooling Provides
More Power For IT Equipment
Data Center Power Draws
12%
26%
Electricity
Transformer/
UPS
10%
Air
Movement
12%
Lighting, etc.
3%
59%
IT Equipment
50%
Cooling
25%
For the same building power, you can allocate
more power to the IT Equipment (18%)
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
32
Take Aways
z
Cooling solutions for higher density will need to move
closer to the load and will require a reliable fluid
delivery means
– Cost effective cooling solutions exist that can be
employed today that meet future needs
– Allow racks to be fully populated
z Cooling at the source of the heat load will actually
lower your incremental energy consumption
– Less power for the cooling system provides more
power for the IT equipment
Results in more available Floor Space and Growth
capability for the IT equipment
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
33
Thank You
Contact:
Steve.Madara@EmersonNetworkPower.com
2007 IBM Power and Cooling Technology Symposium
© Copyright 2007 Liebert Corporation. All rights reserved.
34