Background

advertisement
A Customer Success
from the Experts in
Business-Critical ContinuityTM.
Background
San Diego Supercomputer
Center at the University
of California
The San Diego Supercomputer Center’s role as a national data center
repository and national cyber-infrastructure center, coupled with a
renewed role within the University of California system to become a
key resource for the system’s researchers, led to the need to add both
high and low-density data storage capacity. In 2006 the center broke
ground on a new 90,000 square foot building, doubling the Center’s
physical size and adding trillions of bytes of data capacity and powerful
supercomputers.
Within this facility, SDSC maintains two data centers totaling
approximately 19,000 square feet. The larger of the two data centers, at
approximately 14,500 square feet, is also the older of the two. Known
as the West Room, it utilizes a hot–aisle containment configuration.
The newer data center, known as the East Room, is approximately 4,500
square feet. But as data needs grow with technology, so too does the
need to encourage energy efficiency. With this in mind, SDSC partnered
with Emerson Network Power’s Liebert Associates of Southern California
representative office to optimize two unique data center deployments
in pursuit of a singular goal: improve energy efficiency to establish the
facility as a “green” data center without impacting the availability of
critical systems.
Case Summary
Location: San Diego, California.
Products Services:
• Aisle Containment System from Emerson Network Power
Founded in 1985, the San Diego
Supercomputer Center (SDSC) is an
organized research unit of the University
of California, San Diego and is located
on the UC San Diego campus. Its mission
is to extend the reach of scientific
accomplishments by providing tools
such as high-performance hardware
technologies, integrative software
technologies, and deep interdisciplinary
expertise to the community. SDSC has
served more than 10,000 researchers
at 300 academic, government and
industrial institutions globally.
Emerson Network Power 19” server racks
Modular ceiling solution
Vertical curtain door and ceiling solutions
Critical Need:
Improve energy efficiency in both the existing large data center
and new small data center.
The Situation
Design of the SDSC’s new 4,500-square-foot East data
center started in 2003. The new data center—comprising
disaster recovery equipment, low-density storage and
commodity-based servers such as e-mail, file and web
servers – was planned to be designed for a draw of just
under one megawatt and expected to use only one-half of
the available power and cooling capacity. While average
rack densities were projected between 8-10 kW, the original
design of the new data center did not include an aisle
containment strategy.
The SCSC’s cold-aisle containment design incorporated
130 Knurr racks, most of which are open-framed.
Results
Designed an efficient, optimized cooling infrastructure
for both data centers, taking into account the unique
characteristics of each (density, rack composition, etc.)
to provide two different, yet effective, solutions.
Enabled SDSC to meet its need to add both low and
high-density data capacity while providing sustainable,
energy-saving cooling solutions
Implemented next generation cooling technologies in
the new data center to reduce the PUE (Power Usage
Effectiveness) rating from 1.5 to 1.3 and utilized existing
technologies in the older data center to reduce the PUE
from 1.8 to 1.5.
The new construction would supplement SDSC’s existing
West room, which has the potential to draw up to 12
megawatts of utility power feeding approximately 240
production racks containing high-performance system
supercomputers—enabling rack densities up to 30 kW.
Unlike the equipment to be deployed in the new East room,
racks in the West room are not standard, with heights
ranging between six feet and 12 feet—a characteristic
attributable to the data center’s organic growth over the
life of the facility. Similar to the East room strategy, utilizing
aisle containment was not an initial consideration for the
SDSC team.
While the concept of a PUE rating was in its infancy
during the initial design phase for the East room, as the
facility’s 2006 groundbreaking approached, energy savings
became more of an issue for the university, the community,
and the State of California, which had recently introduced
its “Code of Regulations, Title 24” for energy-efficient
construction. As a result, SDSC’s data center design team
needed to reevaluate its data center infrastructure and
identify opportunities to optimize for these emerging
efficiency initiatives.
“Things just changed over time,” said Matt Campbell, Data
Center Services Manager. “There was a push toward more
energy-efficient operation and new products were being
introduced at the time, so we conducted an energy study
involving both data centers.”
“It’s like having a room within a room, once the
plexi-canopy was installed,” said Campbell. “We had to
make sure we were in compliance with the fire code, so we
added stainless steel flex lines for both gas and water, and
added fire suppression equipment.
“While that stretches out the ROI, my facilities guys also
love the energy improvement since they now can reprovision things like chilled water and electricity.” The
design also incorporated 130 Knurr racks, most of which are
open-framed.
The SDSC containment system creates a physical
separation between cold and warm zones, providing
enhanced cooling while saving large amounts
of fan energy.
The Solution
The preliminary PUE for the East room was estimated at
1.5 based on the types of power supply and racks planned
to be deployed; a similar energy study of the existing West
room revealed a PUE ranging from 1.7 to 1.8. Based on
these findings, the center’s design and management team
determined that it could improve PUE in both rooms by
utilizing containment systems tailored to the unique needs
of each room.
“The lower the PUE the better the efficiency, so we
projected we’d be able to shave .15 to .2 off the East room
design, getting it down to around 1.3,” Campbell said.
“We knew that we wouldn’t be able to do as well with
containment in the West room because the racks are all of
varying heights–it’s like a city skyline of racks.”
The design team sought to utilize state-of-the art
technology in the newly-constructed East room, and
investigated the Emerson Network Power aisle containment
system. This containment system creates a physical
separation between cold and warm zones, providing
enhanced cooling while saving large amounts of fan energy.
“A lot of manufacturers use doors as baffles, but we don’t
use them unless there’s a security requirement,”
said Campbell.
Because the Emerson Network Power modular system,
designed specifically for Knurr racks, was being introduced
in the United States at the time of construction, much
of the equipment was brought in from Emerson’s Knurr
manufacturing facility in Germany. Understanding SDSC’s
need for rapid deployment, Emerson personnel worked to
ensure the custom installation took just over three months
to complete.
In the larger West room, the modular solution was not an
option due to the varying heights of the server racks. As
a result, SDSC’s data center managers chose to utilize the
Liebert vertical curtain solution, using four-foot wide vinyl
curtain drops for rack-to-ceiling coverage and one-foot
by four-foot wide vinyl curtains as entry doors for the hot
aisles. The Liebert vertical solution is a tool-less installation
system that uses small clips to mount on the drop ceiling
runners. The clips have small, round balls that slide onto a
track on the aluminum extrusion holding the vinyl curtain.
The ball turns freely, allowing the extrusion to install at any
angle on the drop ceiling.
“With nearly 240 production racks, we decided to do a
phased build-out,” Campbell said. “We knew we wouldn’t
be able to achieve quite as low PUE levels in the East room,
but we hoped to get it down to around 1.5. We did a small
implementation on a couple of rows to see how the system
would react. We saw an immediate improvement. In one
particular rack, we saw a 12 degree delta, but in the overall
pilot area, the delta flattened out to about 1 degree. Still, it
was obvious what we should do.”
The Results
Because the East room is at a partial capacity
(approximately 60 percent), Campbell estimates that
the energy savings were greater in the West room, todate, based on metering. But with the containment and
pressure monitoring afforded by the modular system in
the East, there are occasions when air handlers can be
turned off completely and cooling accomplished with
makeup air, something that could not have been done
without the modular aisle containment solution.
Achieving the goal of lowering PUE and thermal
separation via aisle containment, the East room level is
currently 1.3, while the West room is 1.5. Beyond energy
savings, the data center also expects that equipment life
will be extended in both rooms.
SDSC’s data center managers chose to utilize the
Liebert vertical curtain solution, using four-foot wide
vinyl curtain drops for rack-to-ceiling coverage, and
one- foot by four- foot wide vinyl curtains as entry
doors for the hot aisles.
In achieving a lower PUE for both rooms—but using
two very distinct cooling strategies—Emerson Network
Power provided expertise in dealing with both a new
construction and retrofit application. It was clear from the
outset that “one size would not fit all,” yet by combining
Emerson equipment and local Liebert expertise, this site
has realized significant energy savings.
“It’s nice to be green,” said Campbell. “We get a lot of
people who want to visit to see how we’ve done this. We
participate in a Green Grid data center users’ group. We
host green conferences and share our experience
with people.”
For more information on Emerson Network Power and
Liebert solutions, visit: www.liebert.com
For more information on the San Diego Supercomputer
Center, visit: http://www.sdsc.edu
Emerson Network Power.
The global leader in enabling Business-Critical ContinuityTM.
AC Power
Connectivity
Embedded Computing
Embedded Power
DC Power
Infrastructure Management & Monitoring
EmersonNetworkPower.com
Outside Plant
Power Switching & Controls
Precision Cooling
Racks & Integrated Cabinets
Services
Surge Protection
Emerson, Business-Critical Continuity and Emerson Network Power are trademarks of Emerson Electric Co. or one of its affiliated companies. ©2011 Emerson Electric Co.
CH-00025 (10/11)
Download