CRITICAL ISSUES Power Quality and Reliability Building A Data Center: A Construction Minefield Critical construction projects require special expertise BY MIKE HELLMAN, SIEMENS ENERGY & AUTOMATION W hile every design and construction project presents its own unique set of problems, few can rival the myriad of complex challenges involved in the construction of a modern data center, data farm, or telecom hotel. Although differing greatly in square footage and configuration, all are essentially host sites for some of the largest and most powerful computer servers in the world. Many are web-hosting facilities hatched to serve the explosive growth of the Internet. Some are in downtown high-rise office buildings, others in formerly abandoned warehouses near rail yards. Still others are in suburban office parks. But there’s a reason for each location, just one of the many variables in the complex issues to balance when planning a data farm. “While the growth of the last few years has slowed, e-commerce is still growing. During the Internet boom everyone wanted a piece of the action, and you couldn’t even find the building materials or hardware you needed,” said Bruce W. Bleser, director of Mission Critical Facilities for Black & Veatch, a Kansas City, MO,-based engineering and construction firm with over 25 years of data center experience. Bleser notes there are three main types of data centers: POPs, or point of presence, which provide access to fiber; Internet data centers (IDCs) like Sprint E|Solutions, Exodus, Genuity, and several others that lease rack space or provide fully managed services; and, third, enterprise facilities for huge collections of data by customer-owners such as Wal-Mart or insurance companies. All however, share similar requirements. Power and Fiber “We have helped to build data centers in almost every type of building you can imagine,” said Julio Herdocia, a principle at MTH Engineers in Santa Clara, CA, “and it can be a real challenge. But there are always two main ingredients, power and fiber.” Data centers consume huge amounts of power, sometimes so much they aren’t welcome in some communities. The Wall Street Journal recently report- ed that a new 45,000-square-foot data center planned in Sunnyvale, CA, by Qwest Communications wouldn’t receive power until 2002 by the local utility, PG&E. “They are using enormous amounts of power,” said Keith Reed of PG&E. In just one year the utility received requests from data centers for Just the availability of sufficient power for a data center begs the bigger question: what is the quality of that power and is it reliable? enough power equal to that used by 1.2 million families. The Yankee consulting group predicted power supplies in areas like Santa Clara would have to double in three years, just to supply power for data centers. Considering the power situation in California, that isn’t likely to happen. Some local utilities are making farms pay a surcharge for their high power needs. Bob Royer of Seattle City Light told The Wall Street Journal, “We don’t want the old economy paying for the effects of the new economy.” Data centers must pay additional costs to locate in his town. Critical Power Just the availability of sufficient power for a data center begs the bigger question: what is the quality of that power and is it reliable? “This is the critical question,” says MTH’s Herdocia. “Data centers contract with their customers for high rates of reliability—they can’t go down. We reference the six 9s of reliability or 99.9999% of uptime. That means over a year’s time, the data center will experience only 30 seconds of downtime per year. At this point you are talking a lot of redundancy, with battery back ups, then parallel generators. It gets to a point where you are designing a system that is so complicated it is no longer economically feasible,” Herdocia said. In fact, however, the “only” thirty seconds of downtime per year noted in the six 9s is not a real-time figure. When all backup systems do fail, it can take hours, even sometimes days, to restore the integrity of the system and bring it back up to normal operations. Six 9s is simply a statistical measurement, not what happens in reality. “You must consider the N (norm) plus 1, 2 or 3 (redundancy), a formula for ultimate reliability vs. design/build expense,” he said. And each complete redundancy you add to the design can double the total amount of floor space needed. Bleser of Black & Veatch says he has heard of power demands of over 300 watts per square foot (W/ft2). “I would say the average for IDCs is approaching 200 W/ft2 and 75-100 W/ft2 for enterprise data centers. That is a lot of concentrated power.” Still, data centers can be very profitable. Many new installations make money after just a few months. Revenue can equal as much as $1200 per ft2, about three times that of a regional shopping center. These numbers can make developers and investors very happy. Down by the Tracks If reliable power is available; the next consideration in siting a data center is access to fiber optic cable—and lots of it. Ironically, Row after row after row of servers in a data center bite into the facility’s power distribution capability—as much as 300 W/ft2, according to some reports. Revenue can equal as much as $1200 per square foot, about three times that of a regional shopping center. According to Bleser of Black & Veatch, nearly every watt going into the computer produces heat, which must be disbursed through the HVAC system. “As a rule of thumb, you need to double the total power used by the computers to approximate the power required for the HVAC systems and the balance of the facility,” Bleser said. The racks, fans, and raised floors can usually be accommodated in older warehouse locations, but sometimes the entire roof must come off. And then there is the problem of where to locate the large (and multiple) generators needed and the If reliable power is available; the next consideration in siting a data center is access to fiber optic cable—and lots of it. this requirement often brings the new economy in touch with the old. Fiber optic cables are often located along rail right-of-ways. The existing old warehouse buildings near these tracks can often be refurbished to house a new data center. They offer a number of advantages as well as some unique problems. The high ceilings and open spaces allow for an open interior for racks of computers and the huge amount of HVAC needed to cool them. fuel tanks that serve them. Up to 24 hours of fuel is stored to cover a potential utility failure. Crisis Planning Data centers must be installed in a secure location with 24 /7 controlled access. A plan to perform regular scheduled maintenance without interruption of service must be in place. Fire protection, the ability to cool the equipment during an interruption, and a fortressing plan December 2001 for any conceivable disasters, including earthquakes and even chemical rail car leaks must be accounted for. Any of these can bring a data center down, and not just for thirty seconds. Even the availability and delivery schedule of tankers with fuel for the generators must be evaluated in the design plan. A recovery plan for any conceivable type of emergency is the final chapter in this crisis planning. If all this sounds like you’re building a war room for the U.S. military, you’re not far wrong. “Information has become the new utility of the 21 st century”, said Bleser. “And highly reliable and highly secure data centers are the cornerstone upon which this new utility is being built.” eun About the author: Mike Hellmann is marketing manager, Critical Power Markets, for Siemens Energy & Automation, Atlanta. His responsibilities include identifying mission critical facilities—including data centers, web hosting, collocation, financial and telecommunications facilities—for a broad range of Siemens technologies, typically working with consultant engineers, developers, general and electrical contractors, OEMs, and end users. He received a bachelor of science degree in engineering from Clarkson University. ENERGY USER NEWS 5