Data Center Site: How Deep Can You Go? Efforts to retrofit subterranean bunkers into functional data center space have been underway for years. But as power requirements and security considerations have intensified, selecting underground sites that are specifically designed from day one to house mission critical infrastructure is a new trend.
Ultra-secure underground data center space is being fueled by projected energy, security and regulatory benefits relative to above ground alternatives.
The Bunker News – KENT, UK – The Bunker, provider of ultra secure Managed Hosting, Cloud Computing, Colocation, and Outsourced IT from within Europe’s most secure data centres, is pleased to announce that it has successfully opened a significant new high specification data hall in Kent.
The former military operations room at its South East facility becomes the latest high specification data hall to open in Europe’s most secure data centre. The new data hall has been fitted out with state of the art power and cooling infrastructure, providing up to 32 amps of UPS power per rack and N+1 fully redundant power and air conditioning.
The facility also benefits from hot and cold aisle separation, dual power feeds, generator back-up and over 150 ISP connections available.
The new data hall in Kent is part of The Bunker’s continuing investment programme, to meet the growing demand for high specification, ultra secure data facilities for fully managed and colocation customers.
Simon Neal, Director of Data Centre Services at The Bunker, said: “I’m delighted with the new data floor at our Kent facility. The ongoing investment allows us to expand our business and provide security conscious organisations with the reassurance of knowing that their hosting and managed services needs are being met in an ultra secure, high specification environment.”
Growing awareness of limited power availability and high prices in London is driving up demand for high power density, out-of-London locations. The Bunker is an obvious choice for organisations that require secure, high specification, managed services and colocation hosting facilities.
“The first customer has already moved in and we anticipate a rapid take-up of space and the opening of phase 2 in the near future.” continued Neal.
Bunker 2
State of the art and military grade The Bunker is expanding its Kent operations. The Bunker2 maintains the highest standards set by The Bunker by blending state of the art M & E infrastructure with military grade security. The new data centre facility provides an alternative highly secure, low risk Internet hub outside London servicing clients who wish to establish systems in the UK, but who also require access to the World Wide Web without entering the high risk London zones.
The Bunker2 offers 12,000m2 Tier 3 data floor within the existing defences of The Bunker’s facility, along with 2,000m2 of office and support accommodation, staffed 24/7/365 by security and technical personnel. The development will be of a modular low profile design, cut into the landscape with a grass roof. Each module of 2,200m2 blends into the surrounding environment with minimum visual impact.
London is the second largest Internet Exchange in the world. Unfortunately London is also one of the highest risk cities in the world.
The Bunker2 is a Carrier Neutral Data Centre, situated at a strategically important location on the cross roads of the main Internet fibre routes leaving the UK towards Europe, Asia and the USA. For our clients this affords peace of mind that traffic can be routed via the traditional channels across London, however in the event of a catastrophe, disabling the main Internet hubs in the London Docklands and the City, traffic can still route to any location in the world via the main Internet exchanges in Europe.
The next step for a trusted brand
The Bunker2 facility
The Bunker2 is owned and developed by The Bunker Secure Hosting Limited. The Bunker delivers secure Managed Hosting and Data Centre solutions from within Europe’s most secure Data Centre and has done so since 1994. Our technical leaders are recognised experts in security and cryptography, renowned for their work on Apache-SSL. Our management team includes Data Centre experts with over 10 years experience. The Bunker is ISO 27001 accredited, are recognized experts in Open Source technology, we are Microsoft Gold partners and practice Prince 2 and ITIL standards.
About The Bunker
The Bunker delivers Ultra Secure Managed Hosting, Cloud Computing, Colocation, and Outsourced IT from within Europe’s most secure data centres.
Our data centres, which are outside the M25 yet within easy reach of London, are military-grade nuclear bunkers purpose built to house the UK’s air defence systems. We run 24x7x365 – our NOC monitors systems both nationally and internationally and is staffed around the clock by system and network engineers and security staff. The Bunker is ISO 27001 and PCI DSS accredited and follows ITILv3 best practice and PRINCE2 project management standards.
Our clients are financial services organisations, technology companies, healthcare, government and other regulated businesses that value a premium service built around security.
For more information visit www.thebunker.net info@thebunker.net
Green Mountain Data Center
Located inside the mountain in a former NATO ammunition depot (the largest in northern Europe)
Built for the highest military security level. Secured against electromagnetic pulses (EMP)
"Nuclear secure" facility, secured against sabotage and direct attack from the sea.
Green Mountain Data Center Green Mountain Data Center a prime piece of real estate tucked inside a scenic Norwegian mountain. Built next to a cool water fjord and surrounded by evergreens and lush rock-clinging mosses, the space boasts of bright, airy subterranean halls carved out of natural cave walls and almost transcendental settings above ground. This will be the comfortable new home for many of Norway’s data servers. The Green Mountain Data Center is one of the first pioneering data centers that will greatly reduce its costs by harnessing the cooling power of the environment, namely, the steady flow of cool water from an adjacent fjord. Alas, the grass seems to be consistently always greener in Scandinavia.
The Green Mountain Data Center contains nine ‘Mountain Halls’—each spanning well over 1,000-square-meters of space to host rows and rows of servers—a workshop, and an administration building. Its servers will be hooked up to an uninterrupted supply of power from a total of eight independent generators as well as three supply lines connected to the central Norwegian network, and its carbon footprint has been thoroughly eliminated.
Of course its most compelling feature, aside from its generally pleasant, Hobbit-like atmosphere noted by Gizmodo, is the cooling system, which relies on the nearby Rennesøy fjord to provide an abundance of cold water year round to cool its resident motherboards. Facebook has gone a similar route by planting a server farm in the Arctic, but we wouldn’t be hard pressed to say that we like the hospitable environment of this data farm better, and it’s nice to see yet another Scandinavian mountain bunker to add to our favorites!
The Mountain Hall
Approx. 21,500 m2 floor space in the mountain
The areas consists of:
– 6 mountain halls each of 1,855 m2
(11 x 164 m each) in size
– 2 mountain halls of 1,546 m2 (19 x 82 m each) in size
- 1 Mountain hall with internal structure 1,370 m2 in size
- I.e. combined mountain halls of 15,692 m2
- Warehouse/workshop 520 m2
- Administration building 840 m2
- Quay w/"roll on-roll off" option
Fire safety and fire
protection
Closed caverns enable the use
of inert / hypoxic air ventilation
Reduced oxygen level to prevent fire and smoke
- 02 reduced to 15 -16 %
- Fire cannot arise as the combustion process
does not get enough oxygen
- Corresponds to an altitude of approx. 3,000 m
Hypoxic air ventilation/Inert ventilation system
- Reduces/limits smoke formation
- Prevents combustion/fire
- Ensures continuous operation
- No fire damage
- No secondary extinguishing damage (corrosion,
harm to the environment, poisoning, etc.)
- No problems with hard disks due to the triggering
of fire extinguishing equipment
Safe as a vault
Located inside the mountain in a former NATO
ammunition depot (the largest in northern Europe)
Built for the highest military security level
- Secured against electromagnetic pulses (EMP)
- "Nuclear secure" facility
- Secured against sabotage and direct attack
from the sea
"Best in class" data security
Communication
- redundancy
High capacity and redundancy
Local broad band operators
Good connectivity to the world
Multiple high capacity lines to Oslo
Multiple high capacity lines directly to the UK
Multiple high capacity lines to continental Europe
Carrier neutral availability
Subterranean “data bunkers” in unusual locations continue to stir the imagination of the technology world. The latest data hideout to enter the spotlight is the “Swiss Fort Knox” facility deep below the Swiss Alps, which offers ultra-secure data storage in a nuke-proof data center. The facility was featured in the November issue of Wired magazine.
Like many of the data bunkers, the Swiss Fort Knox facility takes advantage of existing infrastructure. in this case an old Cold War bunker built by the Swiss military and designed to survive a nuclear blast. The facility is really two separate data centers about 10 kilometers apart, which were developed over the past 15 years by SIAG (Secure Infostor AG), a Swiss provider of IT security solutions. Two related companies, Mount 10 Swiss Data Backup and SISPACE AG, provide services within Swiss Fort Knox bunker, with Mount 10 providing secure data backup while SISPACE focuses on records storage and management.
The data center also takes advantage of the cooling potential presented by its location deep under the mountains. The facility uses Mother Nature as its chiller, pulling glacial water from an underground lake to use in its cooling systems. It also features survivalist-level security measures, including face-recognition surveillance software, bulletproof plastics and vault doors courtesy of the Swiss banking industry.
More details of the facility’s operation are available at the web sites for Mount 10 and Swiss Fort Knox.
CHICAGO – From the outside, the Gothic brick and limestone building a few blocks south of downtown almost looks abandoned.
Plaques identify it as a landmark completed in 1929, a former printing plant that once produced magazines, catalogs and phone books. The sign over the main door says "Chicago Manufacturing Division Plant 1."
There are hints, though, that something is going on inside. Cameras are aimed at the building's perimeter. A small sign at the back entrance says "Digital Realty Trust."
Sturdy gates across the driveway keep the uninvited out.
There's good reason for the intentional anonymity and security, says Rich Miller: "The Internet lives there."
Miller, editor of Data Center Knowledge, which tracks the industry, and Dave Caron, senior vice president of portfolio management for Digital Realty, which owns the 1.1 million-square-foot former R.R. Donnelley printing plant, say it is the world's largest repository for computer servers.
Caron won't identify its tenants, but he says the building stores data from financial firms and Internet and telecommunications companies. "The 'cloud' that you keep hearing about … all ends up on servers in a data center somewhere," he says.
There are about 13,000 large data centers around the world, 7,000 of them in the USA, says Michelle Bailey, a vice president at IDC, a market research company that monitors the industry. Growth stalled during the recession, but her company estimates about $22 billion will be spent on new centers worldwide this year.
The need for data centers is increasing as demand for online space and connectivity explodes. Some are inside generic urban buildings or sprawling rural facilities. For all of them, security is paramount. Inside, after all, are the engines that keep smartphones smart, businesses connected and social networks humming.
Some data centers have "traps" that isolate intrusions by unauthorized individuals, technology that weighs people as they enter and sounds an alarm if their weight is different when they depart, bulletproof walls and blast-proof doors, Bailey says.
When Wal-Mart opened a data center in McDonald County, Mo., a few years ago, County Assessor Laura Pope says she signed a non-disclosure agreement promising "I wouldn't discuss anything I saw in there." She hasn't.
Borrowing a line from a 1999 movie, Miller says, "I used to kid about the Fight Club rule: Rule No. 1 is you don't talk about the data centers, and Rule No. 2 is you do not talk about the data centers."
Although the rapid growth of data centers has diminished their ability to "hide in plain sight," he says, many owners and occupants are "very secretive and … sensitive about the locations."
That makes sense, Miller says. "These facilities are critical to the financial system and the overall function of the Internet."
Making new use of the old
Some data centers — sometimes called carrier hotels because space is leased to multiple companies — are in large urban buildings where they can tap into intersecting networks, Miller says
Old manufacturing facilities such as Chicago's Donnelley printing plant often are repurposed because they have high ceilings and load-bearing floors to support heavy racks of servers.
"They are interesting examples of the new economy rising up in the footprints of the old," he says.
Giant companies such as Google, Facebook, Apple, Yahoo and Amazon often build their data centers in rural areas. "They're looking for cheap power and cheap real estate," Miller says. While the number of private centers grows, the federal government is consolidating. It has more than 2,000 data centers and this summer announced plans to close 373 by the end of 2012.
Communities such as Quincy, Wash., population 6,750, and Catawba County in western North Carolina want to become data center hubs. Catawba and neighboring counties dubbed themselves "North Carolina's data center corridor," says Scott Millar, president of the Catawba County Economic Development Corp.
Apple last fall opened a 500,000-square-foot, $1 billion facility in Catawba County. Google and Facebook have data centers in nearby counties and more are under construction.
Catawba County is building a second data center park in hopes of attracting more, Millar says. Because data centers don't require many employees, most of the permanent jobs are created by contractors who provide electrical, cooling or security support, he says. About 400 people work at the giant Chicago data center; many employ far fewer.
The Apple data center, Millar says, is "pretty secretive." No signs indicate what the building holds, he says, "but everybody knows what it is."
James Lewis, a senior fellow in technology and security at the Center for Strategic and International Studies, a public policy research group in Washington, D.C., compares the evolution of data centers to changes in the way electricity is generated.
A century or more ago, he says, factories and other companies operated their own electric plants to power their lights, elevators and other functions. Those with spare capacity began to sell it to their neighbors. "That's what happened to computing," Lewis says.
Instead of maintaining computer servers in their own facilities for rapidly growing data storage needs, some businesses locate their servers or backup servers in data centers, he says. They can save money because the centers minimize energy consumption, ensure security and allow computers to share tasks. Data centers also give companies places to store backed-up data that is crucial to their businesses.
"The amount of data in the world doubles every couple of years and people … are willing to pay for it to be stored," Lewis says.
He doesn't think it's essential to conceal the centers' locations, though, because hackers won't try to come in through the front door. "The main source of risk isn't physical, it's cyber," he says. "If hiding the location … is all that they're doing, they're not doing enough."
Tall building, low profile
Keeping a low profile is just the beginning of the security measures at Digital Realty Trust's massive Chicago data center.
The exterior is embellished with terra cotta shields depicting printers' marks. The building occupies almost a full block, is nine stories tall and has a 14-story tower. Inside, there are visible and unseen protections, some of which the company won't talk about publicly. There are guards at both entrances, cameras inside and out, motion sensors and much more. To access the rooms where rows of servers live, a card must be scanned and a fingerprint recognized.
The interior of the building is a mix of old and new. Because it is a landmark, its wood-lined two-story library, which has been used for photo shoots, must be kept intact. Some corridors feature stone arches overhead, and some offices are paneled in English oak.
Other hallways are sterile and silent. Inside the locked doors of the individual data centers are locked metal-grid cages and, inside them, rows of black shelving with the blinking lights of servers visible through the doors. The only sound is an electronic buzz. Cameras scan every square foot of the room.
Between the rows of servers are "cooling aisles" with thousands of round holes in floor tiles feeding cool air into the space. Over the server shelving are ladder racks that suspend "raceways" — yellow plastic casing enclosing fiber optic cables. The shelving doesn't extend to the ceiling; air must circulate above the servers to keep temperatures down.
Caron says it costs $600-$800 per square foot to build a data center and often less than $70 a square foot for a normal industrial building, including the land. The giant printing presses that once filled space in the former Donnelley building made it ideal for conversion to data center use, he says. A data center floor must be able to handle at least 150 pounds and as much as 400 pounds per square foot. By comparison, most office buildings are built for 70 pounds per square foot.
Huge amounts of electricity power all those servers, he says: 100-150 watts or more per square foot, compared with 3-5 watts for each square foot of an office building. To keep the servers running, there's more than one electrical feed into the building and backup systems and generators ensure there's never an interruption in power. The Chicago facility has 63 generators.
Digital Realty Trust, which bought the building in 2005, owns 96 properties, most of them data centers, in the USA, Europe and Asia, Caron says. There is, he says, "a lot of demand" and the company expects to spend up to $500 million this year on acquisitions. Last year it spent more than $1 billion , he says.
'You have no idea what's here'
Not every data center is a fortress. The one owned by the city of Altamonte Springs, Fla., is a former 770,000-gallon water tank next to City Hall.
Lawrence DiGioia, information services director in the city of 40,000, says he relocated the city's servers after being forced by three hurricanes to pack everything up to keep them out of harm's way. The tank has 8-inch-thick walls. "It did a great job holding water in," he says, "so we knew it could keep water out."
Even a small-scale data center needs security, though. DiGioia says his is protected by video surveillance, requires dual authentication to enter and a biometric lock limits access to the server room.
It's even harder to get into the five data centers 200 feet deep in a former limestone mine in Butler County, Pa.
"The facility affords a very high level of security, not only physical — armed guards, steel gates, layers of security, biometrics — but also we're protected from the elements, civil unrest, terrorist-type things," says Chuck Doughty, vice president of the Underground, as it's called, for Iron Mountain, an information management company.
Except for the cars parked outside, he says, "you'd have no idea what's here." Besides 7 million gigabytes of digital data, including e-mail, computer backup files and digital medical images such as MRIs, the Underground is home to documents, film reels and computer backup tapes owned by the U.S. Patent and Trademark Office, Sony Music and Universal, among others.
Doughty worked for years on Room 48, an experiment in making data centers more energy-efficient and reliable, and is working now on ways to utilize some of the cold water in the mine to cool the computer space without using chillers or cooling towers. He hopes to begin construction next year.
The security of data centers, Doughty says, is becoming increasingly important for companies and governments "not only because of the situation in the United States with terrorism, but because of the world situation."
Lewis says one of the lessons of the Sept. 11 terrorist attacks was the importance of having data stored in more than one place. As more data centers are built, he says there will be more debate about legal issues: What happens if law enforcement has a warrant for a server that also contains data owned by other companies? Should there be standards for protecting consumers, including requirements that they be notified of breaches? Should data centers be regulated by the government?
John McKay, a visitor to Chicago from Vancouver, Canada, snapped photos of the former printing plant recently. A brochure highlighting historic buildings in the neighborhood had led him to it.
Technology based companies are building new data centers in old mines, caves, and bunkers to host computer equipment below the Earth's surface.
Underground Secure Data Center Operations have a upward trend.
Operations launched in inactive gypsum mines, caves, old abandoned coal mines, abandoned solid limestone mines, positioned deep below the bedrock mines, abandoned hydrogen bomb nuclear bunkers, bunkers deep underground and secure from disasters, both natural and man-made.
The facility have advantages over traditional data centers, such as increased security, lower cost, scalability and ideal environmental conditions. There economic model works, despite the proliferation of data center providers, thanks largely to the natural qualities inherent in the Underground Data Centers.
With 10,000, to to over a 1,000,000 square feet available, there is lots of space to be subdivided to accommodate the growth needs of clients. In addition, the Underground Data Centers has an unlimited supply of naturally cool, 50-degree air, providing the ideal temperature and humidity for computer equipment with minimal HVAC cost.
They are the most secure data centers in the world and unparalleled in terms of square footage, scalability and environmental control.
Yet, while the physical and cost benefits of being underground make them attractive, they have to also invested heavily in high-speed connectivity and redundant power and fiber systems to ensure there operations are not just secure, but also state-of-the-art.
There initially focused on providing disaster recovery solutions, and backup co-location services.
Clients lease space for their own servers, while other provides secure facilities, power and bandwidth. They offers redundant power sources and multiple high-speed Internet connections through OC connected to SONET ring linked to outside connectivity providers through redundant fiber cables.
Underground Data Centers company augments there core services to include disaster recovery solutions, call centers, NOC, wireless connectivity and more.
Strategic partnering with international, and national information technology company, enable them to offer technology solutions ranging from system design and implementation to the sale of software and equipment.
The natural qualities of the Underground Data Centers allow them to offer the best of both worlds premier services and security at highly competitive rates.
Underground Data Centers were established starting in 1990's but really came into there own after September 11 attacks in 2001 when there founders realized the former mines, and bunker offered optimal conditions for a data center. The mines, and bunkers offered superior environmental conditions for electronic equipment, almost invulnerable security and they located near power grids.
Adam Couture, a Mass.-based analyst for Gartner Inc. said Underground Data Centers could find a niche serving businesses that want to reduce vulnerability to any future attacks. Some Underground Data Centers fact sheet said that the Underground Data Center would protect the data center from a cruise missile explosion or plane crash.
Every company after September 11 attacks in 2001 are all going back and re-evaluating their business-continuity plans, This doesn't say everybody's changing them, but everybody's going back and revisiting them in the wake of what happened and the Underground Data Center may be just that.
Comparison chart: Underground data centers
Five facilities compared
Robert L. Mitchell
Name
InfoBunker, LLC
The Bunker
Montgomery Westland
Cavern Technologies
Iron Mountain The Underground
Location
Des Moines, Iowa*
Dover, UK
Montgomery, Tex.
Lenexa, Kan.
Butler County, Penn.*
In business since
2006
1999
2007
2007
Opened by National Storage in 1954. Acquired by Iron Mountain 1998.
Security /access control
Biometric; keypad; pan, tilt and zoom cameras; door event and camera logging
CCTV, dogs, guards, fence
Gated, with access control card, biometrics and a 24x7 security guard
24-hour armed guards, visitor escorts, magnetometer, x-ray scanner, closed-circuit television, badge access and other physical and electronic measures for securing the mine's perimeter and vaults
Distance underground (feet)
50
100
60
125
220
Ceiling height in data center space (feet)
16
12 to 50
10
16 to 18
15 (10 feet from raised floor to dropped ceiling)
Original use
Military communications bunker
Royal Air Force military bunker
Private bunker designed to survive a nuclear attack. Complex built in 1982 by Louis Kung (Nephew of Madam Chang Kai Shek) as a residence and headquarters for his oil company, including a secret, 40,000 square foot nuclear fallout shelter. The office building uses bulletproof glass on the first floor and reception area and 3-inch concrete walls with fold-down steel gun ports to protect the bunker 60 feet below.
Limestone mine originally developed by an asphalt company that used the materials in road pavement
Limestone mine
Total data center space (square feet)
34,000
50,000
28,000 plus 90,000 of office space in a hardened, above-ground building.
40,000
60,000
Total space in facility
65,000
60,000
28,000
3 million
145 acres developed; 1,000 acres total
Data center clients include
Insurance company, telephone company, teaching hospital, financial services, e-commerce, security monitoring/surveillance, veterinary, county government
Banking, mission critical Web applications, online trading
The Editor:
My Name is Gary E.Weller; this is what I am thinking about in the time and space when I am writing this newsletter, you will never know what to expect from my mind or others. When I let my hair grow long some folk say I look like Albert Einstein what a compliment, I will try to keep my hair as long as I can. I prefer to Die on my feet than Live on my knees!"
ACAE - Air conditioning airflow efficiency, the amount of heat removed per standard cubic foot of airflow per minute
AHU - Air handling unit
Air Mixing - The unintended mixing of cold and hot air
Airside Economizer - An economizer that directs exterior air into the data center when the air temperature is at or below the cooling set point
Aisle - The open space between rows of racks. Best-practice dictates racks should be arranged with consistent orientation of front and back to create ‘cold'and ‘hot'aisles.
AMS - Asset management system
ASHRAE - American Society of Heating, Refrigerating and Air-Conditioning Engineers is an international technical society organized to advance the arts and sciences of air management.
BACnet - A data communication protocol for building automation and control networks BAS - Building automation system Blanking Panel - A device mounted in unused U spaces in a rack that restricts bypass airflow, also called blanking or filler plates BMS - Building management system, synonymous with BAS, AMS and other computer-based tools used to manage data center assets BTU - British thermal until, a standard measure of cooling equipment capacity Bypass Airflow - Conditioned air that does not reach computer equipment, escaping through cable cut-outs, holes under cabinets, misplaced perforated tiles or holes in the computer room perimeter walls. C - Degrees Celsius
C/H - Cooling/Heating Cabinet - Device for holding IT equipment, also called a rack CAC - Cold aisle containment system that directs cooled air from air conditioning equipment to the inlet side of racks in a highly efficient manner CADE - Corporate average data center efficiency CapEx - Capital expense, the cost of purchasing capital equipment Carbon Footprint - A measurement of the volume in pounds of Carbon Dioxide generated by business operations. CFD - Computational fluid dynamics, scientific calculations applied to airflow analysis CFM - Cubic feet per minute, an airflow volume measurement Close-Coupled Cooling - Cooling technology that is installed adjacent to server racks and enclosed to direct airflow directly to the rack without mixing with DC air Coefficient of Effectiveness (CoE) - Uptime Institute metric based on the Nash-Sutcliffe model efficiency coefficient Cold Aisle - An aisle where rack fronts face into the aisle. Chilled airflow is directed into this aisle so that it can then enter the fronts of the racks in a highly efficient manner. Cold Spot - An area where ambient air temperature is below acceptable levels. Typically caused by cooling equipment capacity exceeding heat generation. CR - Computer room CRAC - Computer room air conditioner (pronounced crack) that uses a compressor to mechanically cool air CRAH - Computer room air handler (pronounced craah) that uses chilled water to cool air Critical Load - Computer equipment load delivered by PDU output CSI - Cold supply infiltration index, quantifies the amount of hot air mixing with cold inlet air prior to entering the rack. Cutout - An open area in a raised floor that allows airflow or cable feeds CW - Chilled water DC - Data center DCiE - Data center infrastructure efficiency is an efficiency measure that is calculated by dividing the IT equipment power consumption by the power consumption of the entire data center. This measure is the inverse of PUE. Dead Band - An HVAC energy saving technique whereby sensitivity set points of equipment are set more broadly to improve coordination of the equipment and avoid offsetting behaviors, also dead band control strategy Delta T - Delta temperature, the spread between the inlet and outlet air temperatures of air conditioning equipment, measured as the maximum achievable difference between inlet (return) and outlet (supply) temperatures. This is not the astronomical measurement of Terrestrial Dynamical Time minus Universal Time. Dewpoint - The temperature at which air reaches water vapor saturation, typically used when examining environmental conditions to ensure they support optimum hardware reliability D/H - Dehumidifying/Humidifying Dry-Bulb Temperature - The temperature of the air measured using a dry-bulb thermometer, typically taken in conjunction with a wet-bulb reading in order to determine relative humidity EFC - Equivalent full cabinets, the number of full cabinets that would exist if all the equipment in the data center were concentrated in full cabinets ESD - Electrostatic discharge, more commonly ‘static discharge' F - Degrees Fahrenheit Ft2 - Square feet or foot GPM - Gallons per minute HAC - Hot aisle containment system that directs heated air from the outlet side of racks to air conditioning equipment return ducts in a highly efficient manner Harmonic Distortion - Multiples of power frequency superimposed on the power waveform that causes excess heating in wiring and fuses Heat Exchanger - A device used to transfer heat energy, typically used for removing heat from a chilled liquid system HDG - Hot dipped galvanized Hot Aisle - An aisle where rack backs face into the aisle. Heated exhaust air from the equipment in the racks enters this aisle and is then directed to the CRAC return vents. HPDC - High-performance data center, a data center with above average kW loading, typically greater than 10kW/rack Hot Spot - An area, typically related to a rack or set of racks, where ambient air temperature is above acceptable levels. Typically caused by heat generation in excess of cooling equipment capacity. Hp - Horsepower Hr - Hour HVAC - Heating, ventilation and air conditioning system, the set of components used to condition interior air including heating and cooling equipment as well as ducting and related airflow devices In-Row Cooling - Cooling technology installed between server racks in a row that delivers cooled air to equipment more efficiently Inlet Air - The air entering the referenced equipment. For air conditioning equipment this is the heated air returning to be cooled, also called return air. For racks and servers this is the cooled air entering the equipment. IP - Internet protocol, a communications technology using the internet for communications. IR - Infrared spectrum used by thermal imaging technologies JVM - Java virtual machine, Java interpreter. Software that converts the Java intermediate language into executable machine language. kBTU - Kilo British thermal unit, one thousand BTU, a unit of measurement for the cooling capacity of a CRAH kCFM - Kilo-cubic feet per minute, 1000 CFM kV - Kilovolt kW - Kilowatts kWc - Kilowatts of cooling, alternate unit of measurement for the cooling capacity of a CRAH kWh - Kilowatt hours kVA - Kilovolt amperes = voltage x current (amperage) KVM - Keyboard, video, mouse; an interface technology that enables users to access multiple servers remotely from one or more KVM sites. More obscurely, can also mean K Virtual Machine: a version of the Java Virtual Machine for small devices with limited memory. Latent Cooling Capacity - Cooling capacity related to wet bulb temperature and objects that produce condensation Line Noise - Distortions superimposed on the power waveform that causes electromagnetic interference Liquid Cooling - A general term used to refer to cooling technology that uses a liquid circulation system to evacuate heat as opposed to a condenser, most commonly used in reference to specific types of in-row or close-coupled cooling technologies Load - The kW consumption of equipment, typically installed in a rack. Also, the heat level a cooling system is required to remove from the data center environment MAH - Makeup air handler, a larger air handler that conditions 100% outside air. Synonymous with MAU. Make-Up Air - The conditioned air delivered by a MAU or MAH MAU - Makeup air unit, a larger air handler that conditions 100% outside air. Synonymous with MAH. Maximum Temperature Rate of Change - An ASHRAE standard established to ensure stable air temperatures. The standard is 9 degrees F per hour. MERV - Minimum efficiency reporting value , ASHRAE 52.2, for air filtration measured in particulate size. N+1 - Need plus one, a redundancy concept where capacity is configured to include planned capacity plus one additional device to enable continued operations with the failure of one system in the configuration. This presumes immediate detection and remediation of the failed unit. NEBS - Network equipment-building system design guidelines applied to telecommunications equipment No. - Number Nominal Cooling Capacity - The total cooling capacity of air conditioning equipment, includes both latent and sensible capacities. Due to humidity control in data centers, the latent capacity should be deducted from nominal capacity to determine useful capacity OpEx - Operating expense, the ongoing expenses related to operating the data center Overcooling - A situation where air is cooled below optimum levels. Typically used in reference to rack inlet temperatures. PDU - Power distribution unit Pole - A row of power receptacles with power supplied from a PDU Pole Position - A power receptacle on a pole Pressure Differential - The difference in pressure between two locations in the data center used to analyze air flow behaviors PH - Phase, electrical phase 1-3 Plenum - A receiving chamber for air used to direct air flow PU - Packaged unit, an air handler designed for outdoor use. PUE - Power usage effectiveness, a measure of data center energy efficiency calculated by dividing the total data center energy consumption by the energy consumption of the IT computing equipment. This measure is the inverse of DCiE. Rack - Device for holding IT equipment, also called a cabinet RAH - Recirculation air handler, a device that circulates air but does not cool the air Raised Floor - Metal flooring on stanchions that creates a plenum for airflow and cabling, synonymous with RMF Recirculation - Chilled airflow returning to cooling units without passing through IT equipment, also referred to as short cycling Return Air - The heated air returning to air conditioning equipment RFI - Radio frequency interference Rh - Relative humidity RMF - Raised metal floor, an alternate term for the more commonly used term ‘raised floor' ROI - Return on investment, a measure of the amount of time required to recover an investment RPM - Revolutions per minute, used to measure fan speeds RPP - Remote power panel RTU - Rooftop unit, an air handler designed for outdoor use mounted on a rooftop. A typical application of a PU. S+S - System plus system SCFM - Standard cubic feet per minute, the volumetric flow rate of a gas corrected to standardized conditions of temperature, pressure and relative humidity Sensible Cooling Capacity - Cooling capacity related to dry bulb temperature and objects that do not produce condensation Sensitivity - An equipment setting that bounds the set point range and triggers a change in device function when exceeded. Most commonly referring to CRAC/CRAH temperature and humidity set points. Set Point - Typically used in reference to air conditioning equipment thermostat temperature and humidity settings Short Cycling - Chilled airflow returning to cooling units without passing through IT equipment, also referred to as recirculation STS - Static transfer switch Sub-Floor - The open area underneath a raised computer floor, also called a sub-floor plenum Supply Air - The cooled airflow emitted from air conditioning equipment TCE - Triton Coefficient of EffectivenessSM, synonymous with UCE Thermistor - A type of resistor with resistance varying according to its temperature U - Rack mount unit, the standardized height of one unit is 1.75" UCE - Upsite Coefficient of EffectivenessSM, synonymous with TCE UPS - Uninterruptible power supply, device used to supply short-term power to computing equipment for brief outages or until an alternate power source, such as a generator, can begin supplying power. VFD - Variable frequency drive W - Watt(s) Waterside Economizer - An economizer that redirects water flow to an external heat exchanger when the exterior ambient air temperature is at or below a temperature required to chill water to a given set point, simultaneously shutting down the mechanical chiller equipment. Wet-Bulb Temperature - The temperature of the air measured using a wet-bulb thermometer, typically taken in conjunction with a dry-bulb reading in order to determine relative humidity Wg - Inches of water column, an approximate unit of measurement for static air pressure Work Cell - The area of a rack and the related area immediately in front of and behind the rack. Standard racks are 2 feet wide and 4 feet deep. Standard aisles are 4 feet wide, so half of that space is workspace for a given rack. This results in a standard work cell of 16 square feet. Actual work cell size varies with data center design. WPSF - Watts per square foot