Friday, March 21, 2014

Questions to Ask Yourself When Selecting a Data Center: Is the infrastructure built to meet or exceed Tier III standards?

According to a study by the Ponemon Institute, the cost of data center downtime across industries is approximately $7,900 per minute, which is a 41% increase from the $5,600 cost in 2010. This same study also showed that 91% of data centers have experienced an unplanned outage in the past 24 months. (Read more about the average costs involved with outages for 2013 here). Facility outages are not only financially devastating, but seriously harmful to an organization’s reputation.
Thankfully, the data center industry has adopted a standardized methodology to determine availability in a facility, which will help you determine what is right for your business so you can make an informed decision.  Developed by the Uptime Institute, this tiered system offers companies a way to measure both performance and return on investment (ROI).
To be considered a Tier III facility it must meet or exceed the following standards:
  • Multiple independent distribution paths serving the IT equipment
  • Concurrently maintainable site infrastructure with expected availability of 99.982%
  • 72 hour power outage protection
  • All IT equipment must be dual-powered and fully compatible with the topology of a site’s architecture
Another important element in Tier III compliance is N+1 redundancy on every main component, which provides greater protection and peace of mind for crucial IT operations by ensuring a redundant system is always available in case a component fails or must be taken offline for maintenance.

Each LightEdge data center, including the new Kansas City facility currently being built at SubTropolis Technology Center, meets or exceeds the concurrent maintainability requirements of the Uptime Institutes Tier III standards. With our Tier III infrastructure, any one component can fail and the datacenter will remain operational.

LightEdge’s Kansas Center data center is scheduled to open during the spring of this year. Check out our Facebook, Twitter, Google+, and LinkedIn pages for the most recent photos of our construction progress. Download the spec sheet to learn more about the facility here.

Some things to consider when running a data center

As businesses go increasingly digital, the need for data centers to secure company information is more important than ever before. It is not only tech giants like Google and Facebook that need a place to house their information - businesses across the healthcare, government and industrial sectors are looking to data centers as a solution to their storage needs. But running a data center is not something that can be done impulsively. Whether your company has the funds and scale of operations to occupy its own center or ends up looking into existing facilities, here are some important considerations to keep in mind to maximize an enterprise data center operation.

Consider renewable energy solutions
In Hollywood movies, data centers are generally represented as massive, noise-intensive operations that actively drain energy out of whatever area they are occupying. This public perception of such facilities is understandable given that data centers must rely on a constant supply of energy - after all, their functionality depends on remaining active at all times. But just because they harness energy sources does not mean data centers can't function in an environmentally-minded, sustainable way.
Just ask Google, a company that has been dealing with data storage needs ever since it rented its first data storage facility - a closet-sized, 7 foot by 4 foot operation with a mere 30 computers - in 1998, according to CNET. Google has come a long way since then, and so has its dedication to sustainable methods of data center operation. The tech giant now has a vast network of data centers spanning the globe.
What unites Google's facilities is a singular commitment to renewable energy. With renewable energy currently powering more than a third of Google's data storage facilities, the company is always looking for ways to expand the use of solar and wind power, according to its site. Because it is challenging to have a renewable power generator on location, the company did the next best thing: It reached out to renewable energy providers in the area - such as wind farms - and made deals to buy energy from them. Among Google's energy suppliers are wind farms in Sweden and Oklahoma. Through these sources, the company is not only able to maintain solid data room cooling practices, but benefit the local community.   

Have good outside air cooling

When it comes to maintaining an optimal data room temperature, it's best to follow the lead of companies well-versed in data storage. Google and Microsoft are two such businesses, and they both share a commitment to harnessing natural resources to keep their data centers cool.
In Dublin, Microsoft has invested more than $800 million to date in order to build a data center that covers almost 600,000 square feet. The enormous size of the facility would seem to present a major cooling challenge, but the company has been able to surmount that by using fresh air cooling, Data Center Knowledge reported. By building the center in Ireland, where the temperature remains optimal for data room cooling, Microsoft is able to maximize the location as a natural cooling solution - a move that saves significant energy costs while keeping the company environmentally friendly as well. And its commitment to environmentally sound solutions does not end with cooling: the center also recycles 99 percent of the waste it produces.
Google has a similarly cooling-minded approach with its data facility in Finland, which it hopes will be almost completely powered by wind energy by 2015, according to Data Center Knowledge. The wind energy will come from a wind park located nearby. But the center is not waiting until then to implement good temperature practice. Instead of relying on chillers and other machine-based cooling techniques, Google relies on seawater from a nearby Gulf to cool the facility. Its efforts in Finland are part of a broader effort to expand the Google sphere of influence.
"The Google data center in Hamina offers Eastern Finland a tremendous opportunity to jump from the industrial to digital age," said Will Cardwell, a professor at a nearby university.

But just as important as what goes on inside a center is the environment around it. That is because data centers are invariably affected by the physical location in which they are located. With that in mind, here are some more things to look into in order to maximize your data center potential.

Choose the location wisely

Considering that data centers are necessarily connected to the physical environment they inhabit, it is important to pinpoint the best location possible. Data centers are always going to require top-notch capabilities to maintain a good server room temperature, but the ease with which that happens can depend on the location of the center. As always, Google is at the top of the game with regard to location selection. Its Hamina, Finland center is strategically placed near the Gulf of Finland, enabling an easy and natural data room cooling solution.
But Google is not the only company maximizing natural environments for data center growth. Iron Mountain specializes in underground data center solutions, according to Data Center Knowledge. Formerly a storage company for physical records, Iron Mountain already had a 145-acre underground storage facility in a former limestone mine before it got into the data center business. This location turned out to be perfect for data center needs. Blocked from the sunlight and other external heat sources, the underground facility stays at about 52 degrees without any kind of additional cooling function. An underground lake provides further protection against ever needing to bring in a machine cooling system. The company's so-called "data bunker" gained so much popularity that Iron Mountain decided to expand its sphere of operations.

Give back to the community the center is in

Data centers often require a big fleet of staff to operate. Fortunately, they're usually built near communities from which workers can be hired. But as much as data centers plan to benefit from the community they inhabit, it is just as important to look for ways to give back. This kind of behavior encourages connectedness with the community and improves the reputation of the center - and therefore the company - in the public eye.
Google paid special attention to the local community as it developed its Hamina center. When they began mapping out the concept for the center, Google realized that construction would take about 18 months. And so they turned to the locals for help. In the process, they provided steady employment for 800 workers in the engineering and construction sectors, according to Data Center Knowledge. Google's willingness to involve locals in the construction process helped forge a lasting bond between the tech giant and the city.
This bond did not go unnoticed.
"Google's investment decision is important for us and we welcome it warmly," Finnish president Jyrki Katainen said.
And for those who work at the center, life is good.
"No two days are the same as we change our roles around frequently to keep things fresh and new," said Julian Cooper, a hardware operations worker at the facility.

Be prepared to surmount environmental obstacles

In the event of a disaster like a hurricane or earthquake, it is vitally important for all enterprises - especially data centers - to make sure their stock is safe. Iron Mountain understands the principle of environmental preparadness quite well, which is why they offer underground data storage solutions. By storing data underground, Iron Mountain protects it against any conceivable natural disaster. This nature-prove construction is especially important for companies like Marriott, which chose to house data at the Iron Mountain bunker because of the sense of complete security it afforded.
"We have always had a rigorous and constant focus on having disaster preparedness in place," said Marriott operational vice president Dan Blanchard. "Today we have a data center that provides Marriott with a tremendous capability for disaster recovery, and we have a great partner in Iron Mountain."
According to tech journalist David Geer, earthquakes pose a huge threat to data centers in many areas around the world, since they can be difficult to predict and potentially cause large-scale damage. If a company intends to build its facility in an area susceptible to earthquakes, it should apply the most stringent safeguards, including building a center that is capable of withstanding a quake one degree higher than the requirement for the zone it occupies.

Iron Mountain 
Iron Mountain
Charles Doughty is Vice President of Engineering, Iron Mountain, Inc.
We’re all familiar with Moore’s Law, stating that the number of transistors on integrated circuits doubles approximately every two years. Whether we measure transistor growth, magnetic disk capacity, the square law of price to speed versus computations per joule, or any other measurement, one fact persists: they’re all increasing and doing so exponentially. This growth is the cause of density issues plaguing today’s data centers. Simply put, more powerful computers generate more heat which results in significant additional cooling costs each year.
Today, a 10,000 square-foot data center that is running about 150 watts per square foot costs roughly $10 million per megawatt to construction, depending on location, design and cost of energy. If the approximately 15 percent rate of data growth of the last decade continues over the next decade, that same data center would cost $37 million per megawatt. A full thirty percent of these costs are related to the mechanical challenges of cooling a data center. While the industry is experienced with the latest chilled water systems, and high-density cooling, most organizations aren’t aware that Mother Nature can deliver the same results for a fraction of the cost.

The Efficiency Question: Air vs. Water

Most traditional data centers rely on air to direct cool a data center. When we analyze heat transfer formulas, it turns out water is even more efficient at cooling a data center, and the difference is in the math, namely the denominator:
With the example above, the energy consumed by the 10,000 square-foot data center creates over 5 million BTUs of heat rejection. Using the formulas in the figures above and assuming a standard delta T of 10 degrees, this data center would require more than 470,000 cubic feet per minute (CFM) of air to cool that facility, but only 1,000 gallons of water per minute. In order to cool this data center, the system would need between 150-200 horsepower to convey that many cubic feet of air per minute, but only 50-90 horsepower to convey 1,000 gallons per minute – roughly 462 times more efficient! If analyzed on a per cubic foot basis – one cubic foot of air to one cubic foot of water, water is actually about 3,400 times more efficient than air.

Physics 101: The Thermodynamics of the Underground

However, for an underground data center, there’s more at work. In a subterranean environment, Mother Nature gives you a consistent ambient temperature of 50 degrees. (So to begin with, you don’t have to depend on cooling systems as much since it is cool to start. Then you can get further efficiencies by using an underground water source or aquifer.)
The ideal environment for a subterranean data center is made of aquifers, or stone that has open porosity like basalt, limestone and sandstone; aquicludes, such as dense shales and clays, will not work as effectively. In a limestone subterranean environment, heat rejection can increase from 4 to 500 percent because of the natural heat sink characteristics of the stone. The most appealing implication here is that the stone can manage the energy fluctuations and peaks inherent to any data center. (the limestone absorbs heat which further reduces the need for cooling)
As the water system funnels 50 degree water from the aquifer to cool the data center, the heat is rejected into the water which is then funneled back about 10 degrees warmer. Mother Nature deals with that heat by obeying the second law of thermodynamics which governs equilibrium and the transfer of energy. For the subterranean data center operator, this means working within the conductivity of the surrounding rock, thus it is important to be knowledgeable of the lithology and geology of the local stratus, along with understanding, the effects of a continuous natural water flow and the psychrometric properties of air.

The Cost of Efficiency

Of course, there are other data center cooling strategies being used aside from the subterranean lake designs including well systems, well point systems and buried pipe systems to name a few. Right now, well systems are being used in Eastern Pennsylvania to cool nuclear reactors producing hundreds of megawatts of energy with mine water. Well point systems are generally used in residential applications, but the concept doesn’t scale well without becoming prohibitively expensive. Buried pipe systems are used quite a bit and require digging a series of trenches backfilled with a relatively good conductive granular material, but beyond 20-30 kilowatts, this method does not scale well.
How much cost do each of these methods incur? An underground geothermal lake design will cost less than $500 per ton, while well-designed chill water systems range from $2,000-4,000 a ton. The discrepancy in cost is created by the mechanics – in a geothermal lake, there are no mechanics: water is simply pumped at grade. Well and buried pipe systems can cost more than $5000 a ton, and these systems do not scale very well.
By understanding Mother Nature and using her forces to our advantage, we can increase the capacity and further improve on the effectiveness of the geothermal lake design. By drilling a borehole from the surface into the cavern, air transfer mechanisms can easily be incorporated; anytime the air at the surface is at or below 50°, that cool air will to drop into the mine. Even without motive force or air handling units, a four to five foot borehole can contribute about 30,000 cubic feet of air per minute! If an air handling unit is add, the 30,000 CFM of natural flow can easily become 100,000-200,000 CFM. What was a static geothermal system is now a dynamic geothermal cooling system with incredible capacity at minimal incurred cost.

Opportunities for the Future

When analyzing and predicting what data centers are going to look like in the future, a recurring theme starts to emerge: simplicity and lower-cost. Because of the cost pressures facing IT departments and CFOs alike, underground data centers using hybrid water, air and rock cooling mechanisms are an increasingly attractive option.
There are even opportunities to turn these facilities into energy creators. For example, by adding power generating turbines atop boreholes, operators can harness the power of heat rising from the data centers below. Furthermore, by tapping into natural gas reserves, subterranean data centers could become a prime energy source, thus eliminating the need for generators and potentially achieving a power usage effectiveness measurement of less than one. The reality is that if you know Mother Nature well, you can work with her – she’s very consistent – and the more we learn, the more promising the future of data center design looks.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Friday, March 14, 2014

Going underground: Will data centers become data bunkers?

Hong Kong is one of the most vertical cities in the world. Like many cities, when it ran out of space, it simply built up instead of out.
But Hong Kong has almost reached its upper-limit for going skyward. And as a financial capital of the world with an incredibly dense population, space is at a premium. That means that large dedicated computing areas, like data centers, have to compete for space with everything else, even though they are in great demand.

The solution for Hong Kong might be to stop building skyward and to start looking under its feet. Data Center Knowledge recently reported that Hong Kong may  dig out rock caves under the city and build new data centers down there. Apparently putting a data center in a deep cave isn't such a bad idea because the naturally cooler below-ground air could help maintain temperatures as long as the cave is properly ventilated.
The biggest problem with the underground concept is likely to be price. It was estimated in the Hong Kong scenario that digging out a tunnel for a data center would cost up to $600,000 per meter. That makes for an expensive project that all but the most profitable data centers would be hard pressed to ever overcome.
But there may be other advantages to building underground. Apparently other underground data center projects are in the works, or even  have been completed in other places, using decommissioned military bunkers as their base of operations. Swedish IPS Bahnhof converted a bunker below central Stockholm into a state-of-the-art data center back in 2008.
The main advantage to using a military bunker, besides the fact that it's already been dug out, is that they were built to survive a nuclear war. Governments looking for the ultimate level of security may want to consider it.
Even with the structure already in place, it will still be expensive to store data in a nuclear-proof  bunkers. Probably credit card companies will take advantage of it. So you can rest easy knowing that in the event of a nuclear war, both cockroaches and your MasterCard bill will survive.

Cavern Technologies builds area’s largest data center underground

Pete Clune, CEO of Cavern Technologies, is pictured in one of the data suites in its underground data center at Meritex Lenexa Executive Park.

Cavern Technologies, located in the underground portion of Meritex Lenexa Executive Park, has begun a $10 million, 100,000-square-foot expansion that will make it the largest data center in the Kansas City region.
The company, which has developed 60,000 square feet of underground space since its founding in 2007, will leapfrog ahead of 1102 Grand LLC in Kansas City, now the region’s largest data center with 110,000 square feet.
Pete Clune, founder and CEO of Cavern Technologies, said several factors were driving the company’s growth.
One is the fact that the data-storage needs of Corporate America are roughly doubling every year and a half, Clune said. Cavern Technologies’ more than 90 existing tenants are expanding their data-storage capacity by 30 percent annually, he said.
Another growth driver is the cost and reliability of electric power provided by Kansas City Power & Light, Clune said. KCP&L charges 8 cents per kilowatt hour, about half of what utilities on the coasts charge.
Cavern Technologies charges its tenants based on power consumed rather than square footage occupied. But as part of its unique data center colocation model, Clune said, it developed the concept of data suites, which allow clients to house servers in dedicated spaces rather than on racks in a huge shared space.
Tenants also like the advantages of having their data center space underground, Clune said. Being “a data center without walls,” he said, Cavern Technologies gives tenants the flexibility of moving and expanding their space quicky. It also protects data from natural disasters and offers a 65-degree ambient temperature that helps tenants minimize cooling costs. A remote energy monitoring system and suite-design recommendations from Cavern Technologies’ staff also keep power costs down, said Scott Herron, the company’s vice president of data center operations.
In addition to providing access to multiple KCP&L substations, Cavern offers high-capacity bandwidth from multiple carriers. The data infrastructure is so robust, Clune related, that a London-based company with space in the center reported that it could send data from London to Lenexa to Scotland faster than it can send it directly to Scotland.
“With Cavern’s focus on the infrastructure piece, including the space, power, cooling, security and bandwidth, our client’s IT department can focus on their unique mission-critical business operations,” Clune said.
The model has attracted some of the nation’s leading health care, financial services, legal and tech companies, said Clune, whose son John is president of the company.
They have guided the underground business to full occupancy in its present 60,000-square-foot-space. In addition, the company has secured commitments for 25 percent of the 100,000 square feet now being built out.
The company will finish the year with nearly $6 million in revenue, Pete Clune said, and will be posting $15 million to $20 million by the time the additional 100,000 square feet is fully occupied.
JE Dunn Construction is the contractor for the expansion. Bell/Knott & Associates is providing the data center design, and Gibbens Drake Scott Inc. is the engineering firm.
Bill Seymour, a senior vice president with Meritex Enterprises, which owns the underground park, said he began working with Pete Clune 10 years ago, when he operated a managed services firm at Meritex.
“I never imagined the type of scale Cavern has now achieved,” Seymour said. “Pete and John have proven the concept, done what they said they would do and give the customers what they want.”

Helsinki's underground master plan

CNN's Richard Quest takes a look at the development of Helsinki's vast underground and eco friendly programme.

Iron Mountain

SOLABS Chooses Iron Mountain to Host its Data Center

Life sciences eQMS software provider selects Iron Mountain for its secure and compliant data center services

Iron Mountain(R) Incorporated (NYSE: IRM), the storage and information
 management company, announced it has signed a multi-year colocation 
agreement with SOLABS, an Enterprise Quality Management software (eQMS) 
provider for life sciences companies. SOLABS will co-locate aspects of 
its data center operations within Iron Mountain's underground data 
center complex in western Pennsylvania. 
SOLABS' quality management software helps life sciences organizations automate quality operations and comply with Federal Drug Administration (FDA) regulations. Until now, the SOLABS QM software solution has been sold to customers as licensed software and installed on the client's network. Now, in response to customer demand, the company is preparing to offer the software as a SaaS/cloud offering. In order to make this switch, the company sought a secure and compliant data center home for this new offering. SOLABS selected Iron Mountain based on its track record of protecting customer information in secure and compliant facilities.
"Our customers entrust us to protect their electronic records, and we take that extremely seriously," said Philippe Gaudreau, chief executive officer, SOLABS. "When we made the decision to co-locate our data center operations, we wanted our customers to have peace of mind knowing where their data is being stored. Iron Mountain is a well-known and trusted storage and information management company that already provides services to many of our life sciences clients. This seemed liked a natural fit for us."
"We appreciate SOLABS' business and believe our secure data center colocation facility is a perfect home for their company's new SaaS offering," said Mark Kidd, senior vice president and general manager, data centers, Iron Mountain. "Iron Mountain Data Centers is designed to help companies like SOLABS in highly regulated businesses. From employee training to the infrastructure of our buildings, we take a stringent approach to complying with industry-specific regulations such as PCI, FISMA and HIPAA. Our DNA in managing information assets from creation to destruction also differentiates us to these organizations. No one in today's data center market has our track record in security and facilitating compliance."
Gaudreau added: "We wanted a cloud platform that allows us to provide a 24/7 SaaS offering and options to our on-premise clients such as constant monitoring, remote test environments, offsite backup and disaster recovery programs. By partnering with Iron Mountain, we will move closer to our vision of 'feeling local' for every client, no matter where the SOLABS QM software resides."
After more than a decade of providing wholesale data center space to corporate and government organizations, Iron Mountain now offers retail colocation services for companies that do not require a dedicated data hall. Iron Mountain's wholesale offering provides dedicated, secure space for all or part of an organization's data center operations and offers a range of services, including engineering and design, development and construction, and ongoing facility operations and management. Iron Mountain's retail colocation solution provides customers a shared environment with highly reliable, scalable, and secure power and cooling. Iron Mountain Data Centers customers also have access to additional services such as migration, networking, tape handling, IT asset tracking, disposition and more.
Based in western Pennsylvania, Iron Mountain's underground data center is ideal for enterprises and government clients seeking an ultra-secure environment. The former limestone mine is 220 feet below ground, providing both efficient geothermal cooling and natural protection from extreme weather.
For more information, visit:
For more than thirteen years SOLABS has helped organizations in the Life Sciences industry improve their operational efficiency and maintain compliance with '21 CFR Part 11' by automating their Quality Operations. SOLABS QM 10.0, released in 2013, is the company's sixth major release, exclusively dedicated to serving the Life Science Industry. The software allows companies to manage Quality Processes such as CAPA, Complaint, Change Control, Audit, Controlled Documents (such as SOPs, Work Instructions and Methods) as well as Employee Training Activities in one single user interface. SOLABS QM is can be implemented as a 'stand-alone' system but it is also compatible with and may be fully integrated into ERP, LIMS and other enterprise software. For more information on SOLABS, contact Ericka Moore ( or 1-877-322-1368 ext. 219).
About Iron Mountain
Iron Mountain Incorporated (NYSE: IRM) is a leading provider of storage and information management services. The company's real estate network of over 64 million square feet across more than 1,000 facilities in 36 countries allows it to serve customers with speed and accuracy. And its solutions for records management, data backup and recovery, document management, and secure shredding help organizations to lower storage costs, comply with regulations, recover from disaster, and better use their information for business advantage. Founded in 1951, Iron Mountain stores and protects billions of information assets, including business documents, backup tapes, electronic files and medical data. Visit for more information.

After more than a decade of operation, City Utilities is looking to sell off the climate-controlled data center it operates out of the Springfield Underground.
It's not because the facility, known as SpringNet Underground, hasn't been successful - the cavernous space in the former limestone quarry has proved popular with businesses that need a secure place to store computer servers and other equipment.
So popular, in fact, that "we're coming to the point where we're out of space," CU General Manager Scott Miller said.

2006 SpringNet® is a division of CU, which offers telecommunication services. SpringNet® has

 SpringNet Continues Driving Jobs and Revenue for Local Community
A year has passed since we covered SpringNet in Springfield, Missouri, and its remarkable impact on local businesses and economic development. We recently spoke with SpringNet Director, Todd Murren, and Network Architecture Manager, Todd Christell, to get an update on how the network is progressing.
Demand for SpringNet’s high-speed data services continues to grow steadily. Financial statements for City Utilities of Springfield show the network generated $16.4-million in operating revenue last year against costs of $13.2-million. Better yet, revenues have increased around 3% per year while cost increases are closer to 0.5%. The end result is close to $3 million in annual net income for SpringNet. And all of this comes from a network that only serves commercial and public sector clients because Missouri state law restricts municipal network provision to only “Internet service,” meaning SpringNet cannot offer triple-play packages to compete with incumbent providers.
One of the highlights of SpringNet’s economic development success has been the attraction and retention of travel giant Expedia. After a large national provider failed to deliver on negotiations with the company, SpringNet stepped in to make sure Expedia brought its call center to Springfield. That effort has paid off handsomely for SpringNet and the local community. Expedia now employs close to 900 in the area after announcing in July that it was hiring another 100 employees in Springfield.
Up next for SpringNet is an effort to leverage its fiber infrastructure to create even more jobs. Believing that future job growth will revolve around the advancements enabled by gigabit networks, SpringNet is working with the Mid-America Technology Alliance (MATA) to host a hackathon with partners in Kansas City to explore what is possible between gigabit cities.
As Murren and Christell tell it, someone in Springfield can now send data to Kansas City with a 5-millisecond delay. It’s like they are in same building despite being hundreds of miles apart. This capability spells opportunity for new ways of doing business and delivering services. SpringNet wants to help the gigabit community develop these opportunities.
Deep underground in Butler County, the Iron Mountain National Data Center functions like its own little city. With more than 3,000 badged employees, the former mine has its own fire department, restaurant, transit systems, water treatment center, and medical center. (Justine Coyne/Pittsburgh Business Times).

But what may be most surprising about this unique facility, located outside of Pittsburgh in Boyers, is the treasures that are stored inside.
Formerly a limestone mine owned by United States Steel Corp. (NYSE: X) in the early 1900s, today the mine houses everything from original prints of classic motion pictures, to data storage for more than 2,300 business and government agencies across the U.S.
Located 220 feet underground, Nick Salimbene, director of business development for Iron Mountain, said what makes the facility so unique is that it is virtually impervious to natural disasters. Salimbene said being housed underground also provides a stable environment that naturally maintains a temperature of 52 degrees.
Of the 145 acres that are developed, Salimbene said about 70 percent is used for physical storage. But demand is shifting toward more data storage. He said the demand for data storage has been increasing between 20 percent and 30 percent annually in recent years. Today, he said about 80 percent of what is coming into the center today is data.
For the privacy of its customers, Iron Mountain does not disclose who uses the Boyers facility by name. But it includes many big names. One tenant Salimbene could discuss is Corbis Corp., which houses its collection of over 20 million photographs .
"There's a little bit of everything here," Salimbene said. "But the most important thing for us, is that our customers feel secure having their items located here."
Obviously, with such valuable objects in its facility, security is very tight at Iron Mountain, with armed guards keeping watch 24 hours a day, seven days a week.
"We have companies trusting us with their most valuable assets," Salimbene said. "That's not something we take lightly."

Saturday, November 2, 2013

Surprise Visitors Are Unwelcome At The NSA's Unfinished Utah Spy Center (Especially When They Take Photos)

 Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.
George Washington

If the freedom of speech is taken away then dumb and silent we may be led, like sheep to the slaughter.
George Washington

Most people who visit Salt Lake City in the winter months are excited about taking advantage of the area’s storied slopes. While skiing was on my itinerary last week, I was more excited about an offbeat tourism opportunity in the area: I wanted to check out the construction site for “the country’s biggest spy center.”

An electrifying piece about domestic surveillance by national security writer James Bamford that appeared in Wired last year read like a travel brochure to me:
In the little town of Bluffdale, Big Love and Big Brother have become uneasy neighbors. Under construction by contractors with top-secret clearances, the blandly named Utah Data Center is being built for the National Security Agency. A project of immense secrecy, it is the final piece in a complex puzzle assembled over the past decade. Its purpose: to intercept, decipher, analyze, and store vast swaths of the world’s communications as they zap down from satellites and zip through the underground and undersea cables of international, foreign, and domestic networks. The heavily fortified $2 billion center should be up and running in September 2013. Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.”
My outing to the facility last Thursday was an eventful one. I can confirm that the National Security Agency’s site is still under construction. It was surprisingly easy to drive up and circle its parking lot. But if you take photos while there, it is — much like Hotel California – very hard to leave.

When the University of Utah professor who invited me to Salt Lake City to talk to his students asked how I wanted to spend three hours of downtime Thursday afternoon, the super-secret spy center was at the top of my list. The professor, Randy Dryer, was dubious about the value of visiting the construction site, assuming there would be a huge fence that would prohibit us from getting close or seeing anything significant. That turned out not to be the case.
View from the highway. There's a similar bird's eye view available on Google Earth.
We drove about 30 minutes south of downtown Salt Lake City to an area described to me as “out in the desert.” As we got close, I could see from the highway four grey mortared buildings that will soon be holding massive amounts of the world’s data. They appeared half-finished. I snapped some photos with my iPad (which, yes, does make me feel like a ridiculous person).
Then we came to a paved turn-off on the right that led directly to the facility. Driving up the road, we came to a sign emblazoned with the seals of the National Security Agency and the Office of the Director of National Intelligence; it was topped with a digital banner that proudly declared in flashing lights, “Look This… Sign Works!!!!” Behind the sign was a building that looked like a gas pumping station, minus the pumps. We took a right into a parking lot, where I snapped photos of the majestic view of the mountains that NSA data workers will have, another building that looked almost like a visitor’s center (it is #1 — a $9.7 million Visitor Control Center — in this diagram from Wired), and closer views of the data center and the unimposing, barbed-wire-topped fence that surrounded it. That seemed to be the end of the tour. I expressed surprise to Randy Dryer that no one had come out to see why we were slowly driving through the lot.
Two minutes later as we circled back to the flashing sign to take a few more photos, including one of a green sign with an arrow that read, “Rejection Lane,” a uniformed but baby-faced officer with NSA and “K9 unit” badges came out and walked up to the car.
Where we stopped for an hour to 'engage in a chat'
“Were you taking photos?” he asked. I said that I was. He responded, “You’re going to need to delete those.”
I explained that I was a journalist and that I preferred not to. He insisted, saying we were on restricted federal property and that taking photos there was illegal. Luckily for me, Randy Dryer is not just a university professor but a practicing and long-experienced media lawyer. He explained to the officer who we were, why we were there and that we hadn’t realized we were on restricted property. The officer, who carried a gun and a portable radio, began writing everything we said down in a little green notebook. When the officer insisted again that the photos be deleted, Dryer asked if we could talk to his supervisor.
At this point another uniformed officer pulled up behind us. He came up to the car and went essentially through the same question-asking routine while the first officer, who took our driver’s licenses, walked away from the car to call his supervisor. Officer #2, who seemed slightly older than the first but who also carried a little green notebook to record what we had to say, told us he would like for me to delete the photos, and mentioned that it would be easier if we did and that we could be charged with a crime for trespassing and for taking the photos.
Honestly, I was starting to feel pretty nervous at this point but also painfully aware of the irony of the situation. They didn’t want me to capture information about a facility that will soon be harvesting and storing massive amounts of information about American citizens, potentially including many photos they’ve privately sent.
I also remembered that I’d recently turned the passcode off on my iPad so it wouldn’t lock up on me during a presentation to political science students about “privacy watchdogs;” I suddenly had a strong urge to turn it back on.
View from the parking lot
We sat there for about 30 minutes with the car window down and the cold Utah air making its way inside. As we waited for “the supervisor,” we began chatting with the NSA officers. They asked for more information about us, including whether we had guns in the car. (This wouldn’t be hugely surprising in the state of Utah, but we did not.) I confessed that the photos I had weren’t terribly revealing. “You can see the facility from the highway,” I argued. One of the officers grimaced at that and suggested that this has occurred to him and he “didn’t think they built it in the best spot.”
“We didn’t see any signs on our way in,” said Dryer. “They must be tiny.”
“Yeah, that road recently opened,” said Officer #1. “I was just thinking the other day as I was driving in that those signs are too small.”
I said that I expected the construction to be farther along at this point, given that the center is due to be completed in seven months. They said this is “just the half I can see.” Gesturing at the rather puny looking fence, I told them it didn’t seem very high-security.
“Oh it’s stronger than it looks,” replied Officer #2. “It would stop a tractor trailer.”
“Yeah, but too bad it’s not higher and not see-through,” added Officer #1.

Meanwhile, I saw a more casually dressed man make his way into the back entrance of the building that would hold the junk food, sodas, and cash register if this were indeed the gas station it appeared to be. Officer #1 went in to join him. We asked who the supervisor was. “Agent Federman,” we were told. (It sounded like “Federman;” I’m not sure about the spelling.)
The mountainous view for Data Center workers
We sat in the car some more, while they — I assume — ran background checks on us, Googled us, checked my Forbes credentials, poked around my Facebook page and called other supervisors, and perhaps a Public Information Officer to decide what to do about us. After maybe another 15 minutes, an aggressively chummy man with piercing blue eyes, wearing a sweater and slacks, came out to the car. He introduced himself as a special agent and asked us to explain why we were there, with an aside to Officer #1 that he wanted him to record everything. Dryer offered a lengthy explanation, including all of the classes I’d spoken to. Agent Federman responded with a direct question: “Did anyone send you to take those photos and do you plan to distribute them to enemies of the United States?”
I would have laughed at that had I not been so intimidated and nervous. I said no one sent me and that I didn’t intend to do that. He asked why I did take them. I said I was amused by the sign and wanted to document the trip, and that I’m a journalist and recording information is what I do. He asked whether I would distribute or publish them, and I said again that I was a journalist so that was a possibility. He asked if I had already sent them from my device elsewhere. While the thought had certainly crossed my mind, I had not emailed, Facebooked, or Instagrammed them (yet). He asked me to describe the photos I’d taken, which I did.
He asked me again if I would delete them saying this would make things easier. Feeling like Bartleby the Scrivener by that point, I told him that I would prefer not to. He told me I could have called the Public Information Office, requested a tour and gotten official photographs; he suggested I delete my photos and do that instead. (It struck me at that moment as his version of “come back with a warrant.”) Dryer asked if we could go on a tour now. “No,” he responded. He went back inside the building.
I later contacted James Bamford, the author of the Wired article, to ask whether he requested a tour of the facility. He did not as it was just a hole in the ground when he first wrote the article many months before it came out. “But, having written about NSA for years, I’ve had little success in getting ‘tours’ of NSA facilities,” he said by email.
Now Officer #1 began asking for more information, such as my home address, the name of my hotel in Salt Lake City, where we had been driving from and where we were driving to. (If I didn’t have a government intelligence file before, I certainly do now.) He also asked for our social security numbers. We declined to give them – though I suspect it wouldn’t be very hard for these types to get them if they wanted them.
We began chatting again. Officer #2 expressed some personal discomfort around having photos taken, saying that if a photo of him was taken and put on the Internet that someone might come after him just because of who he worked for. “I had enough of that in the Army,” he said.
Officer #1 said they had to protect against “just anyone coming up here.” After the Wired article came out, there were two “sovereign citizens” who drove up and wanted to know “exactly what was going on in there;” the guards turned them away. The sovereigns are considered a domestic terrorist movement by the FBI. Officer #1 mentioned that both Dryer and I had clean records.
Our encounter with the officers started around 3:30. At this point, it was nearing 4:30. I was wondering if there was going to be a showdown and whether they were going to seize my iPad. I started thinking about whether I might have anything sensitive on there that I needed to worry about.
Agent Federman came back out. This time he came around to my side of the car. “Can I see the photos?” he asked. I was hesitant but it seemed like a reasonable request; plus, I was starting to fear that a federal citation was going to be my souvenir from this trip. So I scrolled through the photos I’d just taken on my iPad for him. He apparently didn’t see anything too objectionable. He asked me to go through again and count them. There were 13. He asked if I would delete two of the photos, which showed a K9 unit SUV including its license plate. I didn’t want to out of principle, but after an hour of being detained in a cold car – or as they described it, “engaging in a chat,” – I was really wanting to leave. I agreed to do so. That’s when they let us depart.
Coincidentally, the gas station-looking building where we were questioned turned out to be an entrance where people working in the facility presumably will have to show their credentials to gain entrance. Our interrogation took place in the lane with the green sign that read “Rejection Lane.” We were the first of the rejects.
The only warnings to stay away from the facility at this point

On the road on the way out, we noticed the signs we had missed: a speed limit sign, a small yellow one to the right side of the road that said “authorized personnel only” and another at the turn off – on the opposite side of the road, in front of a big field – that said “no trespassing.” We stopped at each one, of course, so I could take pictures of them.
It was an intimidating hour. While I’ve interviewed federal agents for stories, I’ve never been interrogated by them before. We may have been treated as gently as we were because I’m a mainstream journalist with a prominent platform and because I was accompanied by a lawyer. I was grateful that I could hold up “professional journalist” as my own badge; it felt protective.
I suspect this would’ve been a much more difficult encounter for someone without journalism credentials. That’s despite the fact that people have legitimate questions about the lengths to which intelligence agencies are going in order to monitor our communications and electronic activity to look for threats. My trespass and capture of information about the center was easy for NSA officers to spot, but the extent of the electronic trespassing against American citizens that might occur inside that data center when it’s finally completed will be much harder for us to discern. And, as the Supreme Court recently ruled in turning back a challenge to U.S. government surveillance of communications with people abroad, if you can’t prove that an unconstitutional invasion of privacy is happening, you can’t stop it from happening.
(!?html Kashmir Hill Forbes Staff
I'm a privacy pragmatist, writing about the intersection of law, technology, social media and our personal information. If you have story ideas or tips, e-mail me at PGP key here. These days, I'm a senior online editor at Forbes. I was previously an editor at Above the Law, a legal blog, relying on the legal knowledge gained from two years working for corporate law firm Covington & Burling -- a Cliff's Notes version of law school. In the past, I've been found slaving away as an intern in midtown Manhattan at The Week Magazine, in Hong Kong at the International Herald Tribune, and in D.C. at the Washington Examiner. I also spent a few years traveling the world managing educational programs for international journalists for the National Press Foundation. I have few illusions about privacy -- feel free to follow me on Twitter: kashhill, subscribe to me on Facebook, Circle me on Google+, or use Google Maps to figure out where the Forbes San Francisco bureau is, and come a-knockin'.

NSA's Utah Data Center Suffers New Round Of Electrical Problems

The NSA’s Utah data center is still struggling to get up and running. The Wall Street Journal reported earlier this month that the site slated to hold exabytes of NSA spy data has been suffering from lightning arcs and meltdowns that have destroyed hundreds of thousands of dollars worth of equipment and prevented the NSA from using the center for its intended purpose: massive data storage and mining. The WSJ reported there had been ten incidents thus far. A source familiar with the project says the center underwent yet another shutdown over the weekend after electrical problems on Thursday and Friday.
The data center was shut down through Tuesday. The source says there aren’t “arcs and fires anymore” but that the experts on the site still haven’t figured out what’s causing the problems. They have figured out how to prevent flashes of lightning, though.
“They’re seeing a pattern of where it gets to the meltdown point and they stop it before it blows again,” says the source. The source says that contractors have been injured and taken to the hospital due to electrocution, but not in the most recent shutdown.

NSA spokesperson Vanee Vines provided a statement about the problems at the site that had previously been provided: “The failures that occurred during testing have been mitigated. A project of this magnitude requires stringent management, oversight, and testing before the government accepts any building.”

“As we’ve said, acceptance testing is underway,” she added.
The facility’s headaches are not limited to what’s happening inside the building. A protest group, Restore The Fourth, has adopted the highway in front of the data center.
“The group plans to carry picket signs as it picks up litter,” reports the Salt Lake Tribune. According to a Facebook event posting, the group is planning its first highway cleanup/mass surveillance protest on Saturday, October 26. With the electrical problems plaguing the billion-dollar data center, rendering it inoperable, the group might prefer protesting government waste rather than spying.

 AMERICAN BLACKOUT, Nuclear EMP Attack, Nuclear EMP--The Ultimate Cyber Threat, The Sun--The Inevitable Cyber Threat, A Real Life Nightmare Nearer Than You Think.

National Geographic is to be applauded for American Blackout which is essentially a training film to educate the American people about the very real threat posed to their lives by a cyber attack on the electric grid.  If there is any fault or unrealism in the docudrama, it is that the blackout lasts only 10 days, and recovery is achieved so quickly.
In real life, terrorists or rogue states would probably not limit their attack on the nation's electric grid to computer viruses or hacking, as implied in the docudrama.  They would also use other more destructive means--that could cause a protracted national blackout lasting months or years.          
Moreover, Mother Nature can inflict a potentially protracted national blackout.  Regardless of how one weighs the threat to the national grid from terrorists or rogue states--the Sun will, sooner or later, hurl toward the Earth a geomagnetic super-storm, with catastrophic consequences for the national electric grid.  

The only one of its kind out there,

John Galt

May 22, 2013
As a collector of survivalist books and scenarios thru out the years I know the outlines of most are basically along the same lines of thinking which is prepping, how to's and other topics. This book has a different theme which is a Noahs Ark type of book and manual on reconstructing afterwards. If you had to rebuild society from scratch in a scenario like the hero in H.G. Wells 'The Time Machine', in which he had only 2 books to choose from to take with him, which ones would you choose? That question remained but now here is one of the books.

I would recommend getting the CD as well and I have subscribed to the free online newsletter sent as well which I find very informative for news that is related and unreported elsewhere to keep me informed. The dedication this author provides in detailed analysis and his time involved is evident. Not motivated by receiving reward, but promoting his idea of survival for humanity as a whole selflessly at cost to himself in doing so and hassle's from the system, result in an unbiased and serious exercise in discussion in this area that is a first of its kind.

 Elizabeth Joyce "Writer - Spiritual teacher"  This book is tragically necessary as a guide for what to do after a major catastrophe, including World War III. Society seems to be in a great age of denial, and not many want to discuss much less think about this subject. Unfortunately we must face the possibility of a situation such as this, and be prepared. If not, we'll be dead, period! How many of us have suffered minor losses, such as electricity, or damages after Katrina and Sandy, hurricanes, flooding, tidal waves, or tornadoes? What and who do we rely on in those times? "The guys" right? You know - the ones who fix the lines, rebuild our homes, or repair our damages. Them - out there - the ones we pay for. Well, in a true Doomsday there may be be no one, it will be up to you and the other survivors.

The dedication this author, and the other researchers involved, provide in a detailed analysis of preparation for what may come, is invaluable. Not motivated by receiving rewards, but the author is similarly promoting his idea of survival for humanity as a whole brings a result in an unbiased and serious exercise in discussion in this area of restarting the world from scratch. That is a first of its kind, for who would truly know how? Your politicians? This book should be a requirement for everyone's book shelf! 

Juan Felipe The author tackles a subject that many people refuse to consider or think about -- the destruction of modern society, most likely precipated by nuclear war, EMP (Electromagnetic Pulse), pandemic, or widespread environmental catastrophe.

Unlike many other "gloom and doom" prognosticators, however, he believes not only that it will be possible to reconstruct society, but that eventually the result may actually be preferable to the one we live in today. He emphasizes that the focus is on "reconstruction" and not "restoration", since "restoration" would eventually result in the same sort of problems that society faces today.

From Chapter 10: "It is my fervent hope that humankind may learn from its mistakes and develop an attitude of universal charity and concern towards all of humanity, regardless of race, religion, culture or other coincidental distinguishing feature."

And a little later: "The earth is but one country, and mankind its citizens."

"The purpose for this book is to be a _How To_ manual for use in a post-cataclysmic recovery period by those who wish to reconstruct society in a just manner to _Build a Better World_ (the slogan of Ark Two)."

The heart of the book is the detailed explanation of LERNs (Local Economy Recovery Networks); how they are formed, what are their functions, and how they interact with other LERNs.

A couple of minor quibbles -- the e-book would be more readable with a consistent font size. Most of the boxes with definitions and supplemental information have a miniscule, almost unreadable font compared to the main body of text. That's probably because they are graphics which do not scale up when the main text size is set to a more readable size. And even within the main text, the font size often changes mid-sentence, although only by a point or two.

Gary E. Weller
Imagine if, one minute from now, a lot of person on Earth disappeared. All maybe 6 billion of us. Doomsday. What would happen to the world without a lot humans? How long would it be before our nuclear power plants erupted, skyscrapers crumbled, and satellites dropped from the sky? What would become of the household pets and farm animals? And could an ecosystem plagued with years of pollution ever recover?

In this book, Bruce explore a world you will never want to see--a world without a lot of people. You are in control. Experience the Aftermath. Rebuild, start over, reboot society, farming, food, life with Bruce Beach on survival after disaster scenarios.

Some people are gonna die and there isn't a whole lot we can do about it.

The Pros and Cons of Underground Data Centers Tips from Data Bunker Veterans

cavern-datacenterOne of the data halls at Cavern Technologies in Kansas, which offers few clues that the facility is 75 feet below ground. (Photo: Cavern Technologies)

The Pros and Cons of Underground Data Centers

October 9th, 2013 By: Rich Miller
The entrance to the Cavern Technologies data center in Lenexa, Kansas, which houses its customer servers more than 75 feet underground. The company is among a growing number of data bunkers storing data for high-security clients. (Photo: Cavern Technologies)

ORLANDO, Fla. - The data bunker industry is growing, as more customers seek out ultra-secure underground hosting for their IT operations. Operators of subterranean server farms say these environments are similar to above-ground facilities, but they often must address misperceptions about underground sites, many of which are housed in former limestone mines.
The emergence of underground data centers was the focus of a session at last week’s Data Center World Fall conference, in which several experts discussed the advantages and challenges of underground data centers, and offered tips to consider when evaluating a data bunker.
“The underground data center space is experiencing rapid growth due to the efficiency and speed to market it offers,” said John Clune, the president of Cavern Technologies, which operates a data center in a limestone mine in Lenexa, Kansas. “One of the bigger challenges has been the perception of underground data centers. People are imagining a tight cubbyhole with a guy with a light on his helmet. The reality is that we’ve got 18 foot ceilings.”
Cavern Technologies is among a cluster of underground facilities in the Midwest, which also includes SubTropolis,  The Mountain Complex and SpringNet Underground in Missouri; and the InfoBunker and U.S. Secure Hosting in Iowa.

Tips from Data Bunker Veterans

Not all underground data centers are created equal, and potential customers need to shop carefully and be mindful of the differences between traditional and underground facilities, according to architect Kerry Knott of Bell/Knott & Associates. Knott has worked on a number of underground business parks and data centers in Kansas and Missouri, and offers some insights into evaluating a data bunker.
“Data center buildouts are a good use for these kind of facilities,” said Knott. “Once the data center is built, if you take someone in there blindfolded, they’d never know they were underground. You’ve got the same equipment; it’s just been an underground facility.”
But there are some differences. Here are some pros and cons to consider with facilities built in limestone mines:

Speed to Market: Clune says Cavern was recently able to deploy 5,000 square feet of data center space for a client in just 60 days. “The speed to market is impressive in the underground,” said Clune. One factor is that there’s no need to build or adapt a shell, as the underground space has already been created and all that is needed is the framing and buildout of the data halls. Another benefit is permitting from local officials. “In every underground I’ve worked with, we have had a blanket permit” once the initial underground space is created,  said Knott. “It’s one of the advantages of underground structures. That could be an 8 to 10 week savings.” Another benefit is that construction can continue year-round, with no weather delays.

Construction Costs: Underground data centers can also be cheaper, Knott said, since there’s no expense to construct a concrete shell. Subterranean structures also offer potential savings on disaster-proofing, especially in the Midwest. “To build a tornado-proof building above ground can cost an extra $100 a square foot,” said Knott, who added that customers often inquire about other types of disasters. “People are concerned about collapse, and they’re worried about earthquakes,” he said. “An underground space, unlike the building above ground, doesn’t move and doesn’t need to be reinforced. An earthquake doesn’t affect the enclosure at all, but you do have to brace the improvements.”

Facility History and Origin: Recently-built underground facilities are usually appropriate, but those that were mined in the 1960s and earlier may not be. “To be an acceptable space for a data center, it has to have been mined for commercial development,” says Knott. “The limestone has to be preserved in the proper thickness and have structural integrity. The room size is also important, because the columnar support will be rock columns that may be 25 to 30 feet in diameter.”
The size and placement of these columns impacts the technical space. “Optimizing the layout within the property is essential,” said Knott. “It’s tough to get 90-degree corners with underground columns, so you have to be creative, since almost all your equipment is square. With the restrictions of the columns and placement of the corridor, you have to work with what you have. It can be awkward if these are haphazardly shaped.”

Cooling and Ventilation: Underground spaces are naturally cool, but that doesn’t mean they’ll stay that way once you fill them with servers. “Heat rejection is the biggest concern and the biggest challenge,” said Knott. “Most underground spaces have their own fresh air and ventilation system, but that’s generally for comfort rather than the kind of heat we’re putting into the space with the data center. Your options are to drill (ventilation) holes up through the top or horizontally to the exterior.”

Placement of Mechanical Equipment: Some mechanical and electrical equipment requires ventilation and must be housed in an exterior yard. There are several options to address this, which customers must consider if their goal is disaster avoidance, as this equipment will be more exposed. “Generators and air-cooled chillers can be placed against an exterior wall or protected with an outside wall,” said Knott. “You can also build another underground chamber to house them.” Another issue to consider is fire suppression systems, and what happens with water in the event the system is ever discharged in part of the facility.

Staff Considerations: There won’t be any daylight in an underground data center, but that’s not different from many above-ground data centers, Knott says. A bigger concern for staff might be parking, as underground facilities can be large, and that sometimes means that parking areas are a significant distance from the data center.

John Clune, President of Cavern Technologies
John Clune, President of Cavern Technologies, a Midwestern underground data center, talks about the pros and cons of underground data centers. While the underground temperature is a consistent 68 degrees, the data center engineers do have to accommodate for waste heat from servers and other gear. (Photo by Colleen Miller.)


Rich Miller is the founder and editor-in-chief of Data Center Knowledge, and has been reporting on the data center sector since 2000. He has tracked the growing impact of high-density computing on the power and cooling of data centers, and the resulting push for improved energy efficiency in these facilities.

Thursday, September 26, 2013
LightEdge Solutions plans underground data center in KC
The Kansas City Star
Sept. 19--A Des Moines-based cloud computing company owned by the Anschutz Corp. will be the premier tenant in a new SubTropolis Data Center in the Kansas City underground.
The company, LightEdge Solutions, and Hunt Midwest Real Estate Development Inc. have announced a partnership to begin work on a 60,000-square-foot "mission critical" data center.
LightEdge will invest nearly $60 million and add 21 jobs in the computer center, the company said, which will be its fourth data center in the Midwest. An opening is planned for the first quarter of 2014.
The new operation will be in Hunt's SubTropolis, which bills itself as the world's largest underground business complex. LightEdge also will use part of a six-acre exterior equipment yard.
LightEdge said it expects to provide carrier-neutral service for AT&T, Surewest, TW Telecom, Time Warner, Unite and Windstream customers and up to 10 gigabits per second network connectivity. The center is served by KCP&L.
The LightEdge development represents the first phase of what could become millions of square feet for the technology center, Hunt Midwest officials said. They cited the underground's high level of security and climate control as pluses for the operation.

Premier colocation and IT cloud service provider expands in Kansas City market

 in expansive underground data center campus

KANSAS CITY, Mo – September 18, 2013 – LightEdge Solutions, a premier cloud computing, colocation, and consulting company, will add a regional data center at the underground SubTropolis Technology Center (STC) in Kansas City. LightEdge will be the anchor tenant in the first phase of STC, a mission critical data center campus owned and operated by Hunt Midwest Real Estate Development.

The new data center, which will be the fourth data center operated by LightEdge, is the result of a unique partnership between LightEdge Solutions and Hunt Midwest, a full-service real estate development company. LightEdge is owned by Qwest founder Phil Anschutz and Hunt Midwest is one of multiple entities owned by the Lamar Hunt family.

“The relationship between LightEdge and Hunt Midwest translates into the convergence of financial strength, best-in-class hosted IT services, experience with hybrid cloud environments, a highly secure and protected location, and low kW power costs compared to other areas of the U.S.,” said Jim Masterson, chief executive officer of LightEdge Solutions.

LightEdge will open the first phase of its 60,000-square foot underground operation, built to Tier III Standards, within STC during the first quarter of 2014. In addition, LightEdge will use a portion of STC’s six-acre equipment yard located on the exterior surface of the property. The highly secure data center will have 24x7x365 monitoring, key card and biometric access.

“In the more than 20 years I’ve been involved in data center development and operations, I’ve never seen a property more appropriate for a data center,” said James deVenny, an independent consultant who was co-founder, president and chief executive officer of Dataside LLC, a provider of enterprise data center space, colocation and managed network services.

“This is an ideal partnership,” said Ora Reynolds, president of Hunt Midwest. “Together, Hunt Midwest and LightEdge have all of the pieces needed to create a world-class data center.”
SubTropolis Technology Center has a naturally hardened limestone infrastructure that provides a fortress six times stronger than concrete and features evenly spaced pillars with 40-foot clearance and 16-foot high ceilings. Millions of square feet are available for contiguous expansion.

Power and HVAC
STC is served by KCP&L, an electric utility known for its award winning reliability. There are two diverse substations and 161 kV transmission lines on the property. The raised-floor facility utilizes Liebert CRAH and Liebert UPS systems with system level redundancy and N+1 components to safeguard power. In addition, parallel diesel-fired generators can deliver 1.5 megawatts of critical load power to the floor. High-efficiency chilled water systems cool the data center environment.

“Hunt Midwest and LightEdge are creating a best-in-class data center infrastructure that will offer a great combination of redundancy and capacity,” said Chuck Caisley, vice president of marketing and public affairs for KCP&L. “The robust electric transmission and distribution network – with multiple substations in the area and proximity to our generating station – directly supports the needs of the technology industry.  In other words, LightEdge’s innovative data center facility at SubTropolis Technology Center makes perfect sense.”

Network Services
LightEdge’s carrier-neutral operation provides the ability to deliver high bandwidth, high reliability and low latency service to customers. LightEdge expects to have the following carriers available when operations begin: AT&T, Surewest, TW Telecom, Time Warner, Unite, and Windstream. Connection with multiple providers, and the nearby 1102 Grand carrier hotel, will provide local private network connectivity up to 10 gigabits per second (Gbps) plus connectivity to the LightEdge cloud for access to other LightEdge data centers.

LightEdge Solutions offers dynamic end-to-end network, colocation and IT cloud solutions, providing businesses with reliable access to Fortune 100-level infrastructure that quickly adapts and scales to meet changing business requirements. LightEdge, which is owned by The Anschutz Corporation, was founded in 1996 and is headquartered in Des Moines, Iowa. LightEdge serves both Midwestern and national companies from its five regional office locations with scalable and customized IT solutions that leverage a fully redundant network backbone and a team of experienced engineers.

SubTropolis Technology Center is a mission critical data center campus that provides a highly scalable infrastructure for millions of square feet of contiguous expansion; redundant power with low kW costs; and fiber networking to meet the ever-growing bandwidth requirements of businesses. STC is served by KCP&L, with two diverse substations and 161 kV transmission lines on the property. The eco-friendly underground STC campus is protected by a 150-foot layer of limestone, making it a Tier IV compliant structure. STC provides a Meet-Me-Room for interconnection with major carriers and service providers, and a six-acre equipment yard on the exterior surface. In addition to LightEdge Solutions, future phases will include government agencies and enterprise users.

Kansas City-based Hunt Midwest has developed over 6,200 acres of commercial, retail, industrial and residential property, and is owner/developer of SubTropolis, the world’s largest underground business complex. Hunt Midwest is a privately held company owned by the Lamar Hunt family. The Hunt family business is a diverse portfolio of entities involved in real estate, sports/media, energy/resources, private equity, and investments. Marquee entities include the Kansas City Chiefs, Hunt Midwest, Chicago Bulls, United Center, Toyota Stadium and FC Dallas Soccer Club.
Underground Secure Data Center Operations

Technology based companies are building new data centers in old mines, caves, and bunkers to host computer equipment below the Earth's surface.

Underground Secure Data Center Operations have a upward trend.

Operations launched in inactive gypsum mines, caves, old abandoned coal mines, abandoned solid limestone mines, positioned deep below the bedrock mines, abandoned hydrogen bomb nuclear bunkers, bunkers deep underground and secure from disasters, both natural and man-made.

The facility have advantages over traditional data centers, such as increased security, lower cost, scalability and ideal environmental conditions. There economic model works, despite the proliferation of data center providers, thanks largely to the natural qualities inherent in the Underground Data Centers.

With 10,000, to to over a 1,000,000 square feet available, there is lots of space to be subdivided to accommodate the growth needs of clients. In addition, the Underground Data Centers has an unlimited supply of naturally cool, 50-degree air, providing the ideal temperature and humidity for computer equipment with minimal HVAC cost.

They are the most secure data centers in the world and unparalleled in terms of square footage, scalability and environmental control.

Yet, while the physical and cost benefits of being underground make them attractive, they have to also invested heavily in high-speed connectivity and redundant power and fiber systems to ensure there operations are not just secure, but also state-of-the-art.

There initially focused on providing disaster recovery solutions, and backup co-location services.

Clients lease space for their own servers, while other provides secure facilities, power and bandwidth. They offers redundant power sources and multiple high-speed Internet connections through OC connected to SONET ring linked to outside connectivity providers through redundant fiber cables.

Underground Data Centers company augments there core services to include disaster recovery solutions, call centers, NOC, wireless connectivity and more.

Strategic partnering with international, and national information technology company, enable them to offer technology solutions ranging from system design and implementation to the sale of software and equipment.

The natural qualities of the Underground Data Centers allow them to offer the best of both worlds premier services and security at highly competitive rates.

Underground Data Centers were established starting in 1990's but really came into there own after September 11 attacks in 2001 when there founders realized the former mines, and bunker offered optimal conditions for a data center. The mines, and bunkers offered superior environmental conditions for electronic equipment, almost invulnerable security and they located near power grids.

Adam Couture, a Mass.-based analyst for Gartner Inc. said Underground Data Centers could find a niche serving businesses that want to reduce vulnerability to any future attacks. Some Underground Data Centers fact sheet said that the Underground Data Center would protect the data center from a cruise missile explosion or plane crash.

Every company after September 11 attacks in 2001 are all going back and re-evaluating their business-continuity plans, This doesn't say everybody's changing them, but everybody's going back and revisiting them in the wake of what happened and the Underground Data Center may be just that.

Comparison chart: Underground data centers

Five facilities compared
Name InfoBunker, LLC The Bunker Montgomery Westland Cavern Technologies Iron Mountain The Underground
Location Des Moines, Iowa* Dover, UK Montgomery, Tex. Lenexa, Kan. Butler County, Penn.*
In business since 2006 1999 2007 2007 Opened by National Storage in 1954. Acquired by Iron Mountain 1998.
Security /access control Biometric; keypad; pan, tilt and zoom cameras; door event and camera logging CCTV, dogs, guards, fence Gated, with access control card, biometrics and a 24x7 security guard Security guard, biometric scan, smart card access and motion detection alarms 24-hour armed guards, visitor escorts, magnetometer, x-ray scanner, closed-circuit television, badge access and other physical and electronic measures for securing the mine's perimeter and vaults
Distance underground (feet) 50 100 60 125 220
Ceiling height in data center space (feet) 16 12 to 50 10 16 to 18 15 (10 feet from raised floor to dropped ceiling)
Original use Military communications bunker Royal Air Force military bunker Private bunker designed to survive a nuclear attack. Complex built in 1982 by Louis Kung (Nephew of Madam Chang Kai Shek) as a residence and headquarters for his oil company, including a secret, 40,000 square foot nuclear fallout shelter. The office building uses bulletproof glass on the first floor and reception area and 3-inch concrete walls with fold-down steel gun ports to protect the bunker 60 feet below. Limestone mine originally developed by an asphalt company that used the materials in road pavement Limestone mine
Total data center space (square feet) 34,000 50,000 28,000 plus 90,000 of office space in a hardened, above-ground building. 40,000 60,000
Total space in facility 65,000 60,000 28,000 3 million 145 acres developed; 1,000 acres total
Data center clients include Insurance company, telephone company, teaching hospital, financial services, e-commerce, security
monitoring/surveillance, veterinary, county government
Banking, mission critical Web applications, online trading NASA/T-Systems, Aker Solutions, Continental Airlines, Houston Chronicle, Express Jet Healthcare, insurance, universities, technology, manufacturing, professional services Marriott International Inc., Iron Mountain, three U.S. government agencies
Number of hosted primary or backup data centers 2 50+ 13 26 5
Services offered Leased data center space, disaster recovery space, wholesale bandwidth Fully managed platforms, partly managed platforms, co-location Disaster recovery/business continuity, co-location and managed services Data center space leasing, design, construction and management Data center leasing, design, construction and maintenance services
Distance from nearest large city Des Moines, about 45 miles* Canterbury, 10 miles; London, 60 miles Houston, 40 miles Kansas City, 15 miles Pittsburgh, 55 miles
Location of cooling system, includng cooling towers Underground Underground Above and below ground. All cooling towers above ground in secure facility. Air cooled systems located underground. Cooling towers located outside
Chillers located above ground to take advantage of "free cooling." Pumps located underground.
Location of generators and fuel tanks Underground Above ground and below ground Two below ground, four above ground. All fuel tanks buried topside. Underground Underground
*Declined to cite exact location/disatance for security reasons.