Image by Microsoft via The New York Times
Image by Microsoft via The New York Times

There’s no question, the holy grail of big data analytics is performance—getting actionable insights from your data in real time without time-consuming bottlenecks. Increased performance that gives fast insight for smarter business decisions also has the benefit of increasing the big data ROI, which is necessary for continued executive buy-in. To increase those returns even more, organizations are looking outside of performance and going to great lengths, or depths, to lower costs—especially when it comes to data center cooling.

Going to New Depths to Find a Solution
Conventional data centers, with hundreds of commodity server clusters, consume an incredible amount of energy between powering equipment and lights and keeping the temperature cool enough for the servers and networking equipment to function properly. As such, data center cooling is a growing concern for organizations of almost every size. Even smaller enterprises that have just a floor or large room dedicated to their data centers are still consuming massive amounts of energy for cooling. So it’s not a surprise that companies are looking for ways to lower that cost, but what sometimes is a surprise is the way they are going about it.

Recently, Microsoft announced it had developed a prototype of a self-contained data center that could be installed beneath the surface of the ocean in order to reduce the air conditioning bill and use the water’s current to help power other operations. Putting conventional data centers underwater proves that companies are looking for any and every possible way to reduce the amount of energy used. But what this also does is increase the complexity of running, maintaining and fixing the technology within the data center. Not to mention, no one wants to have a recurring nightmare about water leaks and IT equipment.

Another option, albeit more popular, is building data centers in cooler climates that provide natural air conditioning. Facebook is just one of the many companies that takes this into consideration when building out its data centers. But, it is one of the few that went nearly as far as the Arctic Circle to take advantage of cold air to decrease cooling costs. While Facebook is smartly using the cool air to its advantage, data transport time and additional burdens on the network to move data add significant delays and a whole new layer of cost and complexity to data center operations.

Don’t Work Around the Problem. Fix It.
Instead of spending resources to find new ways to lower cooling costs on the same inefficient infrastructure used for years, companies should start looking to smaller, more-efficient appliances that reduce or eliminate the cooling problem from the beginning. This is especially true when it comes to data analytics infrastructures, where the reliance on x86-based servers is forcing companies of all sizes to build out sprawling clusters that take up valuable real estate and force high energy bills for cooling.

Moving away from the commodity x86 servers to the higher performance of hybrid FPGA/x86-based computing allows companies to reduce the footprint of their data analytics infrastructures as well as their overall data centers. This not only slashes power consumption, but it also limits the need for top-of-rack routers, cabling, lighting and more. Instead of finding creative places—like the Arctic Circle—to naturally cool servers that aren’t efficient, organizations should focus on creating smaller, much more efficient—and therefore cooler—data center infrastructures that do not require sending operations to the ocean floor.

Why Submerge Data or Ship It to the Arctic When You Can Streamline Infrastructures by 100X?      
Contact us to hear how Ryft supercharges existing big data ecosystems while decreasing energy needs and cooling costs by using a fraction of the traditional infrastructure’s space and power!

TCO Graphic_775

Leave a Reply

Your email address will not be published. Required fields are marked *