Storing for the Future: How Data Centers Will Advance - Success Knocks | The Business Magazine
storing future data center
post-template-default,single,single-post,postid-1433,single-format-standard,bridge-core-2.6.5,qode-news-3.0.2,qode-page-transition-enabled,ajax_fade,page_not_loaded,qode-page-loading-effect-enabled,,qode_grid_1400,footer_responsive_adv,hide_top_bar_on_mobile_header,qode-content-sidebar-responsive,qode-child-theme-ver-,qode-theme-ver-25.0,qode-theme-bridge,qode_header_in_grid,wpb-js-composer js-comp-ver-6.5.0,vc_responsive

Storing for the Future: How Data Centers Will Advance

Storing for the Future: How Data Centers Will Advance


he idea that data is an incredibly valuable resource in the modern business landscape isn’t new—but best practices for managing that data seem to change almost by the year. More than ever, enterprises leverage data centers to do their work, and savvy executives will be looking ahead in 2020 and beyond to learn how data can be managed more effectively.


Let’s consider three key questions here.


How will the advancement of AI improve the efficiency of data center technology?


Increasingly, artificial intelligence is being “baked in” to products from the get-go. A popular example of this concept would be IoT appliances—think a refrigerator that’s able to identify the items on its shelves, automatically facilitate restock orders and report on its own functioning and maintenance needs. Data center hardware can similarly benefit from AI:



  • Collecting Operational Data: IoT-empowered data centers keep track of their own systems on a more granular level, making it easy to compare actual performance with expected baselines. Data points might include temperature, battery functioning, data retrieval times and power usage.


  • Descriptive Analytics: Purpose-built analytics suites convert reams of data into useful insights—for customers and manufacturers alike.


  • Optimizing Efficiencies: AI can automatically regulate resource usage to save energy during low-usage periods, and take action when higher usage threatens to cause costly downtime.


  • Factoring in Context: There’s also the outside world to consider. By factoring in important contextual data, such as weather (which impacts cooling in each facility) and holidays (think Cyber Monday usage spikes), AI can tailor its functioning to adapt on the fly.


  • Detecting Malfunctions: Identify significant anomalies and take action to solve issues—before they become critical.


  • Anticipating Equipment Failure: AIs can project when components are likely to fail or fall below an acceptable level of efficiency. Having a clear understanding of a given piece of equipment’s natural lifecycle means having plans in place to keep things humming along.


In 2014 Google was famously able to reduce cooling costs in its data centers by a whopping 40% when it allowed its DeepMind AI to optimize functioning. Now the company is even manufacturing its own custom chipsets to squeeze out greater efficiencies and reduce the overall number of data centers the company relies on for highly resource-intensive functions like speech recognition.



How will data centers be affected by 5G wireless becoming the eventual standard?


By the end of 2020, 5G will be well on its way to becoming the new standard. This has big implications for applications like driverless vehicles, which are too data-intensive to function properly with current-generation 4G connectivity. It’s believed 5G wireless will be able to support speeds up to 100GB/s, roughly 100 times faster than 4G—which is expected to cause a permanent spike in data usage.


You may recall talk in 2016 about the internet entering the so-called Zettabyte Era when global IP traffic first exceeded one zettabyte. According to Cisco Systems, which coined the original term, 5G will bring about the Mobile Zettabyte Era. Considering the fact that the internet already consumes roughly 10% of the energy expended on Earth each year, this has massive implications for data centers.


On the one hand, the anticipated increased demand on data center hardware is already helping to spark a construction gold rush (see the next question for more on that)—a development that will benefit companies that can afford to build at hyper-scale for interconnectivity.


On the other hand, 5G offers so many potential benefits to enterprises (such as improved power efficiency; dynamic resource allocation; and massively improved support for IoT applications), that overall business for data centers should be thriving.



Will the surge in cloud data center construction make the idea of an on-premise data center obsolete for enterprises?


Data center construction is big business, with cloud companies spending over US$150 billion on new construction in the first half of 2019 alone. Does this spell doom for the on-premise server farm?


Gartner Research VP David Cappuccio certainly thinks so. In a blog post called “The Data Center is Dead,” the veteran infrastructure researcher asserts his belief that by 2025 no less than 80% of enterprises will have shut down their on-premise data centers. The crux of his argument is that most of the advantages of traditional data centers have evaporated thanks to technological advancements—notably faster data transfer and the greater operational efficiencies at hyper-scale that mammoth server farms enable.


The real tipping, though, is at the Edge.


Edge data centers are located close to customers’ physical locations, reducing latency. This improves service for more intensive needs like gaming, streaming and cloud computing. Having local nodes allows larger distributed cloud networks to also offer consistent enterprise-quality performance, even outside of high-tier regions like New York and San Francisco.


Altogether, most of the key advantages of on-premise data centers have been obviated, and those that remain have been relegated to niche functions. Today’s IT decision-makers now look for solutions based on their general business needs, such as the specific requirements for data centers in healthcare, as an example, rather than trying to force the solution to fit into their existing data architecture.


This agility helps enterprises more easily hunt for efficiencies, which will remain the hallmark of a successful company in 2020.