Top Data Center Trends for 2017
The data center is under enormous pressure these days to become more scalable, more flexible and less expensive to build and operate in order to offer business users the experience and advantages they have learned to appreciate in public cloud offerings like AWS. These broad goals are evolving along multiple technology fronts, making it difficult for IT executives to align on-prem and cloud strategies and chart the best way forward to achieve the next-generation data environment.
One thing is certain, however: within a relatively short time, the data center will emerge as a radically new entity in both form and function and will incorporate a wealth of new technologies that were considered by many to be the stuff of science fiction only a few years ago.
For the coming year, here are the top trends affecting data infrastructure:
Hyperconverged Infrastructure and the Software Defined Data Center
Hardware will continue to form the backbone of data infrastructure, but it will no longer require the time and attention of today’s sprawling facilities. Top IT vendors are rapidly transforming their platforms to hyperconverged infrastructure (HCI), in which modular compute-storage-networking components are deployed, configured and managed with only a modicum of technical expertise.
This moves all the action surrounding resource provisioning, integration and management to the software layer, which provides a great deal more flexibility to suit increasingly divergent and specialized applications needs. What’s more, it introduces a level of self-service that allows knowledge workers to define and implement their own operating environments while fostering the creation of end-to-end software defined data center environments that can extend to wide-area cloud footprints at minimal cost
Cloud Federation
Already, most enterprises are juggling multiple workloads on multiple clouds, which runs the risk of recreating the same silo-laden infrastructure that hampers productivity in the local data center. To counter this, organizations are implementing broad interoperability between public, private and hybrid clouds to effectively create a geo-distributed data ecosystem.
This is more challenging than it sounds, however, because it requires a high degree of orchestration up and down the data stack and across end-to-end infrastructure that spans multiple third-party constructs. At the same time, says Column Technologies’ Brian Thoman, IT will have to recognize that decisions regarding this type of management will focus more on the needs of users, not technology, so there will be much more coordination between line-of-business managers and technicians than in today’s data center.
The New Data Edge
Adding to both of these challenges is the advent of the Internet of Things, which will push the data center edge outward to a multitude of connected user devices that must be folded into highly dynamic data flows and geared for continuous monitoring and security. For an industry that already struggles to maintain security, governance, compliance and a host of other frameworks across its own enterprise and cloud infrastructure, imagine what it will be like when the data environment extends all the way to consumer products like washing machines and light bulbs.
To meet this challenge, the enterprise will deploy container-based microservices to address the myriad minute functions that are crucial to the high-speed, service-based economy, as well as micro data centers that push processing as close to users as possible to provide timely and accurate results.
System Autonomy
Automation is not new to the enterprise, but system autonomy is a potentially game-changing development. Driven by artificial intelligence, machine learning and other advanced technologies, autonomy takes automation to the next level by enabling systems to learn from and adapt to their changing surroundings with little or no human intervention. This phenomenon is poised to not just remake IT processes but redefine functions across the entire enterprise landscape, from product development to customer fulfillment.
This will have profound implications for the data workforce, of course, but it will also push data infrastructure, and the people who know how to use it properly, to new heights of productivity. If implemented correctly, and for the right reasons, autonomy will lower costs, improve profitability and foster new markets, new business models and perhaps entirely new industries.
The main theme running through all of these trends is that they promise what the front office has been demanding of IT for years: do more with less. With infrastructure that is smarter, more agile and distributed across great distances, the enterprise will not only have the means to thrive in the emerging digital economy, but to transcend the limitations that physical reality has imposed on data operations since the first mainframe was powered up nearly half a century ago.
It’s going to be an interesting journey, to say the least.
Ariel Maislos brings more than twenty years of technology innovation and entrepreneurship to Stratoscale. After a ten-year career with the IDF, where he was responsible for managing a section of the Technology R&D Department, Ariel founded Passave, now the world leader in FTTH technology, which was established in 2001 and acquired in 2006 by PMC-Sierra (PMCS. In 2006 Ariel founded Pudding Media, an early pioneer in speech recognition technology, and Anobit, the leading provider of SSD technology acquired by Apple (AAPL) in 2012. At Apple, he served as a Senior Director in charge of Flash Storage, until he left the company to found Stratoscale. Ariel is a graduate of the prestigious IDF training program Talpiot, and holds a BSc from the Hebrew University of Jerusalem in Physics, Mathematics and Computer Science (Cum Laude) and an MBA from Tel Aviv University.