Analysis: The Future of the Data Center

Published On: 4th September 2018//5.5 min read//

The traditional data center as we know it is changing. In order for IT team to continue to deliver the best solutions, they must understand business needs. However, businesses are becoming more agile, geographically far reaching, and advances in technology are creating more pressure to come up with IT solutions that fit budgets and time to market in order to stay competitive. But why is change necessary? What can we expect? And what could this mean for the future of storage?

Many people have had their say on the subject. There’s been an avalanche of articles published about various predictions for the datacenter, but how consistent are these visions? We’ve compiled a comprehensive analysis of the popular opinions of the future of the data center to provide some clarity to the debate.

Andre Benz

Why is change predicted?

Gartner predicts that by 2025, 80% of enterprises will have shut down their traditional datacenter, versus 10% today. Why is this? What is it about the traditional datacenter that is no longer fit for purpose?

Increase in data

The world’s population is growing at an unprecedented rate, which is expected to put a strain on traditional datacenters – which often sit just outside large towns and cities – as it is estimated that 66% of the planet will be living in urban areas by 2050. This means that we could get to a stage where traditional datacenters run out of capacity to fit our needs. However, this could be easily resolved by building more/bigger datacenters.

Increase in outage incident rates

As well as creating more data, we are also increasingly expecting our data to do more. This means that there is increasing complexity and high interdependency in what is being processed in traditional datacenters, and the strain is already starting to show. Uptime reported that infrastructure outages and “severe service degradation” increased 6 percent over last year to 31 percent of those surveyed. Over the past three years, nearly half had experienced an outage at their own site or a service provider’s. As the volume of data increases, this is only likely to get worse. Downtime is never ideal, but as we shift towards more automated services, unexpected downtime could become critical. This would be a major driving point in shifting to more highly available server and storage solutions.

Change in needs

The need for highly available servers and storage leads into the final reason behind the demise of the traditional datacenter: the focus of IT strategy is moving away from an architecture and infrastructure focus and towards an application and service focus. As we do more with technology, we need to find a way to escape the shackles of the traditional datacenter. But what would this look like?

What are the needs of the ‘new’ datacenter?

Security has become a critical factor in how we store our data. Traditional data centers often have very high levels of physical security, whereas alternatives like cloud datacenters are at a much higher risk of data breach. A viable alternative to the traditional datacenter must be able to ensure data security, especially in a post-GDPR world.

The shift away from the traditional datacenter must produce an alternative solution that at the very least ensures a reduction in unplanned downtime, if not eliminating it entirely. Not only to maintain the technology that we currently have, but to have the capacity to sustain advancements in AI and other future technologies.

What’s next?

More and more data is going to move to and come from the edge. Advancements in AI, autonomous cars, smart cities, and IoT devices will all mean that there will be more applications producing and processing data on the edge of the network. But how is the datacenter going to evolve to meet our data needs?

Dewang GuptaCloud datacenters

Cloud datacenters are becoming increasingly popular due to their scalability. Storage can be scaled up or down, depending on demand, providing flexible centralized storage.

According to Cisco’s Global Cloud Index white paper, “By 2021, 94% of workloads and compute instances will be processed by cloud data centers; 6% will be processed by traditional data centers.” The simplicity and proliferation of both public and private cloud up until now demonstrates its popularity and capabilities, and these will be best utilized by connecting cloud datacenters to the edge.

Connecting the cloud to the edge

There are many reasons that edge sites might need connections to the cloud or a traditional datacenter (backup/archiving, data analytics/big data, reporting). One interesting example: running VDI workloads at the edge while maintaining uptime. The solution: creating hybrid setups that leverage the cloud at the edge, the first of which was launched this year by Citrix and StorMagic.


IoT development will mean that the sheer scale of data being produced will become greater than the capacity of the cloud. Equally, some IoT devices will require data to be processed locally. This will see an increase in data sharing between multiple IoT devices, on the edge of the network, without sending data back to the cloud. Edge and IoT is likely to become popular as autonomous vehicles become more common, more smart cities are developed, and we rely more on locally processing data from IoT sensors.

Autonomous vehicles require instantaneous communication across thousands of vehicles. In order to avoid accidents, fog computing will allow data on traffic or roadblocks as examples to be passed along all autonomous vehicles in an area continuously.

The ‘green’ datacenter

Microsoft recently made the news by deploying the first underwater datacenter. This experimental datacenter was in response to evidence that large, traditional cloud datacenters are unsustainable. If the underwater datacenter proves effective, then Microsoft hopes to deploy models like it near big cities, in efforts to meet the growing data need. It is still too early to tell if the technology will be effective but we look forward to hearing more about what they find.

In summary, it is clear that there will be a significant shift to data processing at the edge, which will allow for more data to be created and processed away from the cloud and large corporate datacenters, while still benefiting from high availability, security factors like data encryption, and simple infrastructure and deployment. By connecting edge technology to smaller cloud datacenters, we can hope to create a new wave of flexible, efficient storage, which is focused on the needs of services and applications, rather than being made to fit infrastructure that is normally associated with the traditional datacenter.

Share This Post, Choose Your Platform!

Recent Blog Posts
Go to Top