Home > Feature 2 > Gartner Report Shows IT Catch 22 and the Need for Cloud Automation

Gartner Report Shows IT Catch 22 and the Need for Cloud Automation

Earlier this year, the analyst firm, Gartner, released their Innovation Insight for “Dynamic Optimization Technology for Infrastructure Resources and Cloud Services,” that looks at the current state of enterprise IT infrastructure and the need for new technologies for automating, optimizing and managing software defined infrastructure. The report also paints an interesting picture of the challenges that IT departments are currently facing.

The analysis by the report’s authors – Donna Scott, Milind Govekar – effectively illustrates that IT departments are – quite frankly – stuck in a Catch 22 between the inefficiencies of traditional data center environments, and cloud implementations that are becoming increasingly difficult to manage, secure and afford.

Worse, the cloud may not even be as efficient as many enterprises and IT leaders believe.

Is Cloud Really More Efficient?
IT departments are working to move an increasingly large number of applications and workloads from traditional data centers to the cloud for multiple reasons. First, the cloud offers faster and less expensive provisioning and implementation. Then, there’s the issue of flexibility, mobility and scalability – the cloud has those things baked in. Finally, there is the issue of efficiency.

IT departments traditionally – and smartly – built their IT infrastructures to handle peak demand. They may have even built them to handle more than that to help reduce downtime and to future-proof their infrastructure against growing demand. But, this over-provisioning wasn’t efficient, and IT departments found themselves paying for and building infrastructures that were only partially utilized.

Although numbers vary based on what research you’re citing or who you’re asking, I’d estimate that the average utilization of data centers is somewhere between 20 and 30 percent across enterprises – which equates to significant waste and money effectively flushed down the drain.

The cloud is supposed to fix this. The elasticity of cloud offerings should effectively keep IT departments from having to over-provision their infrastructures. Instead, their cloud implementations can be ramped up and down – when necessary – to handle new workloads, provide development and test environments, and handle peaks and valleys in traffic or network utilization.

But the Gartner report paints a different picture. Gartner’s research points to a situation that may even be less efficient. According to the report, “traditional data center issues of low infrastructure asset utilization and over-provisioning of capacity resources have spilled over into the private cloud environment. As reflected by Gartner client inquiries, many have asset utilization of just 10% to 20%, suggesting significant room for improvement in cost and usage…”

And, this problem is only exacerbated on the public cloud side where the pay-as-you-go commercial model leads to rampant over-provisioning. As per the report, companies are in need of ways to, “…enable better utilization of services at lower costs (for example, by changing instance types, rightsizing instances and turning off VMs that are not being used).”

If the report’s findings are accurate, the cloud may not be the efficiency-generating machine that many believe.

But Wait…There’s More…
The problems with cloud implementations don’t necessarily end with underutilization. There are other ways in which cloud implementations can be thieves, syphoning dollars and proprietary data out of unsuspecting enterprises. The most common of which is caused by decentralization of cloud provisioning decisions.

One of the allures of the cloud is its ability to enable easy on-demand, self-service provisioning of compute resources. Simply, it’s easy for anyone within an organization to pay for and spin up a cloud server or service with little to no help or oversight from their enterprise’s IT department.

This is a positive when it comes to speed of execution and agility, and can also allow Enterprise IT to focus less on provisioning resources and more on higher level tasks. But there’s also a dark side. The lack of visibility can mean that the company loses track of cloud resources (commonly referred to as “Shadow IT”), loses control of their cloud costs and is otherwise in the dark when it comes to their cloud resources and spending.

The Gartner report confirms this and provides metrics that show that the problem is only going to get worse. According to the report, “As more public and private cloud services are provisioned, there is a great potential for waste in terms of resources consumed and/or dollars spent to achieve both agility objectives and SLAs. This is compounded by decentralized IT spending, which is expected to reach 50 percent by 2020…”

What does this mean? Effectively, the cloud programs that many companies began to help drive cost savings are actually costing them more. How much more? According to the report, “Gartner client inquiries show that it is not uncommon for public cloud service bills to be two to three times higher than expectations.”

And this is what makes the decision to move to the cloud a Catch 22 for IT departments. If they continue to utilize their traditional IT infrastructures and physical data centers, they’ll most likely be overpaying and provisioning systems that wind up underutilized, while also losing the agility, flexibility and scalability of the cloud. However, if they begin to migrate applications and workloads to the cloud, they may see similarly-low, or lower utilization rates and higher than anticipated cloud costs.

But there are ways they can get everything – scalability, agility, flexibility, efficiency and cost savings.

The Case for Dynamic Optimization Technology
The Gartner report defines Dynamic Optimization Technology as, “a technology capability that uses telemetry, algorithms, service and resource analytics, and policies to drive automated actions that reduce waste, cost and risk exposure, while simultaneously improving service levels.”

Simply put, these new technologies – including today’s advanced cloud management and cloud automation solutions – enable enterprises to better manage their infrastructures – including their cloud resources – in a way that requires little oversight but greatly increases transparency and cost savings.

These new technologies aggregate all of an enterprise’s cloud resources – regardless of cloud provider – onto a single operational framework, or “pane of glass.” In this management pane, policies can be set and implemented across all clouds, changes can be made universally and workloads and data can be shifted seamlessly between cloud resources.

Gartner imageThis is just what enterprises need to escape the Catch 22.

Migrating to the cloud and implementing a cloud automation or dynamic optimization solution can help deliver all of the benefits they’re looking for from a cloud implementation while also delivering the tools they need to address concerns about underutilization and cost overruns.

These solutions give IT departments the ability to enable self-service and decentralized cloud provisioning because all provisioned cloud resources are aggregated and have policy applied consistently within one control plane. The aggregation of real-time cloud data and lifecycle controls enables easier tracking of cloud expenditures and better control over the utilization of cloud servers by enabling IT departments to move data and applications onto underutilized cloud resources and subsequently spin-down unneeded cloud servers.

But there’s another area in which these solutions can help enterprises – cybersecurity. By aggregating all clouds in one place and implementing universal security controls and system updates, every cloud server and resource can be protected from cyberattack by ensuring that vulnerabilities are immediately identified and remediated per standard policy. This is especially important in today’s constantly shifting and increasingly sophisticated threat environment.

According to the Gartner report, “Dynamic Optimization Technology enforces policies that reduce sprawl and wasted costs, increases policy compliance…reduces risks from accidental or malicious activities…[and]…enable[s] better utilization of services at lower costs (for example, by changing instance types, rightsizing instances and turning off VMs that are not being used).”

The Growth of the Dynamic Optimization Technology Market
The move to the cloud is imminent. Enterprises of all sizes are migrating their infrastructure, applications and data to the cloud. And many are implementing multiple, disparate clouds in a hybrid cloud environment – whether that be public and private clouds, or multiple public clouds from disparate vendors – as a way to increase resiliency and reap other benefits.

But as enterprises move to the cloud, they’re going to experience all of the pains that we discussed above – inefficiency, underutilization, security concerns and cost increases. This pain will drive many of them to explore cloud automation, cloud management and other Dynamic Optimization Technologies as a way to alleviate this pain and overcome these challenges.

According to Gartner, “IT leaders (including I&O, CSBs and IT finance) are realizing the need for Dynamic Optimization Technology because of the rapid expansion of cloud services, and the desire to contain costs or reduce risk.”

If these solutions were the best salve for the pains impacting enterprise cloud implementations, it would be fair to expect significant growth of the Dynamic Optimization Technology market as an increasing number of enterprises move to the cloud. Unsurprisingly, that’s exactly what Gartner is predicting.

Gartner currently estimates adoption of these technologies at less than ten percent of medium and large enterprises. But those numbers are predicted to explode by 2020. According to the report, “we see penetration of public cloud optimization growing significantly, with Dynamic Optimization Technology’s penetration in large enterprises rising to 25 percent by 2020.”

That number is even more generous for private cloud environments. As the report claims, “…by 2020, 35% of large enterprises will implement Dynamic Optimization Technology for private cloud and virtualized infrastructure, up from less than 5% in 2016.”

Enterprises are faced with a Catch 22 – having to choose between traditional IT infrastructures that were inefficient, expensive and underutilized, and cloud implementations that had the potential to create significant management and spending issues across the organization. But Dynamic Optimization Technology can break them out of that Catch 22. It can make the benefits of the cloud available to enterprises and help them eliminate many of the cloud’s shortcomings, without having to sacrifice the self-service and agility that are some of the most alluring aspects of cloud implementations.

The abilities of these technologies are not going unnoticed. As more enterprises migrate to the cloud, and experience the pain, cost overruns and management issues that can come with it, Dynamic Optimization Technology will become essential in their IT departments. And that is reflected in the impressive growth Gartner is projecting for the market.

For additional information about Dynamic Optimization Technology, its benefits and the market’s potential for growth, download the Gartner report by clicking HERE.

You may also like
Is Your List of Non-Compliant Cloud Resources Out of Control? Take Action!
Does cloud native mean cloud proficient?
State of Cloud Readiness report shows impact of shadow IT
Cloud skill gap increases need for automation

2 Responses

  1. Pingback : The Complexities of Managing AWS at Scale - CloudSprawl

  2. Pingback : Cloud strengths/weaknesses driving hybrid cloud adoption - CloudSprawl

Leave a Reply