Despite the widespread advances of virtualization in the enterprise data center, we continue to see the familiar ratio of ~ 70% of the IT budget spent “keeping the lights on” versus ~30% investment in innovation—detailed in a recent post and shown in these cross-industry Gartner IT Key Metrics. Given that a significant portion of the typical IT budget is spent on IT Operations, it gives rise to the question of whether there is an alternative approach with a razor-like focus on reducing operational costs.
One place to create more efficiency is in the deployment of IT staff. Look at large-scale web providers, such as Google, Facebook and others, whose server-to-admin ratios are widely purported to be extremely high—5000:1, relative to a typical IT shop at 40:1. How are these cloud service providers driving operational efficiencies more than two orders of magnitude greater than many IT departments? Part of the solution by these providers has been radical simplification and standardization of their infrastructure; however, a big part is also around dramatic automation of manual and time-consuming processes. Wikibon describes the Hyperscale model as:
“The application is put on the most suitable set of hardware, and the software management services are provided by software packages, open-source software services, or services provided by the application. The application is designed or chosen to be compatible with a scale-out commodity system design. All possible physical maintenance is eliminated, both from vendors and in the data center.”
VMTurbo provides Software-Defined Control that forms a major constituent part of the aforementioned software management services. As the Software-Defined Data Center continues it’s inexorable march across compute, storage and network, the number of software control points is growing exponentially, and we need the right intelligence to make sure we are driving the appropriate actions to maximize utilization of the infrastructure while assuring application performance. Manually performing these resource allocation decisions requires significant time, cost and human capital. In reality, modern virtualized and cloud environments change too quickly and have too many interconnected moving parts to make such decisions manually.
The goal of Hyperscale operations is to dynamically match IT resources with application demand or as Dave Cartwright at The Register succinctly puts it, “How to grow and shrink like Alice.” This means fundamentally an understanding of what workloads to run where and when, across both our private and public clouds, while respecting business policies and constraints.
With close to 60% of surveyed VMTurbo customers reporting an increase in IT staff productivity of 20% or more, VMTurbo’s Software-Defined Control provides a pragmatic path towards Hyperscale operational efficiencies in today’s virtualized data centers.