The Evolution of the Datacenter

Virtualization has become a critical component in all sizes of IT infrastructures, from SMBs to the enterprise. Not so long ago businesses were reluctant to virtualize mission-critical applications and servers like SQL Server, Exchange, and SharePoint; however that has now changed.

Michael Otey

May 16, 2011

5 Min Read
ITPro Today logo in a gray background | ITPro Today

Today, the corporate datacenter is undergoing a shift in the fundamental technologies that make up the IT infrastructure. In the not-so-distant past almost every new server application was deployed using its own server platform. This was the established best practice because it prevented different resource-intensive applications such as Exchange and SQL Server from interfering with each other. However, after a few years, businesses found that this tactic also led to the proliferation of servers throughout the organization. This situation of having multiple single-purpose servers—termed ‘Server Sprawl’—created management problems, as well as involving significant power and cooling requirements. In addition, these single-purpose servers were typically underutilized. Studies by several organizations, including Microsoft and Gartner, revealed that the average server utilization rate for most organizations today is less than 15 percent. This means that most companies are not getting the full value out of their investments. They are buying computing capacity that they are not using, as well as expanding their operating costs and creating management headaches in the process.

The advent of production-ready virtualization radically changed the way IT staffs now deploy servers and mission-critical applications. Hypervisor-based virtualization platforms such as Microsoft’s Hyper-V and VMware’s ESX Server can now provide performance levels that are within 90 to 95 percent of the performance that you can get out of a physical server system. This production-level performance has enabled businesses to deploy their mission critical applications on virtualized servers. In fact, in many cases moving an application from an older hardware platform to a newer, high-performance hardware platform often results in application performance gains even if the application is running in a virtual machine.

Server virtualization has become a critical component in IT infrastructures of all sizes, from SMBs to the enterprise. Today, most organizations have used virtualization for server consolidation. Virtualization enables the organization to maximize hardware resources in order to implement a scalable and efficient IT infrastructure. Recent estimates by Gartner indicate that 30 percent of all enterprise workloads are now virtualized. In addition to increasing ROI, using server consolidation virtualization also brings several other important benefits to the IT infrastructure, including enhanced business continuity and dynamic IT operations.  IT organizations are increasingly becoming more comfortable with the ability to leverage virtualization and still address quality of service requirements for mission critical applications.

One of the fundamental tenants of virtualization is the fact that it allows the server platform and applications to be abstracted from the underlying hardware platform. This has a couple of vital implications for business continuity. First, on the disaster recovery side, running your servers in a VM enables you to snapshot the VM and restore it very quickly to another virtualization host. This eliminates the need to perform time-intensive bare-metal installations followed by equally time-consuming data restoration processes. Next, technologies like Hyper-V’s Live Migration and VMware’s VMotion enable you to reduce planned downtime by moving running VMs between different virtualization hosts thereby enabling you to perform scheduled maintenance on those hosts with no interruption of end-user services. These types of operational advantages are definitely driving the adoption in IT today. Gartner estimates that by 2012 about half of all IT servers will be virtualized.

For most businesses, the next evolutionary step in the future datacenter is the move toward dynamic IT operations and the integration of either private or public cloud technologies. Virtualization is the cornerstone to implementing each of these technologies. The dynamic IT infrastructure enables IT resources to be automatically adjusted and configured to meet end-user demand. For instance, System Center Virtual Machine Manager 2008’s Performance Resource Optimization feature works in conjunction with System Center Operations Manager to monitor virtualization host and guest workloads.  If the resource utilization of either the host or a guest exceeds a certain predefined threshold, one or more VMs can be dynamically live-migrated to other hosts. Likewise, the cloud builds on virtualization to supply dynamic end-user services. Building those services using your internal IT infrastructure is the basis for the private cloud. The ability to successfully address quality of services requirements mandates that IT organizations deliver simplified and automated management capabilities. Industry notables such as EMC are working with virtualization leaders such as VMware and Microsoft to develop tools that integrate with existing management dashboards and simplify the management of these virtualized environments. Buying those services from an external provider such as Microsoft and Windows Azure is the basis for the public cloud. As you might guess, these aren’t mutually exclusive and combining both public and private cloud is known as the hybrid cloud. In all cases, the goal behind cloud technologies is greater flexibility and lower costs. The cloud promises to provide flexibility by enabling you to rapidly grow and shrink infrastructure as it is required by your business needs and yet still be cost effective as you only pay for the services that your business consumes.

 

 

EMC’s Window to the Private Cloud Partner Post:

Storage Designed for the Microsoft Administrator

By Adrian Simays

At the beginning of 2011, EMC announced the release of 41 new products. Yes you read that right….FORTY-ONE. Among the products announced is the VNX & VNXe – our new unified family of products. The VNXe (the e is for entry) is a new low end entry model designed for the small & medium businesses with a price tag under $10k… The features that make this product so attractive equate down to offering a small cost-effective storage system that is 3x the performance of our current unified arrays! People aren't over exaggerating when they say this release is a game changer for EMC…       Read More

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like