Storage Virtualization, Hyperscale, and the Abstraction of Data update from May 2015
Our capability to control information and data center resources continues to break new ground. Now, we have data abstraction and new levels of hyperscale to make an even more efficient data center platform.
May 20, 2015
We’ve hit an interesting point in the data center world. Organizations are now actively working on reducing complexity, optimizing their existing infrastructure, and really trying to optimize the end-user experience. The interesting part is how all of these organizations (large and small) are actually controlling all of this new data. At this point, business of all sizes are admitting that they have a series of storage challenges.
How do I continue to accelerate user experience without adding more hardware?
What are my options in using logical optimizations around storage?
How do I seamlessly extend my data into the cloud…securely?
Is a commodity data center a reality for me?
Believe it or not – software-defined storage (aka storage virtualization) are ways that many of the challenges listed above can be overcome. The amazing part is that your organization no longer has to worry about expensive gear to keep pace with constantly changing demands of the user. There are always more options.
Understanding data abstraction (virtualization) and hyperscale. New kinds of storage technologies revolving around storage virtualization and hyperscale are creating very real buzz in this industry. Software-defined storage allows you to point any and all storage resources to a virtual appliance for very granular controls. From there, new kinds of hyperscale platforms allows for a completely storage heterogeneous environment to work in unison. Most of all – SDS and hyperscale appliances can take older storage appliances and present next-gen storage features all at the hypervisor layer. Not only can this extend the life of your existing storage arrays – it also directly optimizes any of the storage being presented to the intelligent, virtual, hyperscale (and SDS) appliance.
Creating a logical controller head. Picking up on the last point – many organizations now have a number of storage appliances serving specific purposes. For example, an Nimble Storage array can be in place optimizing VDI workloads while a NetApp FAS acts as a filer and data storage array. With a logical SDS storage controller – you still have these powerful and great technologies in place; but you’re optimizing presented storage via a software controller. Legacy storage appliances can now leverage powerful features to better enhance VM performance. For example – where an older physical appliance can’t support VAAI (a limitation of 256 managed VMs per LUN); it can now point storage to a logical layer and allow that virtual appliance to act as a VAAI engine. Thereby eliminating the limitation and prolonging the life of the appliance. Here’s the other reality – there is nothing stopping you from buying your own Cisco C-Series server and provisioning it with your own disks. Then, simply point those storage resources to an SDS controller – and congratulations. You just created your own storage appliance with a virtual controller.
Using cloud extensions. This is a big one. Your data must be agile and must be able to scale between data centers and cloud points. SDS and data virtualization allows you to seamlessly connect with a variety of cloud providers to push and replicate data between your data center and a hybrid cloud architecture. New SDS solutions now directly integrate with technologies like OpenStack, vCAC and other open-source cloud connectivity devices. What’s more amazing is that you can connect your Hadoop environment right into your SDS layer. This is true data control at the virtual layer because the information can traverse any storage environment – cloud based or on premise.
The proliferation of cloud computing and mobile devices will have an impact on your business. The cloud is producing so much new data that the modern data center must adopt new ways to control all of this information. Here’s the important part – controlling this data is only one of the steps. For many organizations deploying a Hadoop cluster is critical to quantify and analyze vital data points.
Consider the following from Cisco’s latest Cisco Global Cloud Index:
Quantitatively, the impact of cloud computing on data center traffic is clear. It is important to recognize that most Internet traffic has originated or terminated in a data center since 2008.
Data center traffic will continue to dominate Internet traffic for the foreseeable future, but the nature of data center traffic is undergoing a fundamental transformation brought about by cloud applications, services, and infrastructure.
The importance and relevance of the global cloud evolution is highlighted by one of the top-line projections from this updated forecast: By 2018 seventy-eight percent, or over three-quarters of data center traffic, will be cloud data center based.
With these stats in mind – how ready are you to take on the data challenges surrounding a next-gen workload? Is your organization controlling data at the virtual layer? Remember, the amount of data hitting your storage environment will only continue to increase. Deploying powerful data abstraction and management solutions today can greatly impact your evolving business model moving forward.
Read more about:
Data Center KnowledgeAbout the Author
You May Also Like