2020 Saw Intelligent Data Storage Come Into Its Own
Few products on the markets simply stash data – now smart data storage protects it, ensures that it is available when needed, and manages it throughout its lifecycle.
While data storage has always been about more than just storing data, 2020 might be the year where it finally outgrew its name. Today, fewer storage products on the market than ever simply store data—they protect it, ensure that it is available when needed and manage it throughout its lifecycle. In other words, we've been witnessing the rise of intelligent data storage.
So how did this happen? Credit the business insight that data is a critical business asset – the basis for gaining valuable insight, making effective decisions, improving productivity and increasing the bottom line. Because of that focus, companies are looking for ways to harness the value of all of their data, not only from typical business applications but from sensors, 5G networks, cameras and other often unstructured data created in real time. The nature of workloads also has shifted, demanding more of data. That data trickles down to the technology used to store the data.
The growing use of data for analytics is a major driver of this shift towards more intelligent data storage. While analytics used to be more of an add-on function, it is now rolled into data storage products. Vendors like Pure Storage, Qumulo, Scality, Dell EMC and others have made an artform out of combining to enable organizations to access and analyze all types of data, including log data, in real time. A strong trend toward incorporating more data sources at the edge is only fueling this trend.
The trend toward hybrid cloud is another factor in the shift. While companies have been adopting cloud-based storage for years, there is somewhat of a backslide toward a hybrid model. This happens when organizations begin to realize that the cloud is not a storage panacea. It’s often not as secure, scalable or cost-effective as they had hoped, and it can cause the same type of data silos as on premises storage. ESG actually predicted this shift last year, when it noted that most companies had moved at least one workload that was in the public cloud back on premises in 2019.
The growing use of containers is another major factor. DataCore’s Eighth Consecutive Survey on the Storage Market found that 42% are deploying containers in some manner, and they want better storage tools to manage them.
More companies are looking for ways to support persistent storage for container-based workloads. They want higher levels of portability, manageability and capabilities across their container. IT professionals want to have a level of portability across their hybrid cloud environments.
“Managing persistent storage with containers is very different from what we do with virtual machines and certainly bare metal,” said Steve McDowell, a senior analyst at Moor Insights & Strategy. “When you deploy a fleet of virtual machines across your data center, your storage administrator may be provisioning storage a few times a day. But containers are created and destroyed thousands of times a day. That makes their storage requirements very different.”
Data Storage, 2020 Version
So what is storage today? It’s intelligent – everything associated with storage today is intelligent. Containers are intelligent enough to determine where data should be at every point in a process. Storage fabrics are intelligent enough to understand how to route your data to where it needs to be in the most effective way. Storage media itself is intelligent, as vendors build intelligence and even AI into their flash modules. Storage arrays are smarter. And storage management itself is more intelligent than ever, moving data to where it needs to be to avoid latency and other factors.
Getting data where it’s needed, when it’s needed and with the performance required also has driven more interest in software-defined storage. This model, which typically replaces dedicated storage arrays, uses software to direct the action. This allows organizations to be more flexible in managing, storing, sharing and protecting data. When combined with the high-performance capabilities of technologies like NVMeoF (NVMe over Fabric), everything gets better: flexibility, scalability, performance, and availability.
As 2021 rolls in, the morphing of intelligent data storage will continue. Over the next year, McDowell expects to see advances that support this, including more intelligence in storage adapters, more disaggregated storage, more native integration between container orchestration and underlying storage fabrics, and even computational storage.
Companies are ready for this change. More organizations than ever now have chief data officers, with the responsibility for making the most of data and embracing intelligent data storage. Gartner predicts that three-quarters of large companies will have a CDO by next year. Fewer companies are taking the same serious approach to data, but there is progress. A Comptia survey found that about one-third of small businesses have internal IT staff dedicated to data management today.
About the Author
You May Also Like