SolidFire: QoS in Cloud Storage is Architecture, Not a Feature
SolidFire, which designs and sells solid state (SSD) storage systems for the cloud, is hoping to challenge the cloud storage industry to made QoS a a priority. The company has laid out six requirements it says end-users need to guarantee QoS.
March 21, 2013
SolidFire, which designs and sells solid state (SSD) storage systems for the cloud, is hoping to challenge the cloud storage industry to made QoS a a priority. The company has laid out six requirements it says end users need to guarantee QoS.
According to SolidFire, QoS should be part of the cloud storage architecture rather than a feature. In part, this means drafting and adhering to service level agreements (SLAs) regarding storage performance.
“Cloud service providers should be looking at quality of service with the long-term goal of writing firm SLAs against storage performance,” said Dave Wright, SolidFire’s founder and CEO, in a prepared statement. “SolidFire’s QoS Benchmark educates providers with a clear methodology for understanding the conditions that are necessary to deliver guaranteed quality of service. Without it, cloud providers will remain unable to efficiently meet the rising performance requirements of their enterprise customers.”
SolidFire may have the end user’s best interests in mind, but of course, there’s a self-serving nature to this, as well. SSD storage products fit strongly into the benchmark, and that’s the market SolidFire focuses on. Even so, the six requirements make a certain amount of sense, but what kind of impact will it have on cloud storage pricing? Take a gander at the six requirements laid out in the benchmark, and see for yourself:
An all-SSD architecture that enables the delivery of consistent latency for every IO.
True scale-out architecture that allows for linear, predictable performance gains as system scale.
RAID-less data protection to ensure predictable performance regardless of the failure condition.
Balanced load distribution to eliminate hot spots that create unpredictable IO latency.
Fine-grain QoS control that completely eliminates the “noisy neighbor” problem and guarantees volume performance.
Performance virtualization that enables control of performance independent of capacity and on demand.
It sounds like a dream come true in the cloud storage realm—fast, reliable, low-latency storage. It seems like a great idea, but this kind of guarantee must come with an increased price. That might be fine for high-availability storage, but how would this affect the low-cost archival cloud storage offerings that have been growing and expanding in the last year? It might be too much for the average business.
“Quality of service should not be regarded as a feature that can simply be added to a storage product. QoS functionality that is bolted on after the fact tends to leave conditions in which performance is unpredictable and remains a non-starter for business-critical applications. Complete storage QoS requires that it must be considered and implemented at the very core of storage product design,” said Simon Robinson of The 451 Group in a prepared statement. “Without this approach, customers end up with features like rate limiting, prioritization schemes, and tiering algorithms that are designed more to protect the storage system than they are to deliver guaranteed performance to a customer.”
About the Author
You May Also Like