Hitachi Data Systems Updates its Content Platform Service
April 24, 2012
Data storage vendor Hitachi Data Systems (HDS) has announced enhancements to its main cloud services: Hitachi Content Platform (HCP) — its object store technology — and Hitachi Data Ingestor (HDI) — which the company bills as a “bottomless, backup-free cloud on-ramp.”
HCP is a virtualized object storage solution that enables simplified, automated data management. Relying on metadata, it stores data as self-described objects, allowing for advanced storage management and policies to be associated and managed at a granular level. HCP works in conjection with HDI, a cloud on-ramp and file-serving solution that sends data from distributed locations to a firm’s central infrastructure.
The enhancements are centered on effective management of unstructured data, content security in shared storage environments and reducing the complexity of distributed IT, all with an eye on cost reductions.
“Organizations are looking for new, innovative approaches to deal with the expansive growth of unstructured content and extract the full value of their data,” said Miklos Sandorfi, chief strategist of Hitachi Data Systems, in a prepared statement. “Our content cloud approach allows organizations to store billions of data objects and use intelligence layers to index and search the data independently of the application that created it. The data is available across devices, anytime and anywhere.”
In a recent conversation with Tanya Loughlin, manager, Cloud Product Marketing, and Jeff Lundberg, senior product marketing manager, File, Content and Cloud at HDS, Loughlin noted the object storage and metadata capability of HCP:
“Storing data as an object is storing data about the data, what it is, who created it, how it’s accessed, what application it come from. We are able to do interesting things with that metadata, being able to find it, create automated policies to do things with data based on some of the criteria or information about the data, [for example] all data over this, do this with it, delete it, move it. It really opens up a lot of automation. And there’s compliance, and metadata certainly plays into the compliance aspect.”
Lundberg then elaborated on the compliance and storage benefits:
“So, to take it a step further, you could look at it from a compliance perspective. [You have] this certain set of data for 50 years, and it be could something along the lines of a tiering strategy, where I want to find all of my files created by a recently retired employee and I want to keep them for 10 more years until I don’t have any more requirements around that [data]. Or maybe I haven’t used this file for a year, let’s take it off of primary, let’s put it into the HCP, and then let’s put it into a volume that we turn the power off, we spin that individual disk down, so it’s not consuming power and generating heat, it’s to store this file for whatever time we may need it in the future.”
In December 2011, Hitachi Data Systems announced upgrades to HDI, including new content-sharing, file-restore and NAS migration capabilities. Last summer, VMware and Hitachi partnered to create a financial cloud solution built on VMware’s vSphere.
Talkin’ Cloud readers can click here to learn about the Hitachi Data Systems partner program.
About the Author
You May Also Like