IBM Taps Actifio For Cloud Storage Service

SmartCloud Data Virtualization puts virtualized copies of application data within the cloud, then makes it available wherever it’s most needed.

IBM Predicts Next 5 Life-Changing Tech Innovations

IBM Predicts Next 5 Life-Changing Tech Innovations

(Click image for larger view.)

IBM has launched SmartCloud Data Virtualization, a cloud storage service that depends Actifio’s Virtual Data Pipeline technology. Actifio’s “one golden copy” option to data may entice enterprises that need multiple copies of every application’s data but try to scale back storage costs.

IBM previously recommended a few of its customers store data in now-defunct Nirvanix’s cloud datacenters. Its partnership with Actifio acknowledges that cloud storage have to be backed by stronger vendors with more sophisticated data management systems. SmartCloud Data Virtualization is IBM’s first major foray into data management in accordance with cloud services since Nirvanix declared bankruptcy last October.

Actifio can put a front end on multiple applications, each with its own storage system, through IBM’s SmartCloud Data Virtualization service. Enterprise users of the info continue to access it the identical way, as though it were still around the hall within the datacenter, although it might be in a far off IBM cloud environment. Actifio, in effect, enables an application’s production data to be separated from its physical location and operation and placed inside the ideal storage location.

The Actifio system is provided with enough storage software smarts to make your mind up what data is in high demand and may be held in on-premises cache and what’s less frequently accessed and will be placed in distant cold storage. IBM’s announcement February 4 pitched the service primarily as a cloud-based recovery system, cheaper than a common disaster recovery system that is determined by an analogous hardware system in a separate location.

[Think backup is a pain? Virtualization just makes it worse. See Data Protection Must Change In Virtualization Age.]

Actifio’s core idea is that there can be one golden copy of an application’s data, much as there maybe one golden copy of a virtual desktop, from which thousands of clones will be created and sent to users on short notice. That golden copy should then be available as a “virtualized copy” wherever it’s needed. Actifio called the approach Copy Data Management when it launched in 2009 as a startup.

To move data, an IBM cloud storage user will activate the Actifio Virtual Data Pipeline using a service-level agreement. The pipeline works with the virtualized copies of application data, which might be handled as a software file or object inside the pipeline’s distributed object file system. The file system can apply basic data-management commands, including copy, store, move, and restore.

While IBM’s announcement all in favour of disaster recover, the technology has many potential uses. A recovery copy of an application’s data is timestamped and shipped to a storage location within the cloud, then updated with snapshots as frequently because the owner desires. Unlike full-data replications or mirrored images, snapshots capture the changes because the initial timestamp. They take less network bandwidth and compute resources because only the changes are shipped to the cloud, where they’re used to update the unique.

The Virtual Data Pipeline’s SLA offers a collection of selections as a way to cover the frequency of uploading data snapshots. Actifio’s founder and CEO Ash Ashutosh said a SmartCloud service user can “define an SLA in only a couple of clicks” and establish a recovery system with at least labor, ongoing management, network bandwidth, and storage footprint. The information is deduplicated and encrypted upon the user’s direction, he said within the announcement. Rather than each application having its own data storage system, multiple applications’ data may be virtualized, then moved in the course of the pipeline into cloud storage.

The SmartCloud Data Virtualization service will “offer many purchasers faster recovery times at better price points. It may enable them to leverage their protected data as a business asset instead of simply an insurance plan,” said Laurence Guihard-Joly, general manager for IBM’s Business Continuity and Resiliency Services, within the announcement.

That is, once a replica is established within the cloud, it could be called up and utilized by other applications. It could be used for dev/test projects, analytics, or active archive purposes, he said.

A similar data recovery system was announced by Nasuni, a front-end appliance-based storage management system, with its versioning file system last November.

Cloud Connect Summit, March 31 – April 1 2014, offers a two-day program colocated at Interop Las Vegas developed around “10 critical cloud decisions.” Cloud Connect Summit zeros in at the most pressing cloud technology, policy and organizational decisions & debates for the cloud-enabled enterprise. Cloud Connect Summit is geared towards a cross-component to disciplines with a stake inside the cloud-enabled enterprise. Register for Cloud Connect Summit today.

Charles Babcock is an editor-at-large for InformationWeek, having joined the publication in 2003. He’s the previous editor-in-chief of Digital News, former software editor of Computerworld and previous technology editor of Interactive Week. He’s a graduate of Syracuse … View Full Bio

More Insights