2014 State Of Storage: Cost Worries Grow

Solid state alone can’t solve your problems — in spite of the fact that you could afford it. Think scale-out, virtualization, and cloud.

Business users worry about storage growth just like the NSA worries about your privacy. Sure, users might pay lip service to the virtue of restraint, but if it comes all the way down to it, they need their stuff. And their stuff? It’s digital content, and it’s feeding double-digit annual growth inside the amount of information under management, in keeping with our 2014 InformationWeek State of Enterprise Storage Survey.

At 27% of businesses, it’s wrangling 25% or more yearly growth. The most culprit: databases or data warehouses. Money’s still tight, with 25% saying they lack the money even to satisfy demand, less optimize performance by loading up on solid state. IT leaders face a troublesome “pick two” conflict among performance, capacity, and price.

Data growth is an inescapable trend. In its “The Digital Universe in 2020” report, IDC estimates that the entire volume of digital bits created, replicated, and consumed around the United states of america will hit 6.6 zettabytes by 2020. That represents a doubling of volume about every three years. For those not up on their Greek numerical prefixes, a zettabyte is 1,000 exabytes, or simply over 25 billion 4-TB drives. Or check out only one company for instance: UPMC, a primary healthcare provider and insurer, has about 5 petabytes of knowledge today, and that volume was doubling every 18 months.

For enterprise IT, we see three conclusions. First, don’t anticipate all-solid state storage saving your sanity. Do not get us wrong; it is the most disruptive digital storage technology ever. But talk of hard disks joining floppies at the ash heap of IT history is premature, for reasons we’ll discuss. For now, most storage vendors offer hybrid architectures which could dynamically vary the flash-to-disk ratio for changing workloads.

[How will a wiser-connected world prevent money? Read Internet of items: 8 Cost-Cutting Ideas For presidency.]

Second, that you must up your use of scale-out arrays, distributed file systems, and storage virtualization, a.k.a. software-defined storage (SDS). A software-defined storage strategy is your best bet to automatically place, migrate, and manage data and applications on hybrid arrays to satisfy demand without breaking the budget. Today’s applications, particularly mobile apps, are sensitive to any variance in storage performance, which suggests architectures ought to be optimized for performance in addition to capacity. We expect 2014 is the year of SDS, and inside the nick of time.

Third, the cloud has matured right into a legitimate tier within the enterprise storage hierarchy. Now, IT must prevent cloud use, particularly SaaS, from creating new data silos.

As to what is driving demand, greater use of these cloud services and social networks along side the proliferation of smartphones as information clients plays a component. Migration of all media, particularly TV, from analog to digital formats is a culprit, too. But for corporations, what’s really coming at us like a freight train is machine-generated data, notably security images and “information regarding information.” This last bucket includes everything from the net of items, wherein devices generate details about their operations and environments, to analytics software that must crunch vast troves of raw data to provide the insights businesses crave.

The solid state revolution

These days, compromise is a grimy word, and never just in Washington. App developers and end users want all of it: blazing performance and unlimited, dirt-cheap capacity. Solid-state storage has done greater than another technology because the hard disk drive to satisfy these demands, particularly where random I/O is important, and its importance cannot be overstated. In point of fact, as flash densities increase and prices plummet, some industry experts argue we’re at the verge of all-flash datacenters.

We say watch out what you want for because flash capacity still requires trade-offs in reliability and knowledge protection. Solid state can provide you loads of terabytes of capacity or harddrive-like longevity, but not both even as.

High-density flash designs achieve capacity on the cost of media endurance, notes Radhika Krishnan, VP of selling at Nimble Storage. A memory cell might be written to a limited selection of times before it fails, so the full system must be ready to tolerate random bit errors and dying memory chips. To get technical, blame it on an inherent wear-out mechanism in flash technology resulting from repeated tunneling of electrons through an insulating layer at relatively (at the least for semiconductors) high voltages.

The primary technique of improving flash density, and hence capacity, have been through tighter fabrication geometries and via multilevel cell designs, during which each memory location can store a couple of little bit of information. The complexity eager about fighting these limitations explains why enterprise-grade SSDs sell for a considerable premium. It is also why storage vendors are fixated on hybrid arrays geared toward delivering the simplest of flash and tough disks in a single box.

Download our complete February Tech Digest issue on enterprise storage, distributed in an all-digital format (registration required).

Kurt Marko is an InformationWeek and Network Computing contributor and IT industry veteran.

In the 17 years since we began the InformationWeek U.S. IT Salary Survey, greater than 200,000 IT professionals have completed the questionnaire. Participate in the 2014 U.S. IT Salary Survey — it is a wonderful means to organize on your next salary review, or that of the folk you manage. Survey ends Feb. 21.

More Insights