It’s easy to lose track of the inside track streaming out of a giant-scale event like this week’s 11,000-attendee IBM Information On Demand (IOD) conference in Las Vegas. It is all the tougher given IBM’s penchant for spinning high-concept yarns about “smart,” “predictive” and “cognitive” capabilities which will make sense of the unfathomable “2.5 quintillion bytes of information” that IBM says the realm generates daily.
Not like Microsoft, Oracle and SAP, IBM is way less inclined to discuss discrete products than it’s capabilities which might be assembled into solutions with the aid of IBM Global Business Services consultants with deep industry expertise.
Webcasts
More >>
White Papers
More >>
Reports
More >>
But announcements are form of obligatory at big annual tech events, and IBM served up various them at IOD, including a mixture of recently released and shortly-to-be-released big data and analytics services and capabilities. Listed below are key highlights of what is new, what’s coming and what distinguishes IBM’s offerings from similar-sounding offerings that exist already.
[ Want more on IBM’s latest cloud infrastructure moves? Read IBM Shifts SmartCloud Customers To SoftLayer. ]
What’s New
IBM SmartCloud Analytics Predictive Insights is software aimed toward transforming the high-scale machine data spinning out of IT systems — networks, servers, storage systems, applications etc — into business intelligence. During the past, these log files and event streams were either used for simplistic, stove-pipe monitoring and diagnosis or they were entirely ignored.
Within the big data era, some have realized that IT monitoring and event data might reveal leading indicators which will help IT anticipate and stop problems instead of diagnose failures after the actual fact. Where many IT monitoring systems are all about setting thresholds and alerts for one system at a time, the assumption behind Predictive Insights is to mix large sets of data and find correlations and anomalies in data that yield predictive insights.
Consolidated Communications, a cable operator headquartered in Illinois, is using Predictive Insights to trace some 80,000 streams of knowledge across its systems to observe the health of its video delivery network. By spotting anomalies that couldn’t be seen by studying systems in isolation, the cable operator reports it has avoided service disruptions and related costs of roughly $300,000 per year.
Splunk was a pioneer in doing big data analysis across myriad IT system sources, but IBM says the Predictive Insights service isn’t the same as Splunk and other offerings in that it’s an analytic-correlation and pattern-detection environment instead of an open-ended search-and-discovery tool. In other words, it surfaces conditions worthy of investigation by itself instead of counting on humans to drive the analysis.
Also at the “what’s new” list are an update to IBM’s SmartCloud Virtual Storage Center and 3 advances tied to Hadoop. The Storage Center is software on your data center that applies machine learning and analytics to virtualized storage environments to automate complex migration and storage-tiering decisions.
Storage choices typically revolve across the tradeoffs between fast data-access speeds and price of capacity. By analyzing usage patterns, the Storage Center identifies the most efficient valuable storage choice for a given set of knowledge, automatically making the change without admin assistance or interruptions to data access. Storage Center reportedly helped IBM itself reduce per-terabyte storage costs by 50% on the company’s Boulder, Colo., data center.
The 3 Hadoop-related introductions are:
— IBM PureData System for Hadoop. Released in September, here is IBM’s Hadoop appliance incorporating the IBM BigInsights Hadoop distribution and complementary software. IBM says the variation from Apache, Cloudera, Hortonworks and other “standard” Hadoop deployments is four times faster performance way to cluster-management and high-performance computing capabilities adapted from IBM’s Platform Computing acquisition.
— InfoSphere Data Privacy for Hadoop. Coming later this quarter, it is a data-masking and information-activity-monitoring system that works across Hadoop in addition to NoSQL and relational data sources, in line with IBM. Data masking conceals sensitive data which include social security numbers at points of replication so companies can transcend access controls to make certain data privacy. The info activity monitoring capability tells administrators who’s accessing data and when data-access patterns are atypical — even for authorized users.
— InfoSphere Governance Dashboard. Another tool that works across multiple data sources including Hadoop, relational and non-relational databases, this dashboard gives data-management professionals an understanding of the lineage, state of quality and state of governance of information sets under management. The software is asserted to work hand-in-hand with ETL, data-privacy and information-security tools making sure that governance policies are enforced.