Understanding Big Data Architectures

David Howard · January 26, 2015 · Short URL: https://vator.tv/n/3b9c

We're long past data warehousing...

First there was data warehousing, Relational Database Management Systems (RDBMS) and business intelligence reporting. Then, distributed processing and distributed storage with Hadoop, or handling massive datasets, enabling advanced analytics of an explosing of data from an increasing variety of sources. Now that Hadoop has established a strong beachhead, we're seeing new data analytics architectures come to the forefront to leverage its capabilities.

Traditionally, getting data from source systems into a data warehouse was expensive and time consuming, due to the Extract, Transform and Load (ETL) processes necessary to make data queryable. Enterprises needed an ETL programmer, data warehouse architects and administrators, to make the data "dance" before an analyst could actually run reports and gain actionable intelligence with which to make business decisions. Due to the expense and time lag, only select portions of data could be run through this process and made available to the analyst.

In 2011, Hadoop came along, leveraging Google's work with MapReduce, enabling the enterprise to collect massive datasets in all forms. But the Enterprise still needed the programmers, architects and admins to make the data usable to the people who actually needed it. The workflow really didn't change.

A new crop of startups is introducing new big data analytics architectures now. One that eliminates the time lag attributed to data prep, and eliminates the layers of technicians between the analyst and the data - petabytes of it now - on Hadoop. 

It's great for analysts, but the real value proposition of these new architectures is that it reduces the time-to-insight for business managers and the C-suite. Actionable intelligence is no longer months away from the realization that the data - all of the data, thanks to Hadoop's un-precedented mass-store capabilities - is available; time-to-insight is now measured in hours.

And the analytics is no longer limited to structured data that neatly fits into rows and columns. To give just one example, ustomer experience teams can pull in all sorts of structured and un-structured data, such as web clicks, chat logs, e-mail exchanges and traditional contact center data to find opportunities to improve service or lower costs.

Large enterprise, unsurprisingly, is reluctant to share with the competition, the secrets of their success, but the case studies are trickling out, and the ROI is clear. Senior executives need to be paying attention to these new architectures and understand the strategic significance for their business.

 

Support VatorNews by Donating

Read more from our "Trends and news" series

More episodes