Taking the big data plunge a challenge for enterprise: Cisco

Taking the big data plunge a challenge for enterprise: Cisco

Summary: The hardest thing for enterprises that want to tackle big data is actually getting started, according to Cisco Consulting Systems Architect Adam Radford.

TOPICS: Big Data, Australia

There's a lot of talk around big data, but actually getting the ball rolling on how to deal with it is one of the biggest hurdles that enterprises are facing, according to Cisco Consulting Systems Architect Adam Radford.

The IT architecture required for big data differs a lot from the ones used for traditional business analytics and application delivery, which consists or a compute tier, a storage tier, and networking tied together in a centralised way.

For big data, it's a scale-out model, rather than a scale-up one — there would be a multiple clusters of compute and storage units, Radford said.

While businesses may think that this would require a significant investment in IT, he said that the cost of entry for dealing with big data is actually quite low.

"The cost of hardware, at least to get started, it's not massive," Radford told ZDNet Australia. "A lot of the big data software is public domain, so there's not a lot of licensing cost, and the hardware is very much a standard x86 compute platform."

Regardless, organisations are still grappling with changes in the method of handling an influx of data, he said, but it's important to take advantage of big data, as it provides much more insight into things like internal operations and customer sentiments.

However, big data and traditional analytics not only require different IT architectures, but approaches in retrieving information as well, Radford said.

"My number one advice to enterprises is to make a start in a very small environment, to understand the types of thinking and processes that apply to big data," he said. "It's very different from traditional business intelligence and data warehousing questions you have asked in the past."

"This is very much an interactive, experimental, and agile approach versus the standard-based questions we've had previously."

Big data also requires different skill sets. Radford said that skills needed in traditional data analysis have been commoditised over time, and not something average users have to understand anymore.

"It's so drag and drop," he said. "In contrast, for big data, it's really a very different type of skill set used, because it's in a very dynamic environment.

"The type of data we are looking at is inherently unstructured."

Topics: Big Data, Australia

Spandas Lui

About Spandas Lui

Spandas forayed into tech journalism in 2009 as a fresh university graduate spurring her passion for all things tech. Based in Australia, Spandas covers enterprise and business IT.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


1 comment
Log in or register to join the discussion
  • Re:

    Good article Spandas. We are seeing an increase in businesses seeking specialized skills to help address challenges that arose with the era of big data. The HPCC Systems platform from LexisNexis helps to fill this gap by allowing data analysts themselves to own the complete data lifecycle. Designed by data scientists, ECL is a declarative programming language used to express data algorithms across the entire HPCC platform. Their built-in analytics libraries for Machine Learning and BI integration provide a complete integrated solution from data ingestion and data processing to data delivery. More at http://hpccsystems.com