5 data analytics deployment pitfalls

5 data analytics deployment pitfalls

Summary: Companies looking to implement data analytics should avoid planning for complex systems or relying one vendor's products to meet all needs, among other mistakes commonly committed, insiders say.

SHARE:

Organizations are putting too much focus on building expensive backend infrastructure for data analytics, and disregarding access and usability issues. They are also not fully utilizing the ability of the analytics engine by deploying it simply to solve one or two business needs.

These, among others, are some common implementation pitfalls that companies should avoid when rolling out business analytics, industry insiders say.

Analysts have identified big data analytics as one of the key IT trends for 2012, as companies increasingly have to come to grips with the increasing deluge of data and make sense of the information flowing in.

As more plan for or are in the midst of deploying their analytics initiatives, market players highlight pitfalls businesses should avoid, in terms of planning and implementation, in order to maximize their investments and achieve speedier operations.

1. Over-focusing on building expensive backend systems.
Simon Dale, head of innovation and technology at SAP, for one, said data analytics should be "all about end-users".

Companies, he noted, typically placed too much emphasis and focus on building an expensive backend data infrastructure without realizing the importance of usability and access.

Dale explained: "The focus should be on the end-user experience and how the implementation simplifies the consumption of analytical data. An expensive backend data warehouse may be beneficial, but it all becomes obsolete when there is no way of accessing the data."

Since most data analytics projects are driven strongly by IT departments, there also is often a lack of involvement from business users during the planning and implementation stages, noted Terry Smagh, vice president of Southeast and North Asia at QlikTech.

Business users are typically involved at the beginning of the requirement stage and the end of the deployment phase, he pointed out. This results, for example, in requirements that are built wrongly or insufficient data available for analysis, hence, delaying the entire project.

"[This is why] it is common to see traditional data warehouse or business intelligence projects take so long to go into production, and some don't even go live at all," Smagh said.

2. Existing infrastructure must go.
Organizations like to ditch their existing analytical architectures once they commit to a new product, as they believe the older systems cannot be leveraged. However, this is not true, said Rohit Bakhshi, a solution architect at big data vendor Hortonworks.

He noted that such a perception can cause hesitation within an organization to adopt data analytics as decision makers might think it too expensive and time-consuming to have to replace existing infrastructure entirely.

"New analytical solutions can and should be implemented with complementary technical architectures that integrate with existing tools and solutions to enable new use cases," Bakhshi stated.

3. Only solve highlighted needs.
Another common mistake enterprises make is restricting their analytics tools to focus on solving only business needs based on data they have been able to collect or store in the past, Bakhshi said. This approach leads to siloed data and analytical systems across the company.

"Big data analytical systems such as Hadoop, on the other hand, enable a company to think more broadly about its analytical architectures by enabling it to store data it never stored before, and provide analytical solutions across business groups," he noted.

As such, he advised companies to take into account all new data sources they want to derive value from when planning for new analytical use cases. These can include existing sources such as transactional systems as well as newer ones such as social media feeds, detailed activity logs, and other forms of unstructured data.

Organizations should also plan for varying applications and types of users that will consume the processed data, Bakhshi said.

"This planning effort will lead to a well-designed analytical solution that supports current analytical applications and scales to new functionalities and processing powers," he asserted.

4. One-size-fits-all solution.
Bakhshi added that companies that believed one technical solution would solve all of their analytical needs should think again. Choosing one product, while easier to implement, would restrict the company's ability to scale its functionality as use cases grow, he explained.

For example, he pointed out that Hadoop is good for processing huge amounts of loosely structured data, whereas relational analytical databases are good for rapid iterative analysis on smaller sets of processed data.

"When these two technical solutions are implemented in tandem, the analytical platform will scale functionality and performance to grow with the company," he said.

The flip side would be those that end up buying many different software to meet their analytic requirements, Smagh noted.

Larger organizations, for example, often spend millions of dollars on an "end-to-end business intelligence" project which results in a very large pool of resources that requires different software skillsets just to deliver and maintain the system.

The mistaken perception is that these offerings come in a single product but, in reality, the offering is a combination of various software products--usually gained via acquisitions--and, thus, are not tightly integrated and require different skilled workers to learn and use, the QlikView executive explained.

"The [purchasing of different software] results in developers spending longer time in deploying apps, causing business users to go back to using their familiar desktop applications to perform data analytics requirements," he added.

5. Complex planning needed.
Dale added that many companies typically found it difficult to embark on data analytics projects because of the perception that it would require very complex planning.

This, however, can be avoided if they keep it simple, he urged.

"Companies should have a clear objective and be prepared to manage change properly," he explained. "They can also look for other power users who are multipliers and champions for the benefit of the new system."

Topics: CXO, Data Centers, Data Management, Enterprise Software, Hardware, Networking, Storage

Kevin Kwang

About Kevin Kwang

A Singapore-based freelance IT writer, Kevin made the move from custom publishing focusing on travel and lifestyle to the ever-changing, jargon-filled world of IT and biz tech reporting, and considered this somewhat a leap of faith. Since then, he has covered a myriad of beats including security, mobile communications, and cloud computing.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

0 comments
Log in or register to start the discussion