Expectations for big-data projects are running high, but so is the price of failure. A quarter of CEOs say they would fire a CIO or CTO over a botched initiative.
The severe consequences of failure may because so many businesses leaders have now bought into the idea that exploiting data is important. About eight out of 10 C-level executives see data analytics-based business growth as their top priority over the next 12 months, according to new research from software firm Actian, formerly Ingres.
IT leaders are also apparently right to be wary of diving into immature platforms because investing in a technology that fails to scale in line with future demand is seen by 36 percent of CEOs as even greater grounds for dismissal, a figure only outstripped by buying a technology that leads to a security breach, which 43 percent treat as a sackable misstep.
There is some better news for Hadoop big-data technology in the Actian big-data study, conducted last month among 106 senior managers, which follows last month's lukewarm Gartner survey.
"As with any technology advancement, Hadoop has areas for improvement. Many organisations have invested time and resources into making it work for them because they know that their current way of managing analytical workloads won't cut it," Actian business development SVP Ashish Gupta said in a statement.
He added that the number of CEOs who treat a failed big-data project as a fireable offence for their CTO or CIO is "putting immense pressure on top IT leaders to deliver on the promise of big data".
The new research shows that just over half the respondents view Hadoop as an option that could make existing data-analytics operations more efficient. A third of those polled said it will help them find value from existing data, with roughly the same number suggesting it could make a business more profitable.
On top of that, one in three business analysts and data scientists see Hadoop as a cost-effective and scalable way to store data.
However, after those findings, things take a turn for the worse for Hadoop. A third of those same business analysts and data scientists also cite Hadoop as being hard to use. One in five also says it requires skills the business does not possess and lacks the tooling to make it enterprise-grade secure and fast.
"Traditional database technologies are failing to deliver on analytical workloads, so they have turned to Hadoop for help. The problem is, while Hadoop is a very cost-effective place to store massive amounts of data, most are finding it too immature to manage enterprise-grade, high-performance analytics jobs needed to get ahead and stay ahead," Gupta said.
The global study, which polled representatives of 25 industries, revealed that 23 percent of respondents are very or extremely satisfied with their technology investments in reporting, analytics and big data.
That figure contrasts with the 11 percent who are not at all satisfied. Twenty-five percent rate themselves as slightly satisfied, with 41 percent moderately satisfied.
More on Hadoop and big data
- Couchbase ties into Hortonworks Hadoop for single analytics and transaction datastore
- Databricks CEO: Why so many firms are fired up over Apache Spark
- MySQL: Percona plugs in TokuDB storage engine for big datasets
- Cloudera links up with Hadoop developer Cask
- Mesosphere and MapR link up over Myriad to create one big data platform to rule them all
- Teradata rolls out big data apps, updates Loom
- MapR CEO talks Hadoop, IPO possibilities for 2015
- Teradata acquires archival app maker RainStor
- Hortonworks expands certification program, looks to accelerate enterprise Hadoop adoption
- Actian adds SPARQL City's graph analytics engine to its arsenal
- Splice Machine's SQL on Hadoop database goes on general release