The company that invented MapReduce makes its cloud-based dimensional query offering public. But is it really a Big Data tool?
Big on Data
Veteran data geek Andrew Brust covers Big Data technologies including Hadoop, NoSQL, Data Warehousing, BI and Predictive Analytics.
Andrew J. Brust has worked in the software industry for 25 years as a developer, consultant, entrepreneur and CTO, specializing in application development, databases and business intelligence technology. He has been a developer magazine columnist and conference speaker since the mid-90s, and a technology book writer and blogger since 2005. Andrew is also Founder and CEO of Blue Badge Insights, an analysis, strategy and advisory firm serving Microsoft customers and partners.
Tellago CEO Jesus Rodriguez discusses the consulting project opportunities in Big Data and shines a light on his favorite Big Data technologies and companies.
IBM has a new partnership with Cloudera, and it's hardly the Palo Alto startup's first such deal. Is Cloudera becoming the de facto arbiter of Big Data credibility?
Two lions of BI have allied, bringing analytics into the parallel processing, in-memory, appliance-based fast track.
Big Data in the U.S. Federal government isn't just for grandiose inter-agency projects. Tactical, operational applications can use Big Data technology too.
Fast Data, used in large enterprises for highly specialized needs, has become more affordable and available to the mainstream. Just when corporations absolutely need it.
Big Data isn't just about clickstreams and status messages. It's about manufacturing turbines, jet engines and even consumer goods, too.
A Technical Fellow at Microsoft says we're headed for an in-memory database tipping point. What does this mean for Big Data?
Big Data is revolutionary, and not merely the evolution of BI and data warehousing technology. Here's why.
Hadoop 2.0 makes MapReduce less compulsory and the distributed file system more reliable.
Hadoop Streaming allows developers to use virtually any programming language to create MapReduce jobs, but it’s a bit of a kludge. The MapReduce programming environment needs to be pluggable.
As innovative as Hadoop is in toto, its components can benefit from optimization, perhaps significantly. One vendor that’s been in the database business for three decades isn’t just talking about those optimizations. It’s building products around them.
Microsoft has a reputation for modifying external technology when adopting it. But in the case of Hadoop, Microsoft is so far staying true to the core technology, providing optional integration with its own stack, and making it easier for people to work with Hadoop and get excited about it.
Big Data is in a golden age of horizontal opportunity, keeping the prerequisite of vertical market expertise at bay. This provides some early opportunities for tech services firms to gain industry specialist expertise. Big Data is a Big Equalizer.
The Hadoop Distributed File System (HDFS) is a pillar of Hadoop. But its single-point-of-failure topology, and its ability to write to a file only once, leaves the Enterprise wanting more. Some vendors are trying to answer the call.