X
Business

Sumo Logic claims to be first Big Data service provider. Can you believe them?

Companies all over the world are looking for ways to gain some advantage due to the industry interest in processing massive amounts of rapidly changing data. Sumo Logic just emerged from stealth mode and claims to the be first offering Big Data services. The problem is that they're not the first.
Written by Dan Kusnetzky, Contributor

A start up, Sumo Logic, just emerged from stealth mode and announced the fact that they got funding and have started operations. Their headline was "Sumo Logic Emerges to Deliver Industry’s First Big Data Service for Real-­time IT Insight."

The company went on to describe their product in the following fashion:

Key tenets of Sumo Logic’s next-­?generation approach include:

  • Cloud-­?based Service: A multi-­?tenant, secure, reliable and highly available service that provides managed collection and retention. Unlike premise-­?based solutions that are expensive, complex to deploy, scale and maintain, the Sumo Logic service has a low total cost of ownership, can be deployed instantly, scales elastically and requires zero maintenance.
  • Patent-­?pending Technology: Powered by Sumo Logic’s Elastic Log ProcessingTM and LogReduceTM technologies:
    • Elastic Log Processing: A modular, linearly scalable architecture that enables log analytics at unprecedented petabyte-­?scale.
    • LogReduce: Adaptive algorithms that reduce millions of logs into a small number of patterns.

  • Real-­?time Forensics and Push AnalyticsTM: Interactive analytics and forensics are driven by an intuitive web-­?based user interface, with a powerful query language for navigating mountains of data in real-­?time. Push Analytics engine provides proactive detection and notification of trends, changes and anomalies.
  • Expert Community and Global Trends: Sumo Logic’s cloud-­?based approach uniquely connects users, enabling native sharing and social insights; the service mines global trends and anomalies across customer organizations.

Snapshot analysis

I always love it when a start up (or a well established supplier for that matter) starts the conversation about a new product or service with claims like "we're first" or "we're the only supplier." Unfortunately, that claim seldom holds up. When someone makes that claim, I immediately conduct a broad search to learn who did it before and when. In this case, I can find at least two other companies offering similar products and can find a long history of companies offering similar approaches over the years.

If we examine the history of computing, the same concepts come up again and again in each new generation of systems.

Why does this happen?

It is clear that the industry still hasn't found a perfect solution to a problem or completely addressed a need. Almost always, the next generation attempts to lower overall cost or improve either the solutions performance or ability to scale.

Although I'm sure Sumo Logic is applying new technology to the problem of managing and analyzing massive amounts of rapidly changing data, there is, however, a long history of companies offering services that could be described as supporting "Big Data." I've spoken with two of them in the recent past.

Mainframes came first

If we take a stroll back in time, we'd discover that mainframes were used to address this need first. These very large systems are still deployed in many organizations to support what clearly were early "Big Data" applications. For some tasks, this approach is still the most cost-effective due to the centralized management which is a standard component of mainframe environments.

Minicomputers and midrange machines came next

Then single-vendor midrange machines, typically supporting a UNIX flavor or a vendor's proprietary operating system, were deployed to address the same use cases, but at a lower cost of hardware and software. Often the approach here was to segment the application or data so that processing could be spread over a number of computers. Sometimes this configuration was called a "cluster" or a "grid."

Herds of industry standard systems are today's answer

Later, the same type of use case was addressed by a large number of relatively inexpensive industry standard systems (read X86-based systems) that were linked together using parallel processing or workload management software (both forms of processing virtualization technology). This last approach has been the mainstay of technical computing, the industry segment out of which today's concept of "Big Data" emerged.

If I review history, service offerings allowing companies to process massive amounts of data, have been around for a long, long time. Service bureaus made this type of processing available in the 1960s and 1970s. Hosting and managed services companies have been offering services of this nature since the 1980s.

If we examine the narrow case of using a remote service to quickly analyze operational log data to learn more about how systems are functioning quickly and efficiently, I've spoke with several suppliers who are offering similar services. Splunk and Loggly are two examples that immediately come to mind.

Sumo Logic's entry certainly can not be seen as the first. The company, however, can claim to be one of the first to be able to use the industry catch phrases "cloud computing" and "Big Data."

That doesn't mean that they're first to address this type of use case. It only means that new industry catch phrases have emerged to describe a new approach to an old problem.

Editorial standards