ExtraHop introduces streaming data appliance

ExtraHop, already noteworthy for its network packet-level data access, delivers an appliance for working with streaming data, making IoT and other time-series analysis a plug-and-play affair.

Streaming data processing, and stream data analytics, are becoming central to many businesses' data strategies. The Lambda architecture, which promotes and facilitates the mashing up of streaming real-time data with data already captured and stored, is gaining in prominence. This has come about because companies want to respond to phenomena as they occur, rather than the following day, week, month or quarter.

Don't try this at home
But streaming platforms aren't easy to work with. Most of them require setting up event handling structures, like message queues, data sources and data sinks. There's a lot of configuration work involved and, typically, coding becomes necessary as well.

Essentially, the middleware and SOA architectures of yesteryear have been resurrected and given a fresh Big Data sheen. If streaming data is to be truly mainstream, then all this tape and chewing gum won't do. The market needs this to be easier.

Turnkey solution, anyone?
Enter ExtraHop, which today is not just introducing a software abstraction layer over streaming data, but is wraping that software up in a physical appliance that captures and indexes the data. In so doing, the product, dubbed the ExtraHop Explore appliance, makes the data searchable and provides for multi-dimensional analysis of the streaming data as well.

Add in the company's Discover appliance, and customers have the ability to visualize the data in real-time as well. Plus, a REST API and connectivity to Apache Kafka (via ExtraHop's Open Data Stream facility) allow integration with numerous external applications and environments. Speaking of which, customers wishing to perform visual analysis of ExtraHop data in Tableau, or products from Qlik, may do that as well.

Straight to the source
Together this technology comprises version 5.0 of the ExtraHop platform, a platform which already employed the unique approach of gleaning data directly from processing of network packets. While that may sound terribly low-level and raw, the company's protocol filters mean that well-structured and distilled data are delivered to users and, meanwhile, ExtraHop has access to data which may never make it to a log file. That makes data capture more comprehensive and allows multiple data streams to be correlated.

ExtraHop's products are appropriate for low-level applications like network security and optimization; DevOps-related pursuits like application performance monitoring; Internet of Things (IoT) data collection and analysis; and healthcare applications, including analysis of data from medical devices and/or transmitted electronic medical records.

Way to go
The prevailing state of the art in streaming data processing is about as evolved as batch data processing with Hadoop was four years ago. Just as Hadoop had to get easier in order for it to accelerate its adoption, so too does streaming data processing and analytics. Surprisingly few companies are offering anything close to turnkey solutions in this domain, and it will take a while before that changes. But ExtraHop is doing it now, and it has an innovative, layered approach derived from experience in streaming data processing, and a dedication to treating it as a generalized problem, instead of a portfolio of point-to-point solutions. The rest of the industry needs to get in gear.

Show Comments