Adaptive Computing's formula: Workflow management + Big Data = Big Workflow

Summary:Adaptive Computing aims to apply its experience in managing HPC workflows -- using its Moab HPC Suite -- and help companies manage big data workloads.

Adaptive Computing CEO Rob Clyde stopped by to discuss how the company has used its Moab high performance computing (HPC) software to manage Big Data workloads and his company's attempt to declare a new industry catchphrase, "Big Workflow."

Moab High Performance Computing

Adaptive Computing has long offered products that help companies deploy HPC workloads. The company's products -- including Moab HPC Suite - Basic EditionMoab HPC Suite – Enterprise Edition, Moab HPC Suite – Application Portal Edition, and the Moab HPC Suite - Remote Visualization Edition -- are all based upon the company's Moab Intelligence Engine. The company also offers the same capabilities to cloud-based HPC workloads in a product called Moab Cloud Suite.

The Moab Intelligence Engine optimizes scheduling and management across workloads and resources, based on policies. The goal is to accelerate results delivery and maximize utilization while simplifying workload management across complex, heterogeneous cluster environments or even HPC cloud environments. 

These policies utilize the unique Moab management integration tools that integrate data and actions across heterogeneous resources, resource managers and management tools to maximize value. Both editions can be extended with the Moab HPC Suite – Grid Option to optimize scheduling and management across grid environments.

Big Data Workflow

Since the company has had a great deal of success helping companies manage workflow on HPC clusters, it believes that the same technology could be a winner managing Big Data workflows.

Adaptive Computing's goal is to unify data center resources and optimize their use by Big Data applications. At this point, the technology has the ability to manage bare metal and virtual machines, technical computing environments (e.g., HPC, Hadoop), cloud (public, private and hybrid) and even agnostic platforms that span multiple environments, such as OpenStack -- as a single ecosystem that adapts as workloads demand.

Snapshot analysis

The company's Moab HPC Suite has been a well-known competitor in the HPC market. The company is taking the same basic technology and applying it to Big Data workloads. It hopes that the combination of Big Data workloads and Moab workflow management will come to be known by the catchphrase "Big Workflow."

The market to manage Hadoop and other Big Data workloads has been increasingly competitive. Technology such as Apache's AMBARI, Cloudera's YARN, Compuware's APM, Puppet's Puppet, MapR's Control System and quite a few others are already available in the field. So, Adaptive Computing is a newcomer in a very hot market segment.

Adaptive Computing's Moab has certainly developed a following in its original market segment. Will that translate to success in the Big Data segment on the market? That isn't at all clear. The company's product, if properly marketed, has a great deal of promise. We'll have to watch to see how the market forces evolve.

Topics: Big Data

About

Daniel Kusnetzky, a reformed software engineer and product manager, founded Kusnetzky Group LLC in 2006. He is responsible for research, publications, and operations. Mr. Kusnetzky has been involved with information technology since the late 1970s. Mr. Kusnetzky has been responsible for research operations at the 451 Group; corporate and... Full Bio

Contact Disclosure

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.