Storage Analytics meets Tufte

Storage Analytics meets Tufte

Summary: The most powerfull thing about Sun's new line of storage appliances isn't its price, its capacity, or anything else involving the hardware - it's the general applicability of the command and control software that's really exciting to anyone who's ever had to guess how his data center was performing.

SHARE:

Sun's new line of storage appliances supports NFS, CIFS, and iSCSI network connections to anything from two to forty-six terabytes of ZFS stored data per box.

In the short term what's important about the hardware is its low cost and compatibility with established network connected storage practices.

In the long term, however, what I think will prove most important about the package is the storage analytics capability - with the links between its integrated iSCSI capability and the CMT/SMP line's integrated network cryptology support coming in a close second.

If you haven't already looked at the storage analytics software that ships with the new gear, now is the time to do so. It's a whole new world - something that gives sysadmins the ability to both monitor and act in real time on the basis of both instantaneous and long term data.

The place to start your researches is key developer Brian Cantril's weblog - he has a discussion with lots of embedded references on how they got started, what they tried to do, and the circumstances in which they did it.

Two samples:

In October 2005, longtime partner-in-crime Mike Shapiro and I were taking stock. Along with Adam Leventhal, we had just finished DTrace -- and Mike had finished up another substantial body of work in FMA -- and we were beginning to wonder about what was next. As we looked at Solaris 10, we saw an incredible building block -- the best, we felt, ever made, with revolutionary technologies like ZFS, DTrace, FMA, SMF and so on. But we also saw something lacking: despite being a great foundation, the truth was that the technology wasn't being used in many essential tasks in information infrastructure, from routing packets to storing blocks to making files available over the network. This last one especially grated: despite having invented network attached storage with NFS in 1983, and despite having the necessary components to efficiently serve files built into the system, and despite having exciting hardware like Thumper and despite having absolutely killer technologies like ZFS and DTrace, Sun had no share -- none -- of the NAS market.

...

I believe that the result -- which you can sample in this screenshot -- does more than simply strike the balance: we have come up with ways to visualize and interact with data that actually function as a force multiplier for the underlying instrumentation technology. So not only does analytics bring the power of DTrace to a much broader spectrum of technologists, it also -- thanks to the wonders of the visual cortex -- has much greater utility than just DTrace alone. (Or, as one hardened veteran of command line interfaces put it to me, "this is the one GUI that I actually want to use!")

The key document to review is the storage Analytics presentation he makes available in that discussion - and note also his link to a VMware instance you can download to actually try the package, complete with a 16 disk virtual storage implementation.

What I find most compelling about all this, however, is a combination of two different, but closely related, things:

  • the combination of control and analysis built into this package is, at least to my knowledge, both unique in the industry and the first real time command and control package I've seen that meets the information presentation criteria taught by Edward Tufte.

  • there doesn't seem to be anything about this package that limits its applicability to storage management.

Thus what's compelling about the software is first that it is a real tool - something you can lay your hands on and work with right now- implementing what have previously been mere talking points in presentations about how to combine information and action; and, secondly that it's easy to see the potential for applying the same software, and thus the same ideas, to much more complete systems - to, for example, an entire project blackbox data center.

And that, I think, is the real bottom line: not just storage rethought in low cost hardware, but new, powerful, and widely applicable, systems management tools for us.

Topics: Data Centers, Hardware, Networking, Storage

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

4 comments
Log in or register to join the discussion
  • Splunk

    Analytics reminds me of Splunk (http://www.splunk.com/).

    Splunk is an excellent tool to visualize what's happening in your Datacenter.

    Splunk is like "grep on steroids". You can suck all your log files and other unstructured data in and easily do large scale analysis' using its visualization features.
    Burana
    • I'll check it out

      At first gla nce, looks interesting. Thanks for bringing it up.
      murph_z
  • Evidence for a long term view?

    With Sun shedding nearly 20% of its workforce and it's belief in being able to get blood from a stone (or money from open source as it's more commonly known), what evidence you have that there will be any long term?

    Nevermind Paul, I thought DEC would be here forever too ;-(
    tonymcs1
    • off-topic

      Paul is talking about automatic trend analysis of storage availability.

      And you start talking about market share.

      Must me logic only numpties understand.
      TedKraan