/>
X
Innovation

Hitachi Data Systems Analyst Day

Hitachi Data Systems brought together a group of industry and financial analysts. The goal was informing the analysts and helping HDS be top of mind. Here are my experiences and thoughts from the event.
Written by Dan Kusnetzky, Contributor on

Today I'm in San Jose attending the Hitachi Data Systems (HDS) analyst event.  Since I seldom have an opportunity to speak with HDS, I'm looking forward to meeting a number of new people.

The analyst relations folks described the event in the following way:

"You will be hearing from key executive management members, as well as VP/GMs of each region, and even 3 strategic customers to jointly highlight our successes. We think that this will provide the best opportunity for us to deliver a holistic view of our business."

I'll update this post with interesting information as it is presented.

8:00 AM PT Update

The conference room is set up as a number of circular tables. None of them offer power for analysts' PCs. The published WiFi code is invalid.  The USB flash drive that was distributed to the analysts in the analyst packet was blank. I suspect that the day will get better for Hitachi.

I'm somewhat surprised that no one thought to make sure that the basics, WiFi and power, weren't checked out before analysts came into the conference room.

On another topic, I overheard an interesting discussion in the hallway. One analyst was commenting on one suppliers claim that they had a superior product. The other analyst replied, "How can they have the best product? Their sales don't support that claim." I've often seen superior product fail when compared to products that were good enough to get by without too much discredit that were marketed superbly.

HDS Vision and Strategy Jack Domme, HDS CEO

Jack Domme, HDS CEO, is a dynamic speaker.

He did his best to present a rapidly growing, successful company. He also took a moment to present three recent acquisitions: ParaSystems, BlueArc and Shoden Data Systems.  HDS' goals, Jack pointed out, that all data needs to be stored, governed and managed. The following bullets are a summary of his presentation:

  • He pointed out that unstructured data is growing 10 times faster than structured data. He also pointed out that amount of data organizations use is expected to grow 42 times from 2009 to 2020. He also pointed out that HDS expected that there would be 1 billion different applications that will access data. He also didn't cite any research when making these assertions.
  • HDS is working to disconnect data from the applications making it possible for data to be useful far beyond the life of a single application. All data must be independently discoverable, accessible and useful forever. He also pointed out that organizations will need to govern that data forever. Data formats, standards and media types are constantly changing.
  • This rapid change, he points out, requires data, storage, file server and network virtualization.  (I agree with this statement.) HDS has been offering page-level virtualization for some time.  This, he asserts, makes it possible to move an object or a file from one storage tier to the next without applications having to know about the physical location or format of the data.  This capability works across storage devices, storage vendors or storage media. (This sounds a great deal like messages I've heard from several other suppliers.)
  • The next move is to virtualize content making the text-based objects discoverable and accessible across applications, storage location or media type.  Jack then expounded on the rapid growth and amazing scale of the amount of data in use today. Moving this data from tier to tier so that it can be accessed as needed is the challenge facing the industry today.
  • Searching non-text data is the next big challenge. Analytical search across data types is really a requirement to make sense of the data explosion we're facing. "Big Data" today is largely focused on structured data. HDS thinks that non-structured data is the next big thing and the company is working towards that capability today.
  • Jack imagines a future that is made up of completely virtualized, cloud-based infrastructure. He called this "the infrastructure cloud." Organizations would no longer have to care what the storage media is, what tier in a storage system currently resides or what application created or uses that data.
  • He then posits the creation of a "content cloud" based upon the foundation of this "infrastructure cloud." He then discussed the what the world would be like when this was a standard approach.  He then tied in how this would result in an important improvement in power consumption, heat production and would still allow management from a single dashboard.  Data would automatically move to the best storage for its current use. He then presented a few customer profiles to support HDS' story.
  • Forever data governance was his next topic. HDS wants to remove the obstacle of moving data from media type to media type, from format to format, will be very important in the future.

He then pointed out that what HDS is doing and what it plans to do in the future is a requirement for the future of cloud computing. While it was very persuasive, HDS' messages remind me of those presented by IBM, HP and others.

I'll continue this monolog in another post.

Editorial standards

Related

The 16 best Cyber Monday deals under $30 still available
Amazon Fire TV Stick 4K

The 16 best Cyber Monday deals under $30 still available

We will see a completely new type of computer, says AI pioneer Geoff Hinton
artificial-intelligence

We will see a completely new type of computer, says AI pioneer Geoff Hinton

These file types are the ones most commonly used by hackers to hide their malware
getty-a-woman-looking-at-a-laptop-with-a-concerned-expression.jpg

These file types are the ones most commonly used by hackers to hide their malware