X
Tech

2012 Predictions: VCE FastPath, VI SAN Probe & Hadoop

Yearly prediction blogs are so clichéd hence why I’ve always tried to avoid writing one. Despite this I’ve always made a mental note of technology, products or companies that I thought were going to really do well in the upcoming year.
Written by Archie Hendryx Hendryx, Contributor

Yearly prediction blogs are so clichéd hence why I’ve always tried to avoid writing one. Despite this I’ve always made a mental note of technology, products or companies that I thought were going to really do well in the upcoming year. Back in 2008 I felt VMware were going to really take off after the release of 3.5. In 2009 I had a gut feeling DataDomain would explode just before they were bought by EMC. In 2010 I spoke to a friend about how 3PAR’s technology could no longer be ignored and in 2011 I still wasn’t convinced that FCoE would overtake FC in revenue despite all the analysts’ claims. But why believe me when I’d never put these thoughts on paper? So now at the beginning of 2012, I’ve decided to put my money where my mouth is, pull out my crystal ball and document my predictions.

First off I’m going with VCE’s Vblock and their new FastPath feature. VCE (or the company formerly known as Acadia) have always been an exciting prospect with their all-in-one Vblock solution. While other vendors such as HDS, HP and Dell all plot the launch of their own unified computing block, VCE have had the advantage of being the first on the market and consequently the first to learn and adapt their messaging and offering in accordance to customer needs. One such initiative is what is being coined as FastPath. In essence FastPath is a Wizard-GUI based deployment of a Vblock VDI infrastructure that’s based on best practice reference architecture that enables deployment to be accelerated from months to days. I’ve often blogged on the many benefits of VDI and the immense CAPEX and OPEX savings that come with it; to be honest it’s a no brainer. What I did fail to mention was the sometimes long drawn out and painful PoC process that would be required to prove out the value of a VDI deployment to a potential customer. Well, FastPath is the solution to that conundrum.

Available in pre-configured Vblocks, FastPath allows the customer to choose from a variety of products that scale according to their needs thus eliminating the risk of sizing errors and scaling out as needs grow. So if you have a requirement for 500, 1000 or 1500 desktop users, choose the appropriate preconfigured model and you’re ready to go as your VDI roll out is based on known capacities hence avoiding unnecessary pre-purchasing of hardware. Added to this the Vblocks are leveraging proven design and reference architectures via an installation wizard that focuses on performance and usage specific to your environment mitigating any risk to a VDI success. The Installation wizard immediately configures the VMware View components, as well as the connection broker and is completed in minutes, even creating and optimising the VDI storage layout that can be cloned as ‘Gold’ master images.

The business benefits are obvious in that customers can now accelerate their turnaround time from order to installation and enjoy a seamless roll out from PoC to Production. Your TCO is easily quantifiable as you know exactly what you’re acquiring, how it will operate and perform and how much the whole package costs. While FastPath is for VDI deployments, it wouldn’t be surprising to see VCE adopt a FastPath strategy for other Vblock deployments such as Oracle or SAP P to V migrations, or primary and secondary Vblock DR set ups that leverage Site Recovery Manager. The possibilities are numerous and 2012 could well be the year when FastPath transforms an erroneous mindset of Vblocks being a unified hardware computing block to instead being an all in one, quick to deploy and essential solution to the business.

Secondly is obviously a technology that I hold close to my heart having worked for the product’s company Virtual Instruments, namely the SAN Performance Probe. Initially VI were depending on Finisar technology for their probe products and their unique ability to track millisecond latency across Fibre Channel SAN infrastructures. Now with last year’s launch of their own SAN Availability Probe they’ve seen hardware sales rocket as they’ve empowered Storage, Server & VMware administrators to master the once complex art of FC SAN optimization via an easy to use dashboard GUI. What initially was seen as a FC SAN troubleshooting tool, it’s quickly becoming apparent via customer use cases that the value of the platform extends far beyond the realms of the SAN administrator. Already customers have found the SAN Availability Probe provides them the ability to de-risk disaster recovery, optimize backups, safeguard virtualization of Tier 1 applications as well as optimize the performance of their existent infrastructure while offsetting future procurement.

One of the keys to success for any company in such a highly competitive start-up market is to have a 'Blue Ocean Strategy', an experienced and top class leadership and a vision for the future. The reality is the product has no competitor and while this may have irked some vendors into producing FUD that this is not the case, it speaks volumes that a company which has yet to reach the 200 employee mark could be rattling the cages of such big corporations. Add to the mix that you have an executive team that includes a legend of the industry such as former Symantec CEO John W.Thompson and veterans from EMC, McData and HDS as VPs of Sales, Pre-Sales, Marketing and Services, it’s not surprising that a new product from a relatively new start up can so easily walk into large enterprise accounts and justify their unique value. As VI’s customer base will inevitably grow in 2012, so too will the SAN Performance Probe's use cases and consequent business value.

Lastly is a technology that was named after a kid’s toy elephant - Hadoop. Like all great things in life this Java-based programming framework is free. Part of the Apache project and partly invented by Google to help them present back to their users meaningful results from all the information they were indexing and collecting, Hadoop is the solution to what will be the term of 2012 i.e. ‘Big Data’. The long standing problem that Google and their like faced i.e. lots of structured and unstructured data and the challenge of having to run process intensive analytics was always an expensive proposition when put in the context of a traditional centralized database system. So instead of being limited to a single disk mapped to eight processors, Hadoop simply breaks up an application into numerously small fragments which can then be run on any node in a cluster. Hence In a cluster of servers that each have eight CPUs, Hadoop will send your code across those numerous servers enabling you to run your indexing job with all those processors working in parallel, quickly and efficiently and still return your results as a single readable whole.

With the Hadoop framework being already adopted by the likes of Yahoo, IBM and Google, 2012 could well be the year when Hadoop moves beyond search engine sites and find more prominence in the retail and finance sector. That is not to say that current datawarehouses or transaction processing systems are about to be ripped out of these sectors. Instead when these traditional databases reach their peaks, running Hadoop will enable further analysis across multiple data feeds in a single platform at a relatively cost effective price. So for example in the finance sector Hadoop will easily find a useful space in the context of identifying transaction fraud where large data sets for modelling and backtesting need to be created. Other use cases could include supporting compliancy by using Hadoop for the daily processing of equity markets data or even utilizing it for the consolidation of datawarehouses that run loan, banking and credit card consumer products. As for the retail sector, their drive towards cost-effective solutions to deal with their growing amount of consumer and product information is another ideal for a Hadoop based solution. What retail outlet wouldn’t want to provide an online customer experience that provided a product search result comparable to that of Google’s? In fact such is Hadoop’s potential for the Enterprise that even EMC have taken note with the recent launch of their Isilon scale-out NAS that incorporates Hadoop's Distributed File System. This could just be the beginning for Hadoop as the big vendors start to also give their seal of approval.

So while there were a number of other technologies, products and vendors I feel are going to cause some waves this year such as HDS' HAM, Tintri, EMC's VFCache, IBM's SVC 6.2 support for VAAI and of course VMware's eventual move into PaaS with the acquisition of SpringSource, I'm putting my neck on the line with these three being a guarantee; VCE's Vblock FastPath, Virtual Instruments' SAN Availability Probe and the Open Source Hadoop. Either way 2012 looks to be another great year for technology and Storage innovation in particular.

Editorial standards