Ongoing increases in data-centre compute capacity are being largely driven by demand for compute-intensive data analytics as big-data strategies take hold and executives warm to the prospects of real-time data analytics, according to Intel.
Speaking at the launch of the company's "grunty and powerful" Intel Xeon E7v2 CPU – an upgrade to its Xeon E7 processor that packs 15 CPU cores and delivers twice the performance, thrice the memory support, and four times the I/O bandwidth of its predecessor – Intel ANZ enterprise solutions group manager David Mellers said that the latest boost in processor grunt would hasten the transition to a 'SMAC stack' within enterprise customers.
SMAC – an acronym that has gained currency to reflect the new core enterprise model built around social, mobile, analytics and cloud technologies – reflects "the huge focus on agility," Mellers said.
"Enterprise IT is under enormous pressure as they implement technologies for virtualised solutions with much faster deployment times. That, in turn, is driving a flattening of the data centre to a fabric of standardised computing."
As the massive increase in expectations from emerging big-data strategies has dramatically increased raw compute requirements, the delivery of ever-faster computing platforms had become integral to meeting organisational expectations of increasingly near real-time analytics capabilities.
That was a huge shift from years ago, when management reports were designed and run by hand with lag times of several days or weeks.
"The confluence of those four trends [SMAC] is creating a massive demand as companies want to be able to use information effectively and deliver it anywhere to any mobile device," Gordon Graylish, Intel's general manager of enterprise solution sales, explained.
"Analytics, and the stress that puts on the infrastructure at the moment, is resulting in some real questions about 'how I can afford this?' and 'how do I shift the expense of running the current environment?'."
Analytics, and the stress that puts on the infrastructure at the moment, is resulting in some real questions about 'how I can afford this?' and 'how do I shift the expense of running the current environment?'."
Positioned as Intel's high-end server processor, the Xeon E7v2 is built on a 22nm manufacturing process and Intel's Sandy Bridge microarchitecture. Each chip includes between 6 and 15 cores – packing up to 4.31 billion transistors – and has up to 37.5MB of Level 3 cache with CPU frequencies of up to 3.4GHz.
With that kind of power, Intel believes the new CPUs will support ever-increasing expectations from big-data analytics as the open-source Hadoop platform gains currency and demand for analytics of unstructured data continues to grow.
"In the past, most things were quite structured in terms of what organisations had to analyse," Mellers said. "They had control of the data. But today, you have social media, Web logs, and other data that presents a better opportunity to understand what customers are doing. The ability to deliver a horizontal platform for this, as we've done in the past, makes this technology more affordable and effective for enterprises as well."
Reflecting that agreement, Intel this week signed an agreement with Australian big-data consultancy Contexti, which will support the Hadoop-based Intel Data Platform that Intel launched earlier this month.
That partnership is intended to help push the new SMAC mentality into businesses across the region, with an increasing focus on in-memory computing further empowering the growth of the new analytics capabilities.
In-memory analytics is part of the raison d'etre for the Xeon E7v2, Mellers said, noting that its overall performance is "a 93 percent improvement on the previous version of Xeon."
"If you want to slice and dice data it can take a few days for analysts to produce a report and deliver it," he says. "That's why in-memory computing is a key enabler in terms of being able to change how CIOs and enterprises deal with their data, from analysing it to now being able to better predict what's going to happen."
"The speed at which in-memory operates makes these decisions more accessible – and this generates new experiments. With the tools and speed that in-memory computing produce, enterprises can access that data and do meaningful things with it in real time."