Data centers will need to become more scalable, always available, and utilize distributed analytics to reduce the amount of data sent in order to support machine-to-machine (M2M) communication initiatives.
Craig Slattery, server line product manager for Dell Asia-Pacific Japan, said companies will need to relook their datacenter infrastructure when considering M2M projects. This means having a fault-tolerant, scalable infrastructure of network, server and storage within the data center, he stated.
Ideally, this infrastructure should be housed in a space that can grow alongside the use of M2M so that business owners can scale the environment cost-effectively, Slattery elaborated. M2M communication is the technology which allows electronic devices to communicate over a network to generate meaningful information such as indicating when store shelves need to be restocked or to trigger.
The scale and scope of the datacenter infrastructure required is dependent on the M2M application deployed though, as it will need to factor in the number of M2M connections as well as the size and frequency of the event being tracked and logged. The facility will also have to support the method of communication, whether it is wired, wireless, continuous feed or batch feed, the executive said.
As such, he suggested companies invest in data warehousing to effectively track, analyze and extract the commercial value from the data collected.
Furthermore, the networking tools used will be critical to ensuring any M2M project's success.
"Although optimized infrastructure is important in supporting your M2M initiatives, it is the application service layers and protocols utilized that are core to the success of the project--as it is these that enable the M2M communication," said Slattery.
Distributed analytics help reduce data volume
Shonali Krishnaswamy, head of data analytics department at Singapore's Institute for Infocomm Research (I2R), agreed that scalable infrastructure is important to deal with M2M data--which tends to be streaming, real-time and therefore continuous.
A key constraint, though, is the communication infrastructure through which thehas to be transported. Therefore, it is really important to consider deploying "distributed and ubiquitous analytics" wherein preliminary data mining is performed on the data collection devices, Shonali pointed out.
Such data analytics--scaled down and optimized to function on small devices typical of M2M environments--can significantly reduce the total amount of data sent to the datacenter, and also reduceon the M2M devices, she explained.
"From a datacenter perspective, what is really needed is the ability to deal with streaming data and couple that with elastic allocation of virtual nodes in a cloud environment," Shonali said.
"From a datacenter perspective, what is really needed is the ability to deal with streaming data and couple that with elastic allocation of virtual nodes in a cloud environment."
- Shonali Krishnaswamy
"While there is considerable focus on the data management aspects, the real challenge is to combine the Hadoop and MapReduce style-approaches for scalable. These approaches operate in batch mode to more real-time or streaming data analytics--which is the norm in M2M environments."
She did acknowledge that platforms based on Hadoop and MapReduce architectures to cope with analytics in real-time are pretty new and just starting to emerge.
Accommodating mobile players in ecosystems
Data centers, which have traditionally acted as the service hubs for wired services and applications, will now have to too, said Mike Sapien, principal analyst for enterprise telecoms at Ovum.
"In many cases, the data center operator needs to encourage the mobile application providers and mobile access to existing applications just as they added vendors for a large variety of Internet access and internet applications previously," Sapien said.
Social network companies are a great example, he noted. Facebook and Twitter have been seeing more mobile access and users, so they need theand application developers to provide the ease of use and access for mobile users which in turn facilitates M2M.
Business continuity even more critical
Ram Singlachar, Asia-Pacific regional product manager for data center services at Verizon, added that a cloud-enabled data center is the way to go in terms of ensuring business continuity.
The risks that today's data center face include large-scale natural disasters, wars, power blackouts, submarine cable outages, and any of these could isolate a critical data center from the rest of the world. "With the global nature of M2M deployments, such isolation is unacceptable from a market expectation perspective," Ram said.
Cloud-enabled data centers with secure and private inter-connectivity, on the other hand, would be a viable alternative as they are able to host and replicate critical data amongst two or more locations around the globe seamlessly to guarantee 100 percent availability, he said.