X
Home & Office

Top 5 system configuration oversights

IT security players list most common mistakes administrators make when designing and configuring systems that handle customer data, and highlight ways to plug the holes.
Written by Ellyne Phneah, Contributor

In an era where cloud services are growing and virtualization is the new norm, an increasing volume of customer data is stored in a company's network systems which may sometimes be configured in way that puts personal information at risk.

In fact, a 2009 study by Dimension Data found that an average 29 configuration issues or policy violations were identified for each networking device installed in the Asia-Pacific region. The IT services company then added that most security breaches were successful due to poor network configurations, leading to easy access points for hackers.

ZDNet Asia spoke to IT security players to identify the top five blunders associated with designs and configurations of systems that handle public data.

Lack of in-depth security
According to Vic Mankotia, security vice president of CA Technologies Asia-Pacific, many system designs and configurations may under- or over-emphasize different aspects of the "defense in depth" strategy. He added that one criteria of an ideal system configuration would be to adhere to a good defence in depth model.

Mankotia elaborated in an e-mail that outer layers of configurations should include firewalls, followed by network-access controls and host-access controls. Each level of access should be managed centrally and enforced with appropriate levels of segregation, he said.

"Companies could utilize existing protection techniques in conjunction with data loss prevention, so that the system will advise users to encrypt data and be aware that data is protected under digital rights management," he advised.

Trent Mayberry, Accenture's Asean technology group platform managing director, added: "Multi-factor protections are an essential and not just an addon."

Poor procedures for data access
Organizations should relook data retention practices and processes to ensure sensitive third-party records are stored on transient system and conduits, advised Wong Loke Yeow, regional director and evangelist of Hewlett-Packard ArcSight.

He explained that with the emergence of cloud computing, mobility and consumerization of IT putting pressure on existing security models, many organizations face challenges in securing the data that resides with them. Wong pointed to cases where businesses lost customer data that should not have been kept on permanent record in the first place.

"In an environment where traditional defenses are no longer sufficient, the 'instant-on' enterprise requires a new approach to manage the risk it faces today and in the future," he explained in an e-mail interview. "Organizations need a solution that provides complete visibility and critical insights into its infrastructure, across all users, networks, data centers and applications."

Suhas Kelkar, BMC's Asia-Pacific CTO and global director of innovation and incubation lab, also noted that organizations place too much emphasis on technology and forget about the 3Ps--people, processes and policies--which are equally important in designing a secure system.

Overtly complex infrastructure
Stephen Miles, CA's Asia-Pacific vice president of service assurance, also noted that traditional event-automation tools worked well in previously static data environment, but not so in the current landscape of cloud services and virtualization.

Miles said in an e-mail interview that services were typically built from a number of critical assets and were affected if the service provider could not understand how these individual services were constructed.

He noted that it would then be critical for Web hosting companies and service providers to consider managing their infrastructure through real-time, dynamic discovery-based models that can capture each service component, and unify management silos that would otherwise be disruptive to service quality.

With regard to how complex infrastructures could be improved, Miles advised that with proper modeling, infrastructure optimization and analytics, hosting companies and service providers would be able to make their services more attractive.

"Service management must be viewed as strategic tool, rather than one used for information gathering," he added.

Weak data architecture
Many organizations simply focused on the technical aspect of system design implementation and neglect putting together proper data management processes to maintain it, according to Mayberry.

With regard to what was critical in a good system design, he pointed to data architecture. Considerations should be made for the various categories of data, he said, adding that design rules and policies should be made to safeguard access to the data

Poor understanding of data lifecycle
"Organizations must recognize and have a strong understanding of the 'data lifecycle', and ensure there are processes in place to manage end-to-end data," said Mayberry.

He explained that the lifecycle from "source to target" is important to ensure companies are managing information as an asset. Organizations should integrate their understanding of the data source from the "system of record", with business rules to ensure the right usage and application of the data. They should then monitor the eventual usage of that piece of data, he noted.

Kelkar also added that a good understanding of the data lifecycle can also help organizations set appropriate strategies for the collection, use, deletion and retention of data.

Editorial standards