Application testing has taken on added importance as more mobile apps enter the enterprise arena. However, shortening deadlines for app delivery and easier app programming interfaces (APIs) will increase the likelihood of human errors during testing processes, observed security experts, who recommend a balanced testing workflow that includes manual and automated testing.
Ronnie Ng, senior manager of systems engineering at Symantec Singapore, noted that as more mobile platform providers introduce easy-to-use APIs to entice developers to sign up, such efforts have also made it easier and less time-consuming for hackers to write malicious codes that attack mobile apps.
In addition, enterprise developers face a shortening app delivery timeframe, which may result in flaws creeping into the app coding and testing process, Ng said in an e-mail interview.
Paul Oliveria, technical marketing researcher at Trend Micro, elaborated on the testing environments for two popular smartphone platforms--Apple's iOS and Google's Android operating system.
He told ZDNet Asia that Cupertino has outlawed most third-party APIs, thereby, making its app review process more stringent. And since documented APIs provided by Apple are, in most cases, secured, the majority of apps published on its App Store are usually safe for use, Oliveria said.
Additionally, apps running on iOS does not "touch" any internal part of the mobile device's OS and operates solely within its own run-time environment, which makes iOS apps more secure, he noted.
This is not the case for the Android app ecosystem, though, he said. He pointed out that developers who pay US$25 to join the Android developer program will be able to submit any application instantly.
Oliveria said: "There is no review process from Google for the submitted app before it gets published.
"Even though Android has a sandbox-like environment [like the iOS], the openness of the OS and app review process provide a very unsecure situation whereby the app may become a source of threat or for hackers to breach the app more easily," he said.
He added that Trend Micro in 2010 had discovered four malware-like apps that were submitted to the Android Market. Google also removed 50 suspicious-looking apps from its app store last December after determining they had used the names of various banks without prior permission.
According to Raja Neravati, senior vice president of software testing company AppLabs, such risks underscore the need, once apps are published, for app store operators to be responsible for and to ensure that apps are safe for use.
Neravati noted that while an app is built by the developer, a security breach found in an app is a "collective responsibility" of stakeholders such as the network provider, phone maker and mobile app provider.
Furthermore, once the service provider verifies and certifies the app before it is published, it is then the service provider's responsibility to find a resolution if loopholes are found and exploited, he said.
Follow best practices for app testing
Ng noted that while there are unique challenges today when it comes to mobile app testing, developers will have to consider a strategy that balances tradeoffs between cost, quality and time-to-market.
Due to the "high margin of human error", he said developers should rely on automated testing throughout the initial stages of coding, as well as run stringent security screenings regularly to detect any potential vulnerabilities in the app. That said, manual testing should not be avoided altogether, but used at the end of the coding process as an operational test, he advised.
Ng also called on organizations that have or are planning to develop mobile apps to plan their testing strategy across both manual and automated testing approaches, and consider outsourcing to dedicated software testing companies, where necessary.
"Outsourcing to [third-party security] vendors that operate an independent testing practice may be a viable option to manage the expertise, scalability, security and quality assurance requirements for mobile apps," he said.
Oliveria also encouraged developers to follow industry standard, secure coding practices from the start of the development process. These include, for example, ensuring that the programming code is not vulnerable to buffer overflows or format sting attacks, as well as testing all inputs to the application rigorously, he stated.
Furthermore, Neravati added that simulating the app on the network and hacking the app on data transmission and data transparency would also be good to surface any inherent programming flaws.