* Ryan Naraine is traveling.
Guest editorial by Shyama Rose
The market for the development and implementation of source code analysis (static and dynamic) tools is swelling. Companies are increasingly relying on source code analysis tools to identify security-related vulnerabilities. The demand and reliance upon sophisticated automated solutions is greater than the supply of quality tools. Due to the underdevelopment and immature nature of tools and the nature of the industry, the risk of highly complex vulnerabilities left unidentified and unmitigated is high.
Code analysis tools should be used as guidelines or preliminary benchmarks as opposed to definitive software security solutions.
The usefulness of analysis tools for augmenting security reviews is undeniable. On large code bases it can reduce time investments. It provides insight into the code analysis process and can be used as a guide for reviewers. However, a negative trend is emerging where enterprises are relying solely upon automated approaches to gain insight into risk. This invokes a false sense of security as the relying party is likely unaware of the deficiencies associated with security guarantees that tools promote.
The deficiencies of analysis tools are well known and documented. Current tools lack the ability to identify sophisticated bugs, and lean towards identifying top level, common vulnerabilities. Regardless, companies believe they provide a good-faith sense of security to their products and customers. The infancy and lack of sophistication fall far short of the analysis and the ability to provide context that a human brain can generate. The most sophisticated of source code analysis tools are signature based, focus on data and rarely address control flow, and fail on frameworks.
The market is growing increasingly comfortable with regulatory compliance due to metrics-based management practices. The business angle is that this practice is measurable and ultimately cheaper. As a result, a lot of these analysis tools are based on compliance rule-sets such as PCI-DSS. However, regulatory compliance does not guarantee secure software. For instance, PCI-DSS focuses largely on OWASP Top 10 web vulnerabilities and not on a plethora of other vulnerabilities. "AsTech Consulting has found that an automated scan of source code using market-leading tools will find about 35% of the types of vulnerabilities that a manual analysis will discover."
Analysis tools attempt but fall short of implementing effective detection of Top 10-type vulnerabilities.
Therefore, the differential between a comprehensive security review and the implementation of analysis in analysis tools is massive and harmful.
As a result of the increasing dependence upon tools, the security market will shift towards an architected solution as opposed to a live analysis environment as it exists today. Companies will employ people to build tools and have under-qualified handlers to run them instead of consultants to employ comprehensive and focused code analysis, making security guarantees as well as the market for consultants dwindle. Unless the state of source code analysis tools sophisticates dramatically, the quality of analysis and overall security of software will be deficient.
The current train of thought is that a combination of automated and manual approaches is the most effective method to ensure software security. However, the market is moving towards a tool based security assurance model, leading to a “dumbing-down” of security professionals and uninformed business oversight.