Want to dodge a data breach? Do DevOps and let developers work from home, says Google

Developers are tracking what components are used in software, but most of them don't sign-off on code changes, suggesting a long road ahead for protecting the software supply chain.
Written by Liam Tung, Contributing Writer

DevOps, which brings faster software updates, could help prevent the avalanche of records exposed in data breaches, but Google's research finds that existing practices don't meet the task at hand.   

Google surveyed 33,000 tech pros to explore how DevOps – which broadly means aligning software development with IT operations – impacts cybersecurity as part of its annual Accelerate State of DevOps Report. As it notes, more than 22 billion records were exposed in 2021 through 4,145 publicly known breaches.

The report comes as Australian telco Optus handles the fallout from a massive breach that exposed nearly 10 million residents' personally identifiable information (PII) after a hacker on the internet waltzed through an application programming interface (API) on an cloud-hosted endpoint that didn't require a password.

Also: The most popular programming languages and where to learn them

Google's survey focussed on software supply chain security – an area of security that got much closer attention after the SolarWinds attack in 2020 and the open-source Log4Shell flaw this year. These two cases changed the way the tech industry manages software development processes and uses components, such as libraries and language packages, within other products and services.   

DevOps aims to accelerate software releases while maintaining quality and increasingly focuses on security updates. But how much has changed since the SolarWinds breach and Log4Shell?

To estimate this, Google used its take on the Software Bill of Materials (SBOM) concept, which the White House instructed US federal agencies to implement in 2021, called Supply-chain Levels for Secure Artifacts (SLSA).

One of Google's key ideas is that, for major open-source projects, two developers should cryptographically sign changes made to source code. This practice would have stopped state-sponsored attackers from compromising SolarWinds' software build system by installing an implant that injected a backdoor during each new build. Google also used NIST's Secure Software Development Framework (SSDF) as a baseline in the survey. 

Google found that 63% of respondents used application-level security scanning as part of continuous integration/continuous delivery (CI/CD) systems for production releases. It also found that most developers were preserving code history and using build scripts.

Also: How we search the web is changing, so Google Search is changing too

That's a reassuring trend, although less than 50% were practicing two-person reviews of code changes and only 43% were signing metadata.

"Software supply chain security practices embodied in SLSA and SSDF already see modest adoption, but there is ample room for more," the report concludes.

Keeping staff happy can change security outcomes, too. Google found that employers who gave staff the option of hybrid working performed better and suffered lower burnout.

"Findings showed that organizations with higher levels of employee flexibility have higher organizational performance compared to organizations with more rigid work arrangements. These findings provide evidence that giving employees the freedom to modify their work arrangements as needed has tangible and direct benefits for an organization," Google notes.   

Google waded into murky territory of asking respondents to forecast how work styles affected future bugs by asking them to predict the likelihood that a security breach or a complete outage would occur over the next 12 months. 

People working at "high-performing organizations were less likely to expect a major error to occur," Google said.

Editorial standards