I attended Jenkins World in San Francisco a couple weeks back. One thing that stood out was CloudBees' advocacy of a new market concept, software delivery management (SDM). CloudBees provides the following definition of SDM:
Software delivery management helps organizations streamline CI/CD processes and foster meaningful collaboration across all functions involved in software development and delivery. Its purpose is to increase software delivery velocity, quality, predictability, and value, which consequently results in improved customer and user satisfaction and better business results.
Seems like a reasonable set of goals. SDM, as CloudBees describes it, is similar to value stream management (VSM). However, CloudBees really caught my attention when, in (on-the-record) briefings for analysts, it compared its SDM vision to enterprise resource planning (ERP) vendor SAP.
SAP? This might be surprising to many in the DevOps community. However, it makes perfect sense to me.
I've long wondered: Most of the C-suite is well served by ERP vendors such as SAP and Oracle; why not the CIOs of the world? Why are their tools so fragmented? Why does the cobbler go barefoot? It's not a new question nor a new branding idea. Various IT service management (ITSM) vendors have experimented with "ERP for IT" messaging but didn't get much traction. Why not? And what might be different with SDM?
I think the biggest problem for the ITSM vendors is that they were starting at the end of the digital value stream, at the phase of operations and support, when the harder and more valuable aspect is upstream, in software development.
Before continuous delivery, upstream was a world of project management where (oftentimes) the build and deploy toolchains were unique to each project. Now, the industry has a solidifying vision for a continuous delivery conceptual architecture that spans projects (which are increasingly turning into steady-state products).[i] Story, commit, build, package, provision, deploy, operate -- the deepening, DevOps-driven industry consensus here is a big step forward and might well be "what's different this time around."
However, what can we do with this consolidated "digital supply chain" data? Magic does not happen just because we have the data integrated into a common repository. This is cargo cult thinking. We need clear use cases. ERP systems, and their MRP (materials requirements planning) predecessors, have great clarity of purpose -- reducing inventory carrying costs by significant factors through better order scheduling, for example. SDM needs to further articulate such benefits.
Can we decrease inventory (i.e., the invisible inventory of "design in process," as Don Reinertsen has called it)?[ii] Better and more quickly identify constraints and bottlenecks, such as not having enough of the right skills at the right time? Increase speed to market? Prune product options more quickly? Increase our speed of learning? Decrease operational or change risk? Restore service more quickly? How does the availability of this new information achieve these value propositions? Can we make an economic case for it?
It seems like there ought to be value there. But "seems like there ought to be" is not a business case, and clarifying that line of sight to value, I think, is a priority. My current hypothesis is that we are going to make more headway by building simulations, rather than trying to solve the digital pipeline's problems analytically.
[i] Mik Kersten, Project to Product: How to Survive and Thrive in the Age of Digital Disruption with the Flow Framework, IT Revolution Press, 2018.
[ii] Donald G. Reinertsen, Managing the Design Factory, Free Press, 1997.
This post was written by Principal Analyst Charles Betz, and originally appeared here.