Grady Booch has a career in software stretching back over 30 years.
The IBM Research software engineering chief scientist even has a development methodology — the Booch Method — named after him and he was a co-creator of the Unified Modeling Language (UML) while at Rational software. Who better, then, to turn to for a perspective on changes in the software landscape?
With the advent of multicore processors, an influx of new tools and the looming issue of how best to deal with big data, software development is not the beast it once was. Alongside those changes are the growth of the internet and cloud services. ZDNet UK asked Booch for his take on where software's been and where it's going.
Q: You're known by some as the software developers' software developer. How does the landscape of the web and development compare with four years ago when ZDNet UK last caught up with you?
A: If you look at the nature of software development, it has certainly exploded as the result of two things. One is the growth of web centric-based platforms and the second is the growth of the internet of things.
By the web-centric software development I mean in the beginning days of the web, when everyone was exploring what they could do, we were building on the naked web itself, which didn't have a lot of functionality. What has happened over the subsequent years — the past decade or so — is the growth of middleware for the web itself. We have the emergence of the semantic web, a rich set of Ajax-like things one can do, and domain-specific platforms that have a web-centric nature to them.
There are outwardly social-facing platforms such as Facebook and Google. In both cases there are a variety of fascinating APIs around which people can build things, thus you see the various apps that go into the Facebook world and the explosion of things around Google Maps. What that is evidence of are growing levels of abstraction of that middleware, that's then built upon the web.
That's one phenomenon that has happened and has reached a richness of capabilities. There's also stability in that market now. Individual operating systems are relatively immaterial these days, but it's really building on that infrastructure specifically for a domain that has caused lots and lots of software development.
The other factor that has changed software development is the explosion of the internet of things, and I include in that the explosion of mobile devices — smartphones, which in my opinion are hardly phones, they happen to be mobile computers that have the ability to do phone calls — and with the movement toward cars becoming, in effect, IP nodes.
That's the publicly visible piece of the internet of things. So we have all of these devices that are intelligent. But it's not just those things. It's the invisible things, such as thermostats in buildings, power meters, the moving parts of various buildings, cars, warehouses.
We have tens of millions, if not billions, of devices that are now connected to the web and there's a lot of software to be built around those.
All these automated things mean we have tens of millions, if not billions, of devices that are now connected to the web and there's a lot of software to be built around those. Right now the work that's happening is in enabling those devices. There's a lot of activity there.
Another is big data. We're talking about the generation of petabytes of data potentially. So being able to sift through that to make meaning where much of it's noise. But there're elements of gold within it, so people are panning for that. There's going to be a lot of development there. That's why [IBM's] Watson is so interesting, because it represents a very different take at looking at big data.
Are there specific issues facing programmers today in tackling these tasks?
The downside is that methodologically, tool-wise and language-wise, it's still going to be awfully hard to develop software. It's not going to get any easier. I see nothing on the horizon that's going to give a material change in the way we develop software.
Is there anything that could create that material change?
I'm not sure that anything can give us a real state change in developing software, and frankly it's an intellectually labour-intensive activity. Historically, the only way we have been able to tackle the growing complexity of software development is by...
...raising levels of abstraction and we see that in our middleware, in our language and in our processes.
While there are some fascinating languages being proposed, the problem is turning those niche ideas into mainstream is very, very hard because it's not just the best technical idea that wins.
So, if you were just getting started in the industry now, what would concern you most?
It's hard to generalise but there are a handful of things that would be keeping me awake. One of those is the problem of coding for multicore processors. In many domains the presence of multicore is going to be invisible to the developer, as it should be because it is a layer of abstraction below it.
Another issue is the cacophony, or should I say cornucopia, of choices that a developer has.
What Apple has done with Grand Central Dispatch, what Intel is doing with its pattern languages, and what IBM is trying to do with the X10 programming language are all examples of masking the inherent parallelism from the average developer, and that's as it should be.
There are some domains for which the developer must get deep into knowing that there are multiple cores and exploiting them but that's a relatively small segment of the marketplace. But for that segment it's a really hard problem because we don't know how to solve it. Dealing with intimately concurrent systems is wickedly hard.
Another issue is the cacophony, or should I say cornucopia, of choices that a developer has. What scripting language do I use? If I'm doing a web-centric thing, what framework do I use? Is it Drupal, or is it whatever, or do I just hard-code it myself? There are millions of choices. It becomes a challenge for someone to say, "I'm going to choose this particular platform", and then move on. But the problem is the world changes out from under them.
There was a Google employee — and I'm not saying disparaging things about anybody — who was complaining, saying, "Look at Google's code base. Many pieces of it are 10 years old and it's positively old and cranky."
It's really fun and interesting working in a new area. Multicore and all these different platforms are new development problems, but what we've not touched on are legacy problems. Google has a legacy problem, Facebook has a legacy problem. The moment you write a line of code, it becomes legacy. So it's not just a problem of old enterprises, it's a problem even for newcomers.
So what can enterprises do to help alleviate this problem?
There are things that help. One of those is the process of continuous refactoring, which is to keep those legacy systems alive means you have to apply some energy to them but for cost-saving reasons many organisations choose not to do so.
They patch and patch and patch until it falls apart due to its own sheer weight or, more likely, some leaner competitor comes along who has no legacy constraints and blows them away in the marketplace. That's the way the market works.
Get the latest technology news and analysis, blogs and reviews delivered directly to your inbox with ZDNet UK's newsletters.