Here comes the decade of the developer

Here comes the decade of the developer

Summary: IT professionals were the heroes of recent decades when they helped enable big productivity gains. The next decade will have a new set of heroes: Developers. Learn why.

SHARE:

The major productivity gains of late 1990s and early 2000s were powered by the spread of information technology throughout organizations of all sizes. And it made IT professionals invaluable. However, the technology world remains in the midst of a relentless transformation and the changes sweeping the industry over the next decade will make developers, not IT pros, the new superstars.

We shouldn't forget that the innovations we now take for granted have streamlined business communications and commerce in big way. I'm talking about PCs on every desk, computer networks for file and printer sharing, email, shared calendars, and ultimately the Web . These innovations gave a turbo boost to existing companies and spawned the rise of nimble new companies that were able to run circles around the incumbents in many industries.

These 1990s innovations made nearly all organizations deeply dependent on new technologies and they hired IT professionals in droves in order to troubleshoot problems, train employees how to use the new digital tools, and to "keep the world running," as it were.

But that revolution is over. The technologies are deployed. The users are trained. The data centers are built. In recent years, no new IT innovations have arisen that can transform businesses the way corporate networks or email groupware or the Web did a decade ago. Nearly everything IT is doing now is tweaking and incremental upgrades to existing technologies.

Worse, the new tech trends that are arising often adapt and iterate faster than traditional IT departments can handle. Take Web applications and smartphones for example. The product and upgrade life cycles are so short that IT's time-tested procedures (test, harden, and deploy) leave IT in a position of rolling out stuff that is already outdated by the time they rubber-stamp it for employee use.

That's why many employees have started using their own laptops, smartphones, and Web apps to get work done -- sometimes clandestinely -- and it's why many companies have reduced the size of their IT departments. Some, such as author Nicholas Carr, have even predicted the demise of IT altogether.

TechRepublic recently asked its CIO Jury to weigh in on the future of the IT department and the verdict was mixed, with half thinking IT will continue to shrink and the other half thinking all boats will rise as technology becomes even more embedded in modern organizations.

I tend to side with the shrinkers, with the exception of a few industries such as health care that have traditionally lagged in IT innovation and are now quickly catching up (a phenomenon borne out in the comments from the CIO Jury poll). But, that won't last forever.

Why? Many computer products have become so cheap that it's often easier to replace than to repair. Plus, IT services have become highly commoditized. There are plenty of technicians and administrators to fill up the labor pool (unlike the IT labor shortage a decade ago), and there are plenty of consultants who can fill in the gaps when needed. The really good IT professionals will still cost you a pretty penny, but they're worth it because they can make your organization more efficient or innovative, or both.

The other factor is expectations. Now that workers have been using this stuff for over a decade, they just expect it to work. There's less tolerance than ever for downtime or buggy systems, especially when companies have outsourced it to service providers who promise best-of-breed experts on the clock 24/7/365. But, even when companies have an expensive in-house IT department that has been hired to make everything run like clockwork, they don't tolerate interruptions.

IT is now a utility. And increasingly, it's just a utility that gets you to your apps

Applications have always been king -- to an extent. It was apps that decided the winner of the PC wars of the 1980s, empowering IBM and then Microsoft to victories over Apple.

But today's app environment is different. It's faster. It's more incremental. It's multi-platform. It's more device-agnostic.  And it's shifting more of the power in the tech industry away from those who deploy and support apps to those who build them. Oh, and did I mention that it's easier to get started, so there's also a lot more competition?

This new app model began in smartphones, with the App Store for Apple's mobile OS, but it has spread to other mobile platforms such as Android and more recently to netbooks and tablets as well. And it's only a matter of time before it spreads to desktop, Web, and enterprise apps, which will each get their own nuanced versions of app stores.

It will be especially interesting to see how well really big software packages can adapt and get more modular, and possibly embrace open standards so that data can flow more easily between best-of-breed apps and modules. But, however it plays out, the momentum behind this app model is too great for it not to affect traditional software packages and software implementation methods.

It's a simple equation. When there's less tech support, there's a lot more emphasis on apps that just work. And the new app model forces developers to make apps that can deliver immediate value to users, otherwise they'll get passed over for the next app.

This breeds a survival-of-the-fittest environment for developers, but make no mistake, there's never been a better time to be a developer.

In this environment, developers have tremendous opportunities for independence and creativity. Individual developers and small teams of developers (sometimes in concert with designers and project managers) can now build mini empires for themselves, thanks to the micropayment systems that allow one developer with a PayPal account to have virtually all of the infrastructure needed to start a consulting business.

Industrious developers can even work for a big company or an app development team as a day job and then moonlight as an independent developer with a few of his or her own apps that can potentially generate residual income. Meanwhile, experienced app developers can freelance for multiple clients and build a small consultancy by helping businesses of all sizes get into the app game.

For these developers, location matters less than ever, too. A developer with email, a Skype account, and a half-decent Web portfolio can typically find pretty good work, even from clients in remote locations. Plus, the tools for team collaboration over the Web are getting better all the time and will continue to break down geographic barriers.

The demand for developers is increasing because everyone wants an app now -- from Target to Allstate to Joe's Garage down the street. They started with mobile but we should expect this to spread to tablets, desktop widgets, and eventually TVs (once platforms like Android and iOS get embedded in the living room experience). Having a multi-platform app strategy will become standard procedure for new companies the way having a website is today.

The sweeping changes in the tech industry will continue to have unpredictable consequences and will produce new sets of winners and losers. Traditional IT roles are not going to go away, but they will be under increased pressure and are likely to become more concentrated in service providers. At the same time, developers are about to step into the spotlight. This is going to be their decade.

This article was originally published on TechRepublic.

Topics: CXO, Software Development

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

32 comments
Log in or register to join the discussion
  • Very insightful, I think that all of this means less and less written for

    Win32, and more written for mobile devices and HTML5.
    DonnieBoy
    • RE: Here comes the decade of the developer

      @DonnieBoy
      Don't count on it.
      As smartphones become more and more powerful, there is going to be an increasing demand for seemless transition functionality between computers (desktops/notebooks) and these mobile devices.
      Result: high demand in both areas....
      rhonin
      • agreed

        @zenwalker

        Agreed. I don't think people will ever only want one form factor. I think in the long run, we'll want several form factors to suit our situation - small for when we're walking, mid-size when at a desk, large for in the living room, etc.

        Are big screen TVs going away just because you can watch videos on an iPod?

        No, of course not.

        So why in the world is anybody trying to apply that kind of logic to PCs? It makes no sense whatsoever.
        CobraA1
      • RE: Here comes the decade of the developer

        @zenwalker
        Maybe you and CobraA1 should understand the terms HTML5 and Win32
        hawks5999
      • RE: Here comes the decade of the developer

        "Maybe you and CobraA1 should understand the terms HTML5 and Win32 "

        I assumed by "win32" he is talking about Windows in general, even though technically Windows is moving to 64 bit.

        And yes, I know what both terms mean.
        CobraA1
      • Well, yes, they are moving to 64 bits, but, it is basically Win32.

        The point is, in the future there will be very few native Windows applicaitons. They will be written to HTML5. You access your data and applications from wherever you are.
        DonnieBoy
      • RE: Here comes the decade of the developer

        "The point is, in the future there will be very few native Windows applicaitons."

        I disagree. Just because some web apps are popping up doesn't mean development for Windows is gonna cease.

        "They will be written to HTML5."

        Some will, some won't.

        And even web apps aren't pure HTML 5 FYI - they also contain JavaScript, CSS, and sometimes even other languages. HTML is just the markup that lays it all out and provides scripts with access to its structure. In the case of HTML 5, it also allows offline access via a manifest that describes which files should be saved permanently.

        Will this improve online apps? Sure, it goes a long way to doing so.

        Will this displace native apps? Not really. All the native apps that are gonna be displaced have largely already been displaced, and those that haven't probably aren't. Offline access is important, but isn't really holding back a lot of apps.
        CobraA1
      • "Seamless access" IS "platform agnosticism"

        <p style="text-align: justify;">@zenwalker and @CobraA1: Leaving the Web and Web apps entirely aside for a moment, the trend Jason's commenting on here is tailor-made for high-performance, multiplatform development languages and tools, which then produce easily-deployed, platform-agnostic apps. It's not that Win32 (or Win64 or WinNAN) is Paleolithic in comparison, it's that tools better suited for rapid development and deployment (Python and a host of others) are sucking all the oxygen out of the room and using it to give life to systems that would have been impractical as traditional waterfall-cycle apps.</p><p style="text-align: justify;">At the rate things are going I expect that, within a decade, only a tiny fraction of the software that the average consumer (or business user) uses will be recognizable as "Windows." Or "OS X" or "Linux" or "JoeBob's OS." What we're starting to see is disintermediation of users' access to their data.</p><p style="text-align: justify;">The OS being used to run the device that lets users work with information is becoming trivial; even the applications themselves (thanks to open data and UI standards) tend to fade into the background. Pick up any MP3 player and you know how to use it; whether your word processor of choice is Word or Pages or OpenOffice, sit down in front of either of the others and you'll be able to get at least basic editing and formatting done with far less hassle/confusion than a decade ago.</p><p style="text-align: justify;">One of the demos I sometimes run when I'm doing a presentation to a new client reinforces this: I ask them to take one of their experienced office staff (who more than likely is most experienced with Word on Windows) and ask her or him to sit down in front of something different, type a two-page memo, make review corrections, and email it. The time to accomplish this &ndash; in an application, an OS and/or a hardware device <em>different than</em> "the usual" &ndash; is routinely only slightly longer than it would have taken in the current office standard for that individual who is <em>inexperienced in at least one of the three major aspects of the task</em>. This is a powerful argument against the obsolescent perception that standardizing on a single way of doing "everything" is a business necessity of greater value than the risk it enables.</p><p style="text-align: justify;">People <em>don't</em> want "only one form factor." They don't &ndash; and shouldn't &ndash; care about form factor except as a convenience. What's really important is the data, and what they can do with it.</p>
        Jeff Dickey
      • RE: Here comes the decade of the developer

        "the trend Jason's commenting on here is tailor-made for high-performance, multiplatform development languages and tools,"

        It's a trend where? And what exactly do you mean by "tailor-made?"

        "(Python and a host of others)"

        Python does not qualify as "high-performance," sorry.

        All you're really doing is throwing around buzzwords about something you don't really seem to understand.

        "are sucking all the oxygen out of the room and using it to give life to systems that would have been impractical as traditional waterfall-cycle apps."

        A "waterfall-cycle" refers to a generalized approach to developing software that is independent of programming language or platform. To connect it to particular development languages and tools is rather silly. You can certainly use newer approaches with older languages.

        "At the rate things are going I expect that, within a decade, only a tiny fraction of the software that the average consumer (or business user) uses will be recognizable as 'Windows.' Or 'OS X' or 'Linux' or 'JoeBob's OS.'"

        Perhaps, perhaps not. The OS will certainly change, but as far as becoming unrecognizable, I have serious doubts. Especially when it comes to Windows and Mac platforms, being recognizable is a key part of their business model. Microsoft and Apple certainly want you to know when you are using their products.

        Even if the OS changes drastically, they're still gonna be concerned about branding.

        And as far as changing drastically goes - just look at what happened with Office 2007. It was the biggest leap in UI design for word processors in a very, very long time - yet it sorta divided people, with some liking it, others not.

        "I ask them to take one of their experienced office staff (who more than likely is most experienced with Word on Windows) and ask her or him to sit down in front of something different, type a two-page memo, make review corrections, and email it. The time to accomplish this ? in an application, an OS and/or a hardware device different than 'the usual' ? is routinely only slightly longer than it would have taken in the current office standard for that individual who is inexperienced in at least one of the three major aspects of the task."

        Let's see: Experienced in an alien environment vs inexperienced?

        Actually a surprising result, considering an experienced person should already know about the task, even in an alien environment, and would pick up cues about how to accomplish the task faster than the inexperienced individual. I would have expected the experienced person to take a shorter time.

        "This is a powerful argument against the obsolescent perception that standardizing on a single way of doing 'everything' is a business necessity of greater value than the risk it enables."

        No, it's not a powerful argument at all. A more proper experiment would not be changing two variables at once: The current experiment has the two variables of experience and environment. By changing both variables at once, you have invalidated the experiment and possibly skewed the results in your favor.

        All I really see here is you have no idea what you're talking about. You're using your vocabulary more like a marketing department rather than a skilled worker. It's not very convincing at all.
        CobraA1
    • RE: Here comes the decade of the developer

      yes
      i like .is very good !
      lincc275
    • RE: Here comes the decade of the developer

      @DonnieBoy despite your fantasy for years, I don't see it, the demand will be there for both areas.
      ItsTheBottomLine
    • RE: Here comes the decade of the developer

      @DonnieBoy
      the little fun apps you have on your phone do not equate to enterprise apps that run the world. windows apps will be around for a long time just as cobol has been around for a long time.
      rengek
    • It appears you guys do not understand what HTML5 is.

      NT.
      DonnieBoy
      • RE: Here comes the decade of the developer

        @DonnieBoy It's a markup language. An annoying to write one filled with angular brackets, which need the shift key to type.
        CobraA1
      • Yep, I got the feeling you had no idea. I was right.

        NT
        DonnieBoy
      • RE: Here comes the decade of the developer

        @DonnieBoy LOL.
        CobraA1
  • RE: Here comes the decade of the developer

    "But today?s app environment is different. It?s faster. It?s more incremental. It?s multi-platform. It?s more device-agnostic."

    We wish.

    Apple's native apps . . . only work with apple devices. Same with most other platforms.

    Web apps . . . are still slower than native apps, and performance is very dependent on the network you're on.

    Very few apps try to leverage the advantages of both sides of the equation.

    "Oh, and did I mention that it?s easier to get started, so there?s also a lot more competition?"

    A lot more competition? Yup.

    Easier to get started? Not a chance. It's actually more difficult than ever. Instead of one or two languages, you need to know a bunch. HTML/XML/CSS/JavaScript/DOM/some server side language/possibly things like SVG or MathML depending on the website, etc.

    And you need to know the exact specs for the language - and you need to know how browsers don't follow those specs.

    And of course you need to find people to serve your products, whether it be an app store or a web server.

    Development these days is actually very much more difficult than ever before, of that there is really no doubt. You seriously need to know so much these days.

    "to have virtually all of the infrastructure needed to start a consulting business."

    Anybody who wants to start a consulting business needs to be shot. They have done us no favors.

    I value people who actually work on code far more than people who go around charging you to tell you what your 12 year old could've told you.

    Frankly, you're better off actually getting a real education at a local college than messing around with consulting businesses. If you don't know how to do your job - maybe it's time you learned.
    CobraA1
    • RE: Here comes the decade of the developer

      @CobraA1
      lol I was going to quote the same thing about applications being faster and device agnostic.

      In fact in the last 2 years, I believe development has gone backwards. If you write an app these days, you need to write it for android, ios, webos, windows phone 7 and the equivalent web app.

      We've gone away from the write once run anywhere mantra right back to the late 80s where we have to write multiple versions of software for different platform. It used to be publishers would write for windows, os/2, atari, apple, commodore etc. Luckily that nonsense ended and we as developers and the world matured and we developed web applications. All was good.

      Then stupid cell phone apps market their way into a public who really don't know nor care (and they shouldn't) about the difference between a web app and a cell phone app. So now cell phone apps are a nice marketing tool and we end up having tons of apps on all kinds of platform.

      That in itself is a development nightmare. Its bad enough having to support multiple browsers for web app. Now we have to deal with cell phones with their own standards. I am sure that will change as well. Interesting to see how but realistically, it cannot continue in its current path. Thats just disaster.
      rengek
    • RE: Here comes the decade of the developer

      @CobraA1,

      There are plenty of consultants that work as coders. There are good consulting companies and bad consulting companies. There are successful projects that have been executed by consultants and there are bad ones.
      bmonsterman
  • RE: Here comes the decade of the developer

    I sure hope so. I'm a developer :)
    bmonsterman