X
Tech

IT computing back to the future

The cutting edge of the computer world combines the PC's graphical interface (browser) with high-performance database applications. Are we rushing towards the centralized IT of the past?
Written by Ramon Padilla, Contributor

I was on vacation last week and found myself waxing philosophic about how computing has evolved in our organizations over the years and where computing is headed over the next five to ten years. I can’t remember exactly when he said it, but John Gage’s statement, “The network is the Computer,” becomes more and more of a reality each day.

If you are old enough to remember the days before the PC, you might recall a computing environment that was central to a particular machine. Depending on the size of your organization, all of your computing applications ran on this single machine or a few more like it. It was very powerful, centrally controlled, and—except for what some people would now deem primitive word processing and calendaring—was focused on running core business applications. Management of this type of system was pretty straightforward and generally highly regimented.

When the PC revolution came along, it was responding to a computing environment that had little time for the needs of the rest of the organization or couldn’t keep up with the demands of the core areas. People found that they had a device on which they could do many of the things they had always needed to do—they just had to get the data from the mainframe and manipulate it. History has shown that this was, in fact, very empowering for most organizations, but central IT was not quick to embrace the concept.

Soon afterwards, you had the advent of PC networking and client-server computing, and it wasn’t too long before many organizations had competing computing infrastructures within their organizations: one mainframe-centric, the other, client-server. The problem with client-server was it was “messy.” It tended to sprout up all over the organization, rather than being guided by a central strategy; after all, it was part of the “personal” computer revolution—and it suffered to some degree because of the way it was created and implemented, rather than being centrally planned.

Then came the era of control. Many organizations looked around and realized what a mess they had created. Mind you, it was a productive mess, but not as productive as it could have been. Magazines featured many articles and advertisements for products and methods of getting one's computing environment “under control.” This went on for some time and then the Internet burst on to the scene.

When the Internet started to become more mainstream in the late 80s, computing seemed to explode in organizations, and these messy networks got bigger and bigger—and more importantly—complex. The Internet brought many benefits but added security headaches. The difficulty of managing a network increased due to viruses, worms, malware, bots, Trojans and more.

The Internet phenomenon grew in the 90s and the world didn't end in 2000, thanks to the hard work of IT professionals who did not panic under the pressure of tremendous hype. At about this time, the browser-based application started to really hit the forefront. Forget the fat client; let’s have all of development be browser-centric. And, for the most part, it has been a good thing. It is clear that as the tools mature, we seem to be headed inexorably to a future in which the majority of applications will be Web-based (see the success of eBAY, PayPal, Google, and Salesforce.com, as examples) and connectivity is almost ubiquitous.

Now, with the potential for grid computing, which is defined as “a collection of low-cost network, storage, computing and software elements, lashed together to do work that historically required very expensive dedicated proprietary technologies,” we have an interesting situation: IT’s desire to reconsolidate in order to get things back under control by moving things back to a centralized structure where applications are delivered rather than run on the machines they are presented on – brings us right back where we started.

Depending on the maturity of organizations, I see a movement, which will pick up speed over the next few years, in which companies will want to further eliminate the complexities of their IT environment by having all their applications, including the desktop, be delivered by an applications service provider, or by going completely thin client and becoming an ASP themselves, or a combination of both. Other than for gaming (and I’m not so sure that the same thing won't happen there), people may one day wonder exactly what a PC was for in the first place.

Editorial standards