Thanks to the dozens of you who sent me your views on Gartner's 10 predictions on the future of IT. While I can't reflect all of your opinions, I picked out a few predictions from the list to explore further with your comments. First up is the notion that network capacity will increase faster than computing component capacity, resulting in a shift in the relative cost of remote versus local computing by the end of this decade.
The majority of readers were skeptical that the last mile bandwidth problem would be resolved easily or that users will be satisfied with thin clients, especially given the prodigious specs Gartner proposed for a 2008 vintage desktop PC in prediction #6.
Regarding a move toward a remote, centralized compute services model, some readers pointed out that the growth of broadband services has major hurdles from both corporate and consumer perspectives.
Some readers said the FCC should deregulate the telecoms, hoping that increased competition will lead to faster adoption of broadband. While most corporations aren't bandwidth constrained, their customers and partners are suffering from bandwidth deprivation for a variety of reasons. If e-commerce, e-entertainment, e-government and all the other e-services are to gain more rapid momentum, bandwidth-which is not in short supply-needs to be more affordable, reliable and invested with compelling applications.
As a side note, Andrew Odlyzko of the University of Minnesota Digital Technology Center has some interesting proposals to stimulate broadband growth. Rather than deregulate the telecom monoliths, the industry should focus on delivering services requiring broadband that customers want.
He proposes that the telecom industry buy off the music studios--which generate a pittance (perhaps $15 billion) compared to telecom spending--and make file swapping legal. Odlyzko admits his idea is impractical, but it highlights the perennial chicken and egg problem.
His more practical suggestions are to migrate voice calls to cellular phones, which is already happening, giving the telecom companies a good reason to promote wired broadband services, and to encourage more usage of WiFi (802.11) and ultrawideband wireless technologies.
What is clear is that the demand for broadband is directly related to the cost and benefits derived from the technology, no matter whether the compute resources are local or remote. We all recognize the useful applications, ranging from distance learning to telemedicine to movies on demand, but it will take at least the remainder of this decade for the real shift to an economic model and ecosystem that leverages the bandwidth.
Is there a chance that the pipe owners and content providers would seed the market with compelling, lower-cost services that addict people in the same way as cell phones took root in the last few years, spurred by improved technology, a healthy economy (currently lacking), global reach and increasingly aggressive pricing? Not likely. E-mail, eBay and instant messaging are still the killer apps, not e-commerce.
We require too much computer literacy to compensate for the often unfriendly and vulnerable computing systems. The user experience has greatly improved over the last 20 years, but the costs of training and maintenance are outweighing the cost of connectivity and devices. An entire ecosystem ranging from electronic payment systems to self-healing software needs to be in place before we can claim that we have tamed the digital frontier.
Remote versus local resources
Now let's get to the issue of remote versus local resources as the model for future computing. Like most of the predictions we make, the real answers is somewhere in between the two extremes.
Reader Joseph F. Hull wrote that adequate network and application server security, as well as legal means for dealing with un-met quality-of-service claims, will condition corporate acceptance of the more remote computing model. Most people are not going to trust all of their data or their entire business to a cluster in the cloud without more assurances that adequate failsafe systems and security measures are in place. And, both corporations and individual users will want to know specifically the total cost of connectivity (TCC) to measure the value of the service compared to other usage scenarios.
Peter William Lount doesn't buy the concept of all compute power being managed in a centralized network. "When high speed communications (gigabit Ethernet and faster) become common between geographically separated computers and bandwidth price rates drop to reasonable levels, there is less of a need for centralized grid computers. High-speed communications will primarily encourage 'distributed grid systems' not centralized ones."
Lount also points out the challenge in creating software for centralized or distributed grids that can manage the complexity and ensure a secure computing environment. Grid-enabling software is progressing, but most of the efforts for distributed grid systems are still in the research phase.
The majority of readers were not about to give up local control of their data or return to the days of dumb terminals. Storage, memory and processing will become less expensive and more mobile. Locally-based computing components may not be as cost-effective as a centralized network model, but we will see a wide variety of thin and thick devices, all connected to the network.
Blogger and pundit Mitch Ratcliffe takes a hybrid theory in which you have local storage and processing, but it's not part of the physical device you use for input and output. He believes local storage will live in the walls rather than in conventional devices. "Terabytes of disk storage and gigabytes of random access memory, along with multiple processors, [will be] linked to devices throughout the home or office via wireless to conduct real-time, grid-computing-like functions. This will allow 'dumb'" (even though they'll be a lot smarter than today) clients to perform more complex functions," Ratcliffe wrote.
For prediction #6, positing that Moore's Law continues unabated throughout the decade, most readers strongly agreed. Several readers pointed out that Moore's Law was misstated. Gordon Moore, the founder of Intel who coined the law, said the growth of transistor density, rather than processor power, would double every 18 months. More transistors packed onto a production die yields lower priced, faster chips.
Gartner projects a typical desktop computer in six years will have 4 to 8 40GHz processors, 4 to 12 gigabytes of RAM, and 1.5 terabytes of storage. Again, most readers can see a natural progression of capability growth in semiconductors and storages, although a few questioned how you would keep a compact unit with 8 processors from overheating or sounding like an air conditioner. One reader jokingly questioned whether Windows would require a 40 GHz processor and 8 gigabytes of RAM in 2008. I'll leave it to you to decide, but you can be assured that there will be some applications that use every available cycle.
Reader Jim Irving was skeptical about Gartner's vintage 2008 desktop computer. "Typical users generally already have more power than they or the software knows what to do with. If software vendors can think of compelling things that require lots more power, some users may adopt them." Another reader, Jane Axtell, agreed, saying, "Raw power is not everything; customer utility matters too."
David Youkers argues that a law of diminishing return will impact the practical applicability of Moore's Law. "We are beginning to see return on investment becoming the dominate consideration rather than new technology. Shortly, and for the foreseeable future, integration will be the issue. No doubt, speed and capacity will be factors as IS moves from a niche solution to a pervasive utility (grid computing). In that 'utility' environment there will be the need for a 10,000 (pick a number) fold increase in processing / storage / distribution capacity, but we cannot imagine today how that capacity will be achieved. I doubt that it will be with 'old' technology concepts such as individual processors."
David Griffith compares Gartner's predictions to those in the previous decade of Internet frenzy claiming that e-business would establish a new economy. "All of these things will happen, but not nearly to the extent that they forecast. [Similar to e-business] they don't work as well as they should and the cost advantages are much smaller than they appear. The bulk of the economy still operates on brick and mortar businesses that will be increasingly leery of technology mega leaps, preferring to stick with marginal improvements and utilization of existing equipment."
Griffith's comment reflects much of the current thinking about technology investments. Eight years from now, the current severe economic downturn may be a distant memory, but not too distant. IT executives are building up a high level of intolerance for technology projects that fail to deliver a reasonable, rapid return on investment.
As reader Ed Baumgarten writes: "Whether the normally glacial pace of change in business processes will allow the pace of change needed to meet the timeframes suggested is the big question. There may be too many dinosaurs in charge of too many businesses to make so many momentous cultural changes…These will not be easy changes to make in even the best intentioned business."
Next up: Your comments on predictions about interenterprise applications, business activity monitoring, business unit leading IT decisions, centralized vs. decentralized focus. Keep the comments coming. Visit our TalkBack forum or e-mail me at email@example.com.