X
Business

"Free" content, open access and ad-based business models place far greater demands on datacenters

So when a server takes 4 seconds to deliver a data-laden Web page, extracting data from a variety of sources, it ends up costing millions of dollars over a period of a few months in lost revenue to many more businesses. Making the page appear in 1 second is now mission critical.
Written by Dana Gardner, Contributor

File this one under unintended but substantial consequences.

I'm at the annual Sybase TechWave user and analyst conference this week in Las Vegas, and something really jumped out at me from the testimony by a diversified panel of Sybase users during the keynote presentation today before 1,600 attendees. From the group -- AOL, Bear Sterns, Cox Communications and Shopzilla -- came the message that their need for datacenter integrity, speed, access reach, and fail-over support is growing dramatically in a way that redefines the notion of "mission critical."

The expanded notion of must-have performance levels not only includes more applications and services under more circumstances, but demands a much higher degree of responsiveness across many more edge devices that tap more types of data and content. And this need for lightening-fast heterogeneous data retrieval and delivery is in many cases driven by enterprises that are moving to a brave new world of media-like business models: Web 2.0, SaaS, on-demand, social networks, mobile access-oriented, and ads-driven monetization.

Why do these new business models require different kinds of back-end systems? Because, increasingly, companies like AOL and Shopzilla are creating their incomes from the ads generated by page-views and communications services access. Same for carriers serving up content to cell phones. Same for transaction-based services. Same for applications as services.

So when a server takes 4 seconds to deliver a data-laden Web page, extracting data from a variety of sources, it ends up costing millions of dollars over a period of a few months in lost revenue to many more businesses. Making the page appear in 1 second is now mission critical.

The datacenter and applications delivery downtime is no longer a customer satisfaction issue, it's directly related to the bottom line. It could easily make the difference between profits and losses at an increasing variety of companies -- from retail to media to finance to most all online transactions.

Now, Web-based portals and applications hosts are familiar with this need for speed, speed, speed. But now bricks and mortar enterprises and government agencies -- as they seek to monetize based on a media model, and are increasingly delivering data, applications, and services to cell phones, PDAs, and mobile email readers -- need to serve up lots of data (from an SOA?) in the speed-is-money milieu, even though the data began stuck in some old, back-office applications.

So there's a Web 2.0 paradox: When the services are free, the investment in back-end performance needs to be higher. And the costs and complexity are higher for non-greenfield applications, which still need to be delivered far and wide. Indeed, when you charge for subscriptions or license directly for services, you can get away with so-so performance. You already billed them and they paid, so let them a wait a while.

Yet as more vendors and providers go to free delivery, ala AOL and its email and portals these days, the provider must now re-engineer their back-end systems to truly live up the the real-time economy. That means that data needs to be extracted and cleaned from where it began, perhaps from on a mainframe or client/server application. There will therefore be more directories, data instances, data cleansing stages, and fail-safe systems to support these data delivery mechanisms.

Developers need to be able to access data that is separated and agnostically freed from its original or integrated sets of applications. And that data needs to able to be used and reused, added and subtracted, securely, perhaps SOA-like. So increasingly the data on the data needs to be easily accessed --separate from the application logic -- so to deliver the correct kinds of data in the format needed by mobile devices.

What's the investment impact from the move toward more Web 2.0 and on-demand apps on the big, old back-end systems? It turns out to be hugely important, particularly in non-greenfield situations. And that means we should expect a large re-investment in, and re-architecting of, enterprise back-end infrastructure ... but in a different way than the past.

The new datacenters must not just support the previous definition of "good enough" -- that is pokey, slow applications that go to one's internal employees. The real-time jargon is now real. Application and data delivery must support the new types of activities and business models driven by outward-facing Internet-based services, especially if they are ad-driven business models -- and even more especially if they are reaching mobile devices.

End consumers, and employees too, will not wait for the information they need to make decisions, but will go elsewhere for the services and data that serves them best. Workers in the field should get data enablement at time of task. Always. Anywhere. Clearly, bringing enterprise apps and data to the field real-time requires a better back-end than the same applications inside the company had before.

Sybase is betting big on the infrastructure ramp-up to support the real-time information economy, and its recent financial results may be pointing to the trend now well under way. Sybase also this week, in a drive to support these needs, unveiled the Sybase Data Integration Suite 1.0, which ships late this year. It combines Sybase's existing data delivery portfolio, new technology through acquisitions, and a strong modeling capability to allow the high-performance access and delivery of data and objects from various sources. Sybase also updated the SQL Anywhere 10 product.

These products and the burgeoning demand for them in the market validate the path Sybase's outlined at TechWave last year, which, incidentally, marked one of my first blogs on software strategies for ZDNet. My, how the time flies. If only enterprise data could act as quickly.

Editorial standards