The forms based applications model works pretty well. From a developer perspective, you layout the form, associate data with each field, provide the FYI (single line contextual help), and determine what actions the system is to take when other things happen -like a user mouse click, a backspace, or the arrival of a message from outside the user session.
From a developer perspective this means that prototypes can be built with the customer looking on - and successful prototypes can go straight to production. From the user perspective this means that applications can be made to do the right things - because the developer writes the system essentially while the user writes the manual and both can back-up and start over easily and cheaply.
However, there's a much more subtle hidden benefit too: applications don't need menus. Automatic query by example, modality change (read/write), and next form or script execution combined with data inheritance let you create applications in which the user's actions and (csh) environment variable values are sufficient to determine what happens next.
The combination makes applications absolutely intuitive - seeming to both follow and lead user action while silently (and often invisibly) enforcing business rules.
During the late eighties, when all this was being worked out and companies like Unify, Informix, Progress, and Oracle all had real contenders in the market the data processing people still dominated purchasing, were caught up in the information engineering hysteria, and turned thumbs down on anything straying outside their traditional development model.
What really doomed it at the time, however, was the PC -because a lot of companies bet on the PC as a means of getting past mainframer opposition, and the PC couldn't handle it.
From both user and developer perspectives the forms based applications model gains value relative to other approaches as application complexity increases, but does so partially at the cost of requiring larger, clearer, faster, displays. This wasn't a problem for workstation or X-terminal users in development jobs: NCD's base 19C, for example, supported 1024 x 1280 on 21 inch screens but, at the time, the Windows 3.11 PC was taking over the desktop and not remotely up to the job.
In fact Windows 3.11 was heavily optimised to put up windows frames in primary colors on 480 x 640 pixels. This made it look snappy, but did so at the cost of significantly delaying content display. As a result Windows developers were heavily "incented" to simplify forms - ultimately producing the Windows standard single line input form requiring the user to move the mouse to the window, click to establish focus, do something, click again, and wait.
Even today, that strategy persists. Look at any major Windows application and what you'll see is the dominance of the tiny pop-up with major interactive screens limited to two or three functional lines - and a usage process that involve user mouse or keyboard action at every step.
This image (91K -opens in a new window) in contrast, shows the main screen for a time management system I prototyped using Accell/Vision. It was intended to be used on a 21 inch high resolution (1280 x 1640 or better) screens and is simply too dense to be used effectively even on today's PCs with 19 inch or smaller screens. If you're into masochism, or really want to understand why the forms based model never caught on with the PC crowd, just try to imagine using this on a 17 inch PC screen at 600 x 800 under Windows 9X or even 2000.
Notice that just about everything about this form is completely counter-intuitive in today's PC world - it's too complicated, too big, too messy, and there's no clear direction. Right? Actually, no: that's all dead wrong, but a logical consequence of what we've all been taught by how the PC does these kinds of things.
In reality, the user's environment variables define what gets filled in on application start or refresh -and any user action to change one thing on the screen changes almost every non default value shown. Changing the time period has obvious effects; clicking on, or typing into or over, any displayed value triggers either a query or an update on the affected fields and that changes many other things from default actions to next forms - in reality this thing was menu-less, seamless, and actually got faster with more users (because of the way Informix loaded read buffers).
What this illustrates above all, however, is a simple culture clash. The forms based model drives designers toward complex looking applications that are actually very simple and intuitive to use; while the PC does the opposite - driving designers to simple looking screens that may look cool when used once but are inefficient in intensive use because virtually every step requires the user first to do something and then to wait.
The PC won then, but no trend is forever - and tomorrow's Windows products, like MacOS X today, are approaching late eighties smart display capabilities: meaning that the logic behind the interface approach implicit in the combination of relational technology with the forms based applications model should eventually prevail.