There are many cases where an applications development environment, like Unify's NXj, can be used to interactively produce a working prototype that can go directly into production - thus bypassing most of the traditional causes of failure in the traditional applications development process.
There are other cases, however, where there is as yet no sensible way of doing this and you therefore face the challenge of structuring a team and a process to get the thing done in the time available.
Suppose, for example, that you guess the application has five main elements and would take one guru thirty months to deliver - but you have a ninty day deadline. So do you assign ten people and hope they function in some kind of telepathic union? or how about trusting the PERT chart you slapped together for the proposal and appointing a project manager along with five teams of three to handle each major component?
I have an idea about this, but I haven't had a chance to try it. It's based on first recognizing that development doesn't end with project delivery, and second on finding ways to use prototyping for the specifications even though the final product has to be built manually.
Let me explain: first, in the traditional project you may ultimately be handing over something that works, but it will require what is euphemistically called "maintenance." I say "euphemistically" because there's no such thing: computer code does not require maintenance, so what maintenance programmers mostly do is fix bugs that were put in by mistake and add functionality that was left out either by mistake or because the requirement didn't exist or seem urgent yet. But that reality has a budgetary corolary for development: a nickel spent avoiding bugs and limitations now can return a buck next year and end up being worth a couple of tenners over the thing's lifetime -and therefore that lifetime manpower costs, not just development period manpower costs, should be considered in the project plan.
Second, the reason prototyping may not seem applicable is usually something like a client requirement for compiled binaries, efficiency, license free portability, or an unusually tight relationship between the application and its target hardware.
So with that mind here's the putative prescription: choose the highest level language in which it's possible to deliver a working prototype regardless of efficiency, then assign your teams as you would for an object based project plan and have each team develop the entire application prototype. Test each prototype as part of the debriefing for the entire group - then let team members re-assign themselves to the components they want to work on, and get them cherry picking ideas from everyone else while coding for the target hardware and software environment.
You might, for example, have each team build the whole thing in Perl on Solaris, debrief everybody on each team's assumptions and results, let the members restructure the teams, and then have each team write its module or layer in the specified production language and OS environment.
This process places a much heavier burden on users, but getting sufficiently different interpretations from each team so that a careful debriefing can produce a concensus specification -expressed directly in Perl, of course - offers a win-win because you should get the cleanest specification possible along with a working reference implementation constructed from the smartest code produced by the entire team.
Translating that reference implementation to "C" or whatever final language is appropriate, then becomes an opportunity to gain efficiency, unify data definitions, clarify difficult code bits, and document processes - producing an application that covers most of what's really wanted, doesn't contain surprises bringing down production when some oddball edge case is finally encountered, and trades some additional upfront costs for less rework and a reasonable expectation of getting significantly below average long term costs.
So could this approach mean that thirty men could dig the metaphorical well in one thirtieth the time it would take one? No, but perhaps in one tenth of the time prior to hand-over, with the additional labor cost recoverable from "maintenance" reductions during the thing's useful lifetime?