If C (circa 1973) is the prototypical third generation language and we had undisputed fourth generation languages as early as 1982, why don't we have a wide variety of 5th or even 6th generation languages today. Did progress stop?
Here's how IEEE Software Engineering standard 610.12-1990: ( subscription only!) defined 4th and 5th generation languages:
Fourth Generation Language (4GL) A computer language designed to improve the productivity achieved by high order (3GL) languages and, often, to make computing power available to non-programmers. Features typically include an integrated database management system, query language, report generator and screen definition facility. Additional features may include a graphics generator, decision support function, financial modeling, spreadsheet capability, and statistical analysis functions.
Fifth Generation Language (5GL) A computer language that incorporates the concepts of knowledge-based systems, expert systems, inference engines, and natural language processing.
Of course that was worked out during the late eighties when MITA was a mighty threat to the American programming industry and AI was about to take over the world's programming tasks - unfortunately it was also dead wrong. The victim of sloppy thinking and temporal coincidence.
What's wrong with it is simply that it incorporated a very natural, common, and widely saluted mistake about the meaning of "generation" in terms like "second generation language."
In its original meaning, as applied by people like Grace Hopper and John Backus when A-1 and SpeedCoding were being invented in 1954, the word "generation" refered to code generation - the process needed to convert from human readable code to machine code. Thus code written in binary and entered via switch flipping required no (automatic) generation, code made up from the original assembler needed one step to generate binary: the one to one substitution of binary forms for human readable ones, and macro assemblers needed two: the first being the expansion and linking of macros into blocks of simple assembler before conversion to binary.
SpeedCoding (or Formula Translation as it later became known) added a very complex third step: replacing functional constructs and flow control logic with correctly linked assembler macros, and the final binary thus required three generatation stages.
A four GL like Unify's 1990s Vision/accell/SQL combination adds another layer: the development environment is itself an application whose output sits at the top of the binary generation process.
(Oddly, Unify ate its own dogfood - Vision/Accell was built in Vision/Accell - so you can easily fall into a recursive well here in counting generators, but I'm sticking with four as a way of preserving what's left of my sanity.)
Look at languages, like ProLog, that make fifth generation claims and the best you can say is that there are four -not five- phases in the binary generation process - making it a 4GL; albeit not one aimed at business applications.
So why aren't there any true 5GLs? because that would be robot you could instruct to write a a development toolset with which to write the robot -thereby creating, I think, yet another computing conundrum - a tool that has much in common with both APL (1963) and Lisp (1958).