X
Business

Programming language choices

BASIC and Java, are doing what they were intended to do: respectively supporting beginners and isolating third party software from platform change.
Written by Paul Murphy, Contributor
Two programming languages, or actually development environments, dominate most current discussion: BASIC and Java. Whether used independently or within IDEs, Studios, or other marketing constructs, these are, however, out of place in any serious development context; and both for the same reason: no amount of marketing momentum and library accretion can help these products transcend their original design limitations.

BASIC was intended to help arts students understand a little bit about computers without requiring them to suffer the intellectual self flagellation that is Fortran or learn enough math to become comfortable with APL (then being invented). Java, in contrast, was invented as yet another byte code interpreter intended to isolate software developers from rapid change in hardware devices like set-top boxes initially and cell phones later.

Thus BASIC became a success driver for Microsoft again only after it dropped Xenix as its key OS offering in favor of Tim Paterson's QDOS and focused on selling stripped down systems into volume markets -i.e. to those without systems training or computer science educations. Thirty years of development later, Microsoft's BASIC appears quite powerful if used in the context of a Windows operating environment and a Microsoft database product -but that appearance is deceptive since value only exists within that proprietory environment and derives mainly from opaque external libraries.

Interestingly, Java became a huge success mainly because professionals trying to sell non Microsoft products into that Microsoft market saw it as an effective way of insulating their code from Microsoft's tendency to change those same library APIs and other OS internals every time competing products started to gain market share.

So really, both product sets: BASIC and Java, are doing what they were intended to do: respectively supporting beginners and isolating third party software from platform change.

Great for them, but popularity is not itself an argument for using anything for Unix applications development. Try to look at them objectively, however, and you'll see that BASIC is still just BASIC while Java can only really be used inside an utterly undisciplined furball of naming conventions, libraries, and co-products, that has grown so large, and so complex, that getting much beyond "hello world" requires both serious learning time and significant human and technological infrastructure.

So what to use?

Obviously it depends on the application. Ruby, with or without rails, is looking better for webapps every day, but so is PHP and both seem perfectly reasonable where the consequences of a successful attack on the server are considered largely immaterial - meaning that their relative security weakness may therefore be considered unfrightening too.

Combine either one with Apache, SQL, a decent data management package, and a browser client and what you get is a perfectly fine engine for developing production ready prototypes for most small business applications - but if security is a killer issue, performance is important, reliability under extremely heavy loads is a requirement, or you're bent on developing one of the toolset systems groupings like LAMP rely on: the next Apache, a less brain damaged data retrieval and manipulation tool, or a better data manager, then you need something else.

And right now that leaves Perl for rapid prototyping and "C" for the final implementation with not much more out there to threaten more or less (i.e. more libs, less pointers) bog standard 1978 K&R C for an application to be released nearly thirty years after the book. That's quite a tribute to the language originators, but it also suggests that a lot of the object stuff we've been talking about during most of that period has essentially been a detour to nowhere - and that's bad news for an industry facing major hardware, and therefore software, change over the next few years.

Why? Because C as it stands today doesn't quite cut it on Cell, on CMT, or for interval arithmetic - and there's really nothing else out there just now that will.

Editorial standards