On the other hand (re: Java-based client-side computing).....

On the other hand (re: Java-based client-side computing).....

Summary: If you've been following this blog for the last couple of days, then you know that I've been exploring (here and here) the possibility that now may good time to re-examine the possibility of developing some sort of standard thin (or rich) client architecture.  For all of its faults, AJAX is solving a very real problem and adding to Google Mail's usability.

SHARE:
TOPICS: Open Source
22

If you've been following this blog for the last couple of days, then you know that I've been exploring (here and here) the possibility that now may good time to re-examine the possibility of developing some sort of standard thin (or rich) client architecture.  For all of its faults, AJAX is solving a very real problem and adding to Google Mail's usability. By standard, I mean, at bare minimum, a collection of three technologies that developers of Internet-based software can count on being there, on the client side, all the time: a browser (support for HTML, Javascript, etc.), a secure execution environment that's normally plug in-based (ie: a Java Virtual Machine or a Flash Engine), and removable storage (ie: USB key, Compact Flash or SmartCard) so that the last known state of your computing environment can securely persist on your key chain or in your wallet instead of on a heavy device you have to carry with you. 

The posts have provoked some passionately argued comments that are for and against the idea of thin client computing and some have rightfully questioned the viability of Java on the client.  That's because my blogs assume a best case scenario where Java is non-problematic, which it isn't.  For example, citing a Java-based time-clock application from ADP, a ZDNet reader going by the alias of JPR75 wrote:

I can't imagine a Java based RDMS. Our hourly people at work use a web based time clock application by ADP. The app itself is very well constucted, but managing it (updating/changing employee clockings) is slooow. We have a broadband connection at work for the Internet which is quite fast, never-the-less, this app is slow.....Until we see faster web applications and faster Java (period), it scares me to think that everything could be Java and web based.

Based on his experience with the same application, another ZDNet reader, Justin James, concurs:

I had to use that app with one employer as well. Nothing like being marked "late" despite arriving 10 minutes early! It took about a week for management to discover that the application was completely unreliable for determining clock in/out times, as the Java applet was painfully slow to load and run.

Francois Orsini, who has been doing all the objection handling on behalf of Sun responded:

Saying that Java is slow is everything but factual - there are many apps out there running slow over the web without involving Java....Once JavaDB is started in the application JVM context, all interactions are local and extremely fast.

javaconsole.jpgEmphasis on "Once JavaDB is started."  Note that James complained about the load time and Orsini responded that the speed is acceptable once the code is loaded.  In the context of the broader discussion of a standard execution environment for thin or rich clients (Java, Flash, etc.), the Java Runtime Environment's (JRE) black-eye has and continues to be the initial load time.  I was reminded of this today when,while researching today's news that the federal government was issuing its flu pandemic findings, I found my way (via Google News) to a Web page on TodayOnline.  The Web page stalled.  Not caring what the hold-up was and never patient for slow pages (and knowing there were plenty of alternatives on Google News to try), I hit the back button to get back to the search results.  It wasn't after hitting the back button that a window popped up on my screen -- one that clued me into what the original hold up was. The pop-up window was Java's console and the hold up was a Java applet on the Web page.  The problem of course is that it's one thing to have a JRE on your computer.  It's another for it to be loaded and for an applet to load into it on relatively short order.  Even worse, by hitting my browser's back button, it appears as though I interrupted the applet's load, thus triggering the string of error messages that appears in the partial screenshot (pictured right).  All of this activity slowed my system down a bit as the whirring of my disk drive was timed perfectly with the presentation of Java's console and its error messages.

Bear in mind that "showing the console" is an option for the JRE that I have checked because I my preference is for a chatty system -- one that set to tell me what the heck is going on when it's resources suddenly get swallowed whole by some bursty application.  Most users wouldn't see it (and it's unfriendliness).  But in the larger context of the local execution environment question, the question of what is preloaded and ready to run is a big question.  Preload nothing and there will be some delays upon first execution of some code (or plug-in).  Have you ever noticed how slow your first PDF-formatted document to load is?  But once Adobe's Acrobat Reader plug-in is loaded, subsequent PDFs load much faster.  This is where that discussion about the client's girth comes in. 

A relatively svelte client might consist of nothing but a browser and support for removable storage.  If a user walks up to that client, the need for and type of local execution environment is determined at runtime and pulled in plug-in style off the user's USB key.  But being able to dynamically load any runtime environment (Java, Flash, etc.) requires a lot of overhead.  Overhead that translates into client girth.  To eliminate that flexibility-driven overhead, the architecture could eliminate the flexibility and settle on one execution environment (ie: Java or Flash), embed that to the exclusion of others, an pre-load it to speed up load times. But which of these historically plug-in based technologies need to be in every thin client to guarantee a seamless and speedy Internet experience? Java? Flash? Acrobat? Real? Perhaps Java or Flash can accomodate Acrobat and Real formatted content.

That leads us to the next question which is what else needs to be preloaded.  For example, are certain Java classes (eg: JavaDB for data persistence and synchronization) more likely to get used than others everytime a user sticks their USB key into some kiosk in an airport and should those classes be preloaded to speed up execution.  The more preloads we pile into some standard thin client architecture, the lesser the chance we can continue to call it thin or even rich.  Pretty soon, we're right back where we started with a full-blown OS.

My point is that the observations made by ZDNet's readers regarding the realities of thin or rich client computing are not to be dismissed.  But, getting back to the glass being half full or empty, that doesn't mean the challenges are insurmountable.  Architectural decisions would have to be made, prototypes built, and concepts proven.  Take all the pushback on synch for example.  I made the argument that JavaDB could be used to facilitate the persistence of user data within the context of a disconnected browser.  The advantage of such persistence is that end-users could continue to work on their data and documents with browser-based applications in a thin client environment even if the browser wasn't connected to the Web (lack of such offline capability is one of the leading objections to moving away from thick clients to Web-based end-user applications).  Then, when the connection returned, those locally persistent data and documents could be synched up with the user's central storage repository on the Internet. 

Via AJAX programming, Google Mail has autosave for example.  Justin James poo poos AJAX.  But let's face it. For all of its faults, AJAX is solving a very real problem and adding to Google Mail's usability.  But, suppose your connection goes down.  Where do the autosaves go to? What happens if you click send? Thousands of people use GMail.  Is it wrong to assume that just the same way they like autosave (I like it, I wish Wordpress had it), they might like the ability to work offline? No. 

Now, you could take the glass is half-empty approach, dismiss both the need and the opportunity to innovate and do nothing.  Or you could take the glass half full approach that Morfik's did when it applied its "web applications unplugged" approach to GMail with it's Gmail Desktop.  Thanks to ZDNet reader MikeyTheK for pointing it out when he said "Morfik has a sample application called Morfik Gmail that I've been playing with. It allows me to take gmail on the road - on an airplane, or wherever, and use the gmail interface to handle all my gmail."  Last September, Morfik's Dr. Martin Roberts was here on ZDNet extolling the virtues of his platform as well. 

Other proof of concepts -- where someone saw the glass half full --  exist too.  Julien Couvreur has been working on something he calls TiwyWiki which, like the aforementioned Gmail Desktop, is a Web app that works even when the Web is unplugged.  Via email Couvrer wrote:

It can run entirely from the browser cache when you are disconnected and it will sync your changes when you go back online. The server isn't very smart about versioning and merging, but it's only a server limitation that could be addressed

TiwyWiki is actually one step closer to the world I'm thinking about vs. that of what Morfik has because it's relying on something that's relatively ubiquitous: Flash.  With the exception one comment about his choice of Flash 8, most of the comments about his innovation (on his blog which describes the architecture in detail) are like rave reviews.  But also in that email, Courvrer acknowledged the potential of something like a JavaDB to handle the persistence of data:

The client-side storage is primitive (get/set, implemented in Flash), and a richer storage (relational store?) would definitely help. From my experience so far, I'd say the synchronization issue is a much larger concern. In the thick client world, I have seen many database implementations and I'm sure we can migrate some of these to the browser world (using Java or Flash). But even in that world, I don't know of any good synchronization framework for occasionally disconnected applications.

he recently strapped the BlackBerry to Socialtext with Miki.  Why not a blend of TiwyWiki and Socialtext for offline wiki authoring? Call it TiwyText.  You could be a naysayer in which case you're free to never enter the room. But if you see the glass as being half full, you could join Julien in his attempt to get the conversation going.  Under the alias dumky, Julien wrote:

Synch is a difficult problem.  Having written an offline AJAX prototype (TiwyWiki), I can attest that the synchronization is most of the pain. Somewhat usable error handling of all the scenarios would come second. That's not to say that improvements in client-side storage wouldn't matter though: I'd love to see a richer API than [Flash's] getValue and setValue....Why not start a offline AJAX discussion group or mailing list?

Ask and you shall receive.  Given the opportunity to collaborate, I'm guessing that these and other questions can be vetted and progress can be made in terms of innovation.  Given the relevance of the discussion to mashups, I've established a forum called Offline AJAX and Web Apps Unplugged and am hosting it on Mashup Camp forum server.  The invitation is open to Julien, Sun's Francois Orsini, Brad Neuberg, and anybody else who thinks they can contribute.  Additionally, for those who join the conversation who would like it to continue it in the physical world (as well as the virtual), you're welcome to do so at the next Mashup Camp (Mashup Camp 2). It's free to attend and scheduled to take place in Silicon Valley on July 12.  See the mashupcamp.com Web site for details on how to sign up.

Topic: Open Source

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

22 comments
Log in or register to join the discussion
  • Does AJAX suck

    because the paradigm is wrong - or is the implementation bunged up? On XML - I tend to believe that ANY format you choose to include metadata will suck - so XML is "good enough". But the BIG reason AJAX sucks is the "J" - javascript. Why is the FIRST scripting language for HTTP the ONLY universal one? I like to call it the COBOL of web programming. It works, its everywhere and it sucks.

    My suggestions is to replace the "J" with something else. Everyone has their opinions, but I see Ruby as a great choice. Go to W3C, Konqueror, Opera and Firefox and lobby to embed Ruby - and the "suckiness" can be eliminated.

    Embedding a non-scripting language such as Java or Flash into the browser presents other suckiness. Binary programs embedded in HTML files makes for some icky environments. I would hate to be loading HTML files with strange and undicernable symbols.

    We don't need a new approach - we just need to upgrade the one that exists.
    Roger Ramjet
    • Fair enough. How about doing something about it?

      Why not start the discussion in the forum I set up?
      dberlind
      • Forum?

        Which forum is that? This talkback thread or something else? Your talkback articles have about a 2 day shelf life . . .

        NOTE: Of all the browsers ever created, most still exist (in their own little nitches). One big stand out DOES NOT - HotJava. Hotjava was an interesting idea - use java to run the browser. After all the stuff that Sun has gotten into lately, why not revive HotJava? It sure wouldn't take much . . .
        Roger Ramjet
      • AJAX without the J is ready. It's ...

        AjaxAnywhere. See

        http://ajaxanywhere.sourceforge.net/

        From their intro: "AjaxAnywhere is designed to turn any set of existing JSP or JSF components into AJAX-aware components without complex JavaScript coding. In contrast to other solutions, AjaxAnywhere is not component-oriented. You will not find here yet another AutoComplete component. Simply separate your web page into multiple zones, and use AjaxAnywhere to refresh only those zones that need to be updated.

        Whatcha think?

        Jim
        Jim-MN
        • I think...

          ... that this "AjaxAnywhere" thing is total and utter garbage, [i]prima facie[/i]. All it does is take writing the JavaScript off of your hands and auto-generate it. Auto-generated is totally worthless the vast majority of the time. It is still using JavaScript. It is still using JavaScript to process XML. This is was Roger was (rightfully) railing against.

          J.Ja
          Justin James
          • Yes and...

            [i]It is still using JavaScript to process XML[/i]
            OMG! what a disaster! well maybe not eh? are we talking about processing zillions XML elements in average for typical webapps out there - I mean are we? I think not...I think it is good to be pragmatic once in a while...This does help for assessing real issues...but eh, let's use something else than javascript; oops there is nothing else right now part of the browser - Oh I forgot VBScript for IE, but eh it is not really portable (at all), is it?
            Have you processed XML with Javascript? I have and I did not end-up swearing at my monitor - was I parsing and processing thousands of XML elements? no - why? because I don't have/need to and because I usually try to implement something in an optimized and smart way - not like, oh let's get thousands of elements to parse and see if that flies...

            What is your experience and where are the facts?
            forsini
          • Web apps vs. rich client apps

            You are totally backtracking here. First, you say that you want to see "rich clients" on the desktop using somesort of AJAX or Java or whatever technology in a browser centric model. Then you say "typical webapps out there." You are correct; your typical webapp does not do this. But a typical desktop app WILL do this.

            How about the browser based office suite everyone seems to want? They all tie it to ODF. What do you think ODF is? An ODF document of any amount of complexity or length is going to be (drumroll) an XML document with thousands of elements.

            It is resource intensive enough just to walk that tree; add onto it the requirement to [b]validate it against the schema[/b]. As George Ou showed some time ago, OpenOffice is about 100 times slower to handle this than Excel handles a binary file (which is essentially a serialized copy of the Excel workbook object). And OpenOffice is compiled, native code, not Java, not JavaScript running in a browser.

            Sure, for some dinky "typical webapp" you won't be processing thousands of XML elements. But a "typical rich client application" does.

            Of course there is no alternative to JavaScript that is portable. Dn't even mention VBScript; I hate working with it in legacy ASP scripts, and I have never even considered using it for client side scripting, for 1,001 reasons. I certainly never mentioned VBScript except maybe to knock it. I have more sense than that. :)

            You imply that I am a "bad coder" or a foolish one because I have tried parsing a few thousand XML elements. No, I merely have [b]foolish customers with foolish requests[/b]. I had a project a few months ago where the customer insisted that I display about 3,500 rows of data in an HTML table; each row had three form object in it. It took about three minutes to walk through the objects in that object array to validate them. I told the customer when they made this request that they would not be happy with the results. They ended up changing the requirements after seeing the results. And that was merely to iterate through the objects on the form, and if one checkbox was checked, see if the corresponding text box had a particular value. It was a fairly simple routine, and JavaScript choked on it. I cannot imagine how slow it would have been to walk through an ODF file representing a spreadsheet (as a side note, VBA is rediculously slow doing that as well, despite Excel typically being a swift application).

            Yes, JavaScript is just fine for the current "typical webapp." It is wholly inadequete for what the AJAX/Web 2.0/rich client vis a vis browser/thin client crowd want. The two goals are not compatable. Perl is a scripting language, but it writes extremely elegant, extremely fast code. JavaScript is a scripting language that produces inelegant (but not horrible, I will grant it that) code that executes very slowly.

            Maybe the problem is within the browser or the interpreters; maybe it needs to JIT compile JavaScript. If that is so, I suggest that someone give Mr. Gates and Mr. Ballmer a call. If Microsoft had spend as much effort improving their JavaScript interpreter as they have spent adding tabbed browser windows, then maybe JavaScript could execute fast. I really don't know. But I have the proof, because I have had the experience. I have walked large tree structures in JavaScript (and I am sorry, but a total of about 10,500 is not a "large" data structure, it is piddling) and the results stunk.

            Even if JavaScript were optimized out the wazoo, you would still need to be working with that XML data within the context of HTML. XLST just "isn't there yet" to the point where it can/will replace HTML anytime soon. And as I have already said, JavaScript is super slow to goin through HTML.

            No matter how you cut it, you simply cannot write a rich client that is capable of handling more than a few hundred objects in JavaScript in a timely fashion. Thanks to the magic of not allowing any kind of multithreading, combined with the browser's limit of 2 concurrent server connections, JavaScript cannot even do something handy like provide a status update without jumping through some major hoops like creating a seperate window and passing global variables back and forth or something. that is pretty sorry, a language where you need to create an entirely separate instance, complete with a window visible to the user, to do what nearly every other language in use today (in fact, every other language that I can think of in common usage) can do with a few lines of code, transparently to the end user.

            I really do not see why you are unable to evaluate JavaScript on its technical merits. You even say that a "typical webapp" does not do what JavaScript cannot do. Microsoft already discovered that JavaScript cannot effectively process XML (the number I heard was about 400 ms for 150KB of XML, I have written Perl that can handle 10MB of text in well under a second). Well written code gets I/O bound for an operation like this; outside of a dialup connection, it is tough for JavaScript to get I/O bound. JavaScript is not capable of handling a rich client's amount of data. Web browsers as we know them are not able to do so either; a few days ago I dumped about 20 MB of plain text to IE and watched it spend a full minute trying to display it on screen. It was pretty pathetic to watch. I like the idea of a no-install-needed, cross-platform, write-one, run anywhere system. Maybe if Sun hadn't messed up Java in so many ways (bonus points to Microsoft for helping to mangle that mess, they couldn't get rich off of Java so they did their best to destroy it) then we would be there already. But we aren't. And trying to polish the turds known as JavaScript, HTML, HTTP, and XML does not make the situation better, it makes it worse. One day AJAX will be legacy code. Some legacy code like a well written Perl script (note how I keep talking about that language!) is a joy to maintain. Some legacy code like COBOL (regardless of how well written) or BASIC with line numbers is not a joy to maintain. Just creating AJAX code is much closer to writing COBOL or BASIC with line numbers than writing Perl. Is that the legacy code you want to be supporting? Is that the code that you want everything relying upon five years from now when we hopefully move past the browser? I hope not. I have huge amounts of respect for Sun as a company run by and for engineers. The only really dumb technical mistakes I have known them to make was the switch away from BSD when the SunOS/Solaris switch occured (I know the reasons, did not make sense on a technical level), and breaking the backwards compatability of Java every two minutes (along with failing "write once, run anywhere"). Don't let Sun disappoint me by telling me that they think that JavaScript is the best we can do.

            This isn't 1983; programming is not and should not be about stretching hardware past its intended limits. It should be about addressing the users' needs to the best of our abilities. Adding 2,000 lines of code a load time to an application to give drag/drop capabilities (which 75% of users don't know exist, unless instructed by someone else or the directions) or right click functionality (again, only a minority of users right click), when those lines of code could be doing something more useful, or the slowdown eliminated is just developer hubris at its most dangerous.

            If you really need me to, I will write a JavaScript benchmark and prove this to you.

            J.Ja
            Justin James
        • Ive seen XML <b>AS</b> the script

          Something about embedding script commands in tags. Looked interesting . . .
          Roger Ramjet
    • Javascript is everywhere...

      The majority of websites that you're browsing on the web make use of Javascript and has been so for many years...This is a fact, not a tale...

      Can you write portable javascript, sure you can - does it have its own share of issues, etc - Yes it has _but_ another fact is that it's been used extensively to develop websites and is still being used a lot as of today. It is not as sucky as people are trying to make it look like - You can throw another scripting language in there and you also have your share of issues, especially with new ones...It is not going to be the panacea to all the issues people might think...You will have other kinds of suckiness believe me - Ok, so at the point and to come back to David's original blog, the scripting language is just one part and whether Javascript is used or not to control the local Ajax handling of the introduced paradigm for offline webapps mode or local storage access, it does NOT matter...I picked Javascript because it's already there, part of most browsers out there and it being used a lot (still) for most web pages you're coming through - I never stated that Javascript *had* to be used, that is what I used - if another scripting language is there and available everywhere is prooved to be well-adopted and running fine, then so be it...but right now, this is Javascript...What I was arguing about is when I hear that Javascript is way to slow to be used on the client-side when it has been for years...Just a bit more clarification(s) here...
      forsini
  • The idea is great, the methodology is fine

    You can say "well load times suck but it runs fast once it gets going." As a user, I don't care. I am still waiting 5 minutes to accomplish a ten second task. End of story. And AJAX? I know when a site is trying to use AJAX, because the browser window indicates that a JavaScript error has occured. Rarely do I encounter JavaScript that works in my browser. Incidentally, that browser is IE 6 on Windows XP SP2. If you can't get something to work on that combination, go home and start looking for a new job, because it is the most dominant platform.

    Really what you desparately want is a hardware-neutral client/server system, and this is a great idea. You also want to add to it the speed of a green screen, but make maximum possible use of the local system's resources, however plentiful or sparse they may be. Transparently sifting from local storage to central storage and vice versa is on the list as well. Add in the ability to shift processing to the server is local resources are limited, and back to the client whe the client can handle it. Oh yeah, and it has to be as fast. Give it a "snapshot" ability so when you power off or disconnect from the server, no data is lost.

    AJAX is not the answer. As myself and many others have pointed out, JavaScript is a horrid language to develop and run. XML is a lousy system for transporting data. HTTP is a lousy protocol for "chatty" connections (and AJAX is extremely chatty!). HTTP is a lousy protocol for anything other than what it was designed to do. HTTP is a lousy protocol for an application base.

    In other words, you really do have the right strategy. Your tactics (AJAX and/or Java and/or Flash) are the problem. Stop trying to push AJAX, and start pushing something better. What you want is a system that operates at a minimum like XWindows or Remote Desktop, where all that is being sent is how to draw the screen and user input, but at a maximum can shift the entire load processing and code base to the client (I know that XWindows and Cintrix systems can't do this). That means a universal lnaguage. Java is not a "universal language". You may want to ask Orsini (or Gosling) why in the world they keep breaking Java's backwards compatability and don't get it to live up to eing cross platform capable. And why there is a separate mobile and desktop and server versions too, so it is hard to write to a least common denominator. Perl is more cross platform that Java. C using standard libraries is more cross platform. JavaScript is even worse than Java on all counts. Flash isn't bad, but ActionScript is lousy.

    If you can get off of the idea of AJAX, and onto the idea of creating a system that is actually designed for this type of application, I can and will back you to the hilt.

    Heck, it sounds like maybe we can combine what you want, with another tech you love (virtual machines). Why not just carry around a full VM + filesystem around with you, and have way of running that VM everywhere you go?

    BTW, you never responded to my email re: the podcast. Did it maybe get filtered out?

    J.Ja
    Justin James
    • Wrong topic for that...

      ... I meant, "the idea is great, the methodology is not". Bloody ZDNet TalkBack with their lack of preview of post-posting editing...

      J.Ja
      Justin James
    • Exactly what I was saying

      The "J" in AJAX is the real problem. The "X" is not. XML is non-optimal, but I see no other methodology of embedding metadata being any better. If you want metadata - you need tags, period. Making them as "short and sweet" as possible is the key. I see XML doing that - its just HTML with "key" tags made into "general" tags.

      How would YOU handle metadata embedding?
      Roger Ramjet
      • Human readable is XML's weakspot

        The biggest problem with XML is that someone decided that all data should be editable in vi or some other bonehead decision along those lines. The more human readable a format is, the less efficient it is. Look at source code... with all of the indentation, punctuation, etc. that goes into source code, 75% of it is whilespace that is being used as a form of meta data ("end this statement here, these lines of code are functionally related to each other, etc."). Why are we still stuck on the idea that raw, plain text is the best way to view anything? Why not a format with built in compression? Or a system that takes human readable XML code and translates it to a super efficient binary file and provides a binary-to-XML converter code to the reader in the file itself (similar to how an XSD tells the XML reader what the XML format is)?

        I am totally in agreement with you on the JavaScript being the COBOL of the modern day & age. Except that I did not mind writing COBOL too much. I bloody hate writing JavaScript.

        The HTTP part is a mess too. Read RFC 2616 very closely, and think about the hoops that a Web server must go through on each and every request. Even the authentication system is messy. JavaScript is the COBOL of today's age, and HTTP is the vodka of protocols: colorless, tasteless, odorless, and it serves only one purpose but serves it extremely well. There is only one reason to drink vodka, and there is only one reason to ever use HTTP; trying to write a "persistent" or "chatty" client, especially one that requires some sort of authentication is *not* it. HTTP is designed to be roughly 90% - 95% read only. POST and PUT are pratically the only way to send more a trivial amount of data, GET with a query string is more like command line arguments. So HTTP needs to be dumped too, in favor of a protocol that is more like telnet, where a "chatty" conversation can exist with authentication only occuring at the initial session build up. Add in the idea of multiple "conversations" within that one session, allow sessions to be disconnected & resumed to account for networking problems, and *now* you have the technical basis to start writing distributed, real-time syncing rich client apps.

        The problem with the AJAX/Web 2.0 crowd is that their imagination is too limited. They keep drooling about new ways to tie any data they have that has a physical address to Google Maps, or to replace basic word processors with some goofy online editor (I have a hard enough time finding a quality desktop text editor!), when they are totally missing the forest for the trees. AJAX, by it's very nature, is limited. It's like trying to take a Geo Metro that is designed for one very narrow, explicit purpose (great gas milage) and turning it into an all around high performance car (great handling, accleration, etc.)while maintaining its original purpose (gas milage). It just does not work that way. Sure, you can make it happen, but it is clumsy and kludged together. A very complex AJAX app needs to maintain several different codebases for the client-side JavaScript. That alone is a show stopper for any but the most hardcore development team; imagine trying to push & test core functionality changes to 4 different, but nearly identical versions? Sure, it's possible; open source OS groups like the Linux's and the BSDs do it. But is it within the capabilities (in terms of manpower, time available, and technical skill) of your average code shop? Not really. Not unless they want to ignore programming to focus on rediculous amounts of Q/A, version control, etc.

        This is why a standard, unified platform designed for this is needed. Java could have been a contender. Flash could have been a contender. ActiveX could have been a contender. .Net could have been a contender. But they all fail, and partially because they are still way too HTML, HTTP, and JavaScript (on client side) centric. As you point out, XML stinks, but it is still usable. XML is a weak spot only as far as the fact that is is bandwidth wasteful and incredibly resource intensive to parse and generate. But at least is is designed for what people are using it for, which is passing data around. Then again, they could be doing the same thing with CSVs or some other sort of flat file, and save bandwidth AND CPU, but no one is going to do that because they are too lazy to write a CSV parser, despite it taking all of three minutes to write one... they would rather use XmlHttpRequest() instead. They save five minutes of programming to cause a lifetime of pain for the user. Typical programmer hubris in action. Gotta love it.

        J.Ja
        Justin James
        • Interesting perspective

          I'm not sure how you could make "source code" exist in binary form. People like working with ASCII text files - its as WYSIWYG as you can get! You are fighting an uphill battle (on Everest) with that one.

          CSVs just don't cut it when it comes to STANDARD metadata. If I put items in a different order than you, then we need 2 different parsers to make sense of it all. Multiply by the number of programmers in the universe, and you can see that CSVs suck WORSE than XML.

          XML is a compromise. It is human readable (like most every "data entry" file), because people like it that way. The tags make for clumsy parsing, but it is STANDARD parsing - i.e. A single (recursive?) parser would work on any XML file. The hierarchical nature of the tags allows for all sorts of relationships - unlike a dull CSV. There is just NO OTHER "STANDARD" WAY to embed metadata.

          I would tend to agree to alter HTTP. You are right about XML files being too large and there is a need to compress them. But the files themselves SHOULDN'T exist in a compressed form - the HTTP should compress on the fly and send compressed data over the wire. You could compress XML files for your own purposes - like saving space on that 400Gb drive!
          Roger Ramjet
          • Compression actually makes the problem worse!

            Unfortunately, compression of XML makes the problem worse. The last thing an overloaded, poorly written browser that is about to attempt recursive tree walking via JavaScript needs to do is also fire up a compression library. That's why I think that a binary method of transmitting the data (like JSON) is required.

            Source code as plain text may be what we are used to, but that does not mean that it is a good system. For one thing, source code has a huge amount of implicit information and metadata within it that all gets treated the same as actual code. Why? Because we refuse to go beyond plain text.

            Plain text is not the Holy Grail of document formats. It is a least common denominator. A plain text editor is not anything special, and plain text is a special format just like anything else, albeit a rediculously common one that every nearly piece of software has the ability to open. Plain text is stored on a disk in the same fashion as anything else. There is no reason for us to not go beyond, far beyond plain text for source code.

            J.Ja
            Justin James
          • LCD

            Since text is the least common denominator, shouldn't that make it easy to translate into whatever format you require? Isn't that what compiling does? I see little difference in coding with tags or a slick editor that highlights you selection and binary tags it. When that binary file flips a bit - its a lot harder to recover than a text file.

            So I agree with a new method of transport (HTTP replacement) and a new (interpreted) scripting language (I like Ruby, some like Perl and Python) - but I don't agree with a new file format. Whichever format you choose, you need to recreate all of the aspects of XML - so why reject XML (Given that the other 2 things get accomplished)?
            Roger Ramjet
  • How did the Java VM get on the desktop in the first place

    I still think you've failed to address a key point - how did the VM get on the client?

    Once you got a VM its easy via Java webstart to get an application down to the client.

    Then its just a question of what you want to run on the desktop: Java DB, a web server and AJAX client? vs. Rich Client.

    But please address the core question how did the VM get onto the user's desktop. This something I think Sun has failed to do.

    Mark Levison
    mlevison
    • Just try...

      ... to install the Java VM on a non Solaris, MacOSX, Windows, or Linux machine. I dare you. It took me hours of effort, jumping through hoops, and blood sacrifices to get it to happen on FreeBSD. Thank you Sun and your horrid liscensing scheme!

      J.Ja
      Justin James
    • Whoa, where have YOU been?

      [But please address the core question how did the VM get onto the user's desktop. This something I think Sun has failed to do.]

      You mean that you don't remember M$ altering the Java VM, and Sun suing them over it? M$ refused to work with Sun and only gave their users are choice of JavaVM version 1.0 or nothing. I WONDER why sun failed to get it onto the user's desktop . . .
      Roger Ramjet
    • Already replied...

      I've replied to your question on the original blog that David posted.

      http://www.zdnet.com/5208-10532-0.html?forumID=1&threadID=20509&messageID=394875&start=17

      Let me know if you need more information, etc...
      forsini