Android's existential crisis: Why Java needs to die on mobile devices

Web-based standards and native code are the future of application development on mobile devices.

android-crisis-java.jpg
Image: ZDNet

Over the last several weeks, as part of my Smartphone Survival Test as a dyed-in-the-wool iPhone and iOS user, I've been immersing myself in the world of Android -- specifically, low-cost Android handsets made mostly in China.

Smartphone survival test

Swapping my iPhone 6s for a $200 Android handset

After swearing off the platform four years ago, are we still sick of Android? We're about to find out.

Read More

I'm not a complete neophyte when it comes to Android. In fact, I used Android for several years before giving up on it in 2012. I am also well versed in the fundamental systems architecture of the OS itself and how it runs on various ARM-derived silicon designs.

One of the objectives of my Survival Test is to see if a lower-cost Android phone can, in fact, replace my much more expensive iPhone 6S. Three weeks into the test, I am increasingly of the opinion that the answer is yes.

I still have my issues with Android phone hardware and the OS itself, but they are not deal-breaker issues by any means. All of these ultimately are resolvable, and in time the OEMs, Google and the Android developer community at large will find solutions to them.

But solving some problems will require hard choices. One of those may be a complete fundamental re-architecture of the Android OS itself, in order to address what I consider to be a major flaw: significantly higher resource utilization compared to iOS.

Android's Achilles heel, aside from the "toxic hellstew" of security patches and fragmentation -- which is outside the scope of this article and my Survival Test series -- is Java.

Java, beyond being an object-oriented programming language with a C-like syntax, is the core execution environment within Android itself that runs many of the applications and user-land processes in the mobile OS.

The execution/runtime environment for applications written in Java is a Java Virtual Machine, or JVM. A JVM can run on any computing platform. That's one of its best attributes. You can run Java programs on any system that runs a compatible JVM, and it doesn't matter if they have the same system architecture.

The architecture can be Intel or AMD x86, it can be IBM POWER, it can be Oracle's UltraSPARC. Or it can be one of the many variations of ARM-derived processors.

So the system running the JVM could be a mainframe, a big iron UNIX system, an x86 server, a PC, a Mac, a smartphone, a tablet, or even a smartwatch or a microcontroller in an embedded system.

Java's portability range is impressive: It scales from the biggest systems with massive amounts of memory and CPU to the smallest, low-power processors that are highly resource-constrained.

The only thing they need in common is a JVM, and the code is essentially portable between these systems with minor modifications.

There are different types of JVMs, such as server-side (J2EE), Java Standard Edition on PCs and Macs (J2SE) and Java Micro Edition (J2ME) which was once used on dumb cell phones and the classic BlackBerry.

Each of these types of JVMs have different performance and resource utilization profiles because they are targeted towards different types of systems.

Android's JVM implementation as of version 5.0 and up is known as the Android Runtime (ART).

ART is specific to Android and is not historically Java bytecode compatible for reasons that have little to do with engineering and more to do with software licensing.

Google is now embroiled in a legal quagmire with Oracle over it, and is in the process of actually making it more Java-like in the forthcoming Android "N" to address their legal situation. I leave it here for your consideration.

Now that you've read that long wind-up, here is the pitch: JVMs are well-suited to many things, but they are rather resource intensive.

Google has done a great deal of engineering work to make ART more efficient than its predecessors, such as compiling to native code during time of installation, but Android apps written to ART are still considerably more resource intensive than their iOS counterparts, which are compiled using other languages such as C, C++ and Objective-C.

This approach does not extend to games, which are for the most part written in C and C++ on Android the same way they are on iOS.

However, the user-land processes in Android also run on ART which also are more resource intensive. So that adds a lot to the overhead.

Because of this compound overhead, Android devices need to have a lot more memory and more CPU cores than their iOS counterparts to run similar types of applications.

Adding additional CPU cores and RAM adds to the overall Bill of Materials (BOM) of the Android OEM and contributes significantly to the manufacturing complexity and cost for whoever is making/developing the actual silicon.

This cost trickles down to you, the consumer that is buying the products made from these components.

So to get similar levels of real-world performance out of an Android phone compared to an iPhone, you need to at least double the amount of RAM and add a lot more cores. More cores means higher silicon complexity, and higher power consumption as well.

If you look at the "flagship" Android phones on the market that compete directly with the iPhone, you will see devices with 3GB and 4GB of RAM, 8 or 10 general purpose CPU cores and 8 GPU cores.

There are even phones on the market being introduced now with 6GB of RAM.

To put this in perspective, the current generation iPhones with an A9 SoC have 2GB of RAM and a dual-core general purpose CPU with a six-core GPU. The previous generation iPhones had 1GB of RAM, and are still very usable devices even now.

And while it is clearly at the end of its lifecycle, the iPhone 4S has only 512MB of RAM and plenty of people are still holding on to them.

There are other optimizations that Apple's devices can take advantage of, thanks to Apple's tight vertical integration. By virtue of designing their own silicon, Apple is able to optimize its OS to run on its chip and optimize its hardware to suit the performance characteristics of the software.

In contrast, Android is more general-purpose so it can run on a wider variety of devices made by third-party OEMs using a wide variety of components.

There are other architectural decisions Apple has made, such as prioritizing the user interface processes so that the experience is extremely fluid, compared to the "Android lag" that manifests itself in more resource-constrained devices.

However, generally speaking, Android's overhead in terms of CPU and memory bloat comes back to the JVM. The platform is always going to be at a serious disadvantage because Apple can do more with less.

This is roughly analogous to having two cars that are effectively identical, but one needs twin turbos installed because 400lbs of dead weight is loaded in the trunk in order to make it go just as fast.

So for Android to become competitive with iOS devices and to reduce BOM complexity so we can really have high-performing devices at the $200 price point, we need to get rid of Java.

How did we get to this point? Well, these problems happen with every computing platform.

When computing platforms and application environments are designed, certain architectural decisions are made to address functional and non-functional requirements.

In Android's case, this happened in 2005, a little over ten years ago.

Eventually, these architectural requirements and decisions made early in a platform's evolution become impediments to further progress.

This happened three times with Microsoft Windows over its 30+ year history. It started with the 16-bit/DOS era in the mid-80s through Windows 95/98/ME with 32-bit memory extenders until the NT architecture was introduced in 1992.

NT lasted until the mid-2000s, when the Vista/Windows 7 architectural changes were eventually implemented to add support for 64-bit processors and Trustworthy Computing, among many other things.

Windows 8.x added a new programmatic model, the Windows Runtime -- which has evolved into the Universal Windows Platform -- to remove significant encumbrances of legacy code and further streamline the development model so that applications could be more portable across device form factors and architectures.

The Macintosh had similar challenges, when MacOS 7/8/9 had to be abandoned for OS X, and it too had to be eventually re-worked to handle 64-bit computing and a chip architecture change from PowerPC to x86.

It's now Android's turn at this stage of its evolution. Java may have made sense way back when, but now it is holding back the platform's progress.

Ripping out Java and undergoing a major re-write is going to be painful. In the intermediary the OEMs and silicon designers probably need to continue with boosting the RAM and increasing cores and messing with clock throttling and other tricks.

But there are other things we can do.

We need to think about more lightweight ways of writing apps, by using a combination of web-based standards and native code, such as used on Ubuntu for Phones.

We know this approach works, because in addition to iOS, it has been observed in action on past platforms like BlackBerry OS 10 and Palm/HP WebOS which while not commercial successes, were able to do a lot more with less than what Android and iOS devices have today.

BlackBerry OS 10, in particular, is quite remarkable in that respect, by having a very miserly and agile real-time kernel in QNX and a very tight application programming environment.

It may have not captured wide consumer and enterprise interest, and it was years late to market due to executive fumbling, but the engineers in Ottawa and Waterloo did an amazing job.

I believe the Android and Open Source community at large -- which includes competing vendors who build devices and ISVs that write the applications -- needs to form an alternative development model for Android (and possibly even extending to other OSes, such as Ubuntu and Tizen) based on web standards and native code that exists outside of Google's AOSP and developer platform.

This would be effectively the community taking matters into its own hands with Android.

CyanogenMod has already done this to address the fragmentation and patching issue and is forming OEM relationships to use its Android base instead of Google's.

We need to do the same for the application development platform. To do that, Java must die on mobile, the same way it happened with Adobe Flash -- as well as in the desktop browser for the very same reasons.

Does Java need to be euthanized on Android and other mobile devices? Talk Back and Let Me Know.

Newsletters

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
See All
See All