X
Innovation

Rejoice! Flash is taking us back to the 1990s!

Over on Icronics, there's a very useful list of what mobile phones will be able to run flavour-of-the-moment Android FroYo, and which will not. In general, if you have a very recent Android model - launched this year - you'll be in luck; if you don't, you won't.
Written by Rupert Goodwins, Contributor

Over on Icronics, there's a very useful list of what mobile phones will be able to run flavour-of-the-moment Android FroYo, and which will not. In general, if you have a very recent Android model - launched this year - you'll be in luck; if you don't, you won't.

The article gets more specific, though, listing the processors that will be served a new helping of roid rage - the TI OMAP3430, the Qualcomm Snapdragon and the Qualcomm MSM7227. It's not so much that other, older, lesser processors can't run the new code, but that they don't have the performance to make it worthwhile. Especially in the cause du jour, Flash: here, the ability of a phone to run that particular multimedia delivery mechanism has become something of a flashpoint (sorry) in the new war between Apple, Google and the rest.

Old soldiers may be excused for a touch of nostalgia: it's been a while since any of us really cared what's in our desktop or laptop. Excepting chip marketing departments, the gaming mob and tech journalists, most people don't have any idea and care less about processor names and specifications. But it wasn't always thus: in the golden age of clock speeds at the end of the last millennium, each notch up in raw computational welly was newsworthy. It gave the industry something to anticipate each season, with a high point of each Intel developer forum being the public display of yet another milestone reached. And if a competitor had a few hundred megahertz more on the clock, it mattered.

There were plenty of people who were quite snotty about this, because software didn't always get better in step with the hardware and seemed to just soak up the new speed with more crud - "Gordon Moore giveth, and Bill Gates taketh away". Yet it was a major industry driver, encouraging frequent updates and a steady stream of new applications that you really wanted but wouldn't quite run on your current rig.

Physics got in the way. As things got faster they got hotter and more power-hungry, until like Concorde the extra speed came at too great a price. The chip companies had to go to the great two-for-the-price-of-one multicore strategy, if they were to go anywhere.

Intel and AMD have made a good attempt at convincing us that multicore is a continuation of great times, and in servers and scientific computing this is mostly true. But for the rest of us, in consumer and personal computing, it's plainly not, and the protestations of the chip companies seem more an argument from necessity than actual, palpable progress.

Until now. Suddenly, speed is back on the agenda and we've got a lot better at power management. There's room to grow. Last year's model doesn't cut it in ways that really drive the urge to upgrade. There are things you want to do that you can't do, and relief is just a credit card away. Old times are back.

Expect Apple's next iPhone announcement to reflect this: Apple remembers those days too, and is more than capable of playing even if it's too savvy to be so gauche as to quote clock speeds. And as for the old chip companies who made the dynamics of the pre-millennial market the driver of their phenomenal growth: they're not looking so well placed and they know it. Intel in particular has done a lot of good research and has a lot of good ideas - yet the Atom has not delivered, and the game's now quite advanced.

All the pieces are in place for a burst of good old-fashioned, noisy, benchmarked, horns-locked silicon engineering with real results in the real world.

Happy days.

Editorial standards