Donald Knuth (yes, that Donald Knuth) has given a long and interesting interview with the Informit website -- which I hadn't come across before: it's a joint venture between various tech book publishers.
As you'd expect from a septuagenarian who's defined a lot of modern computing concepts, the interview covers acres of ground. From open source - "I believe that open-source programs will begin to be completely dominant " - to how he writes - "My general working style is to write everything first with pencil and paper, sitting beside a big wastebasket. Then I use Emacs to enter the text into my machine" - and his choice of platform "I currently use Ubuntu Linux, on a standalone laptop—it has no Internet connection. I occasionally carry flash memory drives between this machine and the Macs that I use for network surfing and graphics; but I trust my family jewels only to Linux".
There's meatier stuff too, especially on his favourite field, literate programming. This is a way of writing programmes based on the idea that they should be readable by humans first, and contain a decent explanation of what it is you want the computer to do: Knuth says that this may be the only way to approach very complex tasks, and I think that's a fascinating approach. However, I fear he's been stymied by the cllche that once you teach computers to understand English , you'll find programmers can't speak it.
Of most relevance today, though, are his thoughts on multicore. "To me, it looks more or less like the hardware designers have run out of ideas, and that they’re trying to pass the blame for the future demise of Moore’s Law to the software writers by giving us machines that work faster only on a few key benchmarks! I won’t be surprised at all if the whole multithreading idea turns out to be a flop, worse than the "Titanium" [I think he means Itanic. Ed.]approach that was supposed to be so terrific—until it turned out that the wished-for compilers were basically impossible to write."
There's more in that vein.
The trouble is, I think he's right. Ever since Intel introduced hyperthreading in 2002, we've been waiting to see the case for generic multiprocessing on the desktop. That there are plenty of special cases, nobody doubts: multiple processors have been a feature of computing for many decades. But in general? They do nothing, and I say that having sat through more IDF demos of hyperthreading, dual, quad, eight and manycore chippery than is allowed under the Geneva Convention. As Knuth says - the problem is palmed off onto the compilers: compilers nobody can write, because the job is fundamentally impossible.
Programming is at heart mathematics, even though most of it is the most inelegant, ugly and innumerate applied maths ever seen on the planet. I have never seen a mathematical treatment that shows that parallelism is theoretically applicable to linear tasks: I suspect no such thing exists. But if the reverse were true - if a computational mathematician could define the class of problems for which parallelism was useful - it would save us all a lot of time down on Bernard Matthew's Wild Goose Farm And Wrong Tree Barkery.
Wouldn't half shake up Intel's marketing message too.