'

Ushering in a multicore future

A blog by Mary Foley pointed me towards Microsoft's parallel computing initiatives, as well as revealed in a side panel that she has written a book about my favorite company (I'm definitely going to buy it). Parallel computing will become critically important as we bump up against the limits of Moore's law, a "law" which drove the performance gains for most of the history of computers.

A blog by Mary Foley pointed me towards Microsoft's parallel computing initiatives, as well as revealed in a side panel that she has written a book about my favorite company (I'm definitely going to buy it). Parallel computing will become critically important as we bump up against the limits of Moore's law, a "law" which drove the performance gains for most of the history of computers. Now that growth in the number of transistors per processor square inch has slowed (if not stopped), the alternative is to make more parallel processing units.

That, however, doesn't yield the same kind of speed gains as simply increasing the speed of an individual processing core. Faster speeds made possible through Moore's law-style increased miniaturization boost speeds of all software programs, whether or not a program is single or multi-threaded. More cores don't lead, necessarily, to similar gains.

A "thread" in a computer program is like one member of a construction team. Each member can do their work pretty much independently, and if they work together, they can complete tasks faster. If the equivalent of a Moore's law speed increase could be applied to the construction team, it would mean that every member would do their job faster.

Barring speed increases for individual workers, you could try adding more workers to your crew. The problem, however, is that this may not lead to any difference in construction time. You have to find something useful for those extra workers to do so that they don't hang around drinking coffee and harassing passerbys on the street. Though extra workers might theoretically improve construction time, they might not if you don't use them properly, and in the worst case, might even make matters worse (workers getting in each other's way, which is a reasonable approximation of the costs of thread context switching).

Like extra construction workers, it's hard to use extra threads in a computer program, because finding ways to break your program into useful chunks is difficult. Humans might manage it credibly when there are three or four cores, but imagine further down the road when we have hundreds of cores in a single processor. So long as humans write software, we are limited by what humans can reasonably model in their own heads.

That's why it's critically important for us to develop frameworks like ParallelFX that relieve (most) humans of the need to model such complex interactions (or at least, make it easier to model them, as is the case with the new parallelized "for" and "foreach" loops). Such frameworks lie on a continuum with managed frameworks like .NET and Java, which relieve programmers of a number of programming tasks (memory management, security issues, fault handling) that are difficult to implement correctly on one's own. Adding tools to simplify multicore programming is just another job for managed frameworks to handle.

It also makes a lot of sense for a company like Microsoft to spend time building such a framework, because as I've noted with increasing frequency in this blog, Microsoft is, at heart, a platform company.

Microsoft needs to learn new tricks given the consumer-oriented turn computing has taken over the years, but it's critical that Microsoft remain true to its core competency. Frameworks are what Microsoft has spent it's 30 year history developing, guided by Bill Gates' vision of the central importance of software in the computer age. That Microsoft identity will be essential to keeping the company relevant in the post-Gates era (in answer to the subtitle of Foley's new book), even if other aspects of modern computing (user interface, style) have risen in importance.