X
Tech

What will many cores mean to future Windows releases?

Windows and existing Microsoft programming languages work just fine with one- to four-processor PCs. But when 8- 16 and 64-core client machines become the norm -- in the not-so-distant future -- will Windows, C#, Visual Basic and other Microsoft applications be able to keep up?
Written by Mary Jo Foley, Senior Contributing Editor

Windows and existing Microsoft programming languages work just fine with one- to four-processor PCs. But when 8- 16 and 64-core client machines become the norm -- in the not-so-distant future -- will Windows, C#, Visual Basic and other Microsoft applications be able to keep up?

Seemingly, the answer is no. Microsoft is fully aware of this fact and is starting to make noise about how to deal with the situation.

At the recent Future in Review conference, Microsoft officials told attendees that the next releases of Windows will have to be "fundamentally different" in order to accommodate future multicore machines. But that's about all they said.

At the upcoming annual Microsoft Research Faculty Summit, however, Microsoft officials are expected to shed more light on some of the parallel-programming and high-performance-computing work in which Microsoft Research and its university partners are engaged.

This year's Summit is slated for mid-July on the Microsoft Redmond campus. (Microsoft has allowed press to attend the opening keynote in the past, but not the full conference, which is designed for its academic partners.)

One of the hot buttons for the 2007 summit, according to the Web site: "What new approaches are required to drive fundamental advances in multi-core/many-core processing?"

On July 16, Microsoft Technical Fellow Burton Smith, a specialist in parallel and high-performance computing, is set to present on "The Future of Computing." The synopsis of his talk:

"The many-core inflection point presents a new challenge for our industry, namely general-purpose parallel computing. Unless this challenge is met, the continued growth and importance of computing itself and of the businesses engaged in it are at risk. We must make parallel programming easier and more generally applicable than it is now, and build hardware and software that will execute arbitrary parallel programs on whatever scale of system the user has. The changes needed to accomplish this are significant and affect computer architecture, the entire software development tool chain, and the army of application developers that will rely on those tools to develop parallel applications. This talk will point out a few of the hard problems that face us and some prospects for addressing them."

Another session slated for later that day, is entitled "Are New Programming Languages Needed to Exploit Manycore Architectures?" No synopsis is available of that one, but among the presenters is Mark Lewin, a Program Manager who focuses on programming languages, compilers, virtual machines, operating systems and "scalable manycore computing." Among the projects on which Lewin is working are Microsoft's work to add support for dynamic languages to the Common Language Runtime, the Microsoft Research Bartok compiler and Singularity.

Speaking of Singularity, that's on the Faculty Summit agenda, too. Singularity is a Microsoft Research project that encompasses a new operating system, new programming language (Sing#) and new software verification tools. The Singularity OS revolves around software isolation of processes.

Singularity is a non-Windows-based microkernel that Microsoft researchers have written as 100 percent managed code. It is being designed, from the outset, to minimize internal subsystem dependencies. There's been talk that Microsoft is looking at what a Singularity plus Viridian hypervisor combo might bring to the OS table.

Maybe it's too soon to start thinking about this. But I'm wondering whether the move to many cores will necessitate such a complete and thorough reworking of Windows and Microsoft's existing programming languages that they might have to be recreated from scratch. Will Microsoft finally decide its time to cut the backwards-compatibility cord and move to a whole new architecture?

Editorial standards