What will next-generation multicore apps look like?

What will next-generation multicore apps look like?

Summary: Microsoft has started thinking through what apps might look like in five-plus years. And Craig Mundie, Chief Research and Strategy Officer, last week started articulating publicly -- albeit at a very high level -- some of the things Microsoft is mulling in the area of multicore applications.

SHARE:
TOPICS: Microsoft, Apps
56

The move to multicore processing is well underway. But the creation of software that takes advantage of multicore capabilities is still little more than a twinkle in the eyes of developers -- including Microsoft's developers.

Microsoft has started thinking through what apps might look like in five-plus years. And Craig Mundie, Chief Research and Strategy Officer, last week started articulating publicly -- albeit at a very high level -- some of the things Microsoft is mulling in the area of multicore applications.

Of all the presentations at Microsoft's annual Financial Analyst Meeting last week, the one that got me thinking the most was Mundie's. When I've heard Mundie speak before, he usually is showing off a bunch of Microsoft research projects that are typically a long way away from seeing the commercialized light of day.

But at FAM this year, Mundie gave a different kind of talk.

"The world is going to move more and more away from one CPU that is multiplexed to do everything to many CPUs, and perhaps specialty CPUs. This is not the world that the programmers target today. This kind of complexity was historically reserved only for the wizards who wrote the core operating system; or, in the world of supercomputing in science and engineering, people who had the ultimate requirement for computational performance built big machines like this and have used them to solve some of the world's tough computational problems. That was always a niche part of the industry," Mundie told the Wall Street analysts and press attending the July 26 FAM event.

(You also can read the full transcript of Mundie's talk and/or check out his PowerPoint presentation.)

Microsoft focused heavily on what the wholesale change to multicore will mean to programming languages and software design during its recent Faculty Research Summit. But Mundie went beyond the usual pie-in-the-sky high-performance-computing talk. Instead, he spoke at length on what all this multicore power will let users do.

"This (move to multicore) presages a change where the industry at large, the whole concept of applications, will ultimately have to be restructured in order to think about how to take advantage of these machines, because they won't just get faster every year. They'll get more powerful, but in fact only if you're able to master these problems of concurrency and complexity."

It's not a matter of getting PowerPoint to run 50 percent faster. Multicore means there will be a whole different new class of apps in the not-so-distant future, Mundie said.

"When those (apps like Lotus 123 and Wordstar) were the paradigm of word processing and spreadsheets, they were matched to the capability of that microprocessor," Mundie said. "But as the microprocessor has grown dramatically in capability, as has the whole system, the concept of the app hasn't fundamentally changed that much. And so the question that looms in my mind for Microsoft and ultimately for the industry is: What are those future applications and what might they look like? And in fact can we move to use all of the power that's there - not just to make them responsiveness to a new class of demand from you, but ultimately to do things for you that are more like what people who help you do for you? "

The cloud-services model meshes well with multicore, Mundie said, given that multicore apps will be more asynchronous, loosely coupled, concurrent, composable, decentralized and resiliently designed. These kinds of features almost by their very nature call for a cloud-computing model with giant datacenters' worth of power.

There were more adjectives on Mundie's multicore-app-design list: Context-aware, model-based, personalized, humanistic, adaptive and immersive. Sounds good, but what will these things mean to what the next generation of apps can do? Mundie offered a few broad-brush ideas:

"One of the things that will make (multicore apps) more useful is that they'll be more personalized. We're moving to an era where IT will make a lot of things more personal. People talk about moving from just-in-time manufacturing to personal manufacturing. Clearly the world of health care aspires to move in that direction to personalize medicine and health care. But think if the computer was really much more personalized in terms of what it did for you, even if it was derived from some generic capability. I think one of the big changes, it will become more humanistic - your ability to interact with the machine more the way you would interact with other people will accrue from this big increase in power. "

In the end, Mundie kind of spun off into sci-fi land, talking about artificial-intelligence kinds of scenarios where your computer would anticipate things you might be interested in, etc. But back here on earth, his ideas about creating new kinds of app and service combinations that might work more like we do and that could make us more productive sounded intriguing.

Granted, a lot of operating system and design tool changes need to happen before anything changes on the app side of the house. But Mundie offered up at least a taste of the kinds of things that Microsoft is contemplating could happen by next decade.

Topics: Microsoft, Apps

About

Mary Jo has covered the tech industry for 30 years for a variety of publications and Web sites, and is a frequent guest on radio, TV and podcasts, speaking about all things Microsoft-related. She is the author of Microsoft 2.0: How Microsoft plans to stay relevant in the post-Gates era (John Wiley & Sons, 2008).

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

56 comments
Log in or register to join the discussion
  • PLINQ Is Part of the Multi-Core Story

    Microsoft is staffing up its Parallel LINQ (PLINQ) development group to take advantage of multicore processors in handling LINQ expressions (especially Group Join and Group By).

    Joe Duffy has more information and a set of PowerPoint slides about PLINQ, which he presented to the Declarative Aspects of Multicore Programming (DAMP) workshop at POPL in January 2007.

    This "PLINQ Gathers Momentum" post (http://oakleafblog.blogspot.com/2006/12/plinq-gathers-momentum.html) from December 2006 has more info on PLINQ.

    --rj
    Roger_Jennings
  • Multi-core has no advantage to

    the apps we use today. Look, a word processor no matter how the code is written is going to run faster than you or I can type or even click up date the layout. The same is true of almost all apps used by Joe Average. Multi-cores CPUs simply provide nothing for the apps or their users.

    Sure, there are special apps (media/CAD) that can use them but they are a long ways from being mainstream apps.
    No_Ax_to_Grind
    • what?!

      Haven't you noticed that as you type, there is a spell and grammar checker running? More cores open the possibility of more demanding grammar and semantical checkers in a word processing app. This is just a small example.
      bond07
      • Yes, and my old 3.2 Ghz single core

        does all of it in the background and doesn't slow me down a bit. The limiting factor in the word processor is how fast I can type, not the CPU.
        No_Ax_to_Grind
        • You guys don't get it

          Computers have pretty much done the same thing since they were first created. Manipulate data represented through mathematical form. All apps boil down to this. The only real befit that has been achieved since the day of vacume tubes is how the data is presented to the user. 70% of hareware utilization is human interface 25% is Sorting and organizing data, the programs do the same old things. Yes, a 500mhz k6 or celeran can't number crunch nearly as fast as a core 2 but thats not the point. Now in terms of multi core can your 3.2ghz single core say, read as type, display the doc in a photo realistic HD image, Translate to other langs automatically, type as you speak ect, explain grammatical errors in English. And if your thinking those aren't cpu intensive, try putting a 8800gtx in a 3.2ghz single core. see what happens. And how much money you wasted.
          usmcdvldg
      • I doubt this will affect grammar/spelling checkers

        Bah, I doubt that multicores will make them any better. They're not really intensive at all and depend more how they are programmed rather than on how many cores they are given. I fail to see how multiple cores will benefit them.
        CobraA1
        • that's why I said semantical checker

          Today we don't have semantical checkers because that is a more demanding AI process. My point was that with more cores, you can run more sophisticated processes in the background, like complex AI programs. It's kind of a too obvious point to mention that today's spell/grammar checkers work just fine, don't you think?

          The goal is to demonstrate that you may want to run highly demanding background processes even for a word processor, and a semantical checker proves that.

          I'm assuming that you have a sense of what checking the semantics would entail.
          bond07
          • What do you mean by semantical checker?

            What do you mean by semantical checker? How would it be different from a grammar checker? Why would it need a "demanding AI?"
            CobraA1
    • People often run multiple apps and all the services running

      People may think they are running only one app but there are lots of services running on a computer that can benefit from having multiple cores. Even for one app, two cores have often meant that the tread the GUI runs on does not get hung up anywhere near as often if the main threads are CPU-intensive.

      It also means that you can make apps hook up to each other more reliably. Run apps that hook up to data in your local MySQL or SQLServer Express witout worrying about the performance hit of just having the db running at all.

      Basically it allows loosely-coupled interactions that would have had to be thread-optimised on a single CPU.
      Patanjali
      • And if you made most main stream apps 10 X faster

        the end user would not be abke to tell the the difference.
        No_Ax_to_Grind
        • you fail to see the point

          The end user needs to see useful results, that's all. And the sooner they see the results the better (eg, after each key stroke). Multicore can allow you to do more things while the user types. The question is what you can do with the extra power that can be useful for the end user. As I mentioned, AI apps are famous for being multi cpu hungry, since they usually can investigate multiple interpretation branches in parallel. This suggest the creation of more intelligent human-computer interfaces that the ones of today.
          bond07
          • WRONG - The user does not need Faster Apps but more powerfull APPS.

            Guys,
            As long as the user is not slowed down by the system when working in a document, he/she does not need a FASTER App as much as a more powerfull application, or set of application.

            Face it, as the guy said earlier as long as the user sees results on the screen in a reasoneable time then nothing else matters as far as apparent application speed is concerned.

            But if the applicating(s) are more powerfull - ie get more done, make it easier for the user, or do new things, with out slowing down the user in other areas, thats something people will pay for.

            Do you really care if your cpu spends 98% or 78% of its time in the idel process; NO as long as it is responceive to you it is fast enough. Would main stream people pay big bucks so that there machings could spend more time in the idle process? no, but they will pay bucks if it does new things also.
            -jeff
            poundjd
    • I see media becoming more mainstram

      Okay, so CAD isn't becoming mainstream.

      But media certainly is - both the video kind and the 3D kind. I see a lot more personal videos (think YouTube) than ever, and gaming is a huge industry on par with Hollywood.

      Not to mention Hollywood is becoming more common on PCs also - already iTunes has a large (and ever-expanding) collection of movies, and both Apple and Microsoft are really pushing hard to make the PC a media and entertainment center rather than just a work tool.

      Seriously, don't underestimate the impact of media. It's a lot larger than you think.
      CobraA1
      • Media already takes advantage of

        multi-core / CPUs.
        No_Ax_to_Grind
        • . . . which pretty much contradicts your first post

          . . . and you just contradicted your first post, which claimed "Multi-core has no advantage to the apps we use today."

          My point isn't that they are multi-core, which I already know. My point is that they're becoming mainstream.
          CobraA1
    • Multi-core advantages to common apps

      Multiple cores could provide numerous benefits to common end-user apps.

      Word not only does spell checking, but also reformats the document (figuring out
      page breaks) in the background. For docs with hundreds of pages that process is
      not "instant", and having an extra core to work with should certainly help.
      Assuming that process already runs in a separate thread, it should benefit
      automatically from a second core.

      Converting video files to different formats is another common, processor-
      intensive task that can max out all available cores. One "Joe average" example
      would be converting a dvd or home video to Apple TV and iPod resolutions.

      It's not hard to imagine an average user, today, editing a 200 page policy manual
      in Word while at the same time converting a 2-hour video for their iPod.
      pointzerotwo1
    • You don't have a clue

      No axe to grind? Make that "no clue to be had."

      As a programmer who just traded up a dual core to a double quad core and who writes apps that use every miliwatt of those CPUs, I can tell you what I think: you don't have a clue.

      Just a run-of-the-mill clueless windbag, trying to make others think he has a clue.
      Metaphore
  • Linux...

    Since Linux Geek hasn't posted to this story yet I thought I'd beat 'em to the punch.

    Linux is already leveraging these multi-core capabilities, we're even making versions of linux that work with stuff that hasn't even been invented yet by the chip manufacturers. Linux is already implementing new ways that our superior OS will interact with users on a much more human level than Windows ever could.

    Yes, over here in Linux-Land, things are going pretty good. Microsoft is a monopoly and couldn't code itself out of a wet paper bag. Victory is mine. --There I said it.
    BFD
    • Go back to your cave troll

      What we are not talking about the OS bone head we are talking about applications taking advantage of multi core. Linix qapplication developers have far less advanced tools to acomplish this the windows application developers currently have. Go back to your cave troll!
      ea01bg
  • GEE! WOOW!

    I would be happy if I could get any MSFT application to work correctly three times in a row. Instead of rolling out another wave of utterly false WOW! promises, why can't MSFT finally produce a product that actually does what it promises.

    Who needs more power when you cannot use the power that you have because of the utterly junk crap software that MSFT sells.

    From laptop software to mobile software to music players, MSFT has a well deserved reputation for producing JUNK.

    Enough with all the far-flung promises. Fix the junk you already sold me.
    Jeremy W