NVIDIA "de-optimizes" PhysX for the CPU

NVIDIA "de-optimizes" PhysX for the CPU

Summary: NVIDIA has purposefully de-optimized its PhysX realtime physics engine when it is run on the CPU, according to research carried out by Real World Tech.

SHARE:
TOPICS: Hardware, Processors
12

NVIDIA has purposefully de-optimized its PhysX realtime physics engine when it is run on the CPU, according to research carried out by Real World Tech.

The article goes on to claim that NVIDIA makes use of the x87 instruction set for floating-point math rather than the more modern, and vastly superior, SSE.

The truth is that there is no technical reason for PhysX to be using x87 code. PhysX uses x87 because Ageia and now Nvidia want it that way.

...

Using x87 definitely makes the GPU look better, since the CPU will perform worse than if the code were properly generated to use packed SSE instructions.

How old is x87? This should give you a clue:

Intel started discouraging the use of x87 with the introduction of the P4 in late 2000. AMD deprecated x87 since the K8 in 2003, as x86-64 is defined with SSE2 support; VIA’s C7 has supported SSE2 since 2005. In 64-bit versions of Windows, x87 is deprecated for user-mode, and prohibited entirely in kernel-mode. Pretty much everyone in the industry has recommended SSE over x87 since 2005 and there are no reasons to use x87, unless software has to run on an embedded Pentium or 486.

SemiAccurate's Charlie Dimerjian offers us an insight into the lengths that NVIDIA must have gone to in order to use x87 over SSE:

What does this mean? Well, to not use SSE on any modern compiler, you have to explicitly tell the compiler to avoid it. The fact that it has been in every Intel chip released for a decade means it is assumed everywhere. Nvidia had to go out of their way to make it x87 only, and that wasn't by accident, it could not have been.

So, what's going on here? Well, it seems that NVIDIA has chosen an archaic instruction set for the CPU in order to make PhysX on the GPU seem so much better. Back to Real World Tech:

For Nvidia, decreasing the baseline CPU performance by using x87 instructions and a single thread makes GPUs look better. This tactic calls into question the CPU vs. GPU comparisons made using PhysX; but the name of the game at Nvidia is making the GPU look good, and PhysX certainly fits the bill in the current incarnation.

Bottom line, NVIDIA wants to sell GPUs, and giving PhysX the advantage on the GPU fits in perfectly with this agenda.

Topics: Hardware, Processors

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

12 comments
Log in or register to join the discussion
  • RE: NVIDIA

    Again, Nvidia's new slogan should be, "Turning gold into crap !" They did it with 3dfx, took a good software package, and bought it, then killed it. Remember, all cards went to DirectX after that. I wonder if this information has anything to do with the current Intel/ Nvidia spat with onboard GPU's? I don't think I would want my video card dragging down on my processor, doesn't that defeat the whole purpose of the video card in the first place? Are they not used to lessen the loads on both processors and ram? Isn't that the function of GDDR? And exactly how are these retards getting into these positions of power in the first place? Maybe they should look into developing the heatsink and fans for their next cards out of tinkertoys?
    trust2112@...
  • RE: NVIDIA

    i don' like nVidia, they aren't out to produce amazing products and improve peoples graphics experiences, they're out there to grab as much of your money as they can and they'll be as backward and sly as they can in doing so.
    Naryan
    • RE: NVIDIA

      @Naryan
      Isn't that what all public companies are expected to do for their stock holders?
      I don't like it either, but that's the way it works.
      city_zen
  • when relevant?

    When does PhysX execute in the CPU instead of the GPU?
    GDF
    • RE: NVIDIA

      @GDF When you don't have a compatible GPU and run PhysX stuff.
      wcecsharp@...
  • its sad

    Nvidia use to be the love of my video cards but lately they just spam more and more crap to the market and do more and more things to ruin themselves. For awhile there i remember hearing they were going to drop out of the high end video card market. Now this? Looks like i'll just have to go with ATI. Anyway you get more bang for your buck with em :)
    OneTwoc21
  • Adrian, More info please

    From the details you have given in this post, it is very difficult to form an honest opinion concerning this topic. I would request that you continue your research into whether this bit of info is even relevant to users of nVidia graphic adapters. See if it is possible to separate fact from fiction and give us just the information that might make forming a buying decision opinion based on reliable facts. Thanks.
    zetacon4@...
    • RE: NVIDIA

      Erm... no it isn't. Nvidia are using an ancient instruction set rather than SSE which has in itself been around for ages. They have gone to great lengths in order to ensure that anyone not using their GPU's has less than satisfactory performance. Self explanatory I think?
      12312332123
  • RE: NVIDIA

    Find the very detailed report on which this article is based by following the link to 'Real World Tech.' This is provided in the first paragraph of Mr. Kingsley-Hughes' article and it should more than satisfy.
    Starbuckin
  • RE: NVIDIA

    Wow, it was so simple, maybe I should be a freaking bog writer. I took the simple route and simply *asked* NVidia directly, they promptly responded toward an article which TOTALLY makes sense. Again, people should stop and think for a minute before they mouth off about things they apparently know nothing about. Since I knew nothing about it I simply posed a basic question and got an answer, imagine that!

    http://arstechnica.com/gaming/news/2010/07/did-nvidia-cripple-its-cpu-gaming-physics-library-to-spite-intel.ars

    So the jist of it is that GAME DEVELOPERS have no interest in a recompile for SSE instruction set, and further the GAME DEVELOPERS often compile as x87 themselves. There are various reasons behind the x87 code and NVidia makes no claim this is the best way to go which is why version 3.0 coming next year is a ground-up rebuild of PhysX, a clear indication NVidia does not intend to simply recompile an ancient code base which by mention of NVidia would do nothing by way of improving performance as the bottlenecks are for other reasons.

    So in short, NVidia is pleasing the GAME DEVELOPERS because if they do not then they will not use PHysX. So like it or not, conspiracy against Intel or not, NVidia has made the right business move to maintain their customer base.
    ryanstrassburg
    • RE: NVIDIA

      @ryanstrassburg From the same article you pointed out:

      "But still, when you boil it all down, we keep coming back to the point that it's so easy to switch from x87 to SSE, and x87 has been deprecated for so long, and it's so much to NVIDIA's advantage to be able to tout the GPU's superiority over the CPU for physics, that it's very hard to shake the feeling that there's some kind of malicious neglect going on."

      The article you based your post on points out PhysX on the CPU uses a single thread. Given that a quad core hyper threaded system has "8 logical CPUs", such an approach on behalf of nVidia is asinine. That's because outside of the main game threads running on a single CPU and the software based PhysX calculations on yet another CPU, you wind up with six idle CPUs.

      The reason for all this is simple, nVidia wants you to exercise its silicon (GPU) on the PC, not Intel's (CPU) or ATI's. While we're on ATI, nVidia has gone to great lengths to penalize the consumer if you elect to use anyone else's hardware. If you run an ATI card alongside an nVidia card (because you?re driving multiple monitors or you simply want to drive a secondary monitor AND use hardware based PhysX), <b>the nVidia drivers will explicitly disable hardware PhysX support</b>. This saga is well documented:

      http://tech.slashdot.org/story/09/10/04/1729245/Patch-Re-Enables-PhysX-When-ATI-Card-Is-Present?from=rss

      <b>So even if you're willing to pay nVidia for its silicon, even if just for hardware PhysX, nVidia is not interested unless it runs everything (graphics included).</b>

      Thankfully GenL's efforts to to patch nVidia's drivers which disable hardware PhysX if they see an ATI product have paid off. I know because I've been leveraging the patches.

      The idea of asking nVidia for a straight answer is like asking BP about the environmental impact on the Gulf of Mexico due to its oil spill. Namely, if you believe the answers, I've got a bridge to sell you.

      -M
      betelgeuse68
  • If it'sPCI Express, it's SSE-OK

    More to the point on legacy compatibility; it will be hard, if not impossible, to find a system with a slot for a modern graphics card, and no SSE support.
    cquirke1