First AMD Bulldozer gaming desktop PCs announced

First AMD Bulldozer gaming desktop PCs announced

Summary: It hasn't taken long for desktop manufacturers to start releasing gaming PCs featuring the new AMD Bulldozer desktop chips, which were officially launched on Tuesday. A number of boutique builders now offer configurations with the new FX series processors, with the flagship eight-core FX-8150 garnering the most attention.

TOPICS: Hardware, Processors

It hasn't taken long for desktop manufacturers to start releasing gaming PCs featuring the new AMD Bulldozer desktop chips, which were officially launched on Tuesday. A number of boutique builders now offer configurations with the new FX series processors, with the flagship eight-core FX-8150 garnering the most attention.

Maingear is offering its top-notch Shift desktop with the choice of the eight-core FX-8120 for a base price of $2,080 or the FX-8150 for $49 more, as well as the mid-range F131 starting at $1,271 with an FX-8120 and, again, $49 more for a FX-8150. For an additional $49, Maingear will overclock your PC. Meanwhile, Origin PC is equipping its Genesis desktop with the FX-8150, which it will factory overclock on request.

Some vendors are adding other Bulldozers to the mix. While AVADirect is offering its Zambezi Gaming System with the FX-8150 starting at $1,142.71, or the FX-8120 for $1,074.31, it also comes standard with the six-core FX-6100 for $1,040.11. iBuyPower has a trio of desktops built around Bulldozers, one with the FX-6100 (the $759 Gamer Mage D415 FX), another with the FX-8120 (the $989 Gamer Mage D295 FX), and a third with the FX-8150 (the $1,219 Gamer Mage D355 FX).

CyberPower has no fewer than eight different systems that can come with Bulldozer chips, including the new Gamer Scorpius line. A few of its budget systems come standard with the FX-4100 quad-core chip, including the Gamer Scorpius 7500, which starts at just $575.

Brands like Dell/Alienware and HP have yet to embrace Bulldozer, but we'll see if any of their upcoming systems for the holiday shopping season will ship with AMD's latest and greatest chips.

Topics: Hardware, Processors

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Re: First AMD Bulldozer gaming PCs announced

    In your previous post I left off with some questionable issues with the FX-series, and like AMD in the past they always seem to hit and miss leaving their marketing department to attract newbies and AMD fan boy to upgrade their 990FX motherboard's processor or wanting something that has been smeered by spin that isn't made by Intel. So, facts be known the polycrystaline silicon based FX-series processors have a new name with AMD and the transistor count for the eight core FX-8150 is only 94 more transistors than their Phenom II X4 980; to boost input-power-output. While the i7 2600 ($15 more) has 143 more transistors for "four cores" compared to the eight core FX-8150. With a lower TDP @ full turbo-boost and superior overclocking with the K-series Intel part. Overclocking the FX-8150 or the i7 2600k will in either event eliminate any turbo effects altogether anyhow. The plus side for gamers with the FX-series will be while gaming using SLi certified motherboards so keep this in mind when configuring your systems that Nvidia video grahics cards for your gaming will be superior over AMD for most games and have fewer scaling issues since the core values of standard Nvidia GTX 570 and GTX 580's graphic cards are onboard for the task of SLi. Precluding the GTX 560Ti and AMD HD 6870, HD 6950 and 6970's are best intended as single discrete gaming card option. Personally I have had good luck with the AMD HD 6850's in CrossfireX and I think that the core clock of the 6990 would allow for a ridiculous power hungry monster of a 2X16 pcie 3.0 gaming rig. Good luck!<br><br>Mainstream desktop users should really do your homework before upgrading. The Float Point Units for the FX-series are weak for calculation potential compared with Intel and the $294 i7 3820 3.6Ghz (socket 2011) quad-channel capability with all dimms filled for DDR31600Mhz supported; is only a $1,100 computer using integrated graphics and high quality components and motherboard. Ask your boutique builder to make it happen for you.<br>(Disclosure: The author of this Talkback has purchased a gaming computer online from CyberpowerPC in the past six months.)
    Rob T.
    • Go Big with Boutique Computer before finalizing your purchase & save big $$

      I can tell you SLi is great for beginners because your AI opponents will be scaled back to a slower clock rate running through your CPU memory controller; if you were to step up and go with a single Radeon HD 6970 or GTX 580 the AI is on par with the core and memory clock of the single card and becomes more competative with high speed for your opponents. For the "Top Gun" Enthusiast Gamer King who can hang on $729 can buy you the fastest video card on the planet the Reference version MSI Radeon HD 6990 4GB GDDR5 and switch it to HD 6970 2Gb GDDR5 core and memory clock peed with the flick of a side lever that is covered with a warning label that says," removing this label will void your warranty". This is the only manufacture that has this feature available for the HD 6990 line of video graphics card. Don't be fooled by the mid-year cards with improved coolers and sporting preprietary PCB's they won't have it; not ASUS DCII or even the MSI Twin Frozer II have this or the cheaper reference Gigabyte or Power Color do not have this increase from 900Mhz @375watts to 990Mhz @450watts. (ASUS began 2011 with this reference card and sold out by April.) Spend the money now on PSU and video card!
      Rob T.
    • RE: First AMD Bulldozer gaming desktop PCs announced

      <strong> /</strong>

      <h1><strong><a href="">Free Puzzle Games</a></strong></h1>
      <h1><strong><a href="">House Design</a></strong></h1>
  • RE: First AMD Bulldozer gaming desktop PCs announced

    8 cores.<br>I have not even been able to stress my 6 core AMD.<br>It auto adjusts it's speed from 2.4 to 2.9, it normally runs at 2.6.<br>It always runs the BOINC cure for various things in the background, I do not have it set to pause when the puter is in use.
    Always had an Intel before
    • RE: First AMD Bulldozer gaming desktop PCs announced

      @MoeFugger Except not really, it's closer to an Intel Hyperthreaded Quad Core than a true 8 core, which was their goal: challenging HT technology.
  • All those benchmarks were posted months ago b4 final silicone existed

    Lies, I am tired of intel fanboys making stuff up. Intel has 95w tdp for 4 x86 core while AMD has 125w tdp for 8 x86 core, That means Intel consumes 55 percent more power per core. Furthermore AMD turbo disables half of the core and can boost 4 cores to 4.2 ghz while intel is at a measly 3.9 while using 55 percent more powa. HAHA and if that isn't fail enuf for ya lets compare everything else and see if intel wins in ANYTHING. <br> AMD 1866mhz RAM controller Intel 1333mhz RAM controller, AMD wins. <br> AMD 16mb Cache Intel 9mb cache AMD WINS. at Price AMD wins. <br> Intel doesn't have SLI or Crossfire, sorry but if your have cut each videocards bandwidth in half to work then its 8x8, AMD is true 16x16 SLI/Crossfire<br> Intel cost over 220 percent more per core. yes their 4 core cost more than AMD's 8 core with faster EVERYTHING, yes I said everything. The new extreme series sandy when released will also be slower per core, have half as many cores cores and cost much more than AMD. The Bulldozer Quad core 4170 has 4.2ghz but uses less power and costs less than half of Intels 2600Kquadcore at 2.6Ghz!!!<br> AMD also Holds CPU and RAM speed world records. <br><br> Side note for Intel fanboys, The new E-series 2700K is not a new CPU. It came from same line as 2600k. The only difference is they will hand select best silicone from same trays and mark it 2700k and raise the frequency as the hand picked silicone will scale better. Intel always pushes the stock speeds too close to the limit, so its not as safe to overclock an intel. Remember when intel had to recall a CPU cause it made mistakes at clock speeds. Overclocked from the start is bad. <br> So if you bought a 2600k months ago at least 20 percent chance you got a 2700K but now since they will sell them separate. all 2600K CPU's will be the worst 60 percent that come off the line, the best 40 percent will go to 2700K price and slightly higher stock clock speeds.<br><br>Im sorry to inform you that intel supports 1600 RAM but only runs at 1333 unless overclocked but AMD can overclock to 2600mhz so lets not compare unfairly.<br>also AMD has 16x16 for SLI while intel is 8x8 for SLI and crossfire, so your Per card bandwidth is cut in half when using sli Crossfire
  • Just be careful with Gigabyte boards

    Don't know about other manufacturers, but Gigabyte just released this month a BIOS update to ready their AM3+ boards for the new FX chips. Which then bit me in the behind when I bought my components over the weekend, because I *didn't* get the FX chip...& the BIOS update FUBARs its ability to work with the pre-FX chips. I spent over 4 hours trying to diagnose & fix the stupid "Loading Operating System..." message my screen was showing, until a quick Internet search turned up multiple people having the same problem.

    As to why I didn't get the FX... a couple of reasons:

    1. First off, assuming I even buy any games in the near future, I'm going to be getting games like Starcraft II: games that were out of reach of my old system (Athlon XP single-core CPU), but that at most require a dual-core let alone quad/hexa/octo-core systems. "Bleeding edge" isn't what I look for... & the FX chips are literally "bleeding edge". It's also why I went for a quad Phenom; for my usage, processor speed is more important than cores.

    2. The various hardware review sites are pointing towards some performance issues with the FX chips, most likely because of the decision to only provide 1 FPU per 2 CPU cores. However, when I see a quad-core Phenom II outperforming an octo-core FX, that means I can expect the quad-core FX to have worse performance [see #1]. I don't want to have to update my processor within a year just because they didn't get it right.
    • Hey Dude, Congrat on the new system

      @spdragoo@<br><br>I just had to post on the top of the board; and you already know that I have a AMD mainstream gamer build for my Motocross Madness2 game; ( Foxconn A88GMV motherboard, 500w Thermaltake PSU, Samsung Spin-point 320GB HDD, Athlon II X3 455 3.3Ghz, Patriot DDR3 1600 XMP CL8 memory and AMD Radeon HD 6870). <br><br>Like I said before I am completely satisfied with this computer for everything online and it stays in the game unlike my Intel i7 2600k computer with CrossfireX and dual Radeon HD 6850's scaled back to 800Mhz/1000Mhz in order to drop the frame rate stuttering; but smoked the game AI opponents from the starting gate hole shot to the 40 second lead I could pull out by the end of a 20 lap race. With my AMD rig I have the HD 6870 set @ 940Mhz/1160Mhz and it is competitive with the AI and I can not make any mistakes in an entire 20 lap race with AI opponents on three of my seven Supercross track choices. With second being the best I can get with Extreme Racing in the Pro level 500cc class on these three tracks; and with more than four mistake I am dead last with only the last two laps of the race to pick off up to two riders in an eleven bike field of competitors.<br><br>This is my personal testament to the Intel i7 2600k and all that is told on its behalf by Anandtech to Tom's Hardware. I am sure that the coming compiling test will reveal the true ability of the FX-series 8150; but, I agree with you that the quad-core FX-4120 is facing some disappointment by all standard of measure. Also, the "power" of a processor is in the number of transistor in related die size for efficiency of TDP and heat loss. My i7 2600k has a die size of 216mm and 995 million transistors, while the Phenom II X4 is a die size of 258mm with 758 million transistor. We know that the 32nm form factor boast higher efficiency in the Sandy Bridge as mine operates at 37c with an over-sized $18 aluminum heat sink @ 8,000 rpm fan speed in a mid-tower with 2X120mm fans, 1X200mm fan and 1X80mm integrated right-side case fan for multiple hard drives; CyberpowerPc all Coolermaster blue LED fans in a Raidmax Black Storm (Black&blue).<br><br>Finally, as far as memory is concerned; getting DDR3 1600 to post at boot-up is not a problem for us mainstream computer hacks and XMP used with my AMD processor does achieve a full 1600Mhz past JEDEC#2 1.5v @533Mhz; it goes right to full throttle XMP 1.6v @800Mhz 8-8-8-24, then XMP 1.7v @800Mhz 7-7-7-20. I wouldn't have it any other way but it will cost you extra for the added punch. DDR3 1333 is more progressive and undetected web browsing without the serious blitz of low latency CL8 DDR3 1600 memory. G-Skill Ripjaw CL8 8-8-8-24 DDR3 1600 XMP is at a great value $36 (2X2GB) from online and the only remaining 8-8-8-24 XMP avaiable that hasn't been enhanced for Sandy Bridge as 8-9-8-24 XMP. For all you AMD guys thinking about what I am doing. Otherwise Corsair offers an more expensive $64 option; 8-8-8-24 CL8 DDR3 designed especially for AMD; which will compare in price with DDR3 1866Mhz. Good luck with that!
      Rob T.
    • RE: First AMD Bulldozer gaming desktop PCs announced

      @spdragoo@... The real reason that Bulldozer did not stack up in the benchmarks is the compiler used for for each of the benchmarks. All of these closed-source benchmarks are compiled on the standard Intel compiler with the Intel libraries. It is not optimized to support any instructions beyond SSE3 for any processor other than Intel chips. SSE4.1, SSE4.2, AVX, and FMA4 significantly increase the floating point performance of AMD processors, but are not used by code compiled on an Intel compiler. Intel was sued and settled with AMD but haven't fixed it because its too expensive to undo all the work their teams did to cripple AMD while at the same time giving AMD a huge performance increase

      If you look at the integer performance of the benchmarks, AMD almost always out-performs the intel chips and shows a 15-30% increase in performance over the Phenom II x6 processors. If the compiler used was completely optimized for both Intel and AMD, floating point performance would also show similar gains.

      Lastly, under full load where all of the threads are being used, the Intel chip is not physically capable of beating the AMD chip. 4 cores that complete one instruction each per cycle cannot physically beat 8 cores completing 1 instruction each per cycle, when threads are continually running.
      • RE: First AMD Bulldozer gaming desktop PCs announced


        You make valid points and I agree Intel has bought their way into the benchmark wars that is a given. I would like for there to be a better real world comparison if only in game performance frame rates where AMD GPU's are really competing here with the GTX 570 and GTX 580 with heavy processor requirements that would also load the CPU. Then it would be the era of " The Fan Boys"; AMD and Intel. This is what makes for T-shirts and stickers, even posters on the wall of your kid's bedroom.

        Currently I have not used my Intel computer for about three weeks since building this AMD Rana computer using Windows7 Professional. I honestly think that there is a break in period for processors and video card and this machine has shown that to be true for AMD stuff it is in every way comparable with my i7 2600k computer for what I do. Even with different style of vertical peak performance in dual graphics mode and a competitive horizontal acceleration with the single card with increased frequencies; both of my computers are giving the same lap times and averages. It just beats all that someone hasn't come forward as of 2011 to discuss how increasing fps with SLi or CrossfireX only benefits the beginner.

        I remember playing "Far Cry" using the Boutique gaming computer (Velocity) with a single GTX 7900 at Best Buy years ago and there was only one of the employees that could get into the third level with that set up. I picked it up from there and got blasted from all angles when trying to take my first shot. it would be nice to see the high-end card be put to this challenge.
        Rob T.
  • Apparently a bit lackluster as a gamer

    From what I've read, the FX actually is topped by the Phenom II on a clock-for-clock basis in some games. Probably better for workloads that scale well to 8 cores, perhaps video editing or scientific simulation work.