Intel 'preparing' to put an end to user-replaceable CPUs

Intel 'preparing' to put an end to user-replaceable CPUs

Summary: Reports suggest that Intel is preparing to kill off PC upgrades by adopting the BGA rather than an LGA package for its upcoming Broadwell architecture processors. This is the beginning of the end for the desktop PC.

SHARE:
178

Yesterday, a report emerged claiming that Intel is planning to release its upcoming 14-nanometer Broadwell architecture processors as a ball grid array (BGA) rather than an land grid array (LGA) package.

This would have several widespread implications, including bringing to an end to processor (CPU) upgrades.

Traditionally, the processors in desktop systems are fitted into a socket on the motherboard that allows them to be removed and replaced, while systems such as notebooks and tablets have the CPU soldered onto the motherboard.

At present, Intel uses the LGA package design, which allows the processor to either be fitted into a socket or soldered directly to a motherboard. This gives the OEM down the line options as to how to mount the processor onto the motherboard.

A switch to BGA would mean that the processor could no longer be fitted into socket where it could be removed or replaced, and instead would be soldered to the motherboard much like processors for notebooks and tablets are nowadays.

The rumor that Intel was planning a switch from LGA to BGA has been circulating for months, but earlier this week Japanese tech site PC Watch (translation here) was the first to break the news.

I have now independent confirmation from a PC building OEM, who declined to be named, along with two motherboard makers, that Intel has briefed them of the switch from LGA to BGA for Broadwell architecture processors, which are expected to make an appearance next year.

Separately, tech site SemiAccurate has also received confirmation from two unnamed PC OEMs.

See alsoIs Android too hard for the average user to figure out?

Why the switch?

First and foremost, at least from Intel's point of view, is that this move puts the chip giant in an even more commanding position, allowing it greater control over the motherboard market. More control means more money.

While it doesn't seen that Intel wants to cut existing motherboard makers out of the equation just yet, sources I have spoken to seem to be worried that this could happen in the mid-to-long-term.  

The vast array of motherboard choices that both enthusiasts and OEMs currently enjoy could be a thing of the past in a couple of years.

It's a move that could make PC OEMs happy too. Soldering a component to a motherboard is cheaper than soldering a socket and then fitting that processor into the socket. The difference might only be pennies, but spread over millions of PCs, those pennies add up.

As far as the PC OEMs are concerned, killing off the PC upgrade market would be a good thing because it would push people to buy new PCs rather than upgrade their existing hardware. The PC industry is currently stagnant, partly because consumers and enterprise are making existing hardware last longer.

The casualties of this move will be upgraders and PC 'modders', the huge market that exists around them. While not many people bother to upgrade their PCs, instead choosing to buy a new one, the market is large enough to support countless manufacturers and vendors. This move by Intel would be the final nail in the coffin for this industry, taking down a number of players. This, unfortunately, would have a corresponding knock-on effect on jobs.

Intel wins. OEMS win. People wanting cheap PCs win. But there are a lot of losers.

According to SemiAccurate, the successor of the Broadwell architecture, called Skylake, will bring back a socketed CPU, "for a generation, possibly two," but I have not been able to confirm this independently.

It seems that this is the beginning of the end for upgrades, and not just CPU upgrades. Apple is already soldering RAM onto the motherboards of its MacBook Pro systems.

This feels to me like the beginning of the end for the desktop PC. Modularity made the desktop PC, and removing this key feature will break it. 

Topics: Intel, Hardware, Processors

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

178 comments
Log in or register to join the discussion
  • But wait...

    Intel aren't the only circus in town, and while AMD keep doing what they're doing, will likely cannibalise a large chuck of Intel's market share.
    wakieAU
    • Or that for the many that don't upgrade their CPU,

      it won't make a difference. And depending on your MB, how many OEM's make MB's that are upgradable anyway?

      Besides, companies like Tyco Electronics and Advanced Interconnects make solderless BGA sockets, but then that would require Intel making all their new CPU BGA's, and MB makers using such sockets on their designs, which would increase costs.

      but to say that it's just to give Intel greater control over the MB market also overlooks some advantages to BGA's over PGA's.
      William Farrel
      • Modders and Geeks

        They will be hitting a market that goes out of their make a rig that is great. I choose the right motherboard chip-set to match the CPU. I guess it would not be a problem if the right match is made, but that's a lot of combinations to cover.
        happyharry_z
        • Modders and geeks are very influential

          They are a fraction of the market, but they usually lead the direction where the rest of the market heads to catch up. The same applies to gamers, a largely overlapping category. AMD must be laughing out loud at this news. Not only will DIY consumers tend to buy more of their products, but mainboard manufacturers will tend to gravitate more toward AMD and push sales further.

          Another point that may escape US readers is that in many countries, garage-made PCs are market leaders because they are much cheaper than ready-made HP or Dell models - often less than half the price and with superior performance. There is a whole industry of small businesses that assemble such PCs for their friends, friends of friends and so on. While they could perfectly do that with BGA board+CPU combos, the LGA model fits their business much better.
          goyta
          • Modders and geeks are very influential (in their minds)

            In reality they are people who have few social skills and are ill-fit for society. Most are addicted to on line games or internetPorn. To help prepare my son for his life and to avoid this type of twisted thinking, I got him a Mac.
            Dr Phil of Crap
          • "... people who have few social skills and are ill-fit for society."

            The quintessential description of a Mac user if I ever saw one. :-)
            IT_Fella
          • Best off with a bottle of white lightning

            By your logic, it's the only way to save his social skills.

            I take it from your comments, you were a modder and geek?
            Little Old Man
          • Now Tell me

            How does your comment apply to the argument at had. Do you know what ad hominem means?
            Scatcatpdx
          • Modders and geeks are very influential

            I work for the largest Independent Software Vendor on the planet as a software engineer. I'm 63, have been married 42 years, have 2 daughters, both married, and have 2 grandsons. I own a home and 2 cars and, at this point, have no debt at all. There are 8 PC's in this house, 2 of them laptops, one from my company and one of my own. I didn't build either laptop from components (duh), though I've replaced hard drives and memory in both. I built the other 6 PC's from components. Some of them I've rebuilt (new motherboards, new processors (or both) more than once. There is exactly ONE game I find myself playing occasionally (every few weeks), but I don't consider that to be habitual. Apparently, I am the exception to ALL the rules Dr Phil of Crap laid down for us all. I enjoy getting MY combinations of Motherboards, Processors, Memory, and various add-on cards set up just the way I want them. Taking that freedom (somewhat of a hobby, actually) away is NOT the choice I would make for myself. About half of those systems run INTEL processors, currently. Fewer will run INTEL in the future, if I have any choice. Phil - DON'T ASSUME you understand the world, just based on your son and / or your prejudices.
            lko2181
          • Can't cure stupid.

            All your comment is lies and BS. Enjoy being forever alone.
            Parafrost
          • It's a shame...

            Now, your son will be at a distinct disadvantage when he enters the real world and finds that he doesn't know how to use the computers used at every company he applies. Oh wait, you're probably one of those hypocrites who says how superior Macs are, then runs Windows on his Mac, making it identical to a PC. Maybe your son will be ok after all. It depends on how badly indoctrinated to "The Cult" he becomes, I suppose. He might end up being another vocal recruiter to his cult and have everyone making fun of him behind his back, like I see so often in companies.

            By the way, I am one of those modders and geeks, in that I build most of my systems and upgrade components over time to gain longer usable life. Contrary to what "The Cult" has taught you, nearly every relative, friend, and acquaintance I have asks ME for advice before they buy computers. I believe that is the definition of direct influence. How many people have asked for your advice? (Sorry. The folks you force your unsolicited cult recruiting on, don't count.)

            Lest you think I too am narrow-minded and ignorant, I should point out that I actually own and use a Mac, in addition to my Windows machines, and a half dozen iOS devices. I use it for music composition and audio work. I just didn't drink the Kool-Aid like you obviously did. It's a tool. It's not magical. God didn't hand it down from on high. For most tasks, I prefer Windows 7. In my opinion, the user interface on OS X sucks when you have numerous applications open across several large displays. Windows 7 handles that type of usage much better.

            Anyway, good luck to your son. I hope he's able to overcome the disadvantages you've created for him by wholeheartedly joining a cult.
            BillDem
          • What disadvantage?

            Seeing that the majority of the Windows UI, from the standard desktop metaphor to the task bar, to Windows 7 search were directly copied off MacOS (and don't even bother bringing up the Alto, I was an Alto user, so you'll not get very far with that particular load) I fail to see how anyone is at any disadvantage not owning Windows machines. Certainly the average Windows user has little knowledge of most of the standard Windows-specific shortcuts, either, so not having familiarity with them as a Mac user will hardly be a serious disadvantage.
            Nor does running Windows apps on the Mac make Mac users hypocrites. Please show your logical analysis that running any Windows app (which is completely possible without running Windows at all) or even Windows itself, has ANY bearing on whether OSX is a superior user experience for that user, or has any bearing on their decision that Mac hardware better suits their needs.
            As to "making fun of [people] behind [their] back", what on earth kind of companies do you hang around (and why)? Apparently they are staffed by kindergarteners. I guess that makes the tenor of your comments understandable.
            As for this continued braying about how knowledgeable people are on account of how they BYO PC, this move to BGA could not come at a better time. The idea that being able to BYO PC makes one some sort of tech guru is absurd. A trained chimp with a philips head screw driver could assemble a PC. It is NOT an accomplishment. Come back to me when you can reflow the solder on a BGA chip mount, and I might begin to take you seriously.
            "In my opinion, the user interface on OS X sucks when you have numerous applications open across several large displays. Windows 7 handles that type of usage much better."
            You have GOT to be kidding me! Talk about swallowing the Kool Aid! MacOS has had superior multi-monitor support before the idea even occurred to Redmond. Let alone its handling of monitor profiles and interconnect issues, claiming the UI in Win7 is superior just indicates a fundamental ignorance of basic UI concepts. You want to go on about how the single system-wide menu bar becomes an issue on multimonitor setups? You want to claim that Windows' insistence on sticking a menubar in every window is superior? Leaving aside the existence of (free) utilities that allow menu bars across all monitors, or even add menus to each window, the idea that Windows' approach, which violates Fitt's law every chance it gets, and as such is inherently slower, is "much better", is hardly a non-debatable point. And that debate does not have to rely on objective opinion, but can look at actual usage times of the interface in question. Again, you would do well to read up on Fitt's law, and its implications for menued UIs in general.
            .DeusExMachina.
          • Windows UI > OSX UI

            @.DeusExMachina.
            You are an incredibly rabid fanboy.
            It is clearly abundant that OSX is inferior to windows in multiple monitor setups. Stop making excuses about 3rd party software making it as good as the windows UI. That's just lame admission of OSX deficiencies.
            How the hell does windows menu's violate Fitt's law? The menu is attached to the window thus it is ALWAYS closer to the relevant window than it is in OSX.
            In any case, why is Fitt's law the one metric by which the UI is measured? I'm not saying it isn't relevant, all I'm saying is you are arguing about using just one "LAW" to measure a UI.
            Windows taskbar absolutely destroys the OSX dockbar. Without windows, Apple would never have implemented, sorry INNOVATED, the dockbar. The MAC dockbar has wasted dead space, occupies too much screen and lacks the intricate functionality of the windows taskbar. I want to call it a poor man's taskbar but Mac's are not cheap.
            anyway why are we arguing about UI when it's a BGA thread.
            BTW, I reball BGA (PS3, nVidia GPUs, including lots of expensive Macbooks)
            warboat
          • Who cares?

            I've used PC's I've used Macs. Macs, IMNSHO, are harder to use. Every time I'm asked to help someone with their mac, it's nothing but confusion. That doesn't make Mac bad, but just because you find it easier to use doesn't mean everyone else does. Yes, there are flaws in Windows. There are flaws in every OS. You choose your poison.
            notsofast
          • Good Reply

            What if the CPU was soldered to a smaller board that was in turn mounted on some new form of motherboard.
            calfee20
          • CPU on a daughter-board

            Good idea, but it would be pointless. It would be cheaper to just not adopt BGA.

            FYI your idea has already been done. Look up "computer on a chip".
            MetaTrader Programming
          • Apple

            Good arguement BillDem,

            A smarter move would have been to get his son an Apple as well as a Windows computer. That way all bases are covered.

            As you said computers are tools. Its harmful to his sons development to withold the tools (windows) he will require later in his life.
            MetaTrader Programming
          • Gotta say

            Your handle is perfect.
            fairportfan
          • umm

            You can game and watch porno on a mac too Einstein.
            All you've really accomplished is spending more money.
            pwn0tr0n
          • For better hardware with longer usage life

            .DeusExMachina.