PC makers do not make their own motherboards, with the exception of MSI, Asus, Samsung and Toshiba. Maybe a couple others. Point is, everyone else, such as Dell and HP, DO order the CPUs from Intel and stick them in a motherboard made by someone else. This way, they can pretend they have several different models available when it's the exact same motherboard with a slightly different CPU in it. These PC "makers" will no longer be ordering truckloads of CPUs.
Yes. All of which are not meant to upgraded. Desktops are. No one is going to post pics of their all-in-one bragging about what an awesome gaming machine they've got. As F$ said, it's the enthusiasts and gamers that drove everything to where it is now. Closer to home, every one of us here enjoys the hell out of upgrading. We talk about it every single day. Soldering CPUs to the motherboard takes this away from us because now instead of scratching up $300 for a nice high-end mainstream CPU, we also must spring for the nice high-end motherboard it's stuck to, greatly reducing the likelyhood anyone would bother. The constant buying/selling/trading of core components has driven this market for many years. Won't be quite so constant anymore.They already do this for the mobile market, laptops and all-in-one PCs like the iMac.
Fear of unsellable inventory. When you make your own motherboard, the amount of money you have in it isn't very staggering. When you also are now buying shelves full of Intel chips to have on-hand to solder to the boards (which they have never had to do before), your investment is now at least tripled.I also don't see how this (necessarily) kills off 3rd party mobo manufacturers, like MSI/ASUS/etc. They can build their boards with the CPUs soldered on, just like they do all the other chips. Sure, it means more SKUs, or less selection.
Also, Intel itself. They make motherboards too. Why sell CPUs to anyone at all when they can solder them all on theirselves? Now every other motherboard maker fights to stay alive since they can only make motherboards for AMD's jokes of processors. With any luck, maybe AMD will support PCIe 3.0 by 2014 when this all goes into effect :/
No thanks. Switching to ARM means a loss of performance. A large loss. What Apple is saying, is that they're hoping in a few years they will be "good enough" for desktops by 2017, similar to the way a Pentium I is still "good enough" to surf the internet with. In short, it'll never happen. The whole point of ARM is to allow for slimmer/thinner computers which defeats the purpose of putting one in a desktop, same as you won't find an Intel Atom on an ATX motherboard. And forget 2017, i bet Apple is no longer making desktops by 2015. Anyway, as much better as ARMs will be by 2017, how much better will CPUs be as well, also using less and less power? In order for an ARM to do what a CPU does, it would have to evolve to the point where it was no longer an ARM, other than in name. As they progress and as CPUs progress, ARMs will fade themselves from the market and be absorbed by CPUs.It looks like apple may be going down the ARM path for it's desktops, and 64 bit coming soon. So I wonder what other PC manufacturers or mobo manufactures may be planning in that space. Crunchers with similar performance, but lower power consumption would be welcome!
AMD is moving to make these 64-bit ARMs, as is Samsung. Samsung has actually already made the BIG.little ARMs at 32bit, containing 2 quad-core clusters, each running at different speeds. AMD just now coming into the ARM market is a mistake if they're hoping to do anything in the desktop market. ARMs will be great for servers, but only because you can pile thousands of them together cheaply with little heat to accomplish something. 1 chip by itself in a desktop though...ugh.