mdbechtel wrote:
I have consistently had excellent experience with all 5 laptops, a Mac Pro, and iPhones during the last 10 years to the extent I never saw the need to pay for Apple Care. I have also always upgraded hardware and repaired my laptops (i.e. refinishing heat sinks or replacing fans) myself. It had been my strategy to sell my older high functioning machines while their value was still high to fund the next machine. I need that for the software I run.
Sadly, this is increasingly less possible since Apple has made design choices that compromise repairability. Furthermore, it appears the company is actively opposed to affordable means for their customers to repair and extend the life of their products, and I'm not referring to the iPhone controversy.
http://appleinsider.com/articles/17/05/18/fair-repair-act-proposal-in-new-york-u nder-fire-by-apple-lobbyists
I know it's frustrating. However, as an engineer in the electronics industry and as someone who has been using electronics and looked "under the hood" many times, what we're seeing now generally makes sense. A lot of the functions that used come from a separate CPU are merged into a single chip these days. There are a ton of design decisions made to increase general reliability for the expected useful life, to lower costs, and make a more compact device - but where ultimate longevity may be sacrificed. They're taking the lessons they've learned making iPhones, iPads, and other small devices that generally can't be upgraded, and using that to make a ridiculously thin notebook computer by squeezing out every last bit of space that isn't needed.
These forums have lots of people coming to diagnose issues where a connector of some kind failed. So going to permanent connections generally improves the reliability. I remember with my 2007 white MacBook, sometimes the hard drive wouldn't operate, and I could trace it to the connector being separated. It was really easy to pull out the hard drive (through the battery compartment) but that was also a possible failure point. The manufacturer doesn't necessarily have an easy time with all these soldered components. When they do a final system test (including a boundary scan test for connections) if one small component on a board fails, it typically makes sense to scrap the entire board, because it's too much work to try to remove a package and place a new one, not to mention that it might not be reliable. I've heard of "BGA rework", but it's typically for very expensive prototype boards where getting a new board is difficult or there isn't enough time.
If you look at an old computer such as an Apple II, there are expansion cards, and the main board is populated with dual-inline chips (mostly TTL) in sockets that anyone with a chip puller could extract and pull. Back then it might be able to diagnose a single chip failure and replace it with an equivalent part. However, that was also a source of failures since there had to be a solder connection to the socket and the socket itself could be a possible failure point. My first computer was an Apple //c, and by then they had miniaturized a lot of the parts by combining them. Most of the parts were directly soldered to the board, which actually made it more reliable at the cost of ease of repairability. However, it could be repaired with someone handy with a soldering iron and a solder remover. It also didn't come with expansion slots (used serial ports) or any factory-authorized means of expanding RAM (but it had 128 KB), although there were some aftermarket companies selling RAM through some strange rig that sat on top of the board.
At this point the current MPB is an "appliance" that the manufacturer has designed such that it can't be tinkered and to some degree can't be repaired. If you look at the board of a new Retina MacBook, most of the major components are in BGA packages that have to be baked to get the solder balls to flow.
Retina MacBook 2017 Teardown - iFixit
I suppose the most frustrating thing for many is that the SSD can't be replaced given that any SSD is bound to wear out. Well physically it can, but getting it to work is another matter. As far as this 2017 MacBook example goes, it has two 128 GB NAND flash packages and a 128 MB flash package. I'm not sure exactly what's the latter is used for, but my guess is that it's the firmware, including possibly the SSD controller firmware. Getting that firmware right is the tricky part, as the SSD controller would be keeping track of all sorts of things (such as wear leveling, self-diagnostics, etc), and just substituting new flash packages isn't going to cause it to reset to like-new status.
I've said in other posts that perhaps in the future we'll see something like 3D XPoint from Intel/Micron. That's kind of fills a niche between flash and RAM. They're suggesting that future systems might be able to use it in place of either depending on the situation. However, it certainly could be used for what's currently hard drive/SSDs storage. However, the big thing is that endurance is supposed to orders of magnitude higher than even SLC flash, which it would have to be if they're calling it a RAM replacement for some purposes.