You can make a difference in the Apple Support Community!

When you sign up with your Apple Account, you can provide valuable feedback to other community members by upvoting helpful replies and User Tips.

Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

MacBook 16-inch Fan Noise

We are testing two new 16-inch MacBook's before doing a rollout across our organization. Under low loads (25% cpu utilization), fan noise will get annoyingly loud. We're not doing any GPU related and more routine work such as: using web applications, debugging web pages, Microsoft Teams conferencing (audio/video) with a handful of people, Photos downloading from iCloud, Mac Mail downloading a new mailbox from Exchange.


We DID NOT notice this on our 2015 MacBooks and this might prevent us from continuing the 16-inch MacBook rollout in our organization.


Interested to hear others experiences.


Tim

MacBook Pro 16", macOS 10.15

Posted on Nov 21, 2019 11:34 AM

Reply
4,224 replies

Aug 7, 2020 10:47 AM in response to ryunokokoro

That is exactly the wrong question. This is exactly why you aren't helping. There are people out there who could get the silent fans they desire by simply adjusting their refresh rate from 59Hz to 60Hz (or vice versa, depending). What do they give up? In this case, they actually gain in refresh rate and achieve their "silent fans" goal.

Please refocus your efforts on the correct questions.


This is an excellent point. The fact that you can drive an external display without overheating/fans blasting by changing the refresh rate slightly means his point of "wow the GPU is too powerful" doesn't make any sense.

Aug 7, 2020 10:51 AM in response to wealthandnecessity

wealthandnecessity wrote:

This is an excellent point. The fact that you can drive an external display without overheating/fans blasting by changing the refresh rate slightly means his point of "wow the GPU is too powerful" doesn't make any sense.


Not at all; by tweaking the refresh rate you're lowering the load on the GPU, perhaps at the cost of flicker and corruption as stated by AMD.


It's a cost/benefit calculation that the GPU's manufacturer made one way and some using software tools are making another.


It's also one you wouldn't have to make if you were using a GPU that was less powerful and generated less heat, or that used a different technology (the 5600M and HBM2) with the same result (less heat.)

Aug 7, 2020 11:03 AM in response to ryunokokoro

ryunokokoro wrote:

You're oversimplifying the point here by reducing the complexity down to "high clock speed". Telling people "well, your configuration has high clock speed because you're on a super powerful GPU" isn't helpful. Telling people "the GPU in your system has a quirk where your configuration (connection type + display settings) may force your GPU to run much faster than expected" is helpful.


Though it's not actually a "quirk," but rather according to AMD it's by design.


People are coming to this thread for help. Telling the "you're the one who opted for a super powerful GPU" won't help them. Explaining the source of the issue and why they may be seeing unexpected results will give them a fighting chance to try some changes to their setup to alleviate the issue.


Which I and others here have done by showing our experimental data.


I think you will need to restate the reasons you've given as your "defense" tends to fall back on "if you want a laptop that doesn't start up fans with an external monitor, buy one without a discreet GPU".

If you're trying to help people, you shouldn't be on the defensive to begin with.


You characterize my answers as "defensive" but that's not my approach at all.


Rather many here use the term "defective," which has a specific meaning which is not applicable here.


Instead the machine is performing a way other than they would prefer for the task and configuration at hand, and it is caused in part by the tools used to do the job. I have stated where I believed another selection would do the job at hand better.


That is exactly the wrong question. This is exactly why you aren't helping. There are people out there who could get the silent fans they desire by simply adjusting their refresh rate from 59Hz to 60Hz (or vice versa, depending). What do they give up? In this case, they actually gain in refresh rate and achieve their "silent fans" goal.


But at the potential cost of flicker and graphical corruption, at least according to the manufacturer of the GPU. That is also a "cost" that must be taken into account.


HBM uses lower clock speeds when compared with GDDR memory. It makes up for that speed difference by having a much wider memory bus width (2048bit vs 128bit [or lower]). Assuming that the same "certain configurations require faster memory access to handle" applies to the HBM-based architectures, then even if they run at a faster speed they require less power to do so and therefore generate less heat. That said, the issue could also have been handled in some other way. We just don't know (and likely never will).


Indeed, and advantages like this is why HBM2 was developed; the downside is, of course, higher cost.

Aug 7, 2020 7:31 PM in response to DPJ

Thank you for acknowledging you don't have an issue and have no solution for the rest of us and leaving the conversation :)


I thought I posted this last night but it seems to have disappeared so will repeat myself -


Today I found out that my 2017 i5 8gb 13" MacBook Pro performs better than my 16 when connected to external monitors.


I finally cracked it due to the loud fan noise and switched to my old 13 and while it was far from perfect it was consistent, fans were quiet and everything rendered smoothly.


My 16 when powering external displays often slows down and becomes unresponsive for periods at a time when connected to external displays doing dev work (docker, a few node processes, viscose). This only happens when it's connected to displays, I didn't quite realise how dramatic it was often writing it off as "well it's dev work, glitchy tools and what have you". But the old 13 does not have this problem at all.


It is as we feared, the thermal issues the 16 (i9, 5500m, 64gb ram as mine is spec'd out) has causes severe performance degradation making it perform worse than a much older lower tiered device in the real world.


The 16 has seemingly been engineered to trip over itself.

Aug 8, 2020 4:39 PM in response to Dogcow-Moof

William Kucharski wrote:


wealthandnecessity wrote:

This is an excellent point. The fact that you can drive an external display without overheating/fans blasting by changing the refresh rate slightly means his point of "wow the GPU is too powerful" doesn't make any sense.

Not at all; by tweaking the refresh rate you're lowering the load on the GPU, perhaps at the cost of flicker and corruption as stated by AMD.

It's a cost/benefit calculation that the GPU's manufacturer made one way and some using software tools are making another.

It's also one you wouldn't have to make if you were using a GPU that was less powerful and generated less heat, or that used a different technology (the 5600M and HBM2) with the same result (less heat.)



Yes at all. You have to be joking. Going from 60hz -> 59hz lowers the load on the GPU such that the issue is completely fixed? Fans blasting and MBP 16" overheating to completely fixed. Wow.


You have to be kidding me.

Aug 9, 2020 12:40 PM in response to wealthandnecessity

Why?


The update frequency of the monitor has a direct impact on the speed tier at which the VRAM must be updated and the amount of load that puts on the GPU.


Imagine you're running on a treadmill, and there is a speed threshold that, if you exceed it, will cause the treadmill to speed up to twice as fast as that speed.


That's what we're talking about here; it may be as simple as the threshold between clock rates is somewhere around 59.97 Hz; slower than that the clock can drop down a speed (and thus power usage) tier.


This is a great example of the type of thing that goes on:


PC Perspective: TESTING GPU POWER DRAW AT INCREASED REFRESH RATES USING THE ASUS PG279Q


To quote the testing they were doing with a PC graphics card:


At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.

But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.


Why?


When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.

Though details are sparse, it seems pretty obvious what is going on here. The pixel clock and the GPU clock are connected through the same domain and are not asynchronous. The GPU needs to maintain a certain pixel clock in order to support the required bandwidth of a particular refresh rate, and based on our testing, the idle clock speed of 135MHz doesn’t give the pixel clock enough throughput to power anything more than a 120Hz refresh rate.


The article goes on to indicate that in their test, a comparable AMD GPU did not suffer the issue the NVidia under test did; it all depends upon the design of the GPU and the VRAM involved.

Aug 9, 2020 10:14 PM in response to Dogcow-Moof

That is both impractical and is not how modern Intel CPUs are designed.

They are literally designed to operate at a very high speed until they generate so much heat that their speed needs to be throttled; this is the concept of "turbo boost."


William, that is nonsense. You're mixing throttling with turbo boost. Turbo boost, which you've quoted, is what Intel does when it has spare thermal capacity. Notably, it increases the operating frequency faster than rated, until excess heat is generated. Throttling, on the other hand, is when the operating frequency is reduced below rated to avoid thermal shutdown / permanent damage. I, and others on this forum, have seen examples of throttling.


Throttling is a mechanism that Intel, as the chip designer, must include to handle the case that the system designer is unable to adequately move heat away from the chip to avoid a literal melt down. A well designed system should be able to avoid throttling.


Apple, as the system designer, has let us down here. It is not acceptable that a system of this cost and by a brand with Apple's reputation can't get this right.


So I stand by my comment: evidence of poor system design may either be thermal shutdown, or thermal throttling.


The fact that we have no examples of thermal shutdown is evidence that Intel have done a good job with their throttling protection mechanism, not that Apple have done a good job with their system design.

Aug 9, 2020 10:22 PM in response to ntompson

The fact that we have no examples of thermal shutdown is evidence that Intel have done a good job with their throttling protection mechanism, not that Apple have done a good job with their system design.

This is an excellent point, Apple can't just blame intel for making hot CPUs that they then push hard to the point the CPU has to throttle to prevent thermal shutdown.


Although that's neither here nor there, the issue is to do with the AMD GPU that is so poorly designed it has to put its memory at the maximum clock rate out of fear that displays may flicker. This isn't an issue with nVidia GPUs or older AMD MacBook GPUs and is the reason for the fan noise and overheating of the laptop.


The net result of all this uncontrolled throttling, partially fueled by the GPU eating into the cooling capacity of the laptop, is worse real world performance than an i5 from 2017.

Aug 9, 2020 10:25 PM in response to ntompson

ntompson wrote:

William, that is nonsense. You're mixing throttling with turbo boost. Turbo boost, which you've quoted, is what Intel does when it has spare thermal capacity. Notably, it increases the operating frequency faster than rated, until excess heat is generated. Throttling, on the other hand, is when the operating frequency is reduced below rated to avoid thermal shutdown / permanent damage. I, and others on this forum, have seen examples of throttling.


Throttling is also not unusual in every day operation.


More to the point, however, many here have reported improvement in their configurations by "turning off Turbo Boost."


Throttling is a mechanism that Intel, as the chip designer, must include to handle the case that the system designer is unable to adequately move heat away from the chip to avoid a literal melt down. A well designed system should be able to avoid throttling.


Once again, that's not how systems are typically designed. Throttling should be avoided where possible, but its not at all abnormal and certainly doesn't denote bad thermal design.


Apple, as the system designer, has let us down here. It is not acceptable that a system of this cost and by a brand with Apple's reputation can't get this right.


How are they not getting it right? Their machine is handling the situation as best they can - minimal fan noise is not a design parameter, the fans spin up to cool the system and they do so well.


The fact that we have no examples of thermal shutdown is evidence that Intel have done a good job with their throttling protection mechanism, not that Apple have done a good job with their system design.


Actually, it denotes both, as even with throttling poor thermal design could cause over-temperature shutdown. I've mentioned before it used to be trivial to trigger this on older MacBooks (not Pros) simply by attaching an external monitor and firing up Photoshop.

Aug 9, 2020 10:45 PM in response to Dogcow-Moof

Once again, comparisons with older GPUs is immaterial here as they were not driving GDDR6 VRAM to its highest clocks peed.

If I wanted loud fans and poor thermal design with excuses like 'well the RAM module it uses is hot' I would buy a cheap Windows laptop 👍


Fear or experience? I suspect AMD has tested far more monitor combinations than you have.

You clearly don't have experience with AMD, they are not above signing off on terrible GPU architectures (with horrible drivers). As was pointed out this exact issue was plaguing AMDs desktop GPUs from some years back. They simply manufacture an inferior product.


Apple pulled a sneaky one by using these defective parts knowing people who review laptops won't benchmark them when connected to an external display (seriously who expects this sort of defective behaviour from an Apple product?).


It's like a car that has to idle at red line unless it's using fuel from a specific company.


In your application; there are a wide variety of situations where performance on the MBP 16 is better, simply because older MacBook Pros couldn't do what the 16 can at all or at the speed it can do it.

The only time the 16 can outperform a 2017 i5 macbook is when it's not driving an external display.


I get that in every situation where the laptop connects to an external monitor, performance may not be comparable to a 2017 i5 MacBook Pro, this is true for everyone.

ftfy 👍



Aug 9, 2020 10:57 PM in response to ahmedfromreservoir

ahmedfromreservoir wrote:

If I wanted loud fans and poor thermal design with excuses like 'well the RAM module it uses is hot' I would buy a cheap Windows laptop 👍


Or, as I mentioned, previously, an expensive one.


That $2000 HP's fans fire up at the blink of an eye without an external monitor, and when I've mentioned it to other Windows-using friends they see it as normal. One has a Dell whose fans ramp up at login and stay there, all day, every day, just using Excel.


You clearly don't have experience with AMD, they are not above signing off on terrible GPU architectures (with horrible drivers). As was pointed out this exact issue was plaguing AMDs desktop GPUs from some years back. They simply manufacture an inferior product.


That's your opinion, but that's what's in MacBook Pros. If you dislike AMD so much, you should avoid MacBook Pros until they use an external GPU from a different vendor. No one yet knows what they will do in their Apple silicon MBPs.


Apple pulled a sneaky one by using these defective parts knowing people who review laptops won't benchmark them when connected to an external display (seriously who expects this sort of defective behaviour from an Apple product?).


Yes, yes, they're "defective" despite doing everything they were claimed to do and meeting every spec. 🙄


It's like a car that has to idle at red line unless it's using fuel from a specific company.


Many owner's manuals specify octane ratings that users ignore to their detriment. Regardless, the argument is a bit specious here.


The only time the 16 can outperform a 2017 i5 macbook is when it's not driving an external display.


Really? How about editing multiple 4K video streams in real time? That's something an i5 can't do as well as the MBP 16, external monitor or not.


I get that in every situation where the laptop connects to an external monitor, performance may not be comparable to a 2017 i5 MacBook Pro, this is true for everyone.


Not at all true; I have no issue whatsoever in my use cases, while performance is better than that of a 2017 i5, so to say "every" is completely incorrect.

Aug 9, 2020 11:23 PM in response to Dogcow-Moof

I should stop acting surprised by your comments William, but... I am... surprised.


minimal fan noise is not a design parameter


I remind you of the Steve Jobs quote:


Jobs hated fans. Hated them


It's in Apple's DNA.


Screaming fans is not something that we expect from Apple (and I am authorised to speak on behalf of my own expectations, as well, I suspect, of the expectations of many others). It just amazes me that you continue to suggest that we should be satisfied with poor thermal performance and loud fans.


Aug 9, 2020 11:31 PM in response to ntompson

ntompson wrote:


I should stop acting surprised by your comments William, but... I am... surprised.

minimal fan noise is not a design parameter

I remind you of the Steve Jobs quote:

Jobs hated fans. Hated them

It's in Apple's DNA.


He also hated styli, and Apple introduced the Pencil a few years ago.


The Verge: Here's why Apple made the stylus that Steve Jobs hated


He also may have hated fans, yet every Mac made during his tenure had at least one.


Remember the liquid cooled PowerMac G5 tower was also made while he was in charge.


Steve was also a huge advocate of skeuomorphism… enough said.


Screaming fans is not something that we expect from Apple (and I am authorised to speak on behalf of my own expectations, as well, I suspect, of the expectations of many others). It just amazes me that you continue to suggest that we should be satisfied with poor thermal performance and loud fans.


Actually the fans ensure excellent thermal performance; poor thermal performance would be if they didn't use the fans and the machine reached thermal shutdown.


Note that Apple may not be any happier than you are about the situation, which may be part of what drove them to make the decision to switch to Apple silicon, the same way the high heat output and relatively poor performance of PowerPC chips led them to switch to Intel.

Aug 9, 2020 11:37 PM in response to Dogcow-Moof

It just amazes me that you continue to suggest that we should be satisfied with poor thermal performance and loud fans.



Especially when there are alternative more powerful laptops in a simlar chassis size which don't overheat when driving external monitors.


No excuses, no ********, just decent thermal performance.




Actually the fans ensure excellent thermal performance; poor thermal performance would be if they didn't use the fans and the machine reached thermal shutdown.

I would say not thermally shutting down is the bare minimum "customers won't have legal groundings to sue us" performance.


I dunno what you think but by my definition of 'excellent thermal performance' would at the least be "better real world performance than a core i5 from 2017" 😂

Aug 10, 2020 1:03 AM in response to Dogcow-Moof

Recently I was finally able to visit an apple store in Australia and showed them my MBP. The engineer basically said "it is still in the 12-month warranty period. I would suggest you try and return your computer stating that the loud fan noise is a deal-breaker." He said that they were aware of the problem and that the fact Apple has so quickly moved away from the setup they had which resulted in the problem (less than 6 months between the "fan problem" and the new system) suggested they know there's a problem and they can't fix it - hence push hard for a return. His words. I have to go through the national Apple phone support to do that as I didn't buy my MBP at that particular store. I bought my computer in Japan and it has a Japan configuration that I need. Even with that, I'm going to try and return it ,though it is past the whatever day limit. I suggest everyone else do the same.


[Edited by Moderator]

MacBook 16-inch Fan Noise

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.