Apple Intelligence now features Image Playground, Genmoji, Writing Tools enhancements, seamless support for ChatGPT, and visual intelligence.

Apple Intelligence has also begun language expansion with localized English support for Australia, Canada, Ireland, New Zealand, South Africa, and the U.K. Learn more >

You can make a difference in the Apple Support Community!

When you sign up with your Apple Account, you can provide valuable feedback to other community members by upvoting helpful replies and User Tips.

Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

Eye strain from LED backlighting in MacBook Pro

There is one relatively serious con of the new LED backlit displays in the new MacBook Pros that seems to not get too much mention in the media. About a month ago I bought a new MacBook Pro to replace my standard white MacBook. One feature of the MacBook Pro that I was unaware of was the introduction of the LED backlit display to replace the CCFL backlight.

Once I started using my new laptop for long periods of time, I noticed severe eye strain and minor symptoms almost similar to motion sickness. After 20 or 30 minutes of use, I felt like I had been looking at the screen all day. Much longer and I would get headaches. If I used the old white MacBook (with its CCFL display), I had no eye troubles at all. Moreover, I could detect a distinct flicker on the MacBook Pro display when I moved my eyes across it - especially over high contract areas of the screen. White text on a black background was virtually impossible for me to read without feeling sick to my stomach because of all the flickering from moving my eyes over the text.

The strangest thing about all of this was that nobody else I showed the screen to could see these flickers I was seeing. I began to question my sanity until I did a little research. Discovering that the MacBook Pro introduced a new LED backlit display started to shed some light (so to speak) on what might be going on. I had long known that I could see LED flicker in things like car taillights and christmas lights that most of my friends could not see. I also knew that I could easily see the "rainbow effect" in DLP televisions that many other people don't see.

My research into LED technology turned up the fact that it is a bit of a technological challenge to dim an LED. Varying the voltage generally doesn't work as they are essentially designed to be either on or off with a fixed brightness. To work around this limitation, designers use a technique called pulse width modulation to mimic the appearance of lower intensity light coming out of the LED. I don't claim to fully understand the concept, but it essentially seems to involve very briefly turning off the LED several times over a given time span. The dimmer the LED needs to appear, the more time it spends in the off state.

Because this all happens so very quickly, the human brain does not interpret the flickers as flickers, rather as simply dimmer light. For most people that is. Some people (myself included) are much more sensitive to these flickers. From what I can tell, the concept is called the "flicker fusion threshold" and is the frequency at which sometime that is actually flickering is interpreted by the human brain as being continuously lit. While the vast majority of people have a threshold that doesn't allow them to see the flicker in dimmed LEDs, some people have a higher threshold that causes them to see the flickering in things like LED car tail lights and, unfortunately, LED backlit displays - leading to this terrible eye strain.

The solution? I now keep my screen turned up to full brightness to eliminate the need for the flicker-inducing pulse width modulation. The screen is very bright, but there are no more flickers and I love my MacBook Pro too much to exchange it for a plain MacBook with CCFL backlighting (which will also supposedly be switching to LED backlighting in 2009 anyway.) The staff at my local Apple store was of course more than helpful and was willing to let me exchange my glossy screen for matte even though I was beyond the 14 day return period. I knew that wasn't the problem though as my old MacBook was a glossy display. I've decided to stick with my full brightness solution. Sitting in a brightly-lit room tends to help alleviate how blinding the full brightness of the screen can be. In a dimly-lit room I guess I just wear sunglasses. Either way, the extreme brightness is worlds better than the sickening flicker I saw with a lower brightness setting

I would caution anybody considering buying a product with an LED backlit display to pay careful attention to make sure you don't have this same sensitivity. Turn the screen brightness down, find a high contract area of the screen, and quickly move your eyes back and forth over the screen. If you can detect the flicker, you may end up with this same problem.

I have no idea what percentage of the population has this sensitivity. I imagine we will hear more about it as more and more displays start using this technology. Hopefully the Apple engineers will come up with a way to eliminate this flicker some of us can see.

Russ Martin

15-inch MacBook Pro, Mac OS X (10.5.4)

Posted on Aug 23, 2008 8:25 AM

Reply
2,489 replies

Aug 20, 2017 2:50 AM in response to waylord

Last month I started to use my old Samsung monitor on my macbook,

by a simple HDMI to HDMI cable

User uploaded file


macbook Pro retina 13.3 inch early 2015

intel Iris graphics 6100


Samsung S22E360H

21.5 inch LED backlight

resolution 1920 x 1080

PLS panel( just samsung's own version of IPS really )

hardware refresh rate at 60Hz max


it's been more than three weeks,

so far this monitor has not given me any eye strain yet.

plain luck maybe.


if the graphics driver contributed to the flick

maybe this monitor filters it out?

I dont know.


Also note I installed the latest 10.12.6 updates before ready my monitor.

So I don't know if that counts, maybe Apple silently improved some graphics feature

which makes the flicker more subtle to our sense ?

I have absolutely no idea at this point


another thing, this monitor has the response time adjustable,

I changed that to 4ms, can't go less.

because it feels more natural than the factory defaults (I think 8 ms or 12 ms)

not saying response time is a factor

Im just being forced into the corner of the greatest denominator of all that I can think of

and besides, everything to gain nothing to lose with a lightning fast RT


guess all newest monitors by Dell, Samsung etc should be good enough, spec wise.

Try out.

Luck!

Sep 15, 2017 9:14 AM in response to waylord

UPDATE::


Despite the fact my Samsung monitor has been okay, I have got a chance to do an HD vs Iris graphic's eye strain test like on page 161 risvan77 has observed by himself.

My colleagues bought a new 2017 MacBook Air recently (with Intel HD 6000 GPU), we put two macs side by side:

The left is her Air, the right is my Iris GPU MacBook Pro early 2015

User uploaded file

User uploaded file

User uploaded file


I must admit that for no particular reason I almost fell in love with the Air's screen immediately out of the box although the color appears "washed"and the viewing angle is bad (Air seems to have a TN panel).

We then simply had same webpages loaded on two macs to see if the "HD vs Iris" myth is real!


IT IS REAL!

THE DIFFERENCE IS ALMOST INSTANT !!!

Both of us agree!!!!

The HD graphics Air feels right,

the Iris graphics Pro JUST feels wrong, ESPECIALLY SO as sitting next to her sibling!!!!!


Case closed!

The 10.12.6 update didn't fix the eye strain!!!.


I will wait for High Sierra, if the strain caused by ****** GPU drivers (now I can't think of anything else) does not disappear at all, I will immediately sale it for an Air(I love it already), because I can't bring a 21.5 inch monitor "on the go"!

Feb 10, 2018 11:55 AM in response to Keynode

Continued...

(Pics and quotes) From this 2017 review by Mehdi Azzabi

https://www.rtings.com/monitor/tests/motion/image-flicker


User uploaded file

Surprise!

1. BFI frequency can be independent from that of PWM, and it tends to be lower than that of PWM.

2. EVEN On flicker-free PWM-free monitors BFI can still be present.


Image flicker is a behavior commonly found on monitors where images shown on-screen will appear as a series of short duration impulses instead of staying on-screen constantly until they have to be replaced by the next image. Flicker has a large impact on the appearance of motion. Its appearance can either be an intentional method to improve motion clarity (usually referred to as black frame insertion or backlight strobing) or simply a side effect of the screen's brightness adjustment system (PWM Flicker).

Side effect....!

So aside from dimming, PWM can concurrently work as a means to cope with motion blur, taking full advantage of this "side effect"?

IS that why some devices to intentionally use a low freq band for PWM?

You figure out!!!



Following are some of the dedicated methods dealing with motion blur::

(Quotes) from this wikipedia page

https://en.wikipedia.org/wiki/Display_motion_blur



M e t h o d :: S t r o b e d b a c k l i g h t s

Different manufacturers use many names for their strobed backlight technologies for reducing motion blur on sample-and-hold LCD displays. Generic names include black frame insertion and scanning backlight.

Reading through this, i keep ask myself why somebody thinks sample-and-hold must go?

In some cases it's bad I can see it. (games?, movies?)


But too many scenarios out there in which sample-and-hold makes more sense::

Reading, drawing, texting, coding, writing, surfing, modeling, composing, idling, mind drifting, laughing at your screen,....

Not much "motions" involved in all those scenarios!


So toss that beautiful baby out with bath water??

Genius!!


Philips created Aptura, also known as ClearLCD, to strobe the backlight in order to reduce the sample time and thus the retinal blurring due to sample-and-hold.

The word strobe immediately alarms!

What's the Frequency of this strobing then?

You figure out!!


Samsung uses strobed backlighting as part of their "Clear Motion Rate" technology.[9] This was also called "LED Motion Plus" in some previous Samsung displays

Ditto!


BenQ developed SPD (Simulated Pulse Drive), also more commonly known as "black frame insertion", and claim that their images are as stable and clear as CRTs.[11][12] This is conceptually similar to a strobing backlight.

Stable? what does that mean?

CRT's have phosphor screens to soften the flicker graph because of decaying effect of it between refreshes.

Is that decay being simulated too?


Sharp Corporation use a "scanning backlight" which rapidly flashes the backlight in a sequence from the top to the bottom of the screen, during every frame.

One scanning cycle every frame? So..., 60hz?

Can I say bad?


nVidia has licensed a strobe backlight[15] technology called LightBoost to display manufacturers. This is normally used to reduce crosstalk during 3D Vision, which utilize shutter glasses, however, it also eliminates motion blur due to its ability to keep pixel transitions in the dark between LCD refreshes.A 'hack' method or utility tool is needed to take advantage of LightBoost backlights for blur reduction benefits.

No idea on that one, more data needed about LightBoost.


BenQ later developed their own native "BenQ Blur Reduction" technology, integrated into several of their gaming monitors. This offers a strobe backlight which can be easily turned on and off by the user. There is no control over the strobe timing or strobe length for the user, although third party utilities have been produced for this purpose. Newer firmware for the BenQ Blur Reduction monitors allow direct user control over the strobe pulse (timing) and strobe length (persistence) directly from the Service Menu.

Stand up and applause folks! Common Sense finally arrived at the door of BenQ so as to GIVE BACK TO US USERS THE MAGIC SWITCH TO TURN STROBES OFF!!!!


And Interestingly, the monitor has its own firmware driver for strobes !!?.



Eizo have also introduced their 'Turbo 240' option used so far on their Eizo Foris FG2421 gaming display. This allows the user to control the strobe backlight on/off easily to reduced perceived motion blur

Applause!! Common Sense is not exclusive!


LG introduced a similar 'Motion 240' option on their 24GM77 gaming monitor

ULMB is a technique provided alongside NVIDIA's G-sync technology, and linked to the G-sync monitor module. It is an alternative option to using G-sync (and cannot be used at the same time), offering the user instead an "Ultra Low Motion Blur" mode. This has been provided on various monitors already featuring G-sync (e.g. Asus ROG Swift PG278Q, Acer Predator XB270HU). For newer games with a higher demand for graphical power, G-Sync is preferable over ULMB.[17]

NO idea ....



To be continued....

Feb 20, 2018 8:40 AM in response to Keynode

I was trying to see whether PWM and BFI look indistinguishable to one another to the eye, or the eye can tell a difference between the two and how big the difference would that be, I did a simple simulation!


The idea is to draw keyframe images, and make them into a single GIF file. It would be like a slowed down version of their actual frequencies, but more or less it can give you some taste about it. And here it is! (I've tested the GIFs on Safari, Chrome and Opera I do like best the Safari's way of rendering GIF animation. )


NOTE: You may want to scale up if images appear too small on your screen. Flashes in these GIFs may cause eye strain though, so be careful!




Okay, PWM first!!

because PWM is like a 2-phase full-screen strobe {the whole backlight quickly turned on and off; } I only need to draw 2 keyframes:


User uploaded file



...and make them into a GIF that quickly shifts between the two:


User uploaded file



As you can see:

1. the brightness of the area around those letters are now dropped like 50%! In other words yes, it dims!

2. Think that's an eye strain? Read on!!





Let's move to BFI!!!

I started by creating a simple 5-phase strobe/scan pattern that takes 5 images based on the same template.

But this time only one fifth of the area is displayed(strobed) in each keyframe.

It takes one cycle to cover all areas, top to bottom!


User uploaded file



Animate that sequence in a GIF:


User uploaded file



I don't know about you but here's what I think:

1.It makes PWM flicker look like an eye massage!

2.it definitely disrupted my concentration.

3.it looks like a migraine trigger!!

4.I think flashes that came in arranged sequence can do more harm than just plain flash.


(sorry for the eyestrain folks, Ill see what I can add in the coming days )

Mar 7, 2018 7:20 PM in response to Keynode

Can DITHER alone bring about a massive eye strain?

I had some quick fun with dither!

what is dither?

The basics are the same, dither is like "PWM" used on colors

{Suppose} A panel does not support certain color [left] !

{Suppose} It can pull off brighter tones like [1], and a darker tone like [2].

User uploaded file



switch btween [1] & [2], quickly {shown here as in slow motion gif}

User uploaded file



that's a bad flicker!! because a leap from color [1] to color [2] secured a good oscillation amplitude, which leads to bad jigging of color.


So is dithering that bad?

maybe I just picked wrong colors to begin with!!!

In real world the panel must know better than the version of me in that test in how to sift more efficiently to avoid such a leap.


Knowing a 6 bit panel knows better than me, I can't sleep, I too must know, even that means I had to relearn some math


First I must know how colors are differentiated.

A pixel consists of 3 sub pixels ==> Red, Green, Blue!!!

User uploaded file



On a 8 bit panel each color has 2 ^ 8 = 256 shades, hence the value in RGB: (0, 1, 2, 3,... ... 253, 254, 255)

Every combination of an R, G, B value gives a unique color, for example, rgb(230,187,132), a light brown.

User uploaded file


The sum of possible 8-bit colors would be:: 256 ^ 3 = 16,777,216 (hence the number 16.7M!)

But for a 6 bit panel it only has 2 ^ 6 = 64 shades (below), that's a lot shades lacking.

User uploaded file

(color borders appeared!!! for lack of shades to smooth the gradients. )



Enter dither!!

Between every two bordering shades in that pic, three shades would have to be created to cram in that gap


Why three ?

Because 63 gaps in 64 shades.

63 x 3 + 64 = 253, close enough to a 8 bit's!!

it gives 253 ^ 3 = 16,194,277 color (hence the number 16.2M on dither enabled 6-bit panels)


Enough math!!! now lets "dither"!!! which will begin in next post...

Mar 8, 2018 5:56 PM in response to Keynode

Because manufacturers have their own practices pairing and calibrating shades, Ill do it the easy way, leave out unnecessary complications.


First lets see how to make new shades:

Map 64 shades to RGB value, multiply them by 4:


[0, 1, 2, 3, 4, .... 61, 62, 63] ==> [0, 4, 8, 12, 16, ... ... 244, 248, 252]

They would be displayed natively, no dithers required.


For the missing values ==> [1,2,3; 5,6,7; 9,10,11; ... ... 245,246,247; 249,250,251],

they will all be dithered in!!!


Lets see how to fill in gap between shades [4] & [8], that is, [5, 6, 7]!!!

{other triples would follow the same rule}


[5] is at the 1/4 point across, so 75% of the time the sub pixel would be put on shade [4] and 25% of the time on [8]!

[6] is the midpoint, a 50/50!

[7] is at the 3/4 point, so 25% of [4] and 75% of [8]!


User uploaded file

The break points [MARK] would be precisely controlled, hence the term Frame Rate Control.

each cycle would be four, or two, or even one frame length if response time can afford.


everything's ready now, time to start dither simulation!

my first pick would be rgb(40, 125, 210)!


Values sent to each sub pixel would be cycled in this sequence::


R ==> .... || 40 . 40 . 40 . 40 || 40 . 40 . 40 . 40 || ....


G ==> .... || 124 . 124 . 124 . 128 || 124 . 124 . 124 . 128 || ....


B ==> .... || 208 . 208 . 212 . 212 || 208 . 208 . 212 . 212 || ....


that's like looping through 4 colors, extracted by column

User uploaded file

Not much differences among them I must say,

I have a bad feeling the flicker will be weak.


Animate it anyway!

User uploaded file


Flickers?? Yes it's weak!!

That's a huge let down!! I was expecting much more than that!

If you can even sense it that's because I've slowed way way down that already "sluggish" gif!!!

maybe I would rate it 3 out of 10; 2 for the flicker, 1 for the aftershock of disbelief.


First impressions? The space in adjacent shades are too cramped to embody a serious flicker. That's not to say human eyestrain gene won't pick it up though, just compared with BFI and PWM it's definitely dwarfed (me opinion, of course)!

User uploaded file


Other colors at different positions on the rgb scale have been tried for dither too but Ill not bore you with more gifs!!

Seems unable to do enough harm?

Or maybe in other ways likely overlooked, remember the [MARK]? i ll return to that later 👿

Mar 13, 2018 7:05 AM in response to K Shaffer

K Shaffer wrote:


..Since my last response to this thread was in 2016, thought

to look in ~ as each reply here brings it back to my attention..


There may be a different percentage of users whose biology or

physiology (correct words evade me) could be more sensitive

to the effects/affects of a panel's tiny 'pixel square refresh' rates.


Just as some who may have a marginal propensity to have similar

effect at much less a degree, to tiny these flashing things... As ones

with certain kinds of epilepsy. Not all go into seizure-state due to this.


..So that's a margin not unlike casting an untested net into wider waters..😐

Exactly!

and my thoughts on stats of MBP users reacting to flickers: The two extremes will be at the two tapered ends,

the in betweens are where the real majorities are.

in other words, more people will not consider their MBP display perfectly enjoyable, more or less

We're talking about perfectly!! 😀😀😀

Too bad the normal distribution law must be obeyed or so!! 👿

User uploaded file

Apr 9, 2018 1:24 AM in response to stanislavd

Maybe my last posts on this matter.

Tried an e-ink 13.3 inch monitor to a Mac, for someone like me who search &read alot I found my ultimate solution.

User uploaded file


User uploaded file

yes, 0 backlight, 0 flicker, 0 nothing!

***** at watching videos but I can't complain!


Reason? I'm scared! I mean scared!

from iristech.co/pwm-flicker/


Monitors are like a light bulb. But instead of one light bulb, we have millions of them in the size of several inches.



In order to reduce their energy use and brightness, you need to turn them ON and OFF hundred of times per second.



And this thing, this ON and OFF thing is called flicker.



Our brain is slow and we do not perceive this, but our eyes are fast and our iris starts to open and close like this

User uploaded file

Of course, the amplitude here is much bigger the show the effect since the flicker rate of the monitor is much faster but basically, our eye starts to contract like a muscle.



You can test this by turning the light in your room ON and OFF fast and take a video or look in the mirror.



The science behind this thing is that in dark we need more light and our pupil is dilated.



When there is a lot of light around us or there is lots of daylight our pupil is undilated.



This is how our eye controls the amount of light entering it.



You may be thinking why do the monitors need to turn ON and OFF and why they can’t just glow all the time, but it’s not that simple.



LED lights will use a lot of energy if they are constantly ON and they may also overheat. Same goes for other monitor types.



The bigger problem is actually that the lower the frequency of this flicker and the bigger the breaks between this 2 states, the more energy efficient the monitor is.

For me thats the 1 pic better than 1,000 words moment, i'M done, I opt out!

Apr 9, 2018 3:05 AM in response to Keynode

As for what dynamic PWM is, Goog

User uploaded file

Very sophisticated now, not single backlight piece, but backlight array/matrix.

User uploaded file


note the spatial/temporal control of these units on the fly, all programmable

at one moment it could be a double scan 3 phase


User uploaded file User uploaded file


next moment maybe a n-scan x-phase in relation to brightness/contrast/battery..etc

(thus may explain why less or more strains at different settings)

that's just 1D!


and don't forget OLED, where every pixel can be turned on/off individually!

Translation: for a 2000 x 1500 pixel dimensions OLED screen, thats 2000 x 1500 = 3,000,000 or 3 million micro backlights alike!!!


No rules against that it can not be used as strobes to do dimming or for anti-motion blur purpose

User uploaded file User uploaded file

that's a 2-phase, chessboard pattern.

on and off squares are among each others, so the flickers "blend in".

All patterns programmable, where and when to be on/off.

in 2 yrs we might have display A.I. decides for us about what is the best for our eyes to ingest or interpret

idlike to decide myself thank you.

Opt out!!

Nov 5, 2017 8:36 AM in response to Keynode

Keynode wrote:


10. Apple should seriously consider adding PWM and Dithering checkbox to the display settings in a future update, it's not that hard to do. Night Shift is a good move, it shows Apple's concern over the health impact of excessive blue light from LED display is real. So why stop there, Apple? You have enough cash, please continue doing what's right for the customer!! This Eye Strain thing is real!


* "PWM Interface" (or PWM Programming Interface) is just a made up term, some or all of the above may be proven wrong, I'm evolving.

You continue to address Apple in this 9 year old thread. Apple still is not here. This is a user-to-user technical help community, it is not a medical advice nor Apple questioning community.


If you want to contact Apple or provide Apple feedback you can use these links:

Contact - How to Contact Us - Apple

Product Feedback - Apple

Jun 22, 2018 6:51 AM in response to madieDee

MadieDee, after 166 pages the issue you are experiencing has not been resolved. You can post another 166 pages here and the issue will not be resolved. As said numerous times, this is a user-to-user technical help community. Apple is not here. To get Apple's attention to the issue you are experiencing you will have to contact Apple and provide feedback.

With respect to your health, I am glad you are a very healthy, strong, and powerful woman. Maybe writing 10 - 12 hours a day is starting to affect your eyes but I'm not a doctor so can not say. If it were my eyes and I was experiencing headaches I would most certainly see a doctor.

Nov 5, 2017 1:46 AM in response to Keynode

Here's a new episode from this flicker-hunting season::


A friend has an Acer notebook which has PWM setting in BIOS!!! Sounds too good to be true!! But that's after he suffered a good six months of eye strain until he bumped into a post that teaches how to unlock hidden advanced BIOS features for Acer notebooks. When unlocked, the specific item is under the Integrated Graphics sub menu, it's called Backlight Control which lets you select one of the following::


PWM inverted

PWM normal

GMBUS inverted

GMBUS normal


The default value is PWM inverted, he figured better get rid of anything that has the cursing letter "PWM" in it so he changed it to GMBUS inverted without consulting what the heck that is,…. and poof! The eye strain is no more.


Now some theory ::


  1. Maybe how wrong I was, I might have underestimated the PWM factor. Very probable that most panel/motherboard OEMs have certain kind of "PWM Interface"* embedded in BIOS/firmware, I believe this is universal to almost all laptop PC's and Mac's, and it is actually an "exception" that Acer allows user to gain such access to switch it off. (through "hack")
  2. If OEMs have this PWM Interface embedded in BIOS/firmware, the graphics vendor can write codes targeting it.
  3. The friend's Acer notebook is listed on notebookcheck site as PWM-free, obviously It doesn't mean PWM Interface is not onboard.
  4. As long as the Interface stays open, it matters not the panel is PWM-free or it has a fixed high frequency of PWM, because the software can easily take over and take control.
  5. Intel graphics driver, and/or Intel power management driver, should have full privileges to this PWM Interface, Intel drivers may deem it "handy" to use PWM for dynamic control of brightness and panel power if it has detected such interface, thus completely ignoring other methods.
  6. The frequency of PWM is probably dynamically adjusted too via the driver, when you watch full screen movie, the driver "sees" to it that one frequency band should be used, when you edit text or browse web pages (static content) another frequency band is used.
  7. Because the driver can communicate with the LED backlight through PWM Interface, a driver update can completely change how the lighting behaves if it deems such change convenient or energy efficient.
  8. External monitor has independent power cord so power saving is not a concern, it also has its own brightness mechanism, all of these data should be recognizable and later read by the graphics driver as the monitor plugged in, the driver shall then decide it's not its business to force PWM on it. (That said, I don't know if there is watertight evidence that the driver absolutely can NOT force PWM on external displays).
  9. With that being said, maybe it can be concluded if the monitor is eye-straining there must be a dithering source, unless the monitor on itself uses a very bad PWM for dimming.( I think that's a rarity)
  10. Apple should seriously consider adding PWM and Dithering checkbox to the display settings in a future update, it's not that hard to do. Night Shift is a good move, it shows Apple's concern over the health impact of excessive blue light from LED display is real. So why stop there, Apple? You have enough cash, please continue doing what's right for the customer!! This Eye Strain thing is real!


* "PWM Interface" (or PWM Programming Interface) is just a made up term, some or all of the above may be proven wrong, I'm evolving.

Oct 16, 2017 4:33 PM in response to RMartin111

I will post this weekly, until someone from Apple responds. Bad news, people.


I own a 15' Macbook Pro 2015. From the beginning, my eyes hurt immensely. The black colours were too sharp and there was no way of making them softer (I tried everything, from using special apps, to recalibrating the screen).

After a month, I decided to buy a matte screen glare protector from Moshi. It helped a little, but the eyes still hurt.


After another month I've noticed that there are white soft spots on the screen (a monitor defect). I was still under warranty, so decided to get my screen replaced. Apple service center confirmed the problem with the white spots and replaced my screen.


And guess what? My eyes didn't hurt any more. The new screen is much softer to the eye. The blacks are not that sharp. No eye pain, no headaches.


My guess is that Apple uses screens from several suppliers. Each supplier provides a different screen quality. And I was lucky to get my screen replaced with the one from a good supplier.


This is a crazy lottery, people. Some people will get lucky and have proper screens right from the start. Some will get a worser screen and have problems.


Apple will not admit this. Because this defect covers at least 30% to 50% of all released macbooks. Replacing this amount of screens will make Apple go bankrupt.


My specs:

Macbook serial number: C0*******8WP

New screen model: 0000A02F

I think my old (bad) screen model was: 0000A02E


If you have headaches, you can look you screen serial number under System Preferences -> Displays -> Color -> Color LCD -> Open Profile -> Scroll to bottom to 'mmod' and look at 'Model' below.

User uploaded file


<Personal Information Edited by Host>

Eye strain from LED backlighting in MacBook Pro

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.