Previous 1 104 105 106 107 108 Next 2,312 Replies Latest reply: Oct 1, 2015 12:04 AM by harrision_1234 Go to original post Branched to a new discussion.
  • tfouto Level 1 Level 1 (0 points)



    If the white screen hurts the most, then it should not be anything to do with dithering, or openGL 3-d graphics. It seems backlight it's the main culprit.


    The best test would be open the monitor and just stare at the back light for a few minutes with no pixels in between. And see if pain would appear or not. Of course it's not a easy of cheap thing to do

  • mvanier Level 1 Level 1 (0 points)

    tfouto: I hope you're right.  It does seem odd that dithering would be the main problem in this case, as most of the things I've read say that dithering is most apparent on dark backgrounds (making dark colors look more grainy).  On Linux with Intel graphics you can theoretically control the PWM frequency of a monitor.  I'm going to try this and report back.  Some people have managed to make otherwise bad laptops usable by greatly increasing the PWM frequency (e.g. from 200 Hz to 1200 Hz).  I wonder why the higher frequencies aren't routinely used.

  • dmendel Level 1 Level 1 (0 points)

    While PWM at lower frequencies may certainly cause problems for many  people, that cannot be the source of the problem on Apple displays: Apple displays do not use PWM to control brightness. So it is something else that is going on. One reason I suspected dithering as a culprit (or one piece of the puzzle) is that when I looked at the new 27" iMac display I detected a kind of subtle shimmering that made it very hard to focus on the screen, even at low brightness and with f.lux. From what I have read (and I am no expert and may be completely wrong), dithering involves a switching between two colors to produce a unique color; this process might produce a kind of shimmer or flicker that some people are sensitive to.

  • mvanier Level 1 Level 1 (0 points)

    dmendel: While certainly dithering could be a part of the problem, I am not 100% convinced that Apple displays don't use PWM or some variation of it.  The exception is the iPhone displays, which to my eyes look rock-solid (at least my iPhone 4s does) and cause very little eyestrain except perhaps due to some excess blue light.  It is true that Apple displays don't appear to use conventional PWM where the entire display flicks on and off at high speed.  However, they could be doing something cleverer e.g. at 50% brightness having half the LEDs off and half on at any given time, perhaps with periods randomized to avoid movement artifacts.  I look at Apple LED displays and they do not look solid at all to my eyes.  They don't show obvious strobing with the pencil test, though, so again, naive PWM seems to be out.  I think, though, that if Apple had perfected a current-controlled backlights which never flickered, it would be all over their ad copy, just like it is for Dell, BenQ, and Eizo who have all come out recently with monitors with current-controlled backlights.

  • Scott98981 Level 1 Level 1 (0 points)

    Has anyone tried the Kindle HDX screen with quantum dots? As I understand its backlit by a WLED but the spectrum is diffused with red and green quantum dots to better represent the color space.

  • tfouto Level 1 Level 1 (0 points)

    I have this theory about high frequency pwm.


    Maybe this monitors have such a high PWM that we can't measure. Even conventional photodiode/phototransistors aren't able to measure this high frequencies. We would need high speed photodiodes...


    If we had high PWM monitors the problem wont be the traditional flickering. But the fact that the max. brightness of monitors, these days, are increasingly to insanely high levels. The same with  higher saturation/contrast. So for example, 30% of brightness means 30% of time this monitors have 100% brightness for 30% of the time. It's really intense to the eyes, even if it's only 30% of the time.


    What would happen if we could stare at the sun and blink the eyes at 20 khz per second and manage to have 90% of the time our eyes closed? We would have 10% perceived brightness, but the sun would enter full strength 10% a second. I am sure we would have our eyes damaged in a few minutes or at least hours...


    I also think that the narrow and unbalaced spectrum of light is not really healthy to our eyes. It's just not the high blue light energy. It's really the non-linear nature of spectrum. And this combined with higher and higher brightness dont help at all...

  • SimonStokes Level 1 Level 1 (0 points)

    I  just ordered a new Mac Mini and a Benq GW2450HM (same specs as the one noted above). Should have everything set up and tested  within the next 10 days.


    Definitely looking forward to hearing the results of this! Looks promising, although I'm still unsure whether my migraines are caused by the LEDs or the PWM. Fingers crossed it's the PWM..

  • mvanier Level 1 Level 1 (0 points)

    @Scott: If the Kindle HDX uses quantum dots, that's great news!  This is a really good new technology that I've been hoping would hit the market soon.  Quantum dot displays don't use WLEDs.  Instead, you have a standard blue LED at 450 nm (usually) and quantum dots to generate the green and red components.  It should yield a much better color spectrum than WLEDs, pretty similar to BGr-LEDs actually.  Of course, there is still the 800-pound gorilla of how dimming is done.

  • mvanier Level 1 Level 1 (0 points)

    tfouto, that's a very interesting theory.  It reminds me of a guitar amplifier I once had.  It was only 22 watts but sounded as loud as most 100 watt amplifiers (a Mesa Boogie Studio 22, if anyone cares).  It turns out that what they were doing is making the attack (initial volume) extremely intense to trick the ear into thinking that the sound was larger than the wattage would suggest.  It hurt my ears at first but I eventually got used to it.  I've heard some people say that the Apple displays also hurt their eyes at first but they got used to it too.  Certainly PWM at > 1000 Hz shouldn't be perceptible to most people unless there is a beat pattern going on.


    As for the unbalanced spectrum, it may be annoying and distasteful but I don't think it's harmful.  Office fluorescent lights have a horribly unbalanced spectrum but most people tolerate them well (at least, ever since they increased the ballast frequency from 60 Hz to >20Khz!).  I do think that the ultimate pixel would be capable of generating any wavelength or combination of wavelengths whatsoever, but I doubt I'll live to see that.

  • peter_watt Level 3 Level 3 (905 points)

    "What would happen if we could stare at the sun and blink the eyes at 20 khz per second and manage to have 90% of the time our eyes closed? We would have 10% perceived brightness, but the sun would enter full strength 10% a second. I am sure we would have our eyes damaged in a few minutes or at least hours..."


    On that theory, shutter speed on digital camera would be irrelevant. The whole concept of square wave attenuation would be brought into question. Maybe that is your point when applied to eyes.

    In support of the theory, my son's model railway (railroad) had a speed controller I built using square wave attenuation. The bursts of 100% voltage like little hammer blows gave a smoother control at low speed compared to voltage attenuated store bought regulators. 

    That however was overcoming friction which by definition has a threshold coefficient whereas organic receptors tend to be smoothly reacting and with natural recovery mechanisms. On a macro level, exposing skin to sunlight for one minute in ten would not lead to cumulative damage however long you sat out in the sun.


    My qualifications are engineering, not medical, so purely speculative.

  • mvanier Level 1 Level 1 (0 points)

    Just a quick note to point out an interesting reference:  According to this paper, under some circumstances the eye can detect flicker at as high a frequency as 2 Khz (2000 Hz) or in some cases even higher.  After 3 Khz there is pretty much no detection.  That gives us an idea of how high PWM frequencies would need to be to completely eliminate flicker detection. 


    On that note, here's another link which describes a hardware modification to increase the PWM frequency of a TV to 2.2 Khz: acklight_PWM_Frequency_of_Your_TV.  The interesting thing about this article is the disclaimers.  Apparently, high PWM frequencies make the TV much hotter and presumably use much more power, which may explain why laptop manufacturers persist in having such low PWM frequencies on their panels despite all the disadvantages associated with this.  It seems like current-control is the only good way out.

  • kvoth Level 1 Level 1 (0 points)

    I upped my prism again and I'm feeling way better.


    Everyone here needs to be measured for strabismus/prism. It is critical that you 'relax' your eyes during this test or you will be mis-measured.


    Relaxing your eyes, at first, is hard to do. When you catch yourself zoning out, staring at one thing, but not really looking a it, you are most likely relaxing your eyes. Practice it by looking at a pencil up close. When you relax your eyes you will see more than 1 pencil. Then take a video camera and video yourself doing this -- you'll see how your eyes point normally.


    Good luck, guys. I hope some of you listen.

  • peter_watt Level 3 Level 3 (905 points)

    Take care, I suggested going to the optician early on in the thread and was called a troll.

  • milocricket Level 1 Level 1 (15 points)

    Definitely backlight problem, I get eyestrain too.  I always keep brightness at 4 bars.  My friends have retina displays and I can use those with full brightness and no eye strain.  It would be nice if I could basically trade my LED backlight one in for a retina display one.

  • Jessiah1 Level 1 Level 1 (0 points)

    Lol, I am sure your not a Mythical creature Working with an Optician or Neuro-opthamologist is only common sense in relation to the issue everyone has here, if we had no eyes we would certainly not have any issue!!!!