I looked into this a while back. Some LED light bulbs flicker and some don't. The cheaper ones tend to flicker. The CREE 60-watt equivalent (about $15) flickers, which is a shame since the light it puts out is fairly yellowish and should otherwise be easy to tolerate. The Philips L-prize bulb (very expensive, award winning) doesn't flicker at all, but the light was too harsh for me to tolerate it. I don't think it's an energy savings issue at all; LED bulbs are already very efficient energy-wise. I think it's that the circuitry to smooth out the rectified waveform (which is what causes the flickering) is (a) expensive, and (b) somewhat bulky, so being able to cram it into something that is the size of a light bulb is a serious engineering feat. Serious engineering feats cost money. For now, I think the compact fluorescents are much better than the LED bulbs, and are likely to remain so for many years. I still prefer good old incandescents, of course
The reason for the flickering has nothing to do with dimming, by the way. Zero, nada, zip. It's because the bulbs are powered by AC current which alternates at 60 Hz. Passing this through a power diode (not a big deal) "rectifies" it into a very bumpy waveform that flickers from zero intensity to full intensity at 120 Hz, which some of us can detect and which causes massive eyestrain among those sensitive to it as your iris repeatedly expands and contracts with the 120 Hz cycle. To reduce the flicker to undetectable levels requires smoothing circuitry which generally includes a big capacitor (bulky, expensive). This is the hard/expensive part, and if nobody complains, the industry won't bother to do it. It's just like the situation with PWM. If you complain enough, they will fix it. They already know how.
I'm certain that the industry knows all about the issues with LED bulb flicker; they put out these inferior bulbs because they know that most people can't tell the difference. However, with the current price of LED bulbs, it will be a long long time until they become commonly adopted, if ever. The best that we can do is complain loudly and write scathing reviews of flickering bulbs on review sites, so that other people understand the issue.
What you say makes sense and I am no lighting engineer however from everything I have read about LED bulbs it sounds like most do flicker and I can confirm this about many of them but of course not all. I have seen several slow motion videos showing them flickering, perhaps it is the power supply. So what your saying is the Phillips bulb has technology in it that keeps it from flickering? I notice the L-prize bulb is more expensive than the other ones that look exactly the same, the cheaper ones I have seen under slow motion video flickering. Just to add a short list of LED lights I can confirm flicker through video:
-Eyebrow lights on Audi's and other cars
-Tail lights on most cars
-Many of the CFL and LED replacement bulbs
-Grocery store freezer lights
-LED track lighting
I will finish by saying I have not found a signal LED light source that does not immediately drive me towards a migraine in minutes. I can tolerate CFL's and most fluorescent lighting with anti-glare coated glasses. My theory there is that the flicker from fluorescent is not as bad because it is a gas and cannot flicker to 0% light where an LED is electronic and leaves no "Ghost image" of light behind making the flicker much worse. The other factor is the blue light which I find piercing however if it were just the blue light then fire flicker and yellow fluorescent lights should not be such an issue.
I can only share what I know, I built a website with many links on it about flicker and spectrum and I have done endless reading. I would say that I am one of the most sensitive of anyone I have talked to so if you have a theory you would like me try I would be happy to oblige
One other input I have is that an opthamologist recently told me my eyes "Focus hard", another clue that my eyes cannot focus on the light leading me to believe the flicker is creating an unstable image for me.
Yes, what I'm saying is that the Philips L-prize bulb has specific circuitry to prevent it from flickering. Hence the high price (at least, that's part of it; they also include red LEDs to improve the color balance). As I said, the industry is aware of this problem. I saw a web advertisement some time back from a German company (can't find it now, sorry) who specifically put this in their ads; something like "most LED lights flicker [show a high-speed video of flickering LED lights] but our LED lights do not flicker! [Show a high-speed video of their LED lights which don't flicker]" The catch was that their LED lights were not light _bulbs_; they were more like light sheets or panels. The problem is having to cram everything into a light bulb that fits into a conventional light socket that was never designed with LED lights in mind. The thing to realize is that LEDs are direct current devices; they are intended to be used with a direct current. Alternating currents cause the problem. If homes were wired with circuitry to change the AC current into DC before it got to the light bulbs (don't hold your breath on this happening any time soon), then the flicker from LED lights would go away.
LED light bulbs have other problems, too. LED circuitry is very sensitive to heat, so they generally have to have big heat sinks that emit a lot of heat; very ironic for "energy-efficient" bulbs. You can burn your hand if you touch one! This is also why very few LED bulbs can be installed in closed fixtures. There is one recently that came out that claimed it could be installed in a closed fixture due to its spiffy high-tech cooling system based on silicone (if memory serves), but then (you guessed it) the reviews said that it flickered horribly.
Bottom line: LED light bulbs are a very immature technology. They are very expensive, most of them flicker, most of them give off crappy light if not outright glarey light, they get hot, and they can't be used everywhere. They make compact fluorescents look very good in comparison. There are other technologies coming down the road, and compact fluorescents are also improving, and so LED light bulbs will either have to dramatically improve or they will not survive in the market. But either way, now is not the time to buy them.
Computers: 2008 Mac Pro and Macbook Pro with CCFL screens. I use external screens almost exclusively even on the laptop. I have a 2007ish CCFL-backlit Apple 23" Cinema Display and a brand new Dell U2410 24" display (both highly recommended). I can't stand LED-backlit displays, like most of us. As for phones, I use an iPhone 4S. It hurt my eyes at first but I adjusted. No PWM as far as I can see, but a bit high on the blues. My old iPhone 3GS was much easier on the eyes. As I've said before, everyone who is suffering from LED-backlight problems should get a CCFL-backlit monitor while they still can. There are new monitor technologies that may improve the situation (like quantum dot LEDs), but for now all there is are those horrible "white" LEDS that have way too much blue. Also, I strongly recommend using f.lux on any Mac; it makes a huge difference and it's free.
I've seen the U2711 and it's beautiful. It should be easy enough on the eyes, but (a) it's crazy expensive (about $1000) and (b) some people have complained that the antiglare coating is thick enough to make text appear "sparkly". Personally I find 24" more than large enough. I'd go to a nearby store and check it out for myself if I were you.
I assumed that vecause there was supposedly no dimming on the bublbs that there was no flicker - I guess not. I really feel your pain my Uveitis is exacerbated terribly by all forms of LED and actually threaten my vision.
I wonder how - besides the PWM and blue frequencies, LED lights differ from normal lights and CCFL backlight screens.
Thanks all for the info
All the best
Good info Mvanier and it confirms most of what my un-scientific brain has come to understand about LED lights, unfortunately I believe LED lights are here to stay. The technology is changing, business are being pushed by government to adopt them. In my neighboring state of Vermont there are hardly any business's left who have not been visited by the power company and were forced to change over to LED's. A lot of the new track lighting style ones I see appear to be yellowish light and even incandescent in appearance however they still destroy my eyes and brain immediately....Would you agree with my theory that the energy savings on the LED lighting over time is a wash based on the recycling factor of the technology? Also, running this hot all the time I would highly doubt these lights will last the predictited 20+ years they are claiming. From what I have read these LED lights have plenty of toxins and metal in them to go with all the circuitry it takes to run them as well, sounds like a lot more waste than an incandescent bulb....I say catch 22 CLEAN energy savings fail over all....
I use an old CCFL DELL monitor with windows XP, I had an Iphone 4 that did not bother me without my anti-glare glasses for 10 minutes which I recently replaced with another iphone under the theory that would be ok....It was not and now I cannot use my new phone for web and texts are almost unbearable causing me to be sensitive the more I texts I get. My issue is like a bucket of water that cant overflow in 24 hrs, once I add to much I get a migraine and looking at my phone or anything LED for 10 minutes is way too much....
Does anyone know how to turn off temporal dithering on OS X?
All I could find was this, which is old and does not work:
I then went to get IORegistryExplorer from apple's developer support tools site, but that didn't work. So, I installed DSDTSE which has a registry viewer. In the viewer, I couldn't find any keys related to temporal dithering.
It would be very interesting to turn it off and see how things feel. I am trying to dual boot Ubuntu on my mac because I think I'll have better luck fiddling with dithering on that.
I've been reading a bit about dithering and it seems that it could explain why some external monitors are fine. Scroll down to dithering here:
8-Bit vs. 6-Bit
Now color depth was previous referred to by the total number of colors that the screen can render, but when referring to LCD panels the number of levels that each color can render is used instead. This can make things difficult to understand, but to demonstrate, we will look at the mathematics of it. For example, 24-bit or true color is comprised of three colors each with 8-bits of color. Mathematically, this is represented as:
- 2^8 x 2^8 x 2^8 = 256 x 256 x 256 = 16,777,216
High-speed LCD monitors typically reduce the number of bits for each color to 6 instead of the standard 8. This 6-bit color will generate far fewer colors than 8-bit as we see when we do the math:
- 2^6 x 2^6 x 2^6 = 64 x 64 x 64 = 262,144
This is far fewer than the true color display such that it would be noticeable to the human eye. To get around this problem, the manufacturers employ a technique referred to as dithering. This is an effect where nearby pixels use slightly varying shades or color that trick the human eye into perceiving the desired color even though it isn't truly that color. A color newspaper photo is a good way to see this effect in practice. (In print the effect is called half-tones.) By using this technique, the manufacturers claim to achieve a color depth close to that of the true color displays.
The dell u2410 that everyone says is so great appears to be a 12 bit monitor which supports 1 billion colors. http://www.dell.com/ed/business/p/dell-u2410/pd
I believe that the Macbook Air (which I have) is a 6bit display. I'm not sure about other laptops or phones. I am now going to start doing some research into which laptops/phones are higher bit displays. Also, as some have mentioned here, I think that it's possible that the software is changing the dithering technique, so different OS's, or different monitors, may have different dithering, explaining why some devices/OS's feel better than other.
An interesting quote from one of the links above:
This is the biggest problem for individuals who are looking at purchasing an LCD monitor. Most manufacturers do not list the color depth of their display. Even fewer will list the actual per-color depth. If the manufacturer lists the color as 16.7 million colors, it should be assumed that the display is 8-bit per-color. If the colors are listed as being 16.2 million or 16 million, consumers should assume that it uses a 6-bit per-color depth. If no color depths is listed, it should be assumed that monitors of 12ms or faster will be 6-bit and the 20ms and slower panels are 8-bit.
I have tried already to turn off temporal dithering under linux, but although according system logs my changes were active, I couldn't feel any improvement.
For my desktop computer where I have a nvidia card, I had to adjst nv50_crtc_set_dither in drivers\gpu\drm\nouveau\nv50_display.c. Please note that depending on your GPU it may be a different location.
I have changed from DYNAMIC2x2 to STATIC2x2 but couldn't feel and see any difference.
I have tried as well on a notebook with intel gpu, here you have to look for example for haswell_set_pipeconf in drivers\gpu\drm\i915\intel_display.c. Also here may be different function depending on your GPU.
However for Inter GPU it seems spatial dithering is already enabled (PIPECONF_DITHER_TYPE_SP), so didn't need to change something. But unfortunatelly the display was also terrible for my eyes.
So although I started this with great hope, I kind of abandoned the project. Maybe I missed something and you'll have better luck...
How can you tell that the solution described at psychtoolbox.org for MacOSX is not working? Could you describe problem you encountered? Also if you have an nvidia GPU in your Mac you usually have also an embedded intel GPU. Have you enabled the nvidia before trying?
That's good to hear others have tried it.
My guess is that if dithering was turned off, it would be very noticeable to the eye, like the images shown here: http://www.smallhd.com/products/ac7/oled-vs-lcd-on-camera-field-monitors.html Did you try taking any video of the monitor? Using the gradient here http://vizfact.com/wp-content/uploads/2012/01/Black-and-White-Desktop-Wallpaper- Circular.png and videoing it will clearly show dithering.
I called the apple store yesterday to ask if their MBP Retina has a 6 or 8bit screen. The guy didn't know and asked around. Someone there swore up and down that it was 8bit. But I couldn't get a solid answer if it was 8bit hardware, or a 6 bit screen with dithering to make it 8bit. I took a side by side vdieo with my MBA from 2011 and posted it here:
Edit: To be more clear, my understanding is that nearly all laptops are hardware 6 bit, but display 8 bit due to dithering.
Make sure you watch it in HD. The dithering is just as bad on the MBPR as it is on the MBA. The dithering effect (dancing pixels) is much less noticeable with the video on youtube. On my high quality video locally, it's very apparent.
My current theories, which are completely speculative:
- The issue is temporal dithering
- I am not finding any screen flicker in MBPR, MBA, iPhone 5
- Both hardware and software affect temporal dithering
- The bits of color a screen can display will decide whether and how much dithering is necessary.
- If dithering is used, the screen is still the one displaying the dithering so two 6 bit monitors are not exactly the same.
- Software can possibly change the dithering technique, which could explain why different OSs affect us differently. It could also explain why software updates could affect dithering.
- The Dell monitor mentioned in this thread is considered "great" beacuse it is a 12 bit monitor capable of displaying 1 billion colors (as opposed to laptops' 6 million)
My next steps:
1) Scour the internet to find whether MBPR is 6bit or 8bit.
2) Find and buy a laptop that is 8bit (or higher? I don't know if higher exists.) I've heard Sony may have something.
3) Dual boot ubuntu and attempt to turn off dithering.
- The issue is temporal dithering
A few new things this morning:
1) I have taken video and pictures of the MBP Retina with a Nikon D3100. I have not been able to find PWM. For pictures, I tried the technique mentioned here using a black screen with a white vertical line, then a horizontal line. For the videos, I tried all different shutter speeds. Still no PWM.
2) I called Sony to inquire about an 8bit screen laptop. I was transferred to three levels of tech and no one could answer.
3) I found some interesting links.
In fact, exceeding the sRGB gamut can be detrimental to image quality. Most notebook displays are not even 8-bit native (the IPS panel in the Retina MacBook Pro is, however) and you need greater than 8-bit precision when going beyond the sRGB gamut, or the individual steps (gradation) start to become too separated, and banding is introduced into the image. So you are expanding the gamut of the display, while compromising the quality of all sRGB/BT.709 content.
The key metric here is bits per channel. Proper pro displays need at least 8 bits per channel, which is why 6-bit-only TN panels are out. For gaming, I’m not so sure 8- and 10-bit panels are critical. Yes, 6-bit displays end up using dithering (that’s bouncing a pixel rapidly between two colour states to mimic a third). I get very upset by bad dithering (visible as fizzing pixels), but done well it’s pretty much invisible to the end user. It’s worth noting that lots of the latest budget-priced IPS screens are in fact 6-bit, where IPS has traditionally been 8 and 10-bit.
The Retina display reduces glare up to 75 percent while maintaining incredible color and quality. In fact, it has a 29 percent higher contrast ratio than a standard MacBook Pro display. Blacks are blacker. Whites are whiter. And everything in between is rich and vibrant. IPS technology gives you a wide, 178-degree view of everything on the screen, so you’ll see the difference at practically any angle. And you’re going to love what you see.
It's very hard to find how many bits per channel a laptop screen has, because it's not typically advertised nor do the sales/tech reps know the answer.
I'm beginning to wonder whether the best way to find the information is to see how much of the AdobeRGB or sRGB color spectrums a monitor can display. For instance, the Dell u2410 can display 96% of the AdobeRGB color space, which is quite impressive.
According to this, the MBPR covers 67.3% and the early 2011 MBP hi res matte covers 74.7% of AdobeRGB.
EDIT: I just found a great resource that lists specifics on the screens. Now I just need to find out how to get the screen module number in OSX.