2388 Views 7 Replies Latest reply: Jun 1, 2008 6:33 PM by The Looby
Does the Apple miniDVI-to-DVI adapter, going from DVI-D to DVI-D
or DVI-D to HDMI, result in a loss of quality ...?
Nope. Digital signals don't work that way. They pretty much either
work perfectly or fail completely. There's no "gradual deterioration"
of video quality; a cable/adapter either works ...or it doesn't.
should of wrote:
There is some loss in signal loss, test it.
Yep, cables/connectors will always cause some loss in signal strength,
but cables and connectors are unavoidable (unless you can suggest a
magical method for connecting to an external monitor without them).
However, in the digital domain, signal strength has ZERO EFFECT on
data quality. If the 1's and 0's reach their destination in recognizable
form, the cable +works perfectly+. If they don't, the cable +fails totally+.
BTW, there's no "signal conversion" in going from MiniDVI to DVI or
HDMI; the "adapter" is entirely passive and simply provides electrical
connections to the appropriate connector pins.
...in the binary world, the only grading system is "pass/fail",
"Do, or do not. There is no try."
should of wrote:
My assertion was that there is greater chance for error with adapters.
My assertion is that there is ZERO chance of errors without a
MiniDVI-to-<whatever> adapter -- because NO ONE makes
cables that plug directly into a MiniDVI. If you don't make a
connection, it can never fail.
So yes, there's a "greater chance of errors" -- compared to
an entirely imaginary cable that never existed. (But only if
you're running so close to the raggety edge that an extra
10 milliohms contact resistance will break the camel's back.)