HDCP - Be it hardware or software?
Here is my question: If my TV's DVI is HDCP and I purchase a HDMI to DVI cable, is my connection still HDCP compatible? I'm wondering if it is a pin or configuration in the actual hardware that makes HDCP or is it a circuit board encoding thing? If it is software encoding, then it seems that the DVI to HDMI should be bullet-proof. (I know there are DVI-I and DVI-D, and DVI- A specs, but I don't know that that matters of I can see the pic and hear the sound).
It just seems to me that it should work if it is HDCP compliant and should not work if it isn't. I don't know why it would be intermittent.
So, is it a hardware specification and you have to search for HDMI to DVI cables that ACTUALLY SAY THEY CARRY THE HDCP CONTINUITY? If so, I can't find any such cables.
If it is software for firmware, seems to me there would be less emphasis on the hardware connection type and more on the circuitry decoding. But, everyone refers to HDMI as inherently HDCP as if it is a hardware connection.
Last thing, does the DVI (type D, type A, Type I etc.) matter to HDCP? I plugged my cable in, it fit and worked, but not without some glitches like I described.
intel Mac, Mac OS X (10.5.2)