I'm not too sure I agree with that.
With which point? The quality being equivalent?
It largely depends upon the TV you're using, as some of the lower-end HDTVs have skimped on their component input designs in favour of HDMI. Further, some of the higher-end TVs that will do 1080i->1080p de-interlacing will only do this on the HDMI inputs, and not the component ones.
Note that this may not be true for all
source devices either (many HD-DVD and Blu-Ray DVD players provide poor quality component output as well), but as far as the tv is concerned, the output quality seems to be pretty on-par using either method. This is based on a bunch of testing that I and several others have done, using several different TVs.
There is no visible distinction on my 32" HP LCD TV (720p) using HDMI vs component from the tv. I simply use HDMI because it requires one cable instead of five.
On my Toshiba 62" DLP TV (1080p upconversion), using the component cables makes a noticeable difference with photos and higher-end HD content, but obviously is indistinguishable for source content that's only 640x480 or less to begin with (which is 95% of my content right now). With the Toshiba, I blame the issue on the component inputs not doing the 1080i->1080p deinterlacing conversion that occurs on the HDMI inputs.
Note as well that you
do have to use good quality
component cables that are properly shielded, since the temptation is to cop out and use normal RCA audio/video cables. These
will make a difference if the cables aren't up to par.