Is de-interlaced always worse than interlaced watching on TV ? Or do I do something wrong ?
I am working with FCP X, using interlaced material (1080i) shot with a Panasonic TM900 Cam.
So the original footage is AVCHD 17Mbps.
Typically I apply the following workflow to export the film:
- Export from FCP X timeline without any conversion as master-file (i.e. ProRes 422)
- Process through Compressor 4 to h.264 Quicktime movie with average bitrate of 20Mbps keeping the orginal resolution and framerate (25 fps, 1080 res)
a) with all frame controls switched off -> results in a 1080i movie
b) with frame controls switched on for de-interlacing, using "better" or "best" filter and Output fields "progressive" -> 1080p movie
In case a), when watching the film on the media player of a Samsung LCD TV the output quality is extremely good and has no visible difference to the original AVCHD file (that I watch by plugging the camera via HDMI to the TV)
But: in case b) all scenes with fast movements are much worse in quality (played with the same media palyer as well as when playing with Apple TV3). Moving objects are no more sharp and there are "block artefacts" around fast moving objects. Same appears for scenes with fast zoom. Quality is really visibly worse compared to the original AVCHD.
The effect does not change when I use "best" filter settings.
I am now wondering:
Is a de-interlaced 1080p film "per defintion" worse in quality compared to the 1080i film, when being watched on a TV ?
For me this sounds strange, as I understood that at the end also my Samsung TV does some kind of "on-the-fly" deinterlacing, as the LCD screen is of course progressive.
Or do I do something wrong ?
Happy for any advice and help !
iMac, Mac OS X (10.7)