Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

We've been shooting streetscapes from a car HDV 60i. A local lab guy suggested we look at the material on an NTSC monitor to assess the footage. Viewed that was there is tons of moire-ing. On the computer screen it looks fine. I'm assuming the comput

We've been shooting streetscapes from a car HDV 60i. A local lab guy suggested we look at our footage on an NTSC external monitor. He said this would give us the best assessment of our footage. When we do that there is lots of moire-ing. When we see this footage on the IMac we're editing on it looks fine. The NTSC external monitor is working with the interlaced footage; as I understand it both our IMac computer's monitor and new HDTVs all are progressive scan and need footage that's either natively progressive or de-interlaced. When we see our footage on our computers monitor I'm assuming the video card has de-interlaced it. So why would we be learning anything of use by looking at the material on an NTSC monitor? Wouldn't the fact that it looks fine on our editing monitor mean it willl be fine for broadcast?


Thanks.


John

Final Cut Pro 7

Posted on Sep 26, 2015 4:00 PM

Reply
Question marked as Best reply

Posted on Sep 27, 2015 5:34 PM

"A local lab guy suggested we look at our footage on an NTSC external monitor. He said this would give us the best assessment of our footage. When we do that there is lots of moire-ing."


How are you looking at this on an external monitor? Camera directly connected? Or connected to the computer? If so...how?


"So why would we be learning anything of use by looking at the material on an NTSC monitor?"


Computer displays and TV's/broadcast monitors are very different. Not only are they a different color space, but also because the computer display isn't designed to play back interlaced footage properly...and you shot interlaced. So if you intend what you shot to air, you need to judge the quality on the monitor that will play it back in the proper color space, and display the interlacing. The computer display will differ in color, and won't show you the full quality...deinterlacing removes half the resolution.


"Wouldn't the fact that it looks fine on our editing monitor mean it willl be fine for broadcast?"


Nope. Furthest thing from the truth.

8 replies
Question marked as Best reply

Sep 27, 2015 5:34 PM in response to Canada John

"A local lab guy suggested we look at our footage on an NTSC external monitor. He said this would give us the best assessment of our footage. When we do that there is lots of moire-ing."


How are you looking at this on an external monitor? Camera directly connected? Or connected to the computer? If so...how?


"So why would we be learning anything of use by looking at the material on an NTSC monitor?"


Computer displays and TV's/broadcast monitors are very different. Not only are they a different color space, but also because the computer display isn't designed to play back interlaced footage properly...and you shot interlaced. So if you intend what you shot to air, you need to judge the quality on the monitor that will play it back in the proper color space, and display the interlacing. The computer display will differ in color, and won't show you the full quality...deinterlacing removes half the resolution.


"Wouldn't the fact that it looks fine on our editing monitor mean it willl be fine for broadcast?"


Nope. Furthest thing from the truth.

Sep 27, 2015 8:31 AM in response to Shane Ross

Thanks for the reply.


I had an old NTSC monitor (TV) connected to my computer as an external monitor via firewire and a Sony DSR 11 deck. Haven't used it for years but the lab guy, who knew my old set-up, said to use it for our test.


I appreciate your comments about colour, but at this point we're more concerned about the moire-ing we see on this SD NTSC monitor.


Must admit, too, that I'm not at all clear about how de-interlacing works or when de-interlacing is taking place. I've been reading as much as I can about it the past few days and what I've been saying is based on my best understanding of what I've read.


I know that in FCP if you set the Canvas at anything other than 100% it only shows one field of the interlacing, and that's obvious to the eye. But when we view our material at full screen it seems to look fine. I don't get the impression I'm only viewing one field, but maybe I am.


It's material I've been reading in the last few days that suggested the video card was de-interlacing the material before it got to the computer screen allowing the progressively-scanning computer monitor screen on my IMac to show it. Seemed to make some intuitive sense. Still, equally intuitively, I've always felt that de-interlaced material should loose quality and be seen to do so.


Sent to the NTSC external monitor I knew that we were seeing the material presented interlacedly (if that's a word). That, by definition, is the only way an NTSC monitor/TV can do it, right? It's in this form that we now see all sorts of moire-ing, specifically on the sides of houses which often tend to have lots of horizontal siding in this part of the world.


But though we noticed this I began to wonder why would this matter, for the film will be broadcast digitally and be seen on digital TVs. It's that thought that brought me back to my progressively scanning IMac computer screen. If it shows our material without moire-ing why would a digital broadcast to progressively scanning HD TVs do so?


So if an external monitor is to be our test, wouldn't that mean an HDTV? And I guess that brings me to the question: what is done to interlaced footage to broadcast it via digital networks to digital TVs?


Thanks once more.


John

Sep 27, 2015 5:35 PM in response to Canada John

Couple of things:

You are confusing the display card in your computer (which connects the computer to a computer display) and a video card which converts the computer signal to a video signal for a tv monitor.


To properly see your video, as it will look on a tv monitor, you need a video card, such as:

https://www.aja.com/en/family/io

and a good tv monitor - if you are very serious about this, you can spend thousands on a broadcast quality monitor.

http://www.flandersscientific.com/index/

or just go to Costco and get a good quality TV monitor.


Any HD tv set sold in the US will support 1080i, that is what most broadcasters are transmitting. We deliver 1080i commercials and programming all the time.


Your test via the DSR-11, while good intentioned, is not going to show you much. Did you downrez your HD content to SD DV so that it will play on the DSR-11?

HD 1080i is interlaced upper field dominant, DV is always lower field dominant, if they are mismatched during the conversion you will see interlacing artifacts.

Did you connect the monitor via an SVHS cable, as opposed to the composite video tap?

I wouldn't have any expectations that I was going to get a good look at my video if I had to down convert it to DV 720x480, and look at it on an old NTSC monitor. Moire and color crawl can result from all of this.


There is nothing wrong with interlaced footage - it just needs to be handled correctly through the entire post process through to delivery. You won't gain anything by deinterlacing footage without a legitimate reason.


You need to be able to monitor your content in the form that it will ultimately be displayed to the audience.

If you work only appears on the Web, you can probably get away with just using the computer display.

But if it is going to be broadcast, then you need the correct monitoring hardware and workflow to give yourself confidence as to what you are actually delivering.


MtD

Sep 27, 2015 5:51 PM in response to Meg The Dog

Thanks for the reply.


Guess I have to get a HD TV to act as my external monitor, and that makes sense. I'd raised some of your concerns with the lab guy when he suggested we test our material using the Sony DSR 11 deck and NTSC TV, but he seemed to think it would work.


I've checked my computer and, if I read it correctly, my video card is an ATI Radeon HD 5750. The computer has a DVI output so I suppose I could get the cable that would connect it to an HD TV monitor.


Thanks again.


John

Sep 27, 2015 7:48 PM in response to Canada John

Your footage is HD...and as pointed out, the field domination is different than SD. When you output via firewire thru the DSR to an SD TV...you are flipping the field order...and looking at your footage as SD...not HD. The lab guy was right when it came to SD, and very specifically, DV. But you are shooting HD, so that won't work...not if you want to really see what you shot. You need an HD monitor and as mentioned, an HD IO card.

We've been shooting streetscapes from a car HDV 60i. A local lab guy suggested we look at the material on an NTSC monitor to assess the footage. Viewed that was there is tons of moire-ing. On the computer screen it looks fine. I'm assuming the comput

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple ID.