1080i or 720p

Hi,

I've recently purchased the ATV and it's connected to my Samsung HDTV via HDMI. I've got a 720p TV, that also supports 1080i, but I want to know which resolution setting will give the best performance.

Any advice?

PC, Windows XP

Posted on Jan 31, 2008 11:38 AM

Reply
26 replies

Jan 31, 2008 11:52 AM in response to The Cat

Recently this came up, take a look here:

http://discussions.apple.com/message.jspa?messageID=6479307

It's not entirely the same topic but the discussion is good.

1080i displays the vertical resolution on each frame by alternating the odd and even rows... people can notice a visible difference in quality whether it's fuzziness, less contrast... whatever.

I would say most people, would prefer a progressive source and signal as it renders all the rows on each frame. Obviously, 1080p is 1080 lines per frame of vertical resolution. 1080i is half that on each frame.

Since half of 1080i is 540... most say 720p is better because it displays 720 lines. While 1080i may provide a better horizontal resolution... the vertical resolution leaves something to be desired by many. I will say it's subjective.

I'll also add the link talks about interlaced to progressive upconverting. I would add to it that if your source is progressive, keep it that way.

Jan 31, 2008 11:55 AM in response to The Cat

It depends on a lot of things and in the end it's really down to which you perceive as the better picture and since you're asking I assume you don't see an difference.

FWIW I have mine set at 720p for two reasons, the first being, virtually if not all my source video is progressive and secondly the pixel count on my tv is neither 1280 x 720 or 1920 x 1080 so the tv will scale the signal whatever I choose, better that I only have source (when I'm using 720p content-doesn't matter otherwise) scaled once rather than twice. But I guess I don't see any difference either.

Jan 31, 2008 1:19 PM in response to artmovement

artmovement wrote:
It does not depends. 720p > 1080i.

http://alvyray.com/DigitalTV/Naming_Proposal.htm

cheers,


Firstly this argument isn't relevant because they are talking about comparing a source with 720 lines of resolution with one of 1080, we are talking about comparing the same source but scaled at differing ratios.

Secondly this argument isn't relevant because they are talking about comparing 1080i with 60 fields per second (or half frames if you will) against 60 full frames of 720p, in our case 1080i still has 60 fields per second but the 720 p we are discussing has 30 full frames per second.

And thirdly the article is nonsense anyway. In 1/30th of a second 720/60p displays 1,555.000 pixels (1280 x 720 x 2) for 2 full frames whereas 1080i displays 2,073,000 pixels (1920 x 1080) for two half frames.

Feb 1, 2008 11:47 AM in response to The Cat

So,
all good comments here; but I would chime in:

What is the native resolution of your TV?
I ASSUME your TV can only display ONE resolution, period - that's the case for pretty much all of them. That does not mean it cannot TAKE all inputs; a 1080p TV for example will certainly accept 1080i, 720p, 480p and so on. However, it needs to convert (upconvert most likely) the signal.

You are saying you are a Samsung. Is that an HLR-xxxx ?
I.e. do you know the native resolution (spec) of the TV? Is is 1280x720 or 1920x1080 (i or p being irrelevant in THIS case).

The decision you have to make, really, is where the conversion should occur, and what device give you the BEST conversion.
tv native HD is 1280x720. So, you get the best picture possible to leave it on that IF your TV is natively 720p as well.
If you TV is 1080i natively, then trust your eyes: WHERE do you want the conversion to 1080i to occur? In the TV? tv? maybe a receiver you have it connected through?
Some devices, based on their video converters, will do a slightly better upconversion than others.

Cheers,
Dan

Feb 1, 2008 12:19 PM in response to dosers

dosers wrote:
The decision you have to make, really, is where the conversion should occur, and what device give you the BEST conversion.
tv native HD is 1280x720. So, you get the best picture possible to leave it on that IF your TV is natively 720p as well.
If you TV is 1080i natively, then trust your eyes: WHERE do you want the conversion to 1080i to occur? In the TV? tv? maybe a receiver you have it connected through?
Some devices, based on their video converters, will do a slightly better upconversion than others.

Cheers,
Dan


True, but also remember your 1080 tv may actually have say 1366 x 768 pixels and is going to scale the source whatever you throw at it. In which case to avoid double conversions one would just let the tv do all the conversions.

Feb 2, 2008 12:23 PM in response to The Cat

Hi there,
so, 100hz is fairly popular in Europe with EDTV resolutions. I do believe tv actually has a setting for it under it's resolutions. 100hz is basically a 'line doubling' method (a still multiple of original AC line frequencies).

There is no 'direct' correlation to the resolution. This means that your picture refresh rate is 100hz (versus regular PAL at 50); giving a more stable, flicker-free picture. The same as for resolution goes; you want to match as well as possible to minimize the amount of conversion (and frame rate conversion can get complicated, as you don't want any frames inserted or doubled or taken away; a problem with NTSC/PAL conversion).

To make it more complicated 🙂, refresh rate is NOT the same as frame rate necessarily (a film projector for example does 24 frames a second, but would likely have a refresh rate of 48hz or higher for stability)

Soo... I would, again, use whatever looks best to you. It's common for LCDs to be an off-standard (that is, not adhering to HD resolutions) resolution. Your TV will have to convert either way - up OR down. In your case, I would probably think 720p is best as well, as you'll stay in the non-interlaced domain. In the end, if there is no simple rule (i.e. least conversions) then whatever looks best to you is best 🙂

Best,
Dan

Feb 2, 2008 12:41 PM in response to The Cat

In my opinion. Since you have 720p TV. The best option is to select 720 output from the ATV. Assuming you are watching 720 video and output 1080i from the ATV, the ATV up-converts the video to 1080. The TV then down-converts it to 720 so it can display it. That is two more steps of digital processing which is likely to add digital artifacts to the video. Keep in mind that your TV must convert any video coming in that is not 720 to 720 before it can display it.

This thread has been closed by the system or the community team. You may vote for any posts you find helpful, or search the Community for additional answers.

1080i or 720p

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.