Well, video played back on a Mac or PC is almost always going to have a minor frame skip now and then. It's not synced to video like it should be in a home video product.
When playing a video on a mac or pc you might also get video tearing. This is where if you could slow down time you would see the top portion of the video frame load to the screen, then the computer screen refresh and then the bottom portion of the frame load to the and the top part of the next frame load. In other words, taking a snapshot in time you'd see half of one frame and half of another frame on the screen at once. This tear can migrate up or down the frame over time as the frame rate of the movie playing is slightly different than the sync rate of your monitor.
I don't think quicktime does this though. I suspect it would instead skip or drop a frame as described next...
This also shouldn't happen on a home video device like the apple tv because the playback of the movie should be synced to the refreshing of the screen. It waits to load the next frame of the video to the screen until right after the screen refreshes (between refreshes). And it should get the job done before it starts the next refresh. If it doesn't make it in time or if it takes too long to load a particular frame, it has to wait until after the next monitor frame is done refreshing before loading that next movie frame, and by then it would either be a frame off or it has to skip or drop a frame. This is what we're seeing: DROPPED FRAMES.
I'm going to beat a horse until it's good and dead here.. with some technical info:
I just checked and my computer monitor is currently at 60hz. Ok this is going to be confusing but computer monitors are 60hz, 75hz, and other rates but the '60hz' in a computer monitor is true 60 while a tv's refresh at 59.94hz. TV's will often say they are 60 in a status menu but this is a generalization. Some newer TV's are capable of displaying many different sync rates including true 60 hz - and my dell monitor is capable of syncing to video at 59.94hz when fed via an hdmi input. But true 60 is rare in the video/tv world. If it's being fed hdmi from a bd player or apple tv it is 59.94 no matter if it says '720/60' or not. And.. to confuse things even further from a BD player it could be 23.976fps (and say 24). But I'll get back to my point...
Oh and a quick side note for those in the UK.. all modern tv's will play HD at 59.94 even in the UK. Which is why in BluRay authoring we don't have to convert 60i material to PAL and author a different version for that market (like you do with DVDs). If we don't region code we can release the same BD there as here in the USA. Therefore apple tv can output to the same 720/60 in all markets. HDMI is a world wide standard.
If I play the Sky High movie I purchased from iTunes using quicktime on my mac it will try to play it at it's native 23.976fps. I've not seen the video tearing thing in quicktime and I assume qt syncs to the refreshing of the video card - in my current case true 60hz. If quicktime does the traditional thing it would introduce a 2:3 pulldown and play the first frame for two refreshes, the second frame for 3, and then back to 2 and 3 and repeat. The 2:3:2:3 pattern will take the 23.976fps and turn it into 59.94fps. However, my computer monitor is refreshing at true 60hz, which means that there is a .1% difference in rates. Eventually it will have to skip the cadence in order to catch up - but with film material not have to skip a source frame. Doing a little math, it takes 1000 frames for this .1% difference to amount to a full (frame converted) frame. Once every ~16.7 seconds it would need do a 2:3:2:2:2:3. Honestly, even as a video professional I don't think you would notice this because it would still be playing every one of the original frames at least twice.. no single frame is getting completely dropped. Our eyes have been used to seeing this cadence forever as this is how 24fps movies were converted to play on our old tv's.
If the software isn't smart enough it MIGHT do a 2:3:1:3:2:3 - and THAT you
might notice. Chances are qt just repeats each frame as many times as needed until it's time to play the next frame. And in that case you would probably get odd cadences, which might or might not be more noticeable. I really have not examined this too much to know what they're doing. I don't watch movies on my computer. BUT I hope that apple tv, which supposedly plays everything at 60 (I believe 59.94 and not true 60) is programmed to turn 23.976 into 59.94 using the traditional method of a 2:3 pulldown.
Now, if the movie was stored on a slow computer hard drive, or if it takes too long to decode a frame from the compressed stream, then the computer can't load the frames fast enough to play them all - and you'd get a dropped frame. For film sourced material this is when one or more of the original 23.967 frames in that second is not played at all. *This is what I think we are seeing in the new apple tv.*
A couple of important things. In my description above I was talking about playing back on a computer with a monitor refreshing at true 60hz. Like I said I believe that the apple TV is not true 60hz like a computer but is 59.94, so for 23.976 material it should be sticking to a strict 2:3 cadence without the need to ever skip through the cadence.
The other important thing is that dropped frames are more likely to be seen on 30fps progressive material (actually 29.97) than on movie/film 24fps material because the cadence would be 2:2:2:2. And even more likely to happen on real tv stuff that has 2 fields per frame. In that case there are 59.94 discreetly different fields per second to fit into that 59.94 frames.. so that cadence is 1:1.. if anything data is delivered or any frames are decoded late you would see a real dropped frame.
Couple of points I'm making here with this overly long explanation:
1. It doesn't matter how video plays on a computer - a home video product like apple tv should play tv shows and movies without dropping any frames.
2. In Sky High, which is 23.976, I noticed less dropped frames than I noticed on the tv show which was probably 59.94.
3. IF apple tv WAS true 60fps, and if you were playing a 59.94 fps tv show on it - it would have to drop a frame once every 16 seconds or so. I don't think this is what's going on because there is no reason for apple to make the apple tv deliver true 60fps. No home video products work that way.
4. I suspect that source frames are dropping - meaning that something is preventing them to be presented fast enough. That's either a problem decoding frames fast enough or loading material off the buffer fast enough. This could be caused by a background task. Hopefully it's not that the hardware simply isn't fast enough.
Ok.. I think the horse is sufficiently beaten and dead.