Matt Gahs wrote:
Maybe I can help clear up some confusion (while also saying i'm having the same "bounce back to menu" problem).
• I have never had the "Cannot play this movie" error.
• There are certain titles for me that, when I try to play them, it takes three to four times longer to load them (black screen with white pinwheel) than other shows. Normal shows take about ~15 seconds, these troublesome shows take ~45 seconds or more.
(snipped list of shows)
• I am on 2Mbps DSL, hard ethernet from the modem to an AirPort Extreme, from there hard ethernet to the Apple TV. No wireless devices are enabled when this issue occurs.
A 2Mbps connection is
very slow. The issue you're experiencing makes sense.
There are two different issues at play here.
First, the producer of the video content gets to determine what quality level they want to use for video compression. While a lot of people get hung up on how "hi def" something is, they're typically thinking along the lines of whether the video is in 480p, 720p/i, or 1080p/i. Unfortunately it's not that simple. A movie recorded a 720p but with very low compression (e.g. high bitrate), can look better than a movie recorded at 1080p but with very high compression (e.g. low bitrate). The 1080p video will only look better if it's bitrate is better than the 720p video.
Second, there is no uniform speed for playing movies. Movies have
variable bitrates.
If you were to use a digital camera to take two pictures of different subjects, but didn't change any camera settings (same resolution and compression level) and the camera records the pictures as "JPEG" images, you'd find that every picture has a different file size when saved to disk... even though the resolution of the camera didn't change. The file size depends on the complexity of the image and how well it could compress.
Movies aren't still pictures, so they employ a few other tricks to reduce bandwidth and take advantage of the fact that two frames of video, when compared side by side, are often nearly identical. So rather than transmitting a full frame of video, it's only necessary to transmit the differences between frames. You could overlay a grid of squres on the frame (like a checkerboard). If the contents of the first square in the grid (say the upper-left corner) is identical to the same square in the 2nd frame of video, then you don't really need to transmit that frame. In reality it's a bit more complex than this, but you get the idea.
If the scene is still (camera isn't panning, zooming, changing focus, etc. but is mostly just still and very little in the scene is moving.) then the differences from frame to frame will be very small. That segment of video will compress
extremely well. But if the opposite is true and just about everything is changing all across the scene, then in comparison, that scene wont compress nearly as well and will require a higher bitrate & thus higher bandwidth from your network.
Basically the "bitrate" of a movie will vary during playback. I suspect what you're encountering is that many (most?) of your movies & TV shows are compressing well enough that the Apple TV doesn't have to buffer very long before it's able to start playing. But just occasionally you hit upon some content that's either recorded at a substantially higher quality OR it's just demanding higher bandwidth because of the way it was filmed.
The fact that it takes 45 seconds or longer for some content, but only 15 seconds for other content, is not necessarily any cause for concern.