I may be wrong but doesn't the mac mini share its graphics output with the cpu rather than a dedicated card? , plus the fact that normal output from FCP is only a representation until you either use an output card (Kona etc) hooked to a monitor, or print to tape. this may explain the difference you are seeing.
either way quality control off a computer monitor is not a good way to go! calibrated broadcast monitors (not tv's) are the only way to judge your final output.
"Gee...why does this really high end PRORES HQ footage look like crud on the cheapest, lowest end Mac on the market, but look good and normal on the professional machines?"
Asked and answered.
Wonder why the MacMini isn't recommended for FCP now? Especially when working with RED for Pete's sake? I mean, really...
As far as the shared video card. You definitely have a point, except for that with MacBook Pro, they have dual graphics modes. By default, they use the same shared graphics system, like the Mac mini. Only when you go to the energy saver settings and turn on the dedicated graphics card, will you be able to use the dedicated graphics engine. To this point, I have monitored the same exact footage from Final Cut Pro with a MacBook Pro using the shared memory graphics and it did not exhibit the issue.
With that said, I totally see what your saying as to really having proper monitoring through the use of a broadcast monitor and an I/O device.
But I'm just curious what exactly is the culprit causing my particular problem, just so I know what to avoid and what I can get away with so to speak.
*When I look at the footage with my latest model Mac Mini 2.26GHz off of a fast Firewire 800 G-Technology Drive with a Dell ST2210 21.5 inch monitor, the Canon 7D footage looks like it's low res. When I QC the footage on a Mac Pro with either Apple Monitors or a Sony Broadcast monitor, monitoring through Kona, from the same drive, the footage looks really good.*
I think answer is in your post, don't you?
I understand the obviousness of this question. But in the past, when using a lower end machine, I have noticed things like dropped frames and jerky video. But I have never seen video bump down to a lower res version on the fly, like that so I was just curious if this was something that final cut was doing on purpose to compensate for the lack of horsepower, or if there was something wrong with the hardware outside of the box.
I checked my RT settings, and there was no change when I toggled them back and forth.
My other concern was that I'm going to be shopping for a MacBook Pro soon and they are similarly speced to some Mac Mini's so I wanted to make sure what this issue was caused by. That's all. If it's FCP doing it purposefully...then great, but if not, or if by chance its my Dell Monitor, than that's a different issue. So that's why I'm tapping on the well of knowledge.
FCP lowers the resolution of video to ensure proper frame rates when playing back. If at the lower resolution it STILL can't, then you get the dropped frames warning. You can play it back, but at lower resolution. So the Mini is an offline cutting machine only. The fact that it works with ProRes HQ at all is a wonder.
Ah. That makes sense. So it's offline cutting. Ok. As far as output though. If I finish an edit, and output it at full HD ProRes, will the resulting output quicktime file be at the full resolution that I could properly QC on a "real" machine or will it spit out a resulting low res version because I was in Offline mode?
What feature is this called where Final Cut automatically brings the video down to a lower res to compensate for the lack of CPU? I'm trying to gain some more perspective on how the software interacts, and makes this automatic decision to playback in low res. I see options like Offline RT and Safe RT, but those seem to be manually commanded settings, but no mention anywhere in the manual.
Also..it seems that my output file, does carry over the low res effect, when I play them back in QT, even on capable hardware. I outputted a sequence cut in ProRes 422 (HQ) to a resulting H.264 HD file, with a resolution of 1280 x 720, bitrate 5000, etc. And I see pixelation in the dark areas of the footage as I did when it was in sequence in Final Cut Pro.
So I'm still a bit confused. I get that monitoring from a Mac Mini is not ideal, but when I look at the resulting H.264 file on another more "capable" machine, it still shows the low quality.
So my biggest worry is if it's playing in low res due to lack of horsepower, than does that mean it's carrying it over in the resulting files, or is something else causing this. And I'm fairly certain, I have the settings correct.
What feature is this called where Final Cut automatically brings the video down to a lower res to compensate for the lack of CPU
Built in feature of MANY NLEs, Avid included. Because what you see on the computer monitor is only for reference...for you to see what you have. But if you want the specific feature name, it is called UNLIMITED RT, and it is in the drop down menu...as well as the MANUAL.
Also..it seems that my output file, does carry over the low res effect, when I play them back in QT, even on capable hardware. I outputted a sequence cut in ProRes 422 (HQ) to a resulting H.264 HD file
Meaning that you didn't export a FULL RESOLUTION file. You exported as H.264, resulting in a compressed image. If you wanted a full resolution file, you export as a Quicktime Movie, Self Contained, do not recompress. That is full res.
I get that monitoring from a Mac Mini is not ideal, but when I look at the resulting H.264 file on another more "capable" machine, it still shows the low quality.
Because you compressed the image. You didn't export a full resolution file.
So my biggest worry is if it's playing in low res due to lack of horsepower, than does that mean it's carrying it over in the resulting files, or is something else causing this.
You are causing this. When you export to a compressed file type. You don't have the settings right.
Thanks Shane for the prompt answers. I appreciate it.
But here's where my confusion originates from, and I'm sorry I didn't mention this earlier. But I'm looking at the converted footage from the Canon 7D both straight from the card, which is High Res 1920 x 1080 H.264 and the converted footage at ProRes 422 (HQ), in Quicktime, and I still see the same issues. So this is viewing the video both in Final Cut Pro and QuickTime by itself. With these auto low res features happening in FCP, I understand, but QT, does not have this feature, so my question is, why am I seeing this issue directly from Quicktime itself, with just that footage. I do not see this happening in any of the native RED footage or converted RED footage to ProRes in similar light settings.
I'm starting to feel like this might be an issue with my Dell monitor and it being improperly calibrated and possibly accentuating some of the noise that is commonly caused by low light settings with the 7D, being that it records in H.264. Perhaps if I monitored with a proper NTSC monitor, I'd have more accurate results anyways? It's just that I'm so used to working in with beefy machines, and this particular project happened to be on my home machine, and I've never run into these issues before.
Is monitoring on a computer monitor just a complete wash?