My recommended frame rate in the American market is 30 (HD or higher) or 60 (for 720). In a "PAL" market, 25 or 50. I suppose you can use 24 if it can be accepted (without alterations), but I simply do not buy into the fantasy that 24fps is more "cinematic"... Cinematic effect is the realm of the videographer's understanding of the scenes being shot and not the frame rate of the capture. It has to do with motion blur (shutter speed) and lighting (aperture + shutter speed + speed of film, what I knew as ASA now ISO.) Cinematic is captured, not made. Therefore, any frame rate can be "cinematic". [People are experimenting with frame rates up to 120 these days.]
The first rule is always *ask* the host station all the details of the format they *prefer*. They (stations) are not all equal (more affluent stations will have better equipment). If the quality of your materials matter then by all means - do the work (including the encoding) yourself. All they really want is a file they can plug into their programming schedule.
It's not really an exception... If you *value* the appearance of your media, do everything you can to keep OTHER PEOPLE from re-encoding it because they won't care.
My "rule" is basically a wake up call. STOP ACCEPTING THAT 29.97 (or the others) IS CONTEMPORARY. It's not. It's ARCHAIC. After July 13th of this year, every station in the U.S. MUST broadcast digital or go dark. [However, I think they've figured out a workaround to this mandate: streaming is not broadcast. Broadcast is OTA (over-the-air). So low power stations are making deals with local cable tv providers and continuing to use their old "broadcast" hardware.]
My "rule" is based solely on choice. If you have the choice — do NOT choose fractional. The reasons are many, a few of which are: 23.98, 29.97 and 59.94 are not simply those values — they are infinitely repeating decimals and cannot be accurately represented by digital. It thoroughly messes with timecode (time code is not time!*). The *need* for fractional frame rates DOES NOT EXIST ANYMORE!!! It was an analog hack for an analog problem (fixing interference with waveforms). These *problems* do not need to be addressed in a digital environment.
This guy does a really good job explaining how fractional frame rates came about:
https://www.youtube.com/watch?v=3GJUM6pCpew
I started waking up to fractional frame rates when I wanted to be able to create my own timecode/timing plugins. You can't for fractional rates in Motion, the logic is not there (as I said: timecode is NOT time in fractional frame rate video). Otherwise, for every other frame rate, I can create frame accurate clocks for (I'm confident enough to say it:) any frame rate. Before that, I would use 29.97 because I just thought that's what "everybody still uses" (how "IBM PC" of me). What difference did it make? It's just 30fps slowed down by 1%. Wait... What? How does that work? Is that something the software has to do now — "physically" slow down how fast the video plays? How does a computer handle a number like 0.003003003003003003003003003003003003003003003... Impossible / Not very well. It never really sync's with timecode. That's when I started looking into it.
I invite you to do your own research and formulate your own conclusions. My conclusion is: Fractional Frame Rates need to GO.