Since when are all values interpreted, read, tapped, understood, and experienced as CROTCH values????
I think the underlying reason for this has to do with the way midi works. A standard midi file can contain a set-tempo meta-event, which is used to communicate tempo information. The set-tempo meta-event defines tempo in terms of microseconds per quarter note (see
here or
here). This means that quarter notes are the basis for defining tempo, at least as far as standard midi files are concerned. So I think Logic does it this way because midi does it this way, and Logic wants to be compatible, with regard to importing and exporting standard midi files.
Logic does provide a way to see the 'correct' tempo in the Score, as siderealxxx pointed out above. This is explained (albeit superficially) in the manual (
pdf, pp. 886-7): "The tempo indicators in the Transport bar and the Tempo List always refer to quarter notes, even if a time signature with another denominator is used. As such, the displayed tempo differs, depending on the symbol being used."