The thing to keep straight is what happens with variable (and the number reported in variable is an average of all the rates actually used) versus constant bitrate (which is a rate of encoding from start to finish regardless of the complexity of a piece).
If it is indeed a minimum rate as part of a general varying bitrate coding then it seems that with choosing "highest quality" probably none of the track would end up actually being encoded at 16k. You're leaving it up to iTunes to decide, with your bitrate setting just being the lowest quality you would want to risk. In reality probably nothing would be encoded at that bitrate even if you left it set, unless you have some pure frequency hum as large sections of the track.
Conversely, it would not make sense to select low quality, then force iTunes to encode at a minimum of 256k. I haven't experimented so I can't provide a guideline as to which quality encodes at which rates but probably low quality would encode at something much lower than 256, so forcing it to do that wouldn't have the desired space saving of choosing a lower quality since you'd be forcing it to always use 256k.
The main disadvantage of setting too high a minimum is it will essentially force a constant bitrate of your minimum in sections that are below the minimum. Let's say you set minimum at 256k and had two 30s really complex pieces of music with 2 minutes of silence in between. The music gets encoded at an average rate of, say 278k. However, you'd be forcing iTunes to encode the silence at 256k, whereas 16k could probably quite adequately handle it, and end up with a larger file than necessary.
If it were me I'd either experiment with some settings (report back here if you do!), or maybe set minimum at 128k (since that is "fm radio quality") that isn't too unreasonable a size penalty if it briefly is higher than necessary for a short section.