My best guess is that Firefox and Chrome use a more aggressive caching strategy.
Historically, Apple writes Safari to adhere closely to the standards and protocols set forth by the W3C, and protocol says that a cached page be checked against the server version when it's referred so the user is seeing an accurate version of the page.
Web devs will tell you that Safari is a breath of fresh air when coding a website, because it follows all the rules and is usually among the first to impliment new upcoming standards that haven't yet been officially ratified by W3C (early support for color profiles, HTML5, etc).
While Chrome and Firefox may trade off protocol recommendations for more persistent caching, which usually (but not always) works well for the end user, this strategy may also occasionally show up as a bug and can be a major headache for web developers. Internet Explorer is a shining example of this. Us web devs can code a site and it'll work and display perfectly Safari, Chrome and all the rest, and then we have to recode it with hacks to work in IE, which doesn't adhere to standards, and to a lesser extent, Firefox. As a user, have have you ever had a site that will only work in IE? Yeah, frustrating. It's because the web dev coded it specifically for IE because IE writes their own standards and doesn't conform.
If it helps, just recognize that Safari is doing what it's supposed to be doing. The problem is in Apple's implimentation of the animation. Software development is often a trade-off between what's ideal and what can be done within external constraints. But yeah, I agree, Apple should find a way around this dilema.