Yes this issue seems counter-intuitive until one thinks about how this works. I am a systems programmer, but since I do not have access to the internals of OS X, I can only make an educated guess as to the root of this "problem."
There are two distinct parts to Airplay: audio/video and mirroring. When our iDevices send audio or audio/video or photo's to the AppleTV, it sends the audio or audio/video information as a stream of data that is then decoded by the AppleTV in a way that preserves temporal integrity. Think of this as the same way you receive a Netflix movie or a streaming audio song. In fact, it's the same concept exactly, and we all know that Netflix or Spotify would be useless if it was choppy or laggy. This is an efficient means of transporting data, hence it just works.
Mirroring involves the extra step that slows things down: scanning your screen and input audio which then has to be processed and packaged for transport locally from our MacBooks. Only then can it be sent over-the-air to the AppleTV. This is a necessary, incredibly processor intensive task that is the difference between using Airplay on our iDevices vs our more powerful computers. So why does mirroring work perfectly from our iDevices? Because Apple uses a trick that recognizes audio or audio/video streams from our iDevices and temporarily "switches" to the non-mirroring form of Airplay - have you ever noticed that when "mirroring" from an iPhone or iPad, as soon as you start a video, the TV screen gracefully fades out and then back in quickly to a full screen version of the multimedia? To confirm this hypothesis, try mirroring from a Safari page in iPad vs mirroring from a Safari page on Mac OS X. You'll notice that from the iPad, your AppleTV will show only the video in full-screen on the TV (non-mirroring) but from the computer, you'll see the web page and the video in an identical configuration as what you're seeing on your computer screen (mirroring). This way, the iPad and iPhone don't have to perform this extra "step" and everything stays smooth.
So why doesn't Apple employ this same technique on our expensive, powerful computers? I think it's because mirroring from a computer was designed to be more of a "second screen" for our PC's in which case we would be truly ****** if our AppleTV's could not mirror a webpage or application if video or audio/video was present whereas this is not important for iPads and iPhones. As annoying as this is, I fear that it is working as designed, and this problem will only get worst as CPU's move to mobile architectures to save energy unless Apple works on optimizing the efficiency of it's Airplay algorithms.
To address the post that mentioned that this all works much smoother from wired connections, this has to do with the fact that the algorithms to mirror vs stream audio/video only are much less efficient. It took 30 years of software optimization to learn how to efficiently stream audio/video (non-mirroring), but even if the scanning/processing step (mirroring) from our computers was wholly efficient, there is still way more information to send over the network to the AppleTV due to the way data must be packaged to truly mirror a screen.