cdhw wrote:
The key reason Yosemite and El Capitan don't run well inside a Yosemite VM is because they are designed to use the very GPU heavily. Yosemite went a long way in this direction, and El Capitan pushed it further. Your benchmark is really measuring the ability of the VM to share the GPU not the OS performance in any domain it was designed to operate in. This is not something that will affect a typical end-user.
Hello cdhw,
Yes, I know. It is not a benchmark, but a test environment. Obviously my virtualized, unaccelerated graphics system isn't going to perform well. But 10.6-10.9 work fine. 10.10 is really slow. After I went from 10.10 to 10.11 on the host OS, my 10.10 guest OS was too slow to even be worth the disk space. A virtual environment is going to dramatically show even the slightest inefficiencies.
IME, (based on clusters of real machines and real users) as long as a machine either (a) has an i3 processor or later and at least 4GB ram or (b) has at least a core 2 Duo and 8GB RAM, Yosemite and El Capitan provide a very good user experience.
Of course I also test with real machines. I saw right away that my old mechanical hard disks weren't going to cut it anymore and upgraded to an SSD. So it is important to remember that my "unusable" virtual environment is also running from an SSD. I went from VMs running fine on a spinning hard drive with 10.9 to unusable with an SSD in 10.11.
But going beyond that, I have seen how older machines perform with 10.10+. Aside from the horrible long boot time, they do work fine with 4 GB RAM - as long as you don't exceed 4 GB RAM. Before 10.10, the RAM requirements were much less and when you started swapping, the machine slowed down. Since 10.10, 4 GB is an absolute minimum and once you start swapping to a spinning hard drive, the machine stops cold. I have seen this with my own eyes. A 3 Ghz Mac with 4 GB RAM was locked up worse than my old 200 Mhz NT machine circa 1999 when the AV kicked in.
I used to develop large satellite imagery ingest systems for the US government. My software ran on multiple 20+ core Linux servers with 10 Gig ethernet and 27 TB just for cache. I would do development and functionality tests on my 2010 MacBook Pro with 4 GB RAM. Obviously I couldn't do massive 1 TB datasets but I could do more modest datasets. This was a .gov machine so it ran 10.6.8 for a long, long time. Sometimes I would slip up and attempt to test some dataset from a big commercial imagery satellite. That made my Mac really slow. But this was nothing compared to a more powerful iMac with 10.10 and running nothing more than Evernote and Chrome.
So yeah, I have other data points. 🙂