Goodie. I get to put on my high-performance computing hat.
Great informatoin in the graph for deciding between disk and SSD, which is important and was not part of my analysis.
Before I get going too deeply I'll try to bring this down to earth.
Simple factors for a SWAG:
How long do your runs take now?
In how much real and virtual memory?
- Are you willing to increase the RAM so it covers the current virtual memory size
to avoid the paging to disk or SSD?
A factor for a more accurate measurement with a relatively easy benchmark:
What is the run time difference currently if they fit in RAM vs. virtual memory?
Then add this consideration:
How much are the applications expected to grow if you have a more hefty machine?
And the bottom line:
Will the faster expected execution speed be worth the higher price?
Then there are harder questions (CPU speed, memory bandwidth, disk vs. SSD) for which you may need to benchmark them on the actual target hardware. These other factors probably won't matter in the end, given the large RAM and SSD tipping point differences.
Keeping the data in RAM is likely to have a biggest impact on speed than any other factor but as with with most HPC applications, the answer is "it depends" if the cost justifies the benefit.
========= ========= ========= =========
Options and impact:
(1) $1800: 4 GB RAM on disk?
(2) $2200: 8 GB RAM on SSD?
(3) $2800 16 GB RAM on SSD?
Given Mathematica uses several GB just for the code, the 4 GB solution is likely to be too small. So that limits you to a $600 decision on if you need more than 8 GB of RAM. It also elevates the SSD decision.
Ignore the processor speed differences, even the Turbo Boost. I am not familiar with the details of the 3.7 GHz Turbo Boost but is sounds sexy. At most it would yield a 50% boost and probably not that much.
How much time do you want to spend making this decision? You can SWAG it or run relatively simple benchmarks.
SWAG: How much memory will your application need? I would base my decision on the RAM size and ignore the other differences. Luckily this analysis is probably the simplest since one can roughly calculate the gross memory usage from expect problem sizes and estimate if it will exceed 8 GB. Will you run in less than 8 GB most of the time and can wait for exeptional runs that are larger? Those are the tipping points.
Deeper analysis (and probably not to hard to do): One can benchmark on existing 4 or 8 GB systems by artificially adjusting the application size from fitting in in RAM to overflowing into virtual paging. For example, calculate something that with double the memory and should take twice as long (double the calculations) and then cut the longer run time in half to compensate for the extra calculations. Then double the speed estimate for using SSD instead of disk (but only in the 4 GB to 8 GB comparison). It is best to do this on application algorithms that will be similar to the target applications since data access patterns matter for virtual memory access. (Careful, some calculations increase exponentially in time while space increases linearly and vice versa.)
If you have real-world applications you can grow and shrink, so much the better.
And of course an SSD based Mac is faster for normal operations that do I/O. Boot and shutdown times are significantly faster. I hope never to go back to a disk.
Summary:
More is better but are you willing to pay for it?
$400 more for 8 GB and an SSD?
Most likely worth it.
An additional $600 for 16 GB?
Depends.
Does the extra speed really matter in the end?
P.S. I have another option to consider in my next response that may make the decision even easier.