(sorry, the "Edit Reply" doesn't want to work tonight)
... the last 30-100 MB downloaded from various websites (exempting video files, DMGs, and maybe PDFs)? Ten or fifteen years ago I'd have said 200-300 *kilobytes*, but that was before Amazon, Google Analytics, "Fyre", Facebook, and God-know-who-else was polluting web-pages with various Javascript, CSS-up-the-wazoo, and whatever else clutters and clogs up the web these days.
(Of course, I know that - or the information it allows companies to "extract" from what we all do in our browsing habits and such - is what probably *pays* for a lot of the internet in one way or another)
I'd think such cacheing would also save a LOT of bandwidth - at least if you multiply it by 20 or 30 million people browsing the web every day (which is probably an underestimate). And I'm sure a lot of whatever is on "bits.wikipedia.com" could be cached? At least all the rarely-changing-CSS-and-Javascript files?
I also almost never use Javascript unless I actually *need* to have it on - Wikipedia for citation tool or if I want the De-animator extension to stop a pretty but ****-ever-rotating ball-and-stick model of a protein, Facebook (for anything), Youtube if I want to see the preview pictures - but I use Firefox for that anyway because I can't find a reasonable downloader for Safari, or various journal websites that seem to require JS.
I should a FaceBook group called "JavaScript Is The New Flash" (ironically, as FB requires it).
When will someone figure out how to make "Javascript on a chip"? Literally - *down to the metal* (plus some kind of appropriate very-high-speed semi-separate memory bus required for modification of the HTML that JS usually does). It would seem that Google might like that. Don't the ChromeBooks/Android basically *run* using HTML, JS, and Java?
(Yes I know, my "hacker self-education" really stopped somewhere around 1998-2001..)
Jim