This morning's breakfast experiment involves rough estimation of the changes in cost and capacity of computing power over the last few years, and imagining where that will leave us in ten years.
In summary: hardware is already superpowered. The gap between hardware and software is already huge. It's only going to get worse. What is to be done about the parlous state of software?
Rough googling suggests a 512MB card cost about $60 in 2005. That's around $68 in 2012 dollars. A commodity 32GB card costs about $25 today. Assuming exponential decline in price, or equivalently exponential increase in capacity for constant cost, we see that we can expect capacity per dollar to roughly double year on year.
More rough googling suggests CPU capacity (measured in GFLOPS) is increasing at roughly a factor of 1.6 per year. GPUs are improving more quickly, approximately doubling in speed each year.
Wikipedia informs me that in 1971, DRAM cost 5c per bit, and in 1999 it cost 20µc per bit. Again assuming exponential scaling, that gives approximately a 1.56 increase in capacity year on year.
In five years time, expect to be working with machines that are ten times as fast, that have ten times as much RAM, and that have thirty-two times as much secondary storage as today's machines.
In ten years time, expect machines one hundred times as fast, with one hundred times the amount of RAM, and one thousand times the amount of storage.
I'm not even sure how to measure progress in software. But my impression is that it isn't keeping up its end of the bargain. Perhaps we're seeing linear improvement, at best.
I think a big part of the problem is that our ambition hasn't increased to match our capacity. We haven't kept our expectations from software in line with the ability of our new hardware.