The new Top500 list of supercomputers is out, and the tone is downbeat. Record low turnover and performance increase, signified by the #500 computer having performance growth of 55% year-over-year since 2008, whereas it had been doing 90% before that. There was barely any change in the top ten, and the Chinese Tianhe-2 retains top billing.
This decrease in the growth of computing performance seems to be widespread, as the current paradigm of CMOS transistors with electrical interconnects is getting harder and harder to push further. I’m of two minds on this at the moment: On the one hand, any minor drop in performance growth now will dramatically effect outcomes generations and centuries down the line, due to the magic of compound interest. On the other hand, the less computing overhang there is when we finally crack general AI, the less immediate mischief it can cause if we fuck up its programming too hard. I’d say the balance right now is that my mood is morose, and it doesn’t seem like things will be getting much better in the near term (though long term, there’s still plenty of room at the bottom). Algorithm improvement is another story, happily.
Further, what exactly are these supercomputers actually doing? The big six applications are supposedly medical imaging, defense, oil and gas exploration, pharmaceutical research, 3D rendering for movies, and finance. For government work I’d imagine the focus is on modeling of nuclear warheads and codebreaking.