Is Parallel Computing the Next Big Thing?
by Ken Orr I was reading an article today that claimed that parallel computing is the next big thing [1]. I heard a similar refrain in an interview with someone from Intel. The premise behind this idea is that the first age of Moore’s Law is ending and the hardware guys are beginning to panic. For reasons of power consumption and heat, the traditional ways of building faster, cheaper CPUs and computers are approaching some difficult barriers. The most important tool available to the computer hardware engineer over the last four decades has been miniaturization. As folks at places like Intel and AMD try to pack more and more functions on smaller and smaller chips and pack more and more of these chips into smaller and smaller packages, getting rid of the heat becomes more difficult.