I've noticed something of an irony recently. It seems that a UK supercomputer that they use to crunch all the scientific numbers and all the calculations of every climate station and weather center in the world happens to use 1.2 megawatts of electricity to operate.
It's an interesting irony that the power plant ends up putting out 12,000 tons of CO2 in order to power this monolith. Still with 15,000 Gigabytes of system memory, I don't think the power consumption was what they thought of when they built it. I wonder how much that computer cost compared to what the electric bills are.