IBM has built a new sort of supercomputer that is not only more energy-efficient than supercomputers cooled traditionally with air-conditioning, but the excess heat from the computer can be used afterwards to heat a building. Water siphons off the heat via tubes and small capillaries that take the liquid very near to the chips, cooling it at 60 degrees Celsius. IBM says that the new supercomputer design, which they call “Aquasar,” will reduce overall energy consumption by 40 percent as well as 30 tons of carbon dioxide. The heating function of the system will only help reduce heating costs a little, but it has some very promising applications in the future.
I still own two CRT monitors, and those are enough to heat my study to an uncomfortable degree in the dead of winter. I’d hate to think of what an entire supercomputer would do to the temperature (assuming it could fit in the room).
I work with a super computer, in the top 10 of the top 500 list.
If the AC in the room that houses it fails, the building will catch on fire from the sheer heat produced.
My computer heats this little room up to 90+ F. A/C has little or no effect.
I sweat a lot.
The nice part is it does help reduce heating costs in the Winter.
With permission from my manager, I have occasionally brought spare machines into work, that are running distributed computing projects at 100% cpu, and placed them in offices that get too cold for inhabitants (often times caused by misplaced thermostats)…
So, this certainly isn’t a new concept to me
As for the power usage… if it’s generating heat where it’s needed, i don’t see an issue.
This works well at home in the winter as well.
Edit: oops.
Edited 2009-06-23 21:28 UTC
Really???
60C is only about 10C below thermal max on most CPUs….
With a T-delta of 10C on a CPU generating around 45W, we would require a thermal conductive efficiency of at least .22 C/W just to prevent going thermal ( easily doable, my water-cooling setup achieves .08 even ). My guess is that an efficiency around mine has been reached at a lower profile ( copper 1U water cooler is all I see in the pics anyway ) – which is still rather impressive, but not revolutionary or anything.
Problem is that allowing the CPUs to run so close to their T-max is a **BAD** idea. As such, I’d suspect that the temps would be permitted to be no greater than 45 or 50 Celsius by any I.T. department wanting to prevent untimely failures ( which SHOULD mean any I.T. department ).
In the end, all we see is 1U water cooling with a more efficient water block – not a bad thing, certainly, but nothing deserving of the hype. All this CO2 and “GREEN” non-sense from Gore has been outright proven as false by NASA and numerous research findings which indicate that CO2 causes cooling – not warming – and that all of man’s entire CO2 contribution is outmatched by a single volcanic eruption the size of Mt St Helens ( and that just helps plants strive ).
Ahem…. anyway… given the real world applications of this, I certainly would not want my multi-million dollar supercomputer running so close to the edge of its thermal envelope – unless I were trying to get failures to occur while the warranty was still valid or to test a thermal-related bug.
All this, of course, is merely tripe should it turn out that I.B.M. managed to cool a CPU with an efficiency of <= -.01 C/W ( super-cooling ) or utilize a CPU whose preferred operational temperature was in excess of 60C, and whose T-max was closer to 90-100C [ this is HIGHLY unlikely, but would be ***AWESOME*** ].
Oh well, my current coolant temperature is 30.5C, and my coolant T-max is 43.7C. With a 45W processor… and a thermal efficiency of .08 C/W ( meaning my CPU is at around 34.1 C ). My T-max alarm is VERY conservative, but is when I need to be concerned with ambient temperatures for my Radeon x850 ( best ATI card supported by BeOS ).
Am I rambling?? D[[O][ID]] I have a point???
Hmm…
–The loon
[citation needed]
Solar Cycle causes heat:
http://www.infiniteunknown.net/2009/06/07/nasa-study-solar-cycle-no…
http://www.solarcycle24.com/ ( monitor it)
CO2 causes cooling:
http://www.eurekalert.org/pub_releases/2009-02/pu-pgc022609.php
Actually, we ARE cooling:
http://digg.com/environment/Temperature_Monitors_Report_Massive_Glo…
http://www.foxnews.com/story/0,2933,333328,00.html
Heat causes higher CO2 levels, which causes lower temperatures ( but not to the same extent as the sun may supply heat ).
Also, did you ever wonder why heat would be trapped IN without being trapped OUT as well?? That was my first clue that Gore didn’t think things through well enough. You can rarely have one without the other, if you are keeping heat in, you are also keeping it out ( it is called insulation ).
Now, however, having done a bit more research specifically on volcanic CO2 emissions, I see that I am wrong in regards to the amount of CO2 in those gases.
In any event:
CO2 stymies atmospheric heat loss by 32 W/m2
What about water vapor? 75 W/m2
Granted, the sum of the parts is not helpful, I don’t hear anyone screaming about the water vapor emissions, which are FAR more dangerous ( according to Gore’s view, anyway – greater temperature capacitance and lower heat loss ).
The actual real-world data also suggests that CO2 cools more than warms the earth, but the effect is largely due to water vapor displacement, which is a transient effect.
In the end, however, we are outpaced in CO2 and water vapor production over the history of man-kind, and correlation can be seen with causality – heat first, CO2 second.
How much CO2 is man-made ??
We are adding about 20 billion metric tons annually. Meaning we have, cumulatively, added a more than a trillion metric tons in 50 years. This has had some effect, certainly, right??
Every year animals exhale more than 50 billion metric tons of CO2. The ocean emits about 90 bmt yearly, but absorbs slightly more. Plants release some, but absorb more into the soil or the plant’s cells ( until they are released ).
Rotting organic material produces 50 bmt.
There are many other natural CO2 emitters that outdo man.
Volcanoes were a surprisingly small contributor – 230 million metric tons… good to know
So, man makes about 4% of all CO2…
In any event, thank you for causing me to do more thorough research – I’ll not use the volcano example again – but that doesn’t change the effect CO2 has – faster growing plants, somewhat more fluid temperature ( which isn’t exactly good, but isn’t *warming* ) and also doesn’t change the fact that man’s contribution is relatively minor, overall.
Besides, I wonder what our water-vapor pollution levels would be? Granted that would have the effect of stabilizing temperatures more than anything ( CO2 should destabilize due to higher thermal conductivity – though water vapor is measured in percent, whereas CO2 levels are being measured in part per million… ).
–The loon – always learning
EDIT: somehow linked to someone’s comment… heh….
Edited 2009-06-24 16:19 UTC
oops.. CO2 isn’t more conductive, it is more capacitive – but my main point remains valid.
Thousands of times more water vapor…
or was I right?? hrm… stupid memory….
Edited 2009-06-24 16:55 UTC
Did you actually read any of those links?
The numbers in the NASA study account for about 14% of the global rise in temperatures, based on the numbers cited in wikipedia’s global warming article. Also, that article cites DailyTech, and not the actual Goddard study. That’s quality, investigative reporting right there.
From the third link,
CO2 does not lower temperatures. The page says in a couple places the drop in temperature was caused by a reduction in CO2. That is quite the opposite of your statement that CO2 lowers temperatures. Again, did you actually read the article?
As for the fourth link, well, to say that a 150 year trend was ended by one cold winter is stupid. You can’t say an 150 year trend has ended because of one cold winter. By the way, in my neck of the woods, snowfall was actually quite a bit lower than normal for the third year in row, further worsening my region’s latest drought conditions.
And as for Fox News, well, they are such a worthless news organization, especially when it comes to science. They would report the sky was purple if the Republican party said it was so, and anybody who says the sky is blue are only supporting terrorists by their treasonous, anti-American statements.
Edited 2009-06-24 20:22 UTC
Well, water (the number one greenhouse gas in our atmosphere) is transparent to visible light, but it is NOT as transparent to infrared light.
This means that when light from the Sun arrives at Earth, visible light (which is the range where the Sun emits most intensely) passes straight through to the surface. There’s also infrared light in there that DOES get absorbed, bounce off, etc… but most of the energy is in visible light.
Visible light that reaches the surface mostly gets absorbed, heating the surface, which then re-radiates in the infrared. Again, water is not as transparent to infrared light, so the atmosphere doesn’t let that energy escape back into space.
It has to do with blackbody radiation: The Sun is 5780 degrees Kelvin, so its peak energy production is in the visible light range; the Earth’s surface is about 280 degrees Kelvin, so its peak energy production is in the infrared light range.
I apologize if any of the above is slightly wrong, I’m working from memory and didn’t have time to do any actual calculations.
Edited 2009-06-24 21:54 UTC
All else aside then… do we need to tax water vapor emissions?
I mean, we measure water vapor as a percentage, whereas CO2 in parts per million. CO2 cools the upper atmosphere and causes a slight increase in temperature at the surface (the planet warms faster AND cools faster) – I guess I need to make this distinction in my discussions ( I don’t normally discuss this topic ).
Also, my real meaning seems to be lost, all I meant to say was that man-made global warming and man’s CO2 emissions were political tools – and they are.
Not that the end results are terrible, we needed the change, but I don’t like the lies. The warming is real, the CO2 rise is real, man adds to this effect – but man’s overall effect on temperature is almost zero – at least via CO2.
Water vapor absolutely renders CO2 figures meaningless until we see concentrations in parts per thousand. Granted, seeing the rising trend early on is helpful – but it is not the cause of the warming cycle. It is still the sun – regardless of the poor conclusion from NASA ( I look at data, conclusions are normally purchased or demanded [ besides dissenting views are not permitted in science anymore ] – the study should have properly said that other factors contribute to the warming observed, not WHAT those factors may have been ).
Solar output increases result in CO2 output increases, but, most importantly, higher humidity. I don’t need data for that, it is common sense. An increase from 300 PPM to 390 PPM of CO2 non-trivial considering the base. but has VERY minimal effect on temperatures. I wish I could find data for historical humidity levels…
It is vital to remember that correlation does not equal causality. Higher temps and humidity would cause a change in plant & animal respiration, decay, and ocean CO2 absorption rates. A higher temperature prevents that water vapor from condensing as quickly as normal, thus trapping more ‘stuff’ in the air, thinner & wetter air is more difficult to breath for animals, so breathing quickens ( almost 55 bmt of production just for breathing vs man’s total 20 bmt output ) – and bacteria love it.
In time, the solar output will decrease and the rains will dry out the air a little, and levels will adjust, but the temperature decrease will occur first. All this is a political game for the power elite – a tool to consolidate more power into the hands of only a few ( a bad idea, regardless of intentions ).
Besides, we have ice ages with super-high CO2 levels: the Ordovician is alleged to have had 4000 ppm of CO2, and the more recent Permian ice age had about 500-600ppm. What is true, however, is that – in both cases – CO2 levels declined rapidly as part of, or in response to, the cooling.
Correlation does not show causality.
I’m certainly no expert in the field, but I know enough about statistical modeling to know why these scientists believe what they do. They aren’t dumb, they just believe the trends revealed through normalization is enough to “reject the null.” Mathematically speaking, it is. That doesn’t mean you have the right answer – you need the correct input first ( MUCH debate about which rises/falls first [ CO2 or temps ] is still occurring amongst ALL scientists in the field ). At the same time they manage to forget about absolutes, everything becomes relative and correlative based upon a narrow window of time.
Then, they have the prevailing winds against them should they desire to research “effect of water vapor on global warming.” No one cares, those with the power to issue grants can’t do anything about water vapor… so no funding == no “scientific” research.
Granted, in the last few thousand years CO2 levels have been very low – caused by whatever, and are back on the rise – slightly.
One nice site: http://www.longrangeweather.com/
They accurately predicted the fall in temps from the agreed average, though I don’t see them making that claim…
On another note, an observant individual may have noticed that the media stopped calling “global warming” as such, and went to “climate change.”
Back to the article, however, I’ve had the idea a hundred million times to use the excess heat from the machines to warm a space – it is a big DUH thing, but 60C coolant requires either hardware designed for higher temperatures is to be employed, or I.B.M. is trying to get CPUs to die a bit sooner – but when their outside of warranty.
70C T-max doesn’t go well with a 60C ambient. Period.
–The loon [ all over the map ]
– I’m not even gonna re-check my post, I’m going to bed.
What an idea !!
I will go now to buy this supercomputer to heat my building, and…, during this summer !!!
Better IBM stay quiet, than telling us these kind of stories…
At least, they announce it in winter, not now !
Well… depends on where you live
Maybe they announced it for Australians