It is fairly impractical as demoed here, because the 3M Novec fluid is way too expensive. But that's why 3M partnered with Gigabyte for this: they can build whole servers with non-standard form factors optimized to maximize density and minimize the empty volume that needs to be filled with the coolant.
They make a fairly convincing claim that immersion cooling leads to much lower component failure rates due to everything in the system being kept well below unsafe temperatures and not experiencing large temperature swings. That should at least partially offset the maintainability challenges of immersion cooling.
I would like to see a rack-scale demo of this, both to see how they would remove the heat generated by so many servers when the vapor is only ~45-50ºC, and how they handle power delivery to such a dense cluster.
I'd assume you would have a water/"3m fluid" heat exchanger. I'd also wonder how effective it would be to have plastic inserts that would reduce the volume of expensive fluid needed and maintain flow. The other question would be what fluid would be flowing through the heat exchangers. I'd guess something like highly diluted anti-freeze (to prevent fouling) and then further heat exchanged with water pulled from a river/lake.
Personally, I'd rather skip all these steps and just use "all in one" units (with lengthened tubes and water/water heat exchangers instead of water/air, but there's a lot of heat that you will miss. Containing/making irrelevant leaks (such as this fluid) will be the key.
Hey, if immersion cooling was good enough for Cray... In fairness, Cray didn't use boiling coolant. And that boiling is actually ingenious, as it ensures all the hardware is maintained at or below the boiling point of the bath, and that coolant circulates across all hot components pumplessly.
IIRC, OVH has built some of the largest datacenters in the world, and done so without using any air conditioning. They have datacenter-wide watercooling loops that are cooled with big fans.
I think thats probably more like most industrial scale air conditioners which use a giant evaporative chiller tower to cool water in a loop that cycles through radiators through which air is pushed. Most large businesses use them as they're much more efficient at the large scale than heat pumps utilizing refrigerant. For example I work at a hospital that has 4 large chiller towers to provide chilled water for air conditioning. I imagine a datacenter is the same.
Yes large-scale 'air-conditioning' is really chilling water down and then pumping that to air/water heat exchangers where you need cooling. Water phase-change is indeed cheaper at the largest scales than heat pumps, though at my place we have both (diversity of back-ups).
You'd just connect these liquid baths to heat exchangers fed with chilled water. If you're clever you stick them in a cold town, then use heat pumps to put the heat into the buildings.
This thing looks just like the liquid cooled computer in the spaceships in the film Sunshine!
...and you do still need to pump chilled water into water blocks in the racks. The radiator (most visible in the fourth picture) will have had chilled water circulating through.
I wonder what's in Novec and if frequent exposure to it or its vapors over a long period of time could cause unintended health effects. It could make the data center workforce at risk for problems that we'd only learn about years later. Aside from added maintenance difficulties, that'd be a cause for concern. Okay, sure most administration is done remotely but someone has to go into the racks to work on stuff. Ick if that person ends up growing an extra arm or something.
3M publishes safety data for Novec. It's mostly large-ish hydrocarbon molecules with the occasional halogen group. You wouldn't want to be around it if it was burning, but under more ordinary conditions it's only a problem in very large quantities or prolonged exposure to large amounts. By most measures, the most dangerous ingredient seems to be the isopropyl alcohol, but there are certainly other components that can be harmful.
Given the expense of the fluids, the most cost-effective handling and storage measures should easily provide sufficient containment to keep exposure levels safe.
So what was this thing extracting in total? I make it just over 2 kW coming out of that 2U rack, assuming efficient power supplies. Rejecting that much heat without fans is rather good.
I don't understand how this "So the point in all this is more efficient cooling – no need for massive air conditioning units in a data center, no need to pump chilled water into water blocks." removes the need to remove the heat from the servers from the building???
Same amount of heat generated from the servers = same general need of removing it from the building. Whether that method is air conditioning, liquid cooling, etc. gets to be fairly similar looking if you look at the entire process.
BTW - For the curious, the fluid used in the demo at CES is not Novec 72DA. Fluorinert FC-72 or perfluorohexane was used. It also runs on Novec 7100, the fluid preferred by Bitcoin companies like Bitfury. - demo co-creator
A colleague just reminded me to correct a misstatement in the above article. Since I built the tank with Adachi-San of Gigabyte at my Lab in St. Paul, I feel qualified to do this. The fluid is not Novec 72DA. Novec 72DA is a reasonably aggressive azeotropic solvent that might have damaged the computer. Systems like this have been run in pure Novec hydrofluoroethers like Novec 7100 (popular in Bitcoin) without the dichloroethyene and IPA. Here it was in fact running in Fluorinert FC-72.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
28 Comments
Back to Article
JoeyJoJo123 - Tuesday, January 17, 2017 - link
Looks neat, seems impractical for actual data centers.Billy Tallis - Tuesday, January 17, 2017 - link
It is fairly impractical as demoed here, because the 3M Novec fluid is way too expensive. But that's why 3M partnered with Gigabyte for this: they can build whole servers with non-standard form factors optimized to maximize density and minimize the empty volume that needs to be filled with the coolant.They make a fairly convincing claim that immersion cooling leads to much lower component failure rates due to everything in the system being kept well below unsafe temperatures and not experiencing large temperature swings. That should at least partially offset the maintainability challenges of immersion cooling.
I would like to see a rack-scale demo of this, both to see how they would remove the heat generated by so many servers when the vapor is only ~45-50ºC, and how they handle power delivery to such a dense cluster.
wumpus - Tuesday, January 17, 2017 - link
I'd assume you would have a water/"3m fluid" heat exchanger. I'd also wonder how effective it would be to have plastic inserts that would reduce the volume of expensive fluid needed and maintain flow. The other question would be what fluid would be flowing through the heat exchangers. I'd guess something like highly diluted anti-freeze (to prevent fouling) and then further heat exchanged with water pulled from a river/lake.Personally, I'd rather skip all these steps and just use "all in one" units (with lengthened tubes and water/water heat exchangers instead of water/air, but there's a lot of heat that you will miss. Containing/making irrelevant leaks (such as this fluid) will be the key.
petuma - Friday, February 3, 2017 - link
You can see the technology at 40MW facility scale here:https://www.youtube.com/watch?v=t8dj1LYw50g
29a - Monday, June 18, 2018 - link
The video is blocked in the US you'll need to use a VPN to watch it.Samus - Tuesday, January 17, 2017 - link
Potentially too unreliable/unproven for a data center application, as well. Super computer, or something else experimental, perhaps...LordOfTheBoired - Monday, January 30, 2017 - link
Hey, if immersion cooling was good enough for Cray...In fairness, Cray didn't use boiling coolant. And that boiling is actually ingenious, as it ensures all the hardware is maintained at or below the boiling point of the bath, and that coolant circulates across all hot components pumplessly.
koaschten - Wednesday, January 18, 2017 - link
Well not if you start thinking 90° turned. Like hanging Servers vertically into an aquarium.http://www.grcooling.com/carnotjet/
Guspaz - Tuesday, January 17, 2017 - link
IIRC, OVH has built some of the largest datacenters in the world, and done so without using any air conditioning. They have datacenter-wide watercooling loops that are cooled with big fans.Ej24 - Tuesday, January 17, 2017 - link
I think thats probably more like most industrial scale air conditioners which use a giant evaporative chiller tower to cool water in a loop that cycles through radiators through which air is pushed. Most large businesses use them as they're much more efficient at the large scale than heat pumps utilizing refrigerant. For example I work at a hospital that has 4 large chiller towers to provide chilled water for air conditioning. I imagine a datacenter is the same.Meteor2 - Wednesday, January 18, 2017 - link
Yes large-scale 'air-conditioning' is really chilling water down and then pumping that to air/water heat exchangers where you need cooling. Water phase-change is indeed cheaper at the largest scales than heat pumps, though at my place we have both (diversity of back-ups).You'd just connect these liquid baths to heat exchangers fed with chilled water. If you're clever you stick them in a cold town, then use heat pumps to put the heat into the buildings.
This thing looks just like the liquid cooled computer in the spaceships in the film Sunshine!
Meteor2 - Wednesday, January 18, 2017 - link
...and you do still need to pump chilled water into water blocks in the racks. The radiator (most visible in the fourth picture) will have had chilled water circulating through.boeush - Tuesday, January 17, 2017 - link
Now take that vapor and run it through a tiny turbine to turn the heat back into electricity... :>TekBoi - Tuesday, January 17, 2017 - link
And that my friend, is the Rankine cycle.ingwe - Tuesday, January 17, 2017 - link
I really enjoyed this comment...unlike the class that I finished in December on heat engines.BrokenCrayons - Tuesday, January 17, 2017 - link
I wonder what's in Novec and if frequent exposure to it or its vapors over a long period of time could cause unintended health effects. It could make the data center workforce at risk for problems that we'd only learn about years later. Aside from added maintenance difficulties, that'd be a cause for concern. Okay, sure most administration is done remotely but someone has to go into the racks to work on stuff. Ick if that person ends up growing an extra arm or something.Billy Tallis - Tuesday, January 17, 2017 - link
3M publishes safety data for Novec. It's mostly large-ish hydrocarbon molecules with the occasional halogen group. You wouldn't want to be around it if it was burning, but under more ordinary conditions it's only a problem in very large quantities or prolonged exposure to large amounts. By most measures, the most dangerous ingredient seems to be the isopropyl alcohol, but there are certainly other components that can be harmful.Given the expense of the fluids, the most cost-effective handling and storage measures should easily provide sufficient containment to keep exposure levels safe.
BrokenCrayons - Tuesday, January 17, 2017 - link
Ah thanks! That sounds fairly mundane.29a - Monday, June 18, 2018 - link
MSDShttps://multimedia.3m.com/mws/mediawebserver?mwsId...
29a - Monday, June 18, 2018 - link
Wrong MSDS, here's the right one.http://multimedia.3m.com/mws/mediawebserver?66666U...
Meteor2 - Wednesday, January 18, 2017 - link
So what was this thing extracting in total? I make it just over 2 kW coming out of that 2U rack, assuming efficient power supplies. Rejecting that much heat without fans is rather good.petuma - Friday, February 3, 2017 - link
Power in overclocked state was 3.3kWAnato - Wednesday, January 18, 2017 - link
Now sysadmins and maintainers need to get diving course to access servers :-)zodiacfml - Thursday, January 19, 2017 - link
Seems great for cooling all components and I have read somewhere that CPU loads in the data center are pretty low for best practice.Nottheface - Monday, January 23, 2017 - link
I don't understand how this "So the point in all this is more efficient cooling – no need for massive air conditioning units in a data center, no need to pump chilled water into water blocks." removes the need to remove the heat from the servers from the building???Same amount of heat generated from the servers = same general need of removing it from the building. Whether that method is air conditioning, liquid cooling, etc. gets to be fairly similar looking if you look at the entire process.
MAXINFOTECHVISION - Tuesday, January 24, 2017 - link
http://max-infotechvision.blogspot.in/IS A POWER FULL PLATFORM TO TAKE REVIEW
petuma - Friday, February 3, 2017 - link
BTW - For the curious, the fluid used in the demo at CES is not Novec 72DA. Fluorinert FC-72 or perfluorohexane was used. It also runs on Novec 7100, the fluid preferred by Bitcoin companies like Bitfury.- demo co-creator
petuma - Thursday, November 2, 2017 - link
A colleague just reminded me to correct a misstatement in the above article. Since I built the tank with Adachi-San of Gigabyte at my Lab in St. Paul, I feel qualified to do this. The fluid is not Novec 72DA. Novec 72DA is a reasonably aggressive azeotropic solvent that might have damaged the computer. Systems like this have been run in pure Novec hydrofluoroethers like Novec 7100 (popular in Bitcoin) without the dichloroethyene and IPA. Here it was in fact running in Fluorinert FC-72.