Intel Embraces Oil Immersion Cooling For Servers

Intel sees 20 percent annual growth for the technical computing market from 2011-2016. Will the appetite for ever-more powerful computing clusters push users to new cooling technologies, like submerging servers in liquid coolant? If so, Intel will be ready. Intel’s interest in alternative cooling designs is driven by growth projections for the high-performance computing sector.

Intel Embraces Oil Immersion Cooling For Servers Slashdot posting tells that Intel has just concluded a year-long test in which it immersed servers in an oil bath, and has affirmed that the technology is highly efficient and safe for servers. The chipmaker is now working on reference designs, heat sinks and boards that are optimized for immersion cooling.

Intel Gives Oil-Based Cooling Thumbs Up article tells that Intel finds dunking a server in a bath of nonconductive oil may be an ideal cooling solution and Intel gave its seal of approval to dunking a server full of electronic components in a bath of dielectric oil. This approach can lower the PUE to an eye-catching 1.02. After a year’s immersion in the oil bath, all the hardware involved in the test—microprocessors, I/O chips, connectors, hard drives, and server housing—withstood the oil just fine.

Of course, this “new” way of cooling a server isn’t exactly novel. For example power substations have used oil liquid cooling to reduce heat in transformers for ages. Dunking heat-generating microprocessors and graphics cards inside an oil bath has served as an alternative cooling solution for PC over-clockers for many years. Also early supercomputers have used liquid cooling.

Data Center Knowledge article Intel Embraces Submerging Servers in Oil article tells that Intel is optimizing its technology for servers immersed in oil, an approach that may soon see broader adoption in the high performance computing (HPC) sector. Mineral oil has been used in immersion cooling because it is not hazardous and transfers heat almost as well as water, but doesn’t conduct an electric charge. Mike Patterson, senior power and thermal architect at Intel, says that immersion cooling can change the way data centers are designed and operated. Immersion cooling can even eliminate need for chillers and raised floors.

Austin-based Green Revolution Cooling says its liquid-filled enclosures can cool high-density server installations for a fraction of the cost of air cooling in traditional data centers. The company says its approach can produce large savings on infrastructure, allowing users to operate servers without a raised floor, computer room air conditioning (CRAC) units or chillers. Fluid temperature is maintained by a pump with a heat exchanger using a standard water loop.

In essence, the company’s product, called CarnotJet, houses servers in a specialized coolant oil that absorbs the heat they give off and is then sent to a radiator where it’s cooled before being recycled back into the server housing. Each 13U rack can handle between 6 kW and 8 kW of heat, depending on whether the heat pulled away from the servers is exchanged via a traditional radiator or a water loop. The company claims that they can install 100kW or more of compute in each 42U rack. Here is a video of the technology.

The downside is that mineral oil-style coolants can be messy to maintain. A little mineral oil spreads a long way (ie., it’s messy). If you plan to minimize the needed hardware maintenance and keep spare clothes when working with servers, the messiness might to be a very big issue compared to potential gains.

5 Comments

  1. Tomi Engdahl says:

    How to cool a PC with toilet water
    http://www.extremetech.com/extreme/124677-how-to-cool-a-pc-with-toilet-water

    Hot on the heels of news that Google uses toilet water to cool one of its data centers, it has emerged that an enterprising hardware hacker had the same idea some seven years ago. As you will see in the following pictures, though, Jeff Gagnon’s computer is much more than a toilet-cooled rig — it’s a case mod tour de force.

    Then there’s the CPU waterblock, which has been handmade from a lump of copper and, I presume, an arc welder or a soldering iron. But where does that tubing go, I hear you ask? Where’s the water reservoir, the pump, the radiator?

    Well, it just so happens that there’s a toilet on the other side of the wall

    Reply
  2. Tomi Engdahl says:

    Cool technology: Submerged blade servers escape the heat
    http://www.theregister.co.uk/2014/12/12/blade_cooling/

    Keeping servers cool is a challenge, even in a purpose-built data centre.

    Outdoor equipment in Canada often lives in horrible little metal boxes ironically called “sheds” that bear no resemblance to any structure so spacious.

    They are basically a full depth 12U 19in rack bolted to the side of some steel monstrosity made up of nightmares and solar absorption. Inside the box sit 4U of server, 4U of networking and 4U of heating, ventilation and air conditioning.

    With the chiller going flat out the temperature doesn’t drop below 60°C during the hot days, and stays around 50°C for about four months of the year.

    At first glance it would seem that the obvious solution to these problems is to replace the horrible little shed things with something better. That is far easier said than done.

    For reasons involving bureaucrats (and, I am convinced, demonic pacts) getting external enclosures certified for use with telecoms equipment, on oil pipelines or with various other utilities is way harder than it should be.

    Networking along 6,000 kilometres of oil pipeline is spotty at best so cloud computing is not an option,

    So, if the ovens we kindly refer to as shelters are not likely to get any cooler we need servers that can handle higher temperatures.

    The first answer that comes to mind is from LiquidCool Solutions, which does not manufacture IT equipment but licenses patents

    Liquidcool claims to be able to run servers “in ambient temperatures of 50°C or higher while maintaining core junction temperatures 30°C cooler than fan/air based cooling”. That has got my attention, yes indeedy.

    LiquidCool pitches “harsh environments” as a major use case

    LiquidCool licenses patents to build fully enclosed submerged server technology. Put your server in a box of oil, seal it and pump the oil out to a radiator. It is a great concept, but I have a few issues.

    Assuming that there is absolutely no possible way for the metal dust to get into the oil, then a completely sealed, submerged computer might be a solution. Except for the part where swapping computer components out is a job likely to be given to Welder McMetaldust.

    In June, HP announced its Apollo 8000 system at the HP Discover conference.

    Contemplate this for a moment. HP’s “dry disconnect” technology offers the efficiency of liquid cooling, and also the ability to hot-swap nodes from a chassis.

    Project Apollo is aimed at the high-performance computing market. It is designed to get more computers into a smaller space while using less power than the competition.

    Reply
  3. Tomi Engdahl says:

    Aquarium Computer
    http://eleccelerator.com/aquarium-computer/

    the process of building a PC is pretty boring, it’s just an exercise of picking out compatible parts for the right price. I decided to make it slightly more interesting by submerging the entire computer in a fish tank full of mineral oil.

    This isn’t an entirely new idea (even patented so nobody can sell a kit), many people have already done this. Mineral oil is non-conductive and so the electronics will work perfectly fine while submerged

    I have to be really careful with choosing materials. I’ve noted that some people who build mineral oil submerged computers experience no problems with any materials, while other people report that PVC will swell or harden. The PVC swelling is the cause of capacitors popping off circuit boards. Flexible vinyl tubing (also PVC) can become hard. A more detailed research indicates that capacitors on older motherboards has a rubber seal that is failing, causing hot electrolytic fluid inside to bulge, newer motherboards with “solid capacitors” should not fail. Vinyl tubing should not be operated at high temperatures. Rubber and neoprene must definitely be avoided as they will fail in mineral oil. PLA plastic (used by my 3D printer) can’t be used because PLA will warp even in hot water. ABS plastic (also used by my 3D printer) should be fine. Mineral oil is also used to clean many adhesives so I obviously need to be careful with that as well.

    Reply
  4. Tomi Engdahl says:

    Designing with liquid-immersion cooling systems
    http://www.csemag.com/single-article/designing-with-liquid-immersion-cooling-systems/245158e4ef137225b9ee9ccf2e54cd08.html?OCVALIDATE&ocid=101781

    Liquid cooling is an option in some data centers. Consider these best practices when looking at immersion cooling for your next data center project.

    In simple thermodynamic terms, heat transfer is the exchange of thermal energy from a system at a high temperature to one at lower temperature. In a data center, the information technology equipment (ITE) is the system at the higher temperature. The objective is to maintain the ITE at an acceptable temperature by transferring thermal energy in the most effective and efficient way, usually by expending the least amount of mechanical work.

    During steady-state operation, the thermal energy generated equals the rate at which it is transferred to the cooling medium flowing through its internal components. The flow rate requirement and the temperature envelope of the cooling medium is driven by the peak rate of thermal energy generated and the acceptable temperature internal to the ITE.

    For data centers, air-cooling systems have been de facto. From the perspective of ITE, air cooling refers to the scenario where air must be supplied to the ITE for cooling. As the airflow requirement increases due to an increase in load, there is a corresponding increase in fan energy at two levels: the air distribution level (i.e., mechanical infrastructure such as air handling units, computer room air handlers, etc.) and the equipment level, because ITE has integral fans for air circulation.

    Strategies including aisle containment, cabinet chimneys, and in-row cooling units help improve effectiveness and satisfactorily cool the equipment. However, the fact remains that air has inferior thermal properties and its abilities are getting stretched to the limit as cabinet loads continue to increase with time. For loads typically exceeding 15 kW/cabinet, alternative cooling strategies, such as liquid cooling, have become worthy of consideration.

    Liquid cooling refers to a scenario where liquid (or coolant) must be supplied to the ITE. An IT cabinet is considered to be liquid-cooled if liquid, such as water, dielectric fluid, mineral oil, or refrigerant, is circulated to and from the cabinet or cabinet-mounted equipment for cooling. Several configurations are possible, depending on the boundary being considered (i.e., external or internal to the cabinet). For the same heat-transfer rate, the flow rate requirement for a liquid and the energy consumed by the pump are typically much lower than the flow rate requirement for air and the energy consumed by the fan system. This is primarily because the specific volume of a liquid is significantly lower than that of air.

    For extreme load densities typically in excess of 50 to 75 kW/cabinet, the liquid should preferably be in direct contact with ITE internal components to transfer thermal energy effectively and maintain an acceptable internal temperature. This type of deployment is called liquid-immersion cooling and it is at the extreme end of the liquid cooling spectrum.

    the commercially available solutions can essentially be categorized into two configurations:

    1. Open/semi-open immersion. In this type of system, the ITE is immersed in a bath of liquid, such as dielectric fluid or mineral oil. The heat-transfer mechanism is vaporization, natural convection, forced convection, or a combination of vaporization and convection (see Figure 1).
    2. Sealed immersion. In this type of system, the ITE is sealed in liquid-tight enclosures and liquid, such as refrigerant, dielectric fluid, or mineral oil, is pumped through the enclosure. The heat-transfer mechanism is vaporization or forced convection, and the enclosure is typically under positive pressure

    For both types of systems, thermal energy can be transferred to the ambient by means of fluid coolers (dry or evaporative) or a condenser. It can also be transferred to facility water (chilled water, low-temperature hot water, or condenser water) by means of a heat exchanger.

    The liquid properties impact major facets of the design and should be reviewed in detail.

    The right solution?

    When dealing with extremely dense cabinets, immersion cooling is worthy of consideration. It is suitable for deployments ranging from a few kilowatts to several megawatts. Due to improved heat-transfer performance as compared with an air-cooling system, liquid-supply temperatures higher than 100° F are feasible. Higher liquid temperatures increase the hours of economization, offer the potential for heat recovery, and in certain climates can eliminate the need for chillers completely. The elimination of internal ITE fans reduces energy consumption and noise. In addition, pump energy for circulating liquid is typically lower than fan energy.

    Despite the mechanical advantages, there are reasons for caution when deploying liquid-immersion cooling in data centers. The impact on infrastructure, such as structural, electrical, fire protection, and structured cabling, should be evaluated.

    Reply
  5. Tomi Engdahl says:

    IEEE says zero hot air in Fujitsu liquid immersion cooling for data centers
    http://www.cablinginstall.com/articles/pt/2017/05/ieee-says-zero-hot-air-in-fujitsu-liquid-immersion-cooling-for-data-centers.html?cmpid=enl_cim_cimdatacenternewsletter_2017-05-23

    Given the prodigious heat generated by the trillions of transistors switching on and off 24 hours a day in data centers, air conditioning has become a major operating expense. Consequently, engineers have come up with several imaginative ways to ameliorate such costs, which can amount to a third or more of data center operations.
    One favored method is to set up hot and cold aisles of moving air through a center to achieve maximum cooling efficiency. Meanwhile, Facebook has chosen to set up a data center in Lulea, northern Sweden on the fringe of the Arctic Circle to take advantage of the natural cold conditions there; and Microsoft engineers have seriously proposed putting server farms under water.

    Fujitsu, on the other hand, is preparing to launch a less exotic solution: a liquid immersion cooling system it says will usher in a “next generation of ultra-dense data centers.”

    Fujitsu Liquid Immersion Not All Hot Air When It Comes to Cooling Data Centers
    http://spectrum.ieee.org/tech-talk/computing/hardware/fujitsu-liquid-immersion-not-all-hot-air-when-it-comes-to-cooling-data-centers

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*