Large-scale miners operating huge mining farms enjoy economies of scale and low energy cost. To stay profitable as an residential miner, efficient cooling is critical, in particular during a crypto winter when the value of coins is on a down-ward trend.
If our previous post on mining pods was the winter-edition then this is the summer-edition of building larger data-centers at home. Servers in a residential living space in summer is hell, no matter how good your A/C is.
It took several months of trial-and-error and validation. Our servers are low-cost, used HP Z620 dual-CPU servers with in total 16 cores, and two NVIDIA GTX GPUs, with a preference for quality NVIDIA GTX 1070 Founders Edition GPUs.
Power Savings and Efficiency
We eliminate A/C cooling altogether to save about 40% of energy cost. This is a rule-of-thumb figure often used to estimate cost of cooling. This also means that we can increase scale of our mining operation by 40%.
We further minimize the required air exchange rate by separating air of different temperatures: Fresh air, CPU plus GPU cooler exhaust, and PSU exhaust.
The power consumption of a fan is about 200W each and mean power consumption of our servers is 437W so our PUE (Total Facility Power/IT Equipment Power) in summer is less than 1.06. That looks fine, comparing to Google’s reported best “PUE as low as 1.06.”
Heat management
Silicon chips and most other computer hardware is designed to operate at elevated temperatures. Each server runs software to control GPU fans and power consumption limits, automatically capping operating temperatures at 80 degrees C for both GPUs and CPUs. This is well within spec. of enterprise-grade servers, and consumer-grade GPUs.
In hot summers in the South of the US, ambient temperatures will easily rise beyond 40 degrees C (over 100 degrees F). Therefore we need to make sure that the air entering our servers is as close as possible to this temperature. Also, the volume of fresh air must meet the demand of the server fans operating at maximum speeds.
Second, server exhaust air which is in our case about 20 degrees C (or about 30 degrees F) hotter than the inlet temperature, must be collected and evacuated from the server room. We separate the air from GPU and CPU cooling from power supply (PSU) cooling. We recycle the clean and relatively cool PSU exhaust air by mixing it with fresh inlet air.
Air quality
Dust is another major issue to consider. It is a good idea to use large, high-quality A/C filters such as 20×25 MERV11 filters to filter all air pulled into the server room. Two 700 CFM fans can be used for each 20×25 filter. Make sure the air pressure in the mining room is balanced. For eliminating any residual imbalance we use one extra grille with one filter, without fan.
700 CFM fans made of metal parts and designed to operate at temperatures well above 140 degrees F, 8″ inch flexible duct, and 8″ plastic air inlet and outlet grilles and caps are available at very reasonable prices, striking a good balance between air flow per unit and cost.
In this configuration, at least one exhaust fan and one inlet fan are needed for every 16 servers or so. That is the bare minimum. One pair of fans per 10 servers is strongly recommended.
Location
A garage, or a partition of a garage is a great place for a larger residential data center. The main breaker panel is typically located there making it easy to add breakers and circuits, and the garage door may already have small widows in it, which we can easily replace by air in-lets and out-lets. And all of this is reversible.
After some experimentation, we found that a single circular hole, a strong metal mesh attached from the inside, plastic exterior cap, and strong rigid ties to attach flex duct ends to the opening, form a good and cost-effective solution. Foundation air vents can be cost-effective for some additional air pressure balancing at the cost of higher air flow resistance. Less visually pleasing but more effective is two 8″ holes per window opening combined with metal mesh but without a vent cover. The metal mesh will probably start rusting on the air inlets but not on the exhausts.
An A/C is still a great option for cooling down the remaining space around the data center booth. Painter’s plastic and a stacks of boxes to support it, can help keeping warmer and colder air separated.
Reliability
Would we use this with the latest expensive server hardware? Probably not. But with used, low-cost hardware it seems worth taking the risk. We might see higher failure rates of hard drives or other secondary components. However, so far with outdoor temperatures in the 90’s and 100’s we have not seen any.
We will faithfully update this post in case new experience shows us that this was not a very good idea after all.
updated: 20180623; 20180629; 20190519
photo: ideanist M
Very interesting articles about building mining pods from old high end workstations. The 6-8 pin adapters to the Geforce 1070 cards looks like quality products. Where did you get them?
Cable Matters (2-Pack) 6-Pin PCIe to 8-Pin PCIe Adapter Power Cable – 4 Inches on Amazon