Facebook unveiled its new technology at Oregon State Pliville data center http://www.aliyun.com/zixun/aggregation/13748.html "> Infrastructure details, including custom servers, racks, and UPS. This is Facebook's first self-built data center.
Jonathan Heiliger, vice president of Facebook's technology business, said the Pliville data Center's pue was 1.07, meaning the data center was the industry's most energy-efficient data center.
Refrigeration design
Facebook has adopted a two-tier architecture that separates servers from refrigeration equipment, allowing for maximum utilization of server footprint. Facebook chooses to manage the refrigeration supply through the upper half of the data center, so the cold air can go from the top to the server space, using the cold air to drop the natural cycle of hot air rising, to avoid the use of air pressure to achieve the wind.
Oregon State's cool, dry climate is a key factor in Facebook's decision to place its data center in Pliville. "This is an ideal location for evaporative cooling," said Jaypark, director of engineering at Facebook's data center. For the past 50 years, Pliville has never been hotter than 105 degrees Fahrenheit (about 40.56 degrees Celsius).
Air enters the data center through a set of ventilating fans on the second floor. Then the air passes through a mixing chamber, where the cold air can be mixed with the waste heat of the server to adjust the temperature. Then the cold air passes through a series of air filters, in the last filter chamber, to further control the temperature and humidity through a small nozzle. The air continues through another filter, absorbs water mist, and then through a fan wall, the air is blown through the floor opening into the server area. The entire cooling system is located on the second floor, the air directly to the server, no wind tunnel.
"The beauty of this system is that we don't have any plumbing systems," Park said. The air goes directly into the data hall and presses into the entire data center. ”
Racks and servers
The cold air then enters the custom racks, these racks three a group, each including 30 1.5U of the Facebook server. To avoid waste, the server is also customized, and the Intel and AMD motherboards inside the server are spun off into some basic necessary parts to save costs. The size of the chassis is larger than the general chassis, which means that the ability to accommodate larger radiators and fans also means less ambient air is needed to cool down. "We've removed all the useless stuff," says Amirmichael, a Facebook hardware engineer. "A slightly higher chassis allows us to use higher fins." With a relatively conventional 40 mm fan, we can also use a larger, more efficient 60 mm fan. ”
Wiring and power are placed in front of the server, so Facebook's operations staff can maintain the machine in a cool channel instead of standing at the back of the machine at temperatures up to 100 degrees Fahrenheit (about 37.8 degrees Celsius).
UPS and Power Distribution
One of Facebook's special focus is power distribution, where traditional data centers use uninterrupted power supply, which AC in the conversion of a switch. "We've put a lot of effort into the efficiency of this power supply design," says Michael.
The Facebook data Center uses a custom power supply device that can be used for 277V AC power instead of the normal 208V, which allows the power to be directly connected to the server without the need for AC to DC conversion to avoid power loss. The power supply is produced by Facebook, the PowerOne Electronics and the California power supply.
What about the UPS system? Facebook originally planned to install on-board batteries on the server, eventually installing a in-rowups unit, with a set of UPS between each group of servers to provide back-up power during power outages. Each UPS system consists of 20 batteries and five serial 48V DC batteries. Facebook's power supply consists of two parts, one is AC mains, and the other is a DC UPS system.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.