Facebook's data centers worldwide, by the numbers and in pictures | ZDNet

2022-06-17 00:42:36 By : Ms. Nicole Fu

With more than 1.32 billion users and counting , Facebook is arguably at the forefront of burgeoning but still nascent social media world. But the Menlo Park, Calif.-headquartered company is also paving the way in a manner in which most of its global membership base might not be as familiar.

While Google, Microsoft and Amazon Web Services clamor over hosting the cloud needs of its Silicon Valley neighbors and social media darlings from Pinterest to Pulse, Facebook has been building out its own datacenter footprint over the last few years .

Emphasizing inspirations from open source to energy efficiency , the world's largest social network shared its latest updates with ZDNet, revealing cost and power savings attributed to its cutting-edge datacenter designs .

Over the past three years, Facebook boasted it has saved more than $1.2 billion by optimizing its full stack for the datacenter, hardware, and software.

Pictured above, Facebook's flagship datacenter building in Prineville, Oregon was constructed with 950 miles of wire and cable — touted to be equivalent to the distance between Boston and Indianapolis.

The Prineville facility is made out of 1,560 tons of steel, equal in weight to approximately 900 mid-size cars.

All in all, if you stood Prineville’s Building 1 (with a footprint of roughly 332,930 sq. ft.) end-to-end, it would equate to an 81-story building.

Prineville was Facebook's first datacenter deployed using Open Compute Project designs . When it started serving traffic, Facebook said it was 38 percent more energy-efficient than its leased capacity at the time, lowering operational costs by up to 24 percent.

The Facebook Altoona datacenter campus is 202 acres, described to be 42 acres larger than Disneyland.

Fun fact: If you had enough ping pong balls, Facebook estimated it could fit 6.4 billion of them in Altoona Data Center Building One.

Altoona 1 plans were first unveiled more than a year ago. Since then, more than 460 people have worked on the project, logging more than 435,000 hours in the ongoing construction of the 476,000-square-foot building.

Pending local council approval, Facebook is planning on breaking ground on the second datacenter building designed to mirror the first, aptly named Altoona 2 .

The rural 160-acre campus in Forest City, N.C. opened in 2012, taking the building blocks of the Open Compute Project to a new level .

Thanks to design efficiencies attributed to OCP, Facebook said it saved $1.2 billion in infrastructure costs — enough energy to power 40,000 homes for a year or the carbon equivalent of taking 50,000 cars off the road.

To demonstrate how these energy and cost savings happen, Facebook explained it reuses computer server heat by taking a portion of the excess heat and using it to heat office space during colder months.

An evaporative cooling system is used to evaporate water to cool the incoming air — as opposed to traditional chiller systems that require more energy intensive equipment. This process is championed as highly energy efficient, minimizing water consumption by using outside air.

The Forest City center runs 100 percent on outdoor air, saving on heating and cooling costs from power-hungry air handlers.

When in Sweden, Facebook is evidently doing as the Swedish do. Facebook's first facility abroad is (no joke) taking an Ikea-like approach with its own out-of-the-box, pre-fab datacenter blueprint .

Dubbed Rapid deployment datacenter design (RDDC), the guide takes modular and lean construction principles and applies them at the scale of a Facebook datacenter.

The RDDC design is based on two concepts: A structural frame is built before all the components, from lighting to cables, are attached on an assembly line in a factory. The entire construct is then driven to the building site on the back of a flatbed truck. 

Facebook believes this will enable it to deploy two data halls in the time it previously took to deploy one while also cutting back greatly on the amount of material required for construction.

Facebook design engineer Marco Magarelli admitted in a blog post back in March that the RDDC design actually started out as a hack.

"Our previous datacenter designs have called for a high capacity roof structure that carries the weight of all our distribution and our cooling penthouse; this type of construction requires a lot of work on lifts and assembly on site," Magarelli wrote. "Instead, as Ikea has done by packing all the components of a bookcase efficiently into one flat box, we sought to develop a concept where the walls of a datacenter would be panelized and could fit into standard modules that would be easily transportable to a site."

Since Facebook started deploying its open hardware, the social network estimated it has saved enough energy to power more than 40,000 homes for a year.

Supported by the power of datacenters like these, Facebook noted it sees an average of six billion likes per day alone. Over the last 10 years, Facebook's datacenters have seen more than 400 billion photos shared and 7.8 trillion messages sent.

Facebook is currently testing the chassis approach at its second building under construction at the Luleå campus. Spanning about 125,000 sq. ft., it will be the first Facebook datacenter building to feature the RDDC design upon completion.