Independence for Facebook’s open source hardware project

These days, it is not uncommon for large web companies to share the technologies they have built internally with the outside world. In fact, it is now almost expected of them.

Usually, this involves publishing code. Google, for example, developed Map Reduce, a technique for breaking analytical jobs into smaller tasks that can be executed in parallel, and wrote a scientific paper about it.

An engineer at Yahoo! used that paper to develop Hadoop, an implementation framework for building Map Reduce applications. Yahoo! later submitted Hadoop to the Apache open source foundation.

Social networking giant Facebook in turn developed Cassandra, a distributed storage system that is commonly used in conjunction with Hadoop, which is now also open source.

But in April of this year, Facebook also made the unsual step of “open sourcing” the hardware specifications for the servers it assembles for use in its newer data centre facilities.

Under the banner of the Open Compute Project (OCP), Facebook shared information ranging from motherboard designs to power supply specifications, battery cabinet plans to cooling mechanics.

These are the specifications that Facebook used to kit out its recently opened data centre in Prineville, Oregonm which accoding to the company consumes 38% less energy than its existing data centres. costs 24% less to run, and has a PUE (power usage effectiveness) ratio of 1.05. They will also be used in three data centres Facebook has planned in Sweden.

By opening up these specifications, Facebook hopes to derive the same “open innovation” benefts as are associated with open source software – when other organisations adopt them, they will refine and improve them, leading to better performance in its own facilities.

"We want you to tell us where we didn’t get it right and suggest how we could improve," wrote Jonathan Heiliger, Facebook’s VP of technical operations at the time. "Opening the technology means the community will make advances that we wouldn’t have discovered if we had kept it secret."

In October 2011, Facebook stepped up its campaign to encourage those advances. At an event in New York, the company handed over control of the project to a new organisation named the Open Compute Foundation.

This organisation’s board of directors include Sun Microsystems co-founder Andy Bechtolsheim, Goldman Sachs managing director Don Duet, Rackspace CEO Mark Roenigk and Jason Waxman, Intel’s general manager of high-density computing. Intel, ASUS, Dell, Baidu, Mozilla, Rackspace and Goldman Sachs have all signed up to Foundation.

For Facebook, the advantages of having all these companies working to enhance its data centre infrastructure are obvious.

“Clearly, Facebook is hoping to reduce total cost of ownership [for its data centres] by finding the right mix of computational power, power consumption and cooling costs,” explains Ray Valdes, who cover Facebook for analyst company Gartner. “The Open Compute Foundation is about getting other minds from other companies to apply themselves to these problems.”

But what is in it for the suppliers? According to Valdes, participating in the Foundation presents an opportunity for companies to expand their knowledge and to enhance their brand.

Katie Broderick, IDC’s data centre and server analyst, believes the Foundation will be more about collaboration and innovation than about producing commercial products. “I don’t think you’re going to see people building ‘Facebook servers’ anytime soon, but it will be interesting to see some of the ideas that come out of the project,” she says.

As for other data centre operators hoping to catch a glimpse of how Facebook supports the second most visited site on the Internet, the Open Compute Foundation will be of limited use. Facebook’s special sauce is in its software, and that is still a closely guarded secret.

 “A lot of large scale cloud providers aren’t running virtualisation software, but rather virtualised input-output in a big grid,” says Broderick. “I’m not necessarily saying that’s what Facebook is running, because we don’t know.”

Right now, there are few organisations that require quite same scalability of web infrastructure as Facebook. However, the same might have been said for Hadoop when it was first deployed by Yahoo! in 2008. Today, Hadoop is attracting huge interest as companies seek new ways to analyse data.

“If the Open Compute Project is to be a success, I would see it following a similar path as Hadoop,” says Gartner’s Valdes.

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media plc from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The Economist Intelligence...

Related Topics

Open Source