Can Facebook Show How to Reduce the Growing Energy Use of the Internet?
Can Facebook Show How to Reduce the Growing Energy Use of the Internet?
Computer design and server farm location could dramatically reduce the energy required to run Facebook and the rest of the Internet, reducing greenhouse gases, too
By David Biello (Source: Scientific American)
More than 750 million users, 532 million kilowatt-hours of energy consumption and the attendant 285,000 metric tons of carbon dioxide: those are Facebook’s numbers for 2011.
That means, as the social networking company wrote in an August 1 Facebook post (naturally) releasing the data on energy use, that “one person’s Facebook use for all of 2011 had roughly the same carbon footprint as one medium latte. Or three large bananas. Or a couple of glasses of wine.” That’s 269 grams of CO2 per “active user,” and another invisible impact of the computing cloud.
But that cloud has a very tangible physical impact. Although the individual number may sound small, when added up, Facebook’s—and the world’s—use of row after row of computer servers stored on racks in massive, refrigerated, windowless warehouses in places like Prineville, Ore., and Forest City, N.C., consumes a growing share of the globe’s energy. For example, to keep Amazon ever ready to take an order, rack after rack of computers in a data center are chilled below 21 degrees Celsius. There are now more than 500,000 data centers worldwide, hosting the bulk of the more than 32 million individual servers. Server farms, according to data center expert Jonathan Koomey of Stanford University, now account for roughly 1.5 percent of global electricity use, or about 300 billion kilowatt-hours of electricity per year. Google’s data centers, for example, dwarf Facebook’s, using two billion kilowatt-hours per year as the world searches for the latest article on server energy use.
That makes the Internet a larger emitter of greenhouse gases—230 million metric tons—than all the countries of Scandinavia put together.
Internet companies, of course, are not looking for massive energy bills—or catastrophic climate change. Wringing the most energy efficiency out of such cloud computing has become an important part of a company like Facebook’s profitability—and cooling all those computers remains the single largest use of energy for these companies.
The only thing that has kept servers from sucking up ever more energy has been a little known corollary of Moore’s law: over the past 65 years, the number of computations that can be done per kilowatt-hour of electricity used has doubled every 1.6 years, according to Koomey’s research. But small server rooms, or even closets, employed by smaller companies the world over, typically do not have computers with the most efficient cooling and use up to twice as much electricity per computation as the more effective computers employed by many large computing companies.
One idea to cut down on energy use in server rooms (as well as large server farms) is to simply raise the temperature such servers operate in. “Why are data centers cooled to 18 to 21 degrees Celsius? People are concerned about reliability,” says Charles Rego, Intel’s chief architect for high density data centers and cloud infrastructure. But today’s servers can comfortably operate at as much as 27 degrees C, and Intel specifies that its chips must tolerate up to 35 degrees C without a loss in performance. “For every degree Celsius you move, it’s 4 to 5 percent energy savings,” he notes.
If server farm temperatures were raised by just 5 degrees C, globally, 1.7 million metric tons of CO2 emissions could be avoided at present levels of usage (as well as enough energy saved to power Taiwan for a month). That is about the amount of CO2 sequestered by 43 million trees growing for a decade—or roughly a new Nordic forest covering Scandinavia, according to Intel.
Another step is to change the layout of the microprocessors—each one heating up as it computes—inside a server. They can be arranged so that they are not lined up, which results in each preheating the other, but rather are spaced to allow for cooling airflow.
The architecture of the server buildings themselves can help, too. Facebook’s new server farm in Prineville, for example, cools itself completely with the surrounding air, which has itself been cooled through evaporation rather than an air-conditioning chiller employing ozone-destroying and greenhouse-exacerbating chlorofluorocarbons or a cooling tower. And data centers are also learning how to shrink and grow on demand—meaning more or less computers are on at any given time. “Most servers in the U.S. or the world are very underutilized,” notes senior engineer Pierre Delforge of the Natural Resources Defense Council, who has been helping to reduce this large source of electricity consumption. Many servers are run at as little as 5 percent of capacity, or 15 to 20 percent at best, while running inefficient software code that, in some cases, was programmed 50 years ago.
It is not just power and money at stake. Data centers employ some 80 billion gallons of water for cooling annually, according to Intel. If the chipmaker can figure out how to operate at temperatures above 40 degrees C, “it gets rid of water,” Rego says.
In the end, though, simply building such server farms in places that are naturally cool and renewably powered—think Facebook opening such a warehouse in Luleå, Sweden, with its chilly air and abundant hydropower in 2014—may prove a like-able move. “When you can use cold air or, even better, cold water, you don’t have to make cold air or cold water through chillers and therefore save a significant amount of energy,” Delforge explains. “Data centers generate a lot of heat.” But with energy efficiency and proper siting, maybe all the hot air expressed on Web sites like Facebook can avoid exacerbating global warming.