Ken Patchett ran Google’s Asian data centers for more than a year and a half, and he says it’s “B.S.” that the company treats its computing facilities as trade secrets jealously guarded from the rest of the world.
He actually writes the letters in the air with his finger. The B. And then the S.
Web giants like Google and Amazon are notoriously secretive about what goes on inside the worldwide network of data centers that serve up their sweeping collection of web services. They call it a security measure, but clearly, they also see these facilities as some sort of competitive advantage their online rivals mustn’t lay eyes on. When he joined Google, Ken Patchett — like so many other Googlers — signed an agreement that barred him from discussing the company’s data centers for at least a year after his departure, and maybe two.
But after leaving Google to run Facebook’s new data center in the tiny Northwestern town of Prineville, Oregon, Patchett says the security argument “doesn’t make sense at all” — and that data center design is in no way a competitive advantage in the web game. “How servers work has nothing to do with the way your software works,” he says, “and the competitive advantage comes from manipulating your software.”
It’s hard to argue with him. He just spent the better part of the afternoon giving us a walking tour of Facebook’s newest data center — from the rows upon rows of extra-efficient machines that serve up the company’s social networking site to the “penthouse” that lets the company cool its facility with outside air rather than burn electricity on the mammoth water chillers traditionally used by the world’s data centers. When Facebook turned on its Prineville data center this past spring, it also “open sourced” the designs for the facility and its custom-built servers. Patchett is merely extending this willingness to share.
For Patchett, Facebook is trying to, well, make the world a better place — showing others how to build more efficient data centers and, in turn, put less of a burden on the environment. “The reason I came to Facebook is that they wanted to be open,” says Patchett.
“With some companies I’ve worked for, your dog had more access to you than your family did during the course of the day. Here [at Facebook], my children have seen this data center. My wife has seen this data center…. We’ve had some people say, ‘Can we build this data center?’ And we say, ‘Of course, you can. Do you want the blueprints?’”
‘The Tibet of North America’
Facebook built its data center in Prineville because it’s on the high desert. Patchett calls it “the Tibet of North America.” The town sits on a plateau about 2,800 feet above sea level, in the “rain shadow” of the Cascade Mountains, so the air is both cool and dry. Rather than use power-hungry water chillers to cool its servers, Patchett and company can pull the outside air into the facility and condition it as needed. If the air is too cold for the servers, they can heat it up — using hot air that has already come off the servers themselves — and if the outside air is too hot, they can cool it down with evaporated water.
In the summer, Prineville temperatures may reach 100 degrees Fahrenheit, but then they drop back down to the 40s in the evenings. Eric Klann, Prineville’s city engineer, whose family goes back six generations in central Oregon, says Facebook treats its data center much like the locals treat their homes. “Us country hicks have been doing this a long time,” says Klann, with tongue in cheek. “You open up your windows at night and shut them during the day.”
The added twist is that Facebook can also cool the air during those hot summer days.
All this is done in the data center’s penthouse — a space the size of an aircraft carrier, split into seven separate rooms. One room filters the air. Another mixes in hot air pumped up from the server room below. A third cools the air with atomized water. And so on. With the spinning fans and the neverending rush of air, the penthouse is vaguely reminiscent of the room with the “fizzy lifting drinks” in Willy Wonka & the Chocolate Factory, where Charlie Bucket and Grandpa Joe float to the ceiling of Wonka’s funhouse. It’s an analogy Patchett is only too happy to encourage.
You might say that Facebook has applied the Willy Wonka ethos to data center design, rethinking even the smallest aspects of traditional facilities and building new gear from scratch where necessary. “It’s the small things that really matter,” Patchett says. The facility uses found water to run its toilets. An Ethernet-based lighting system automatically turns lights on and off as employees enter and leave areas of the data center. And the company has gone so far as to design its own servers.
Cade Metz is the editor of Wired Enterprise. Got a NEWS TIP related to this story -- or to anything else in the world of big tech? Please e-mail him: cade_metz at wired.com.