the film forum library tutorial contact |
Data Centers Sprouting Up in Oregonby Associated PressCanadian Business, November 20, 2011 |
PORTLAND, Ore. -- Data centers arrived in Oregon five years ago, cloaked in mystery. There's no hiding them now.
From Umatilla to Prineville to Hillsboro, server farms are sprouting across the state. They are the physical manifestations of the cloud that hosts your free Gmail, movie streams and Facebook friends. And though not big employers -- computers do all the heavy lifting -- their technology and investment are nonetheless beginning to transform the rural communities where they operate.
This is Oregon's newest industry and, by some measures, may soon be among its biggest. Only chip-makers spend more on their Oregon facilities.
They're here for the cheap power -- a big data center can gobble up more electricity than a small town -- and the mild climate that keeps their hardworking computers cool. Above all, they're here for tax breaks that make Oregon a relative bargain for companies that can spend $1 billion or more on a single facility.
The biggest names on the Web, Facebook, Google and Amazon, are here, and others soon will be. Three Silicon Valley companies announced plans last month to build in Hillsboro, and at least four other builders, shrouding their identities behind code-names such as "Maverick," are scouting sites in eastern and central Oregon.
"The assumption is that with the Internet, place no longer matters," said Andrew Blum, a Wired magazine correspondent who's just written a book on the Web's inner workings called "Tubes: A Journey to the Center of the Internet."
The reality, he said, is that location matters as much as ever. Innovations honed in The Dalles and Prineville are now being replicated from South Carolina to Sweden.
"These are the hot rods," Blum said. "These are the most efficient and in some ways the most technologically advanced data centers around."
Worldwide, companies will spend nearly $100 billion this year to equip these mammoth facilities, 12 percent more than in 2010, according to Gartner Inc. The Stamford, Conn.-based researcher forecasts that tab will climb another 25 percent by 2015 as more and more data moves online.
Behind the curtain
With annual energy bills in the millions and even tens of millions of dollars, data center operators are in a constant pursuit of more efficient designs to reduce consumption and contain costs. Initially, companies were very secretive about their Oregon facilities, hiding details of their technology to gain a competitive advantage and to avoid negative publicity over their energy use.
But as the technology has advanced, they've reversed field.
Facebook was the first to lift the veil.
"What the user sees, and what's happening behind the scenes, is dramatically different," said Thomas Furlong, Facebook's director of site operations, who oversaw development of the Palo Alto, Calif.-based company's first data center, in Prineville.
The high-desert city of 9,250 is a central, albeit invisible, node in Facebook's network of 800 million members.
The Prineville facility is in a long building perched atop the sunbaked bluff overlooking the tire-and-timber town. Inside, it's silent and dim as Angie Weatherman, 29, strolls through the racks tending to the computers. In a mild nod to aesthetics, the panels separating the servers are the same distinctive blue that borders every Facebook profile.
Just 55 people work at the facility, half of them security. So most of that precious data load is managed by engineers in California who monitor the servers, dispatching on-site technicians like Weatherman when they spot trouble.
A blustery wind swirls about Weatherman on her solitary walk, the gusts lifting her long brown strands as she turns to enter a narrow hallway. Fluorescent lights silently click on above, illuminating two racks of computer servers on each side. Weatherman used to manage computers for Les Schwab, before the tire giant moved its headquarters to Bend. Now she proceeds to a rack of servers in need of maintenance.
Facebook has assumed a high profile in Prineville, said Weatherman, who works for a contractor Facebook hired to maintain its servers. In a town with few options for techies, she said, the company's provided a welcome bridge following Les Schwab's exit.
"It's a great opportunity," she said. "I know I was really excited."
Packed into a building the length of a battleship, tens of thousands of computers hold the intimacy of the information age: photos of new babies alongside status updates that capture heartbreak and triumph.
"For me, the Facebook data center is the repository of some of the most meaningful bits," said Blum, the Wired writer. "It's everything Facebook is on a good day, on its most emotional moments."
Each company's data center works differently, but Facebook's is distinctive in several ways.
It made a substantive departure on design, emphasizing simplicity instead of the complicated systems capable of handling vast amounts of information round-the-clock.
The company distributes its data across multiple sites, reducing its dependence on any one facility and enabling a less complicated, less expensive format at each site.
It opted for spare concrete floors, a low-tech cooling system and bare-bones computers that it custom-builds itself.
"Complexity creates waste," said Ken Patchett, who manages the Prineville facility. "The important thing about our data center is what's not here."
The cooling system is central to Facebook's approach. Computers generate enormous heat, especially when dozens are packed tightly together as they are inside data centers, and those temperatures can cause them to overheat and crash, taking a data center offline. Most data centers keep them cool with conventional air conditioning, which requires a lot of electricity and a lot of water.
But Facebook's system is something like an old-fashioned swamp cooler and one uniquely suited to Prineville's climate.
"The cheapest cooling system you'll get is free air. Moving air is much easier than moving water," said Jay Park, Facebook's director of data center design.
Enormous fans suck in the city's dry desert air and -- in long corridors above the data center -- funnel it through misters that add moisture. Those droplets evaporate and the air cools as it moves through the data center, directed by furious winds blowing through the building.
The data center also counts another innovation: reducing misspent energy.
A conventional data center wastes as much as a quarter of its energy simply converting electricity from the power grid for use in the data center. But Park dreamed up a solution, inspiration striking as he lay in bed one night.
His design, sketched on a napkin, had Facebook's custom servers to run on a higher voltage, drinking their power straight from the grid instead of converting the energy back and forth to power the servers and the cooling system.
By eliminating the power conversion, he said, the facility loses just 7.5 percent of its energy.
"We've eliminated unnecessary components," said Park, whose midnight sketch hangs inside the data center. "All that stuff we don't need, we took it all out." Building from scratch, Facebook didn't know for sure that its new designs would perform.
"As engineers, we believe it's going to work," Furlong said. "As businesspeople, we build in all sorts of contingencies, backup plans."
"Meaning," Park rejoins with a grin, "he doesn't trust us."
Facebook imagined a variety of potential hazards, from central Oregon wildfires to volcanoes in the Cascades. Because of its distributed network, though, Facebook concluded it could withstand outages at Prineville or other individual sites.
But not if they don't see them coming.
"Unexpected is painful," Furlong said, grimacing. "The software systems can't really respond that quickly to a straight outage."
Massive diesel generators out back (again, colored Facebook blue inside their sheds) stand sentinel, prepared to chug for a day or two if the power goes offline.
"We have more ways to not lose data than you can imagine," Furlong said. "That, to us, is one of the inviolate things."
Prineville has become Facebook's template as it shifts its data from leased facilities to company-owned centers deployed around the globe. The Oregon design is being replicated at its second company-owned center in North Carolina, as well as at a third facility, announced last month, in Sweden.
Other companies are taking note, too. Apple engineers toured the Prineville center last summer, and Facebook is publicizing many of its innovations in hopes of bringing down operating costs and energy use across the industry.
That open approach, as much as the design innovations, is what distinguishes the company, according to Jon Koomey, a consulting professor at Stanford University who studies data centers and the environmental effects of technology.
Google and other big data center companies had already attained energy savings comparable to Facebook's, Koomey said. Given that energy savings are no longer a key competitive advantage for big data companies, he said Facebook and others can have a significant impact by sharing their discoveries with other organizations -- saving cost and energy across the board.
"Facebook is the first to really open the kimono," he said. "They've explained what they're doing with the explicit goal of changing current practice."
learn more on topics covered in the film
see the video
read the script
learn the songs
discussion forum