It’s a new world. At the push of a button, we can access all our old photos, or go on any website at any time of the day or night, but all that comes at a cost—it takes a lot of energy.
In fact, as far back as 2010, it was estimated that the data centers in the US used about 76 billion kilowatts-hours of electricity. According to the DOE, about 3 percent of national energy consumption goes toward powering these data centers, and that amount is rapidly growing.
With mounting electricity bills, and a growing demand for faster processing speed and even more server space, companies are looking for new ways to bring their prices down and their efficiency up.
Here are the four ways that data centers are raising their efficiency by going green.
Where Do Data Centers Get Their Energy?
Massive data centers are turning away from using fossil fuels, and it’s not for the good karma. These days, green energy is becoming the cheap and efficient option.
For one thing, companies that use their own energy sources, rather than relying on the city’s power grid, are secure from any outages or price changes. They will always have a reliable source of energy.
Some companies build power stations with traditional energy sources, like natural gas and oil. But, the trend seems to be leaning more toward renewable energy sources, like: solar, wind, hydropower, among others.
In fact, many large companies are choosing their data center locations in areas where renewable sources are optimal.
Facebook’s new data center is being built in Ireland because, as Tom Furlong, Facebook’s VP of infrastructure, says, “our data center in Clonee will be powered by 100 percent renewable energy, thanks to Ireland’s robust wind resources.”
Going green also allows these companies to sell the power they don’t use back to the state, and make money. It also allows them to stay ahead of governments implementing laws on carbon taxes.
But using green energy sources does not solve the problem. It’s how they are using that power that really counts.
What Are Smart Servers?
Did you know that an iPhone consumes more energy than a refrigerator?
Digital Power CEO Mark Mills claims they do, “although charging up a single tablet or smart phone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year.”
One reason for this inefficiency has to do with what are called, “comatose servers.” These are servers that haven’t delivered any information or services in six months or more, but they are powered just like the other server that needs to be accessed at any given moment.
In fact, it’s estimated that 30 percent of the world’s servers are unused or empty. According to the Stanford and Anthesis Consulting Group, running those comatose servers is costing about $30 billion a year.
That’s a lot of wasted energy, but we are on the verge of saving some of it. Through Facebook’s Open Compute project, and other projects like it, we are on the verge of having “smart servers.” These servers would track the energy consumption and the energy needed for computing, and utilize this information to know when to power down, even for small amounts of time.
When Facebook first published their findings, their new servers were already 38 percent more energy-efficient to build and 24 percent less expensive to run than the previous servers.
What Is Free Cooling?
But powering the servers is only half the battle, though. Actually, up to 60 percent of the energy bill for data centers comes from running the cooling systems.
Servers get hot from spinning all day. And if they get too hot they are liable to overheat and shut down, which is the last thing that a data center wants. So, lots of time and energy is spent on systems that cool servers.
At first, lots of air conditioners were shoved into server rooms, but since these servers were running for 24 hours a day, the cooling costs went through the roof.
So, Google went with an easier option, free cooling—which is just what it sounds like. They simply decide to build a data center in cold places, and let Mother Nature be their cooling system.
With free cooling alone, Google has reduced its global cooling energy bill by 12 percent. “It’s not quite as simple as just opening the windows, “says Dan Costello, Google’s global data center operations officer, “but it’s pretty close.”
Is It Possible to Close the Energy Loop?
Now we are even seeing the first data center so efficient, it actually cleans the environment.
The EcoDataCenter project in Sweden will be the first “negative carbon emission data server” in the world. That means this data server doesn’t emit any carbon at all, and it’s actually is actually cleaning carbon from the air.
How is that possible? Bengt Gustafsson, CEO of Falu Energi & Vatten explains, “We are connecting the data center to an already sustainable energy system and can make use of all the energy. Thereby we are building the very first climate positive data center in the world.”
Think about it this way, data centers are energy transformation devices. They take in electrical energy and export heat. In fact, more than 98 percent of the electricity is turned into heat, which can be used, rather than thrown out.
Using data center efficiency metrics, data centers will be able to make power that they sell back to the community. In effect, they will become power stations that benefit the community twice.