When blood transfusions become shock therapy…
Computers are pretty advanced, we all know that. But as they become more and more advanced, they require more power to operate and there lies the problem. So researchers at IBM are working on a way to power and cool computers at the same time via what they call ‘electronic blood’.
If I could interject for a moment: HELL YEAH.
This is cool—like, really cool. The IBM research team is attempting to mimic the human brain and its complex systems of vessels and other stuff that I have no knowledge of and apply it to computers. The electronic blood is electrically charged and then sent to the computers processors where it simultaneously cools and powers them.
IBM is working on this because of what the future holds: in a few years, we’ll have supercomputers that will be able to do trillions of billions of calculations per second, and that kind of processing power takes a lot of energy—enough energy that we wouldn’t be able to power it without ruining the world in one giant swoop. The current supercomputers use so much heat that processors can’t be placed next to each other in fear of melting into oblivion (making them a ridiculous size), so this electronic blood could—in theory—reduce the size of supercomputers to a desktop.
This could potentially reduce the amount of cooling capacity needed by data centers. With over 2% of the country’s energy usage going directly to data centers, it makes sense to use electronic blood as an alternative to cool and power data centers and the hardware within.
But duuuuuuude, electronic blood? That’s some Terminator-level stuff right there. I’m not sure I want to live in a world where robots DON’T bleed electronic blood. The bad part is that we’re a few decades away from electronic blood being readily available, so the singularity is still a far ways off.
For more information contact Chris L.