I’m pleased to present part 1 of A day at The Data Center: Behind the scenes with James and Ivan.
Last week, I had the chance to catch up with our resident technician, Ivan Manchan, for a behind-the-scenes look at one of our Los Angeles data centers.
I’ve been writing about colocation and related services for nearly a year now and was happy to see first hand how a world class data center is setup and properly maintained. It’s no easy task, but Ivan takes great pride in his work and does a fantastic job proactively monitoring all aspects of the data center ensuring our clients are online and connected to their servers 24/7/365. As you would guess, the data center is quite noisy (thanks to all those fans and air conditioners), well-maintained and properly wired for optimal bandwidth.
Without further ado, I’m pleased to present part 1 of A day at The Data Center: Behind the scenes with James and Ivan. Please excuse the video quality as I shot this on an iPhone 4 and am not up-to-snuff on my video editing skills. Nevertheless, our video tour with Ivan demonstrates how server isles are separated for effective cooling and how miles of delicate fiber optic cabling is safely routed alongside traditional copper cabling. Pay close attention to the yellow trays and blue wiring routed through the ceiling. It’s all about connections – literally.
Data center cooling and backup power supply are two very important things to stress over. And with good reason; computer equipment generates a fair amount of heat and whenever you have a room full of machines generating a lot of heat you’d better have a well-designed cooling system to keep things under control.
Traditionally, dedicated servers are connected to a meet-me-room and other important switch points from the back of the rack. Incidentally, the rear side of a server rack generates the most amount of heat.
This particular facility uses what is called hot aisle/cold aisle containment. The front of the server racks face the cold aisle; industrial strength fans direct cold air from the AC through the cold aisle. What this does is force the heat out the back of the server rack. The heat is contained in the hot aisle and the fans push the hot air into a return air-intake system.
The hot air is pumped through the ac system, cooled, and the whole process repeats itself. Pushing cold air through the front of the rack helps to contain heat in the hot aisle. If it weren’t for this cooling system, servers would overheat and performance would be affected. On a side note; hot aisle and cold aisle containment differs from a raised-floor cooling system. Raised floor cooling takes a different approach, but we’ll save that for a later discussion. On to the backup power supply, otherwise known as a UPS (Uninterruptible Power Supply). Check out the video below for a closer look:
Uninterruptible Power Supply
The UPS (uninterruptible power supply) is a huge battery. Like Doc Brown said in Back to The Future, “This sucker’s electrical.” As such, the UPS stores electricity and kicks in whenever a power outage is detected.
The UPS is different from a diesel generator as it requires no fuel to function, but more or less, it serves a similar purpose. The UPS adds to the redundancy factor if in fact an outage does occur
Thankfully, the UPS sees very little to no action. Even if it does kick in, it does so with such speed and precision network latency is hardly noticeable. The UPS takes up quite a bit of space inside our facility, but it’s pretty inconspicuous. Whatever you do, don’t flip that on-off switch!
Stay tuned for part II of A day at The Data Center: Behind the scenes with James and Ivan.