Data Center

Discoverable data centre infrastructure

David Cuthbertson of Square Mile Systems was kind enough to demonstrate his AssetGen software to myself and Denis last week. Once the data has been inputted into a CMDB like AssetGen all sorts of very impressive reports can be generated very quickly. Implementing a CMDB involves a heavy up front investment because you have to manually enter at least 50% of your infrastructure and associated dependencies. The cause of the steep initial investment in CMDB is the invisibility of infrastructure in the data centre to auto-discovery software, meaning that infrastructure cannot be auto-discovered in the same way as devices on the network.

Intel study shows no effect from using none conditioned air

Intel have carried out a limited pilot to find out how a data centre would perform without the usual data centre environmental controls [PDF]. The top and bottom of it was that the servers, over a nine month test period performed as well whilst exposed to regular none air conditioned air and limited air filtration as servers in a fully air conditioned data centre. Does this mean that you can switch off all of your air conditioners and circulate none conditioned air instead?

Compute upon a cloud

Interesting what Amazon is up to…first with cloud storage then cloud computing and now cloud databases. Is the art of data centre management going to be concentrated into a few massive data centres? We currently rent a single Sun box, running Linux oddly enough, in a data centre to run all of our websites and email. One of the down sides with renting a machine is the limited capacity of storage, CPU and bandwidth.

Data centre heating effects

One of the side effects of the recent RackSpace outage in their Dallas/Fort Worth data centre has been finding out just how quickly their data centre heats up when the air conditioning system fails. Our backup generators kicked in instantaneously, but the transfer to backup power triggered the chillers to stop cycling and then to begin cycling back up again a process that would take on average 30 minutes. Those additional 30 minutes without chillers meant temperatures would rise to levels that could result in data loss and irreparably damage customers’ servers and devices.

Blog Action Day: The server power double whammy

Servers are getting faster and faster, consuming more and more power, producing more and more heat. Removing heat from the data centre uses even more power. According to a November 2006 Gartner report, over 60% of total data centre power consumption is spent cooling the data centre environment. Making the cooling system less power hungry would be the best bet. Unfortunately, significantly lowering the power consumption of air-conditioning units is very difficult.

IT run the servers, facilities run the air-con...

Facilities running the air-con in a data centre has to be one of the classical IT anti-patterns. You’ve got your nice shiny data centre, rows and rows of cabinets full to the brim with IT kit. Problem is, you don’t run the air conditioning, the facilities people do. So what you say, the facilities people eat air-con units for breakfast. That’s probably true, but what happens when things go wrong? Are you going to be told about the failure in time to do something about it?