On-demand architecture provisions Microsoft’s Technology Center

Microsoft’s Thames Valley MTC helps companies make decisions about IT strategy – from the applications they need for successful operations to how they roll them out enterprise-wide, as private cloud or hosted services. A high density data centre, architected using Schneider Electric hardware and software, is core to this service provision.

  • Wednesday, 29th May 2013 Posted 11 years ago in by Phil Alsop

Microsoft Technology Centers (MTCs) are collaborative environments that provide access to innovative technologies and world-class expertise, enabling companies to envision, design, and deploy solutions in 30 locations around the world.

“The role of the technical director is to look after the data centre and manage the strategy and equipment that goes in there. I’m also responsible for maintaining the relationships with the partners that support the MTC,” says Paul Appleby, Technical Director, Microsoft Technology Center, Thames Valley UK.

The MTC’s are used to facilitate a variety of customer offerings. These include a one -day strategy briefing which includes mutual discovery, tailored product and technology drill-downs, and an Architecture Design Session (ADS) where two days are spent focusing on the customers’ business objectives and aligning them with specific applications of Microsoft software to meet their objectives. From this a solution can be developed based upon Microsoft and partner technologies, with the goal to enable the customer to move forward with confidence.

“If a customer comes in for Proof of Concept offering, we provide a Development Suite for the 2 - 3 week engagement. For this we’ll load up a dedicated room with high-end workstations together with all the software they need, and then connect them up to whatever resources they need in the data centre so they can implement their proposed solution and see how it scales,” said Paul Appleby.
About data centres and the MTC
The data centres which support the MTCs and provision the Development Suite environments vary in size globally, with Thames Valley being somewhere mid range. After nearly a decade’s operation, the facility has recently been up-rated from 40 to 70kW in order to facilitate higher levels of usage as well as to accommodate more highly powered equipment.

The data centre is kept busy; at any time up to three Development Suites can be running simultaneously at any time. In the course of a year, over 40 customers avail themselves of the facility and, with data centre tours included, literally thousands of interested CIOs and IT Directors, CEO’s and other senior executives will come through the doors of the MTC located in Reading. In the last ten years dependence upon IT systems has grown dramatically, as has the numbers of servers and storage devices in use by modern enterprises. Hardly surprising, then, that the facility was due for upgrade.

“At 40kW we were running out of capacity and faced with the potential of having to turn down equipment that we couldn’t power or cool. This was down to the physical constraints of the room, but obviously we’re providing a service to the customer and the last thing we wanted to do was inconvenience them. The objective of refurbishing the data centre was twofold; to increase the power into the room as well as increase the physical space available for IT,” said Paul Appleby.

“There was very little we could do to the space. We made some small alterations and freed up a little floor space by straightening the back wall and removing a peripheral CRAC unit. We had to upgrade our UPS in order to protect the increased load, but the APC Symmetra PX96 we installed has the same footprint as the unit it replaced so that had no net effect upon the available space for equipment cabinets.
“One of the main changes which we made was to the architecture of the physical infrastructure. Before the upgrade we were using a traditional cooling set up with perimeter air conditioning units around the edges of the room. To ensure resilience we had redundant air handling units supplying cool air to the IT load via a raised floor and grilles in front of the cabinets.

“With the new setup we’ve contained the hot aisles between the equipment cabinets, and we’ve also formed a second area of containment using the backs of a row of cabinets near the rear wall. We’ve introduced InRow cooling units to remove the hot air; the chilled water system uses a compressor installed on the roof of the building.

“In an ideal world our APC InfraStruxure solution would have comprised two rows of cabinets back to back with a single Hot Aisle Containment Solution (HACS), but the wedged shape of the room meant that we had to improvise to make the most of the available capacity. So we’ve added about a ten-inch extension to the rear of third row of the NetShelter cabinets, and the doors are lined with Perspex to create a narrow hot aisle.”

The Rack AIr Containment solution (RAC) segregates hot and cool air streams in order to improve cooling predictability. It’s the first example of its kind from Schneider Electric and it’s proved a very beneficial way of increasing power density in a limited space.

“It’s a novel solution, but we often deal with CIOs and they’re interested to see what we have in the data centre. Our primary focus is always going to be our software technologies, but the impact of the sorts of solutions which are developed could drive a requirement for higher density servers or virtualised hardware. People are curious to see how a company like Microsoft manages a small data centre like this to keep it efficient and resilient. It’s relevant to their situation.”
In fact one of the strengths of the MTC data centre is that it probably reflects a lot of those customers’ own data centres. It’s not a homogeneous environment and the server load includes equipment from all the main manufacturers - Dell, IBM, HP and even Oracle. The objective of the Development Suite is to approximate the sort of eclectic environment that many customers operate. The load is therefore “a real mishmash of IT and communications stuff,” said Appleby.

“We constructed the room with Hot Aisle Containment and Rack Aisle Containment because we were looking to improve the PUE of the data centre. These sorts of facilities are unique in Microsoft’s estate because their load is unpredictable and we need to stay flexible according to our customer needs - we always need capacity available on an as-needs basis. So we introduced the use of containment of racks and of aisles, because we could see that would produce an immediate effect on the efficiency of the cooling infrastructure. We’re also using StruxureWare for Data Centers DCIM software to manage the cooling equipment.

Because of the nature of the work which the data centre does, the workload is variable and driven by the projects at the MTC. The use of DCIM software enables intelligent cooling, matching the requirements of the IT load.

This helps keep the data centre efficient and ensures that energy isn’t needlessly wasted. In any other circumstances, equipment would be shut down outside working hours. But the data centre has to run 24x365 because often customers want remote access to fine tune their implementations at any time of day or night, including weekends.

The monitoring carried out by the software was also helpful in identifying an issue with the cooling system when the data centre was commissioned. “We were able to provide information the plant engineers: the flow rates of coolant we’ve been getting so that the system could be adjusted and optimised. This has been an unexpected but useful by-product. However, on a day-to-day basis, StruxureWare for Data Centers ensures the health of the data centre so that we can maintain quality of service to MTC customers,” said Paul Appleby.