Research Hub > State-of-the-Arts Protection
Case Study

State-of-the-Arts Protection

Salt Lake Community College relies on APC's power and cooling technology to keep its Center for Arts and Media data center humming.

Casey Moore and his team chose a hot-aisle containment strategy for Salt Lake Community College’s new Center for Arts and Media data center, standardizing on APC equipment. Photography by Erik Ostling

Data center power and cooling likely weren’t top of mind when Salt Lake Community College (SLCC) began planning the new $45 million Center for Arts and Media on its South City campus, but they were a priority for Technology Director Casey Moore.

The brand-new, 130,000-square-foot Center for Arts and Media includes state-of-the-art television and film production studios, a sound stage, a screening room, gaming labs, web design labs and more, alongside banks of computers dedicated for video and audio production and editing. The facility, which officially opened in November 2013, vaulted SLCC to the forefront of digital media education, particularly in a region that has long served as  a popular filming location.

SLCC designed and built a new data center to support and back up all of the new multimedia equipment. To ensure reliable and consistent availability of critical applications, Moore and his IT team installed a complete power and cooling solution from APC by Schneider Electric. 

“The old data center for the South City campus was about the size of a closet. Sometimes it was a very hot closet,” Moore says. “We wanted a power and cooling solution that we knew worked and that we could manage.”

SLCC’s South City campus is one of 10 college sites in or near Salt Lake City that serve 60,000 students annually. A decade of planning and three years of construction led up to the opening of the new arts and media center. Flexibility and the potential for growth were key components of the planners’ vision for the center, says Richard Scott, interim dean of the School of Arts, Communication and New Media.

“We’ve been able to build a remarkable facility, one that is far beyond what is usually available at a community college,” Scott says. “The focus is on emerging digital arts. We may not know now what the areas are that we’ll need to grow, or what technologies we’ll need to offer, but we wanted to make sure there was space to run them and an environment in which to run them reliably.”


CDW’s partnership with APC helps keep your data center up and running efficiently, so you can focus on driving innovation.
Clovis Community College

We wanted a power and cooling solution that we knew worked and that we could manage.

Casey Moore, Technology Director, Salt Lake Community College

Clovis Community College

APC from the Start

The SLCC IT staff knew from the outset that they wanted to deploy an APC solution for power management and cooling in the new data center because of the team’s experience with the manufacturer at other campus locations. The team also chose to deploy APC’s hot-aisle containment cooling strategy, which had resulted in greater energy efficiency in other campus data centers, Moore says. 

“We had already started to standardize on the equipment and knew it worked,” he says. “We were looking to match those configurations using a full-blown solution.”

Moore consulted with representatives from CDW and APC to develop the specifics of the power and cooling solution. He then worked with CDW to put together the proposed budget for the system.

A Symmetra Power Array PX 500 kVA Uninterruptible Power Supply is at the heart of the APC solution for the Center for Arts and Media data center. It provides a flexible range of power, matched to real-time load, and offers a modular architecture for easy scaling of both power and runtime. Each module provides redundant protection from outages with batteries and circuit bypass. APC StruxureWare Data Center Expert software makes it possible for the IT staff to monitor and manage power consumption and temperature in the South City data center and all other APC installations on SLCC campuses from a single web-based interface, System Administrator John Madsen says.

“We can see what’s going on with the data center environment from anywhere,” he says. “We can get in and manage it from home if we need to.”

Two APC InRow chilled-water coolers were installed in each of four equipment racks in the new data center to help maintain optimum equipment temperatures. The coolers offer built-in intelligence that automatically adjusts fan speeds and chilled-water flow to the heat load. The hot-aisle containment system isolates hot-air exhaust from servers, and directs it toward the cooling system, which operates more efficiently as a result, Madsen says.

For an additional layer of protection, SLCC purchased APC’s modular power distribution units, which are hot-swappable circuit breakers mounted on individual racks that can be replaced without services from an electrician or the need to power down.

Deployment Support

Given that it took place during a complex construction project, the APC power and cooling system implementation went smoothly, Moore says. CDW helped the team to coordinate delivery and installation dates that fit the new facility’s construction schedule. CDW also was instrumental in organizing early calls and meetings, bringing together college officials, APC representatives and contractors. APC sent engineers and technicians to help with the installation and worked with SLCC IT staff members and contractors to work out power specifications for humidity control in the data center.

Most of the implementation challenges the team faced resulted from having to install the power and cooling system in an unfinished building, Moore says.

“Making sure the power was run to exactly where the equipment was going to end up was critical,” he says. “We were dealing with [wiring] coming up from the floor and water from above, which must be placed within reasonably small tolerances.”

The power and cooling equipment arrived in the spring of 2013 — well in advance of the rest of the data center equipment’s installation, so the team didn’t need to worry about implementing the power and cooling architecture at the time of their move into the data center. Everything was up and running by July.

The new data center consolidates existing South City campus hardware and software with new equipment brought in to support the new Center for Arts and Media, including the network and routing core for the campus, security appliances and firewalls, five physical servers supporting 15 virtual servers, a 60-terabyte EMC Isilon storage array, and all the video management and switching equipment for the center.

Moore and Madsen estimate the data center is operating at only about 25 to 30 percent of its capacity. The equipment now in place draws only about 20 kVA — and SLCC has deployed only 250 kVA of the Symmetra Power Array’s 500 kVA.

“We built in a lot of room for growth and wanted a power and cooling system that could handle it,” Madsen says.

Flexibility for the Future

Modular architecture also is helping Madsen and IT team members avoid capacity planning problems, which Forrester researcher Sophia Vargas says is a common pitfall for many organizations that implement power and cooling solutions.

“Modular systems remove the risk of over- or underprovisioning the power and cooling systems in the data center,” Vargas says. “When you can add capacity as needed, you can make sure the data center infrastructure is protected, but you can still run it efficiently.”

Any organization’s search for power and cooling technology should begin with a thorough understanding of the specific needs of the data center, as well as the organization, Vargas advises. “Too many fail to develop a realistic use case that covers capacity planning, scalability needs and the availability necessary for critical applications.”

Whenever financially possible, Vargas also recommends that organizations select full, integrated power and cooling solutions. When chosen carefully, the system’s manufacturer can become a valuable strategic partner, further minimizing worries about service. 

“Without the management software in an integrated system, you can’t automate controls and alerts as easily, and power and cooling management takes up a lot more of the IT staff’s time,” Vargas says.

Madsen cites ease of management as one of the main benefits of the APC system: “You want those alerts and the ability to see the entire power and cooling environment from anywhere. If I were giving advice about picking a system, I’d say to insist on that level of reporting and control.”

The ability to see all the APC equipment on all of SLCC’s campuses via a “single pane of glass” is one of the clearest advantages provided by the APC solution, Moore concurs.

By far, the greatest justification for SLCC’s investment in the new system is simply that “it works,” Moore says. Power and cooling is just one of several fundamental components of the new infrastructure that make the technology-rich arts and media center possible.

“Typically, power is not something we worry about, and that’s the way it should be,” Moore says.

43%

The percentage of cooling  system energy cost savings that can be realized through a hot-aisle containment strategy over cold-aisle containment

Source: Schneider Electric Data Center Science Center, "Impact of Hot and Cold Aisle Containment on Data Center Temperature and Efficiency," November 2013

Green Bonus

The fear that the new facility — and its data center, specifically — would consume vast amounts of electricity raised alarms during the planning process, Dean Scott says. 

“At a certain point, power consumption started to shape a lot of the decisions we made,” Scott says. “The state of Utah, which gives us a lot of our money, said, ‘You just can’t have all that power.’ Casey and John have found the technologies that use electricity responsibly, cover our present needs and leave room for growth.”

Minimizing power usage was not the primary driver in the decision to standardize on APC equipment, but the overall efficiency of the system, which automatically matches power to the real-time load, is an added benefit, Moore says. 

Still, it is likely that controlling power consumption will become increasingly important in the future, as classes grow and the new media training center becomes a destination for students outside of the immediate Salt Lake region. 

“In the past, students had to carry their projects around on thumb drives and hope they could find machines with decent speeds and memory,” Madsen says. “We’re now able to provide multiple terabytes of space over a very fast network for them to do online editing and production.

“As the data center grows, we’ll probably want to manage the electricity and temperature more aggressively, and this new system gives us the means to do that,” Madsen says.

Early visitors and new students are amazed by what SLCC’s new Center for Arts and Media has to offer, Moore says, but it will be very different in five years.

“Much of the technology that’s there will be traded out or upgraded, and we will have added new equipment and programs,” Scott says. “That was the great challenge for the IT department — building a data center that supports future growth and changes to meet our students’ needs.”


Hot or Cold?

Gone are the days of data center managers cranking up the air conditioning and crossing their fingers while hoping servers stay cool enough to run smoothly. 

Today’s state-of-the-art power and cooling systems deploy cold-water coolers for individual server racks, controlling and separating the flow of hot and cold air in the room, usually through one of two basic strategies:

Hot-Aisle Containment: In this solution, equipment racks are positioned back to back, and the aisle between is enclosed and vented back to rack-mounted coolers and the air conditioning system intake. Data center temperatures stay cool.

Cold-Aisle Containment: In this solution, equipment is set up face to face across a covered aisle that is sealed on both ends, connected directly to the air conditioning output or to a vent in a raised floor that delivers cold air. Ambient temperature in the data center is warm.

Salt Lake Community College Technology Director Casey Moore and his team recently chose APC hot-aisle containment for a new data center on the college’s South City campus and also have deployed the configuration on other campuses. 

“Both allow for an overall higher efficiency than without them, but hot-aisle containment does better by driving up the return temperature into the in-row coolers,” Moore says. “In an emergency, there would be a larger volume of [cool] air in the room to feed the equipment than a cold-aisle containment system would allow.”

The decision whether to deploy one strategy over the other should be based on an organization’s specific needs, Forrester researcher Sophia Vargas advises. What matters most is finding an effective way to separate hot and cold air within the available data center space, she says.

“It’s a basic best practice to isolate the hot and the cold air, so you’re not diluting the cooling environment with exhaust,” Vargas says. “The next step up is to take measures to direct the two air flows to take better advantage of the cooling system.”


CDW’s partnership with APC helps keep your data center up and running efficiently, so you can focus on driving innovation.