MasterCard’s robust data centre: priceless

If a severe tornado touches down west of St. Louis, you might have trouble using your MasterCard.

The credit card company’s global technology operations centre, based in O’Fallon, Missouri, is strong enough to withstand a 160-mile-per-hour gust of wind, says it’s acting president, Rob Reeg, who hosted a group of Canadian journalists touring the facility last month.

MasterCard International processes about US$18 billion of transactions every year, at a rate of about 5.4 million transactions per hour through its GTO centre. The company says each transaction takes about 129 milliseconds. Although merchant and customer accounts are handled by their respective banks, MasterCard acts as a middleman, processing all of the transactions through its 25,000 square foot data centre, with mainframes, servers and storage space.

Company officials won’t disclose the number of employees working at the centre, nor will they reveal the bandwidth of their wide-area network or any other details, like databases. But they did reveal a few tidbits of information that could serve as a guide to IT departments supporting high-speed transaction processing.

The data centre is monitored 24 hours a day, seven days a week by a team of network engineers, network operators, operations technician, systems analysts and shift supervisors.

Reeg said the centre is built not only to support high winds, but also exceeds the structural engineering standards required by the state of California to withstand earthquakes.

The data centre is cooled with three cooling towers, said John Eubanks, the GTO’s senior business leader for data centre operations. “We can run with one but we have three, so it cuts down on the effort any one would have to work,” he said. “It’s just water being passed over grills and fans blowing cold air into the data centres.”

The GTO is connected to the local power utility at two separate sub-stations, but only needs one sub-station for its power. Eubanks said MasterCard has three generators at the station, and can use two at a time with the third one idling, so if the municipal power fails and one generator fails, MasterCard can bring the second generator online.

The GTO is served by more than one carrier, and it has four separate connections to the wide-area network in case one stops working, Eubanks said. He would not say how much bandwidth the GTO’s connections have with the outside world. “I won’t go into specific, but take OC and put numbers on the end of it and that will give you an idea,” he said. “They’re huge.”

The operators in the operations centre not only monitor the mainframes, network and servers, but they also get a bird-eye view of the 1,200 end points on MasterCard’s wide-area network, which is comprised mainly of data centres at banks serving both the cardholders and merchants.

Reeg said some banks have six end points, and most have two so if one fails, MasterCard can re-route traffic to the other. The GTO constantly pings each end point to make sure it’s up and running. All end points are plotted on to a Google Earth display, located at the front of the operations centre, which is flanked by televisions showing live feeds of both The Weather Channel and CNN, so MasterCard can find out if any of its supported banks might be affected by severe weather or political events.

“If we see there’s a hurricane and the weather service projects the path of the hurricane, we can see if it’s going to come close to a location where we have an end point,” Reeg said. “And if it does, we can work with the customer. We may want to switch all of their traffic to a different data centre that’s outside the path of the hurricane.”

This is important, because MasterCard’s job to is ensure all components in a purchase — authenticating the user, reconciling the purchase information with both the cardholder and merchant’s account and then paying the merchant — are processed, said Mike Manchisi, MasterCard’s group executive for strategic account management.

“When you’re talking about a real-time transaction, you have to ensure that it’s going to work and it’s going to work the first time,” he said. “We have built-in levels of redundancy to ensure that in the event that you can’t get a transaction from Point A to Point B, that there are other methods to be able to deliver that transaction.”

One way is being doing a “stand-in,” whereby MasterCard approves a transaction on behalf of a cardholder’s bank, Reeg said. “If the transaction can’t get out at all on to the network, we have enough intelligence at the edge of the network in that little device that sits in the back shop to make an approval, or non-approval decision on behalf of the bank,” he said. “The worst thing to us is, to have someone pull out a card and it does not go through and they go look for a different card. We always want our card to be at the front of the wallet.”

To stay up and running, MasterCard’s data centre is wired with 200 miles of fiber optic cabling, connecting a couple of IBM z9 mainframes, plus a plethora of Sun Microsystems and IBM servers, running on the Solaris and AIX versions of Unix respectively.

Although the company confirms it uses both IBM’s DB2 and Oracle databases on the back end, it will not disclose the versions. It also uses IBM’s Tivoli systems management software, plus applications from BMC, EMC and CA Inc. “We’re very big in virtualization,” Reeg said. “We like to take a server and carve it up into pieces, so you’re not buying as many boxes.”

He added the GTO staff handle “literally thousands of virtual servers.” Though it will not divulge details, MasterCard stores more than 2.5 petabytes of data in a variety of devices manufactured by EMC and Hitachi Data Systems, and some of the data is stored on tape, which is “pretty efficient for off-site storage,” Reeg said. “The tape technology now with virtual tape is pretty impressive,” Eubanks said. “It’s not like in the old days where you had data checks and those kinds of things.”

Reeg said all backup tapes and anything transported off-site is encrypted. Eubanks said tape is more efficient now, for some applications, than other storage media. “A couple of years ago, you used to have 2 GB tape,” he said.

“Now you can take a 2-gig tape and with compression you can get 100 gig on a single tape.”

Related Download
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center Sponsor: Lenovo
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center
Find out how Hyperconverged systems can help you meet the challenges of the modern IT department. Click here to find out more.
Register Now