The growing flood of data that enterprises create and consume is doing more than giving rise to new storage technologies. It’s also changing who is responsible for storage within IT departments.
Demand for storage capacity has grown by 60 per cent per year and shows no signs of slowing down, according to research company IDC. New disclosure laws, which require more data to be preserved and retrievable, also are making storage management a bigger job.
Now, with network-attached storage, storage-area networks (SAN), virtualization and other technologies shifting information and processing around within enterprises, a variety of changes are happening in the storage administration ranks. (Compare Storage resource management products.)
“With the sheer complexity of some companies’ information infrastructures, you wonder whether one person can really get their hands around it all,” says Pund-IT analyst Charles King. The job has grown beyond taking care of storage arrays, he says. “It’s really requiring storage administrators and executives, including CIOs, to think of it in a more holistic way.”
The turning point for some IT departments seems to be the shift to centralized storage. Late last year, the University of Pittsburgh set up its first SAN and started moving its data out of servers and into its network operations center, says Jinx Walton, director of IT. Until then, every time a group in the IT department set out to meet a need on campus, the university’s IT development team would assess how much storage was needed for the project and purchase it. The individual group would then manage that storage.
“Whoever was responsible for the project was responsible for the storage,” Walton says.
But that was inefficient. Buying storage for individual servers and investing in additional disks when the servers filled up was expensive and a distraction, she says. After centralizing most servers in the network operations center (NOC), the university started building a SAN there and that was shared by all the project managers. Purchasing and management of storage shifted from the development realm to the NOC. It wasn’t easy at first, Walton says.
“Any time there’s any kind of change, there’s concern about it,” Walton says. But IT developers can now spend their time solving problems instead of handling storage, and they’re happy with the change, she says.
At a large transportation company in the Midwest, what had been a niche storage project under a small team has gone mainstream.
IT administrators set up SANs about five years ago for data warehouses while keeping host-based storage in the rest of the IT universe, says an IT executive who asked not be named. At that point, the data warehouse team was responsible for the SANs. But recently the company expanded SANs to more of its IT systems in conjunction with adopting virtualization, and ownership of SANs has shifted to the production services department that manages IT as a whole, he says.
Tucson Electric Power was an early adopter of virtualization and networked storage, both of which have made the utility’s IT operations far more efficient, says Chris Rima, supervisor of infrastructure systems. Two years ago, Tucson Electric hit a wall with data center growth.
“We’re a power company, and we didn’t have any more power coming in,” Rima says. Virtualization put off for two years the need to build a new, improved, more efficient data center, which Tucson Electric is now building.
Storage has been a big part of the company’s explosive IT growth. Data has doubled every year for the past three years, to 80TB at the end of last year. Among other things, Tucson Electric needs to store maps and high-resolution images of its coverage area so it can install power poles in the best locations, Rima says.
SANs and virtualization may have been the company’s two saviours, but they have also made the IT department’s work more complex. “They are so intrinsically linked, it’s unbelievable,” Rima says. This created a need for a new position: storage architect.
“Instead of somebody who’s solely doing administration work … the architect is somebody who takes a step back and says, ‘OK, how do I design the architecture to take advantage of virtualization, data protection [and other factors]?'” Rima says.
That person needs to understand both storage and virtualization, and specifically technology from NetApp and VMware, Rima says. Finding the right person outside the company would have been virtually impossible, he says. So Tucson Electric trained a storage administrator, who is due for the promotion soon.
Adventist Health, a hospital operator in Roseville, Calif., had a generalist handling its data center until it implemented a SAN and virtualization. It then hired a small team of specialists for each new technology, says Greg McGovern, Adventist’s CTO. Despite the supposed simplicity of centralized storage and processing, the company also is relying more on support from vendors, he adds.
“It looks simple on the surface. It looks complex when it stops working or slows down,” McGovern says. If a doctor in the field complains an application running over the network has slowed to a crawl, vendor support is often called in. “I think I’ve got WAN engineers who can handle it, but I need more assurance,” McGovern says.
Pund-IT’s King thinks many more companies will face these kinds of challenges as storage grows rapidly in importance as well as in terabytes.
“People are still trying to get their heads around how to do this,” King says. The falling price of storage equipment only makes things worse, he added. “Anybody can afford enough data storage to get themselves in trouble.”
Information will exceed storage capacity, says IDC