Two ongoing projects by U.S. federal agencies underscore the need in the public and private sectors for building storage-area networks (SANs) because of increasing demands from end users and customers to access information.
This week, the U.S. Geological Survey (USGS) upgraded the SAN behind its Web site. The SAN, which is hosted by Microsoft Corp. and referred to as TerraServer, had its capacity bolstered from 12TB to 18TB to better serve an audience that launches more than 5 million imagery requests per day.
Each time the site is publicized on television or in print publications, the number of visitors skyrockets. On top of that, a new Web site was added to the TerraServer to give federal agencies exclusive use of the data to create presentations.
“I attribute most of the growth to adding the second Web site,” said Tom Barclay, TerraServer project manager at Microsoft. “The more data, the more the usage goes up.”
The USGS’s Web site, which can display more than 3 million satellite photos of the Earth, is most often used by the general public and the U.S. Department of Agriculture for natural resource and land management.
“We get a half [terabyte] to a [full] terabyte of images from the federal government every 30 days,” said Barclay, referring to JPEG images from space.
The TerraServer project is a joint venture between the federal government and various IT vendors including Microsoft, Compaq Computer Corp. and Advanced Digital Information Corp. in Redmond, Wash.
TerraServer uses three Microsoft SQL Server databases, four Compaq ProLiant 8500 servers, one Compaq Enterprise Storage Array 12000 and 16-port Brocade Silkworm 2800 switches to store aerial and satellite images of the Earth and to provide the information publicly on the Internet. Veritas Software Corp. provides NetBackup software. ADIC just added a Scalar 1000 tape library, giving the SAN an additional 6TB of storage capacity.
When the project first started in 1997, “we were basically exploring how large we could grow a single server,” said Barclay.
The project grew from that single server into a 25-foot-long computer with eight racks of equipment. Then, disk capacity topped out and the USGS wanted its data available via the Web around the clock.
The need for multiple servers in an active configuration – combined with the ability to move multiple terabytes of data from one server to another – “was the motivation to move to clustering,” he added.
“On the Internet, predicting user load becomes so much more challenging,” said Barclay. Anyone who’s building an application “has to configure on the high side for bandwidth, considering you can literally have every man, woman and child in the world standing outside your door,” he added.
In another example of SAN technology making federal inroads, the Federal Deposit Insurance Corp. is in the midst of a two-year server consolidation project aimed at pumping its data into two SANs so that it can upgrade its platform to Windows 2000 and make information available to its scores of field offices.
The FDIC, the federal entity that insures customer deposits at nearly 10,000 U.S. banks, was faced with replacing the hard drives in about 400 servers that it uses for internal operations. It had been adding servers to support data-intensive applications like Microsoft Exchange.
“At the same time, we were also looking at budget considerations. We wanted to consolidate servers and centralize them,” said Ann-Marie Haynie, senior computer specialist at the FDIC in Arlington, Virginia. “If we have the servers centralized, we can actually start clustering them more efficiently for redundancy and failover.”
Each SAN will be made up of a Hitachi Data Systems Corp.’s Lightning 9960 storage array with 6.6TB of capacity and 20 Compaq ProLiant 8500 servers connected to 12 Brocade Silkworm 2800 switches. For clustering and server-side management and backup, the FDIC settled on Veritas’ Volume Manager, NetBackup and Cluster Server software.
Under the Bush administration, federal agencies are being required to demonstrate a return on investment before funding is approved for any IT projects. The FDIC estimates that it will save US$13 million over the next five years simply by avoiding upgrades to its servers, according to Steve Laterra, the agency’s server software section chief.