When it comes to developing a network-attached storage facility, there are myriad ways for customers to go.
Users are adopting Fibre Channel-based storage-area networks (SAN) for data-intensive transactional applications such as databases or customer relationship management; they are installing NAS appliances to manage the masses of file-oriented data found in front-office applications; and they are investigating new technologies that transport their storage data over the same Ethernet network and use the same management infrastructure as other types of traffic.
In a nutshell, SANs consist of host servers and SCSI or Fibre Channel storage devices connected by a Fibre Channel router or switch – they transport blocks of data between client users and the storage device. NAS devices connect to the Ethernet network and transport files across the network to customers. An upcoming standard, called iSCSI, lets network managers move blocks and files of data across a Gigabit Ethernet network.
The IT executive at a global same-day package delivery service with more than 60 operating locations, chose a centralized SAN to pull together disparate servers and storage, and give his network increased uptime and reliability.
“At Dynamex, we’re implementing a SAN to support our business-critical order entry and dispatch system,” says James Wicker, vice-president of information services for the Dallas company. Dynamex Inc.’s order entry system, called the Courier Order Processing (COPS) application, is the nerve centre for the package delivery requests the company needs to process and deliver each day.
Dynamex is migrating its Santa Cruz Operation (SCO) UnixWare-based COPS application to Oracle Corp. on Linux, running on three Intel-based PowerEdge 6450 servers. Each of the three servers, plus six others running Windows NT, 2000 or Sun Solaris, connects to a Dell PowerVault 530F SAN appliance. SCSI-based storage subsystems and tape libraries also attach to the PowerVault 530F to complete the SAN configuration. The PowerVault 530F connects to another Fibre Channel appliance located more than 10 miles away in Richardson, Texas. Data is replicated between the two sites.
“We replicate data [between sites] in real time as much as possible for disaster-recovery purposes,” Wicker says.
The SAN also lets Wicker monitor and redistribute the capacity of his storage resources, something he could not do when storage was attached directly to each server.
“In a direct-attached environment today, there are servers with overutilized storage and some that are underutilized,” Wicker says. “The SAN lets us balance the load or reallocate storage more dynamically. So if we have a very active server, we can reallocate storage to it, without ever taking the servers offline.”
Wicker chose Dell’s SAN because it supports the four operating environments Dynamex uses: SCO UnixWare, Windows NT, Windows 2000 and Solaris. The company has two terabytes of data now and expects to double that amount in the next two years.
In its migration to a SAN from direct-attached storage, Dynamex was able to consolidate 38 servers in different locations to nine that share storage resources. By bringing the storage under the same management umbrella, they reduced the IT staff required to manage it. Additionally, the installation at Dynamex resulted in a fault-tolerant SAN that protects the company’s business – the information gathered for parcels Dynamex delivers.
“We generate a million dollars per day in revenue that can be split equally over the three servers [and storage] running our business applications,” Wicker says. “If you break that up by server, when a server goes down, it costs us 30 percent of our business volume. That’s why we’re creating that type of fault tolerance with the SAN environment. We want 100 per cent uptime.”
A Look at NAS
Meanwhile, another company installed a number of NAS appliances to store the huge data files it uses in its gas exploration and discovery business.
At Anadarko Petroleum, an independent, oil and gas exploration and distribution company in Houston, Mitch Williams, enterprise storage project manager, and Joan Dunn, manager of enterprise computing, have installed 14 Network Appliance file servers to hold the 40 terabytes of gas exploration data the company uses to discover oil reserves. The data Anadarko uses consists of files as large as 48GB that are retrieved by a geophysical exploration application from Landmark Graphics. Anadarko stores this data and the typical front-office word-processing and spreadsheet data on the Network Appliance file servers.
The type of data Anadarko uses dictated its storage choice. “We decided to go with a hybrid network consisting of a small SAN and network-attached storage,” Williams says. “We wanted a single architecture for our seismic exploration activities that could handle large data sets and read large blocks of data sequentially.” NAS is particularly well-suited to handling large files; Fibre Channel-based SANs work best for transactional data.
The company also has a 5-terabyte SAN that uses an EMC Symmetrix array connected to Sun and Hewlett-Packard Unix servers to store write-intensive database data. “We run extremely high-availability applications such as the gas marketing system, which can never go down, on the SAN,” Williams says.
While Williams chose NAS equipment in part based on price, he says that implementing NAS was closer than SAN to the skill sets of his staff. “We had NAS experience with Auspex and Network Appliance file servers, and experience with Ethernet,” Williams says.
The iSCSI World
Finally, a communications infrastructure company is considering storage over IP (Internet SCSI), which lets users transport SAN-type block and file-oriented storage data over the Ethernet network – a technology that converges NAS and SAN data onto the same network, but may not be readily available until next year.
IP storage, the transfer of SCSI data over Gigabit Ethernet, is still a technology in development – although a few users have experimented with it and are intrigued by the technology.
In an iSCSI configuration, a server contains a Gigabit Ethernet adapter that communicates with an iSCSI switch or router attached to the IP network. The switch is attached to a SCSI or Fibre Channel storage array. The iSCSI specification is currently an Internet Engineering Task Force draft, awaiting ratification. ISCSI uses technology such as Ethernet that is familiar and prevalent in the network infrastructure.
“iSCSI is attractive to us because we can now bring SAN and NAS data together over our IP network and offer it to broadband customers,” says Vikram Saksena, CTO for Narad Networks, a cable infrastructure company in Westford, Mass.
At Narad, Saksena is building a storage infrastructure at local cable facilities and using it to offer services such as video-on-demand or NAS for small and midsized businesses.
“As IP networks get faster and faster, iSCSI will become more popular because it allows us to run storage and other kinds of traffic on a common Ethernet network,” Saksena says.
Saksena says iSCSI is easier to manage and build than a Fibre Channel-based network, because Ethernet is a familiar technology. Rather than creating a separate management infrastructure for iSCSI, Saksena can apply the same principles of management to it that he does with his Ethernet network.
“With multiple levels of priority and quality of service, we can virtually segregate storage traffic from other types of TCP/IP traffic, so they don’t interfere with each other in a negative fashion,” Saksena says.