IBM airs ‘utility’ wares

FRAMINGHAM (10/14/2003) – IBM Corp. last week continued to fill in its utility computing puzzle by introducing a variety of products, services and partnerships all designed to let companies more quickly deploy and more efficiently manage data-centre resources.

The company is aiming to make it possible for corporate users to link server, storage and network gear into one big, harmonious package that will help them adapt more quickly to changing business demands. While this latest technology that IBM is rolling out as part of its US$10 billion initiative isn’t groundbreaking, it does represent incremental steps toward automating data-centre systems and virtualizing resources.

“It’s going to be a long cultural shift for IT organizations to bring all their systems together, and adding pieces along the way could make the most sense for cost-conscious enterprise companies,” says Glenn O’Donnell, a research director at Meta Group Inc. Visibility into the mainframe and automation across that platform and other distributed platforms could help enterprise network managers get a better handle on the storage capacity they have, and storage virtualization could provide the tools to bring all that capacity into one shared resource pool.

The company announced:

– The TotalStorage SAN File System – code-named Storage Tank and set to be available next month starting at $90,000 – a software and hardware package that will let customers allocate storage to particular applications and provision servers.

– A partnership with Cisco Systems Inc. to develop standards that would let software from IBM and potentially others discover configuration errors on Cisco gear, and automatically reconfigure and/or provision new gear to accommodate a spike in demand.

– The IBM Web Infrastructure Orchestration package, which unites the company’s WebSphere, DB2 database and Tivoli Storage Manager with its BladeCenter servers, and is controlled and managed by Tivoli Intelligent ThinkDynamic Orchestrator policy-based software. Orchestrator is embedded in the blade servers and can shift resources among packaged items as needed.

The plans rely heavily on IBM’s autonomic computing technologies, which are part its on-demand initiative that proposes to make network, server, storage and other resources available on an as-needed basis.

Big Blue, along with Hewlett-Packard Co., Microsoft Corp., Sun Microsystems Inc. and others in the past year announced utility computing initiatives promising to reduce costs, pool resources, increase efficiencies across data centres and better align critical business services with IT infrastructure. That includes legacy systems.

That’s what IBM has in mind with Storage Tank, which links previously isolated server and storage units so large amounts of data can be accessed, stored and managed. Once virtualized, this pool of data can be automatically allocated or migrated to other systems as needed. Data, no matter where it resides or what operating system it runs on, looks as if it is part of a local file system.

“The SAN File System is one of the basic foundation technologies for utility computing,” says Jamie Gruener, a senior analyst with The Yankee Group. “The bottom line looking ahead, and one of the core things any vendor has to do, is to be able to virtualize the file and volume-level information in a way that it can be taken advantage of by other management tools and applications.”

Storage Tank performs its virtualization by assigning metadata references to data that describes the data content. The metadata then is spread across the network servers and stored so it can be accessed when data needs to be retrieved. According to IBM, Storage Tank will be able to manage petabytes of data. One petabyte, the company says, is 50 times the size of the data in all the books in the U.S. Library of Congress.

Francois Grey, OpenLab development officer for CERN in Geneva, is installing IBM’s SAN File System to manage the terabytes of digital images the organization’s Large Hadron Collider proton and ion accelerator will generate.

“Each time protons collide it leaves tracks in large detectors we put underground. We take an image of the tracks and store it digitally,” Grey says. “We produce these images at a phenomenal rate – 10 to 15 petabytes of data – which we need to analyze and store for a decade.”

CERN is building a data grid to store these images throughout the world. “We are testing Storage Tank to see if it can solve our problems of managing this information,” Grey says.

The Total Storage SAN File System software works with IBM’s TotalStorage Enterprise Storage Server (code-named Shark), SAN Volume Controller and Integration Server. It supports IBM AIX and Windows 2000 and Advanced Server. SAN File System consists of two IBM eServer xSeries dual-processor servers that function as metadata repositories and console management system, and storage-area network (SAN) management software.

The company also has announced enhancements to its IBM TotalStorage SAN Volume Controller software to let it work with disk arrays from Hitachi and HP and be implemented on Cisco’s MDS9000 Fibre Channel switches.

IBM also improved its IBM Tivoli Storage Resource Manager Version 1.3 to let users analyze the capacity allocated to disks, groups of disks and virtual disks managed by the IBM TotalStorage SAN Volume Controller, and their connections to host systems and file systems.

“IBM is setting up the infrastructure for managing the network better down the road,” Gruener says.

Meanwhile, IBM announced with Cisco that the companies would work to develop standards for exchanging information about problems across corporations.”

Basically, what IBM is working to do is, for example, develop intelligence and products that will make IBM software capable of automatically provisioning Cisco gear,” says Steve Wojtowecz, director of strategy for Tivoli Software in IBM’s Software Group.

IBM has submitted a Common Base Event specification, which is envisioned as the basis for standardized exchange of problem determination data via Web services, to the Organization for the Advancement of Structured Information Standards.



Related Download
Understanding How IBM Spectrum Protect Enables Hybrid Data Protection Sponsor: IBM
Understanding How IBM Spectrum Protect Enables Hybrid Data Protection
Download this whitepaper by Enterprise Strategy Group to learn how to choose a backup technology that is capable of supporting a hybrid protection approach capable of covering both on-premises technology and offsite cloud capabilities.
Register Now