Gazing into the network

Ask people to imagine what enterprise networks will look like in the future and they generally laugh. Most are too busy worrying about how they’re going to get through next week to wonder about the technology landscape down the road.

But those who don’t have to worry about keeping networks humming day to day can afford to kill some time with such musings. We managed to pin down a few people for their thoughts on five developments and issues that will shape the future: the bandwidth boom; wireless; hackers and IT disasters; outsourcing; and the breaking of Moore’s Law.

The bandwidth boom

Demand for dedicated bandwidth will increase at a compound annual growth rate of 32 per cent from 1998 through next year, according to consultancy Vertical Systems Group. Vendors of all stripes are rushing to meet the demand, whether by laying fibre, launching satellites or squeezing more bandwidth out of existing networks.

The bandwidth boom shows no sign of abating. Recently, Cable & Wireless and Alcatel announced they were building an all-IP transatlantic cable that would boost capacity by more than one-third vs. current cables. Lucent also announced Time Warner was buying an optical network system that uses dense wave division multiplexing and new amplifier technology to double the capacity of existing fibre.

At the same time, satellites slated to be operational within the next three years will be capable of downloading IP traffic at 100Mbps, with uplinks of 2Mbps, offering yet another alternative for “last-mile” connectivity.

As broadband fully unfolds in the consumer market, “it will substantially redefine the way that enterprises work and the way that they are integrated into society,” says Sandy Fraser, director of AT&T Labs Research. “It is a much more dynamic world when communication paths rather than collocation substantiates the relationship between an enterprise and its employees.”

The result will be companies that are far more distributed. Russ McGuire, chief strategist with the consultancy TeleChoice (a highly distributed organization), expects applications that he uses routinely today to communicate with colleagues, such as instant messaging, will grow to include video and other capabilities that take advantage of high-speed connections. “It’s the long-promised, never-delivered collaborative computing paradigm, with people working on the same document interactively,” he says.

Widespread wireless

Increasing amounts of bandwidth contributing to the boom will be delivered sans wires.

Shipments of wireless Internet access devices, after experiencing double- and triple-digit growth through 2004, will eventually replace the PC as the preferred Internet access method, according to market researchers Cahners In-Stat Group.

The speed of such links is likely to increase dramatically in coming years, given improvements in satellite communications and other developments.

“In 15 years we will see the difference between wired and wireline disappear entirely as wireless connections become more broadband and more reliable,” says James Kobielus, an analyst with The Burton Group and a Network World (U.S.) columnist. He envisions salespeople surfing the Web from the back seat of a taxi for last-minute background data while on the way to visit a potential customer. He expects we could also see advanced telemedicine applications, such as wireless-enabled monitors built into wristwatches that can alert healthcare professionals of dangerous vital signs for at-risk patients.

Ken Hyers, senior analyst for mobile commerce at In-Stat, expects the availability of ultra-wideband services in five to seven years will enable easier access from portable devices to corporate databases, including full-colour images and video. Salespeople will be able to instantly check inventory while sitting with a customer. Insurance adjusters will collect data and video at the site of a disaster and transmit it back to the home office for faster processing.

“It’s going to be the enterprise that’s going to drive a lot of these early applications because they have the need and the resources to pay for it,” Hyers says.

Under lock and key

The downside to wireless and broadband to the home is that it makes life easier for hackers, who don’t need any help.

“Employee-owned PCs, Palms and cell phones being used to tie into work systems really changes the security model,” says John Pescatore, a research director at Gartner. Fixing it will mean embedding firewalls and other security tools into home network hardware components such as PC cards and, increasingly, hubs and routers.

Eugene Spafford, director of the Center for Education and Research in Information Assurance and Security at Purdue University, was part of a “security visionary roundtable” assembled last September by CERIAS and Accenture (formerly Andersen Consulting). The participants came up with a list of top 10 trends affecting security and issued a “call to action” paper that outlined steps required to reverse these trends, including improving software quality, investing in training and packaging basic security architectures.

“I seriously doubt the paper will have much effect,” Spafford says. “There are a lot of people who agree with it, but no one wants to be the first to slow the time to market or invest extra resources.”

It’s likely to take outside pressure from government or something along the lines of a consumer boycott before industry will spend the time and money to adequately address the issues raised, he says. He fully expects such pressure will come.

“The trend is one toward an increasingly complex and fragile infrastructure that’s being used for more critical purposes,” Spafford says. “At some point, the fragility is going to overtake the utility and that will bring a call to action.”

Others also worry that complexity is a breeding ground for disaster. The separate events that brought down various Microsoft sites a while back present a case in point. One was attributed to a technician’s mistake during a router configuration update, and the other was a denial-of-service caused by a buffer-overflow attack. So one was caused by human error in dealing with a complex system, while the other was a simple attack made possible by a complex system with known vulnerabilities.

Outsourcing everything

As a way of dealing with the complexity inherent in providing solid security, and of running networks in general, companies will continue to outsource.

A lack of qualified personnel is one driver behind the outsourcing trend, especially in specialized areas such as security. Schools aren’t churning out qualified professionals fast enough to meet demand, says Spafford, who teaches at one of the few schools in the country that offer an undergraduate degree program in computer security. He doesn’t see an end in sight, given that those who could help fix the problem – Ph.Ds and other qualified faculty – are being offered large salaries to work in industry instead. Even undergraduates get hired before receiving a diploma, he says.

The movement toward outsourcing is “a very natural evolution,” says William Pulleyblank, director, exploratory server systems with IBM Research. “We outsource everything as technology reaches an appropriate state.”

He cites electric power generation, which is now completely outsourced. “There was a time when everybody had their own generator,” he says. “Computing power has become much more of a utility. Companies now want to focus on their core competence, something they can do more cost effectively than anyone else.”

Lending a technical, helping hand

Terraspring, a start-up in Fremont, Calif., is launching a new breed of service provider that it calls the IT infrastructure provider.

“We can give you all the hardware you need, but we can do it in seconds or minutes, not days or weeks,” says Angela Tucci, a founder of Terraspring. The company developed technology that lets it quickly corral a set of servers and install one or more applications from a predefined application library. So during peak hours you can configure a collection of servers to handle e-mail, while after hours use the same servers for data crunching and reporting. Customers can also quickly add resources as needed, paying only for what they use when they use it.

“We’re changing the way people buy and manage computing resources,” she says. “Don’t go through a long procurement cycle, don’t take the assets on your books, don’t spend a lot of time configuring the unique possibilities of your server infrastructure.”

E-commerce is another outsourcing driver, says Andrew Efstathiou, program manager at The Yankee Group consultancy.

“In a Web environment, as people are accessing enterprise systems from outside, the cost of being down is very large,” he says. “Having 100 per cent availability and rapid recoverability is a difficult thing to achieve, so firms are outsourcing to address that issue.”

Breaking Moore’s Law

Whether you outsource or not, you can bet that demand for computing power will continue to rise. You can also bet that vendors will step up to meet the demand, and in new ways that rewrite old axioms.

IBM Research is at work on a cellular computing architecture that uses silicon with memory and processing power resident on the same chip. Data that it takes to solve a problem is distributed among the chips. Each chip processes its part of the data, analyzes it and shares results with other processors, but only as necessary.

“There is no notion that any central part will ever look at all the data,” Pulleyblank says. That means “applications can be much more efficient.”

An example of this cellular architecture is Blue Gene, the computer IBM is building to study how proteins fold into various shapes, some of which cause disease. With about 1 million processors, Blue Gene is more powerful than the next 500 largest computers in the world combined, Pulleyblank says.

Whereas Blue Gene is being built for a specific function, he sees more widespread appeal for IBM’s cellular computing architecture. It could be used to tackle any problem where huge data sets can be broken down and mapped to many processors. Examples include logistics applications, such as airline schedules and warehouse distribution programs.

A big advantage of the cellular architecture is scalability, Pulleyblank says. “If data gets larger and a million processors isn’t enough, maybe we need two million, we can effectively extend it up to that level,” he says.

“Moore’s Law says that if I wait 18 months, the machine will be about twice as fast,” Pulleyblank continues. “If I look at a problem like protein folding, I won’t get enough compute power to actually study these problems for about 10 or 11 years, and I don’t want to wait that long. This is a way that lets me jump up power much faster.”

A key to the cellular architecture is that it reduces the latency inherent in storing data away from the computing processor by putting the processor and memory on the same chip.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now