Next-generation Net seen at crossroads

People use the telephone network for voice and some IP data; they use the Internet for data and some voice. There will be one network someday. But today, there’s considerable debate on what that next-generation network will be: an outgrowth of the public switched telephone network or the Internet.

That debate took center stage in Washington, D.C. this month at the Next Generation Networks 2005 conference, a gathering of industry players and pundits. “There’s considerable uncertainty right now on what the next-generation network will look like,” said Dave Passmore, conference chair and research director of Burton Group.

That uncertainty creates standoffs between standards organizations such as the International Telecommunication Union (ITU) and the IETF. It tries to take into account the effect of wireless, video, message-based routing and peer-to-peer applications on foundational IP routing structures that were developed when those technologies could scarcely be imagined.

The uncertainty prompts questions about whether the Internet should be rebuilt using a “clean slate.” But most critical of all is that the uncertainty could potentially reach into the pockets of carriers and their customers.

The direction of next-generation networks could determine how tightly customers are tied to carriers, and who will be forced to reinvent their business models to stay alive. “We need to acknowledge the fact that something is wrong with the business model of the public network,” said Tom Nolle, president of consultancy CIMI Corp.

Passmore added, “Providers want to take back control of the Internet. They feel it’s out of control.”

Some consider the IP Multimedia Subsystem (IMS) architecture, endorsed by the telecom-heavy ITU for next-generation networks, as the key for operators to regain that control. Conceived by the Third Generation (3G) Partnership Project — a collaboration of telecom standards bodies initially chartered to define specifications for 3G mobile wireless systems — IMS essentially replaces the control infrastructure in the traditional circuit-switched telephone network, separating services from the underlying networks that carry them.

IMS enables services, such as text messaging, voice mail and file sharing, to reside on application servers anywhere and be delivered by multiple wired and wireless service providers.

Yet some are skeptical of IMS. While it enables the migration of the PSTN to IP while maintaining telephony-borne features such as emergency services, wiretaps, call handoff and billing, critics, such as the IETF, say it gives carriers too much control over the customer experience.

“We see an attempt to completely control quality of service by IMS,” said Scott Brim, Cisco Systems’ senior consulting engineer, who spoke at the conference on behalf of the IETF. “The IETF is concerned about this.”

One large corporate user is not.“We’re looking specifically at what they mean by control,” said the user, a director of network architecture at a US$40 billion company, who asked to remain anonymous. “But I do think IMS is a good direction. I want to know what carriers are doing” technologically.

HEMMED IN?

IMS skeptics described the architecture as a “walled garden” around customers. Proponents said it provides a secure, reliable, high-quality service experience for customers.

The Internet, on the other hand, provides only “best-effort” QoS, is prone to security breaches, denial-of-service and other attacks, and is generally less reliable than the PSTN. This prompted discussion among academics and researchers at the event as to whether the Internet should be rebuilt from scratch. The most important reason to rethink the Internet is security, said David Clark, senior research scientist at the Massachusetts Institute of Technology.

“We are suffering a success disaster,” Clark said, referring to the popularity and ubiquitous use of the Internet. “The first question is, ‘Isn’t today’s network good enough?’ Which applications can you not build because of today’s Internet? We do not have framework, architecture or a set of rules. We need to focus on resistance to attack and resiliency in the face of attack.”

Reliability is right up there with security as a reason to re-architect the Internet, according to Larry Peterson, professor and chair of computer science at Princeton University. The industry norm for reliability is five nines — 99.999 percent reliable — but the Internet is “a long way from five nines,” he said.

QuickLink 050151

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now