Is a new Internet architecture needed?

I got a call from a reporter the other day. He wanted to talk about the denial-of-service (DoS) attacks on prominent Internet sites, including Yahoo Inc., CNN and eBay Inc. He did have some idea what was going on (not always the case when I get such a call), but he seemed to want me to say that the architecture of the Internet needed to be changed to deal with such attacks. I declined to do so.

It is true that the Internet architecture’s openness makes the kinds of attacks that we saw a couple of weeks ago easier to launch while, at the same time, making it harder to track down the perpetrators. But it is that same openness that created the economic engine that the Internet has become. We need to be very careful not to overreact to the extent of killing the features that have made the Internet successful.

There were two different types of attacks that were used in the recent incidents – SYN flooding and smurf attacks. I wrote about smurf attacks almost two years ago and SYN attacks have been known about for quite a while. Attackers using these techniques depend on forging the source addresses of the packets they send in order to hide their tracks.

RFC 2267 describes how network managers can help protect the Internet from people or corrupted computers at their sites by ensuring that packets leaving the sites do not have forged source addresses. This RFC was published two years ago as an Informational RFC and has just been approved for republication as a Best Current Practices (BCP) RFC, a category that the IETF uses to label documents describing the best thinking on how to perform some function.

Filtering, as described in RFC 2267, is not a cure-all because not everyone does it, and it does not stop the attack itself. But it can make tracking easier. There are well-known ways that sites can protect themselves from the effects of SYN attacks and other ways to filter out some of the effects of smurf attacks. But we are now seeing calls for more drastic actions.

At first glance one of the most attractive methods would be to require that all Internet traffic include authentication information so the sites would know to whom they are talking. The technology exists to do this. But this cure would be far worse than the disease because the same authentication would mean a perfect record could be kept of the activities of all Internet users – not a pleasant prospect for anyone who is remotely concerned with individual privacy.

Let’s try to figure out how to address the problems raised by the attackers without requiring each of us to undress for governments and big business.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now