Old bugs don’t die very easily

One might think that a vulnerability first described in 1985 would not be a factor in today’s Internet, especially if a good way to eliminate the vulnerability was published in 1996. But, sad to say, that is not the case.

Reports surfaced recently that Internet consulting company Guardent Inc. claims that an old bug had arisen from a supposed grave.

TCP, described in RFC 793 (www. ietf.org/rfc/rfc793.txt), is the basic reliable data delivery protocol used in the Internet. TCP uses a data sequence number as the basis of its reliable delivery process.

When your computer sends data to another computer on an IP network, it breaks the data stream into chunks, known as packets, for transmission. When it sends each packet, it includes a sequence number that represents all the bytes of data that it has sent up to this time during the specific communication.

The destination host responds with an acknowledgement packet containing the sequence number of the next byte of data that it expects to see. The sender uses this acknowledgement to find out what data has made it to the destination node.

In many environments, trust relationships are defined between hosts – for example, between a file server and its clients. An attacker with the knowledge of what sequence numbers a file server will use and the ability to forge IP addresses can fool the server into thinking it is talking to a trusted client when it is talking to the attacker. Computers can be configured to try to make it difficult to guess what initial sequence number will be used in a conversation.

But there have been problems in coming up with a good way to make it hard to guess the initial sequence number. Robert Morris’s February 1985 paper (ftp://ftp.research.att.com/dist/ internet_security/117.ps.Z) details the above attack and makes some suggestions on how to prevent it. A decade later, Steve Bellovin published a more detailed description and set of recommendations in RFC 1948 (www.ietf. org/rfc/rfc1948.txt).

But system vendors are sometimes not all that good at fixing their software to avoid vulnerabilities, even when the vulnerabilities have been known for a long time (centuries in Internet time).

In the case with TCP, vendors generally tried to plug the security hole after a well-publicized attack in 1994. But they did not then add the additional protections that Bellovin described two years later in RFC 1948 because they were seen as too hard to implement.

But Guardent’s report indicates that avoiding the hard work just meant that the problem did not go away. It is a truism in the security area that good security is not easy. This example should be taken as just another reminder of that truth for anyone concerned with the security of his own network and systems.

Scott Bradner is a consultant with Harvard University’s University Information Systems. He can be reached at [email protected].

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now