Time to ‘throttle’ computer viruses

With Matthew Williamson

Matthew Williamson is a senior research scientist with the Biologically Inspired Complex Adaptive Systems group, part of Hewlett Packard labs in Bristol, England. Williamson’s work, which started in robotics, is part of a growing trend in IT to investigate how biology can be emulated in computing. The difficulty in solving complex interactions between systems is the root cause of many problems plaguing everything from grid computing to security. This is something nature is inherently good at. New computer antivirus solutions may come from imitating human antivirus responses. Williamson recently visited Toronto to present a paper, An epidemiological model of virus spread and cleanup, at the annual Virus Bulletin conference and took some time to speak with ComputerWorld Canada department editor Chris Conrath.

What can living organisms teach us about computer security?

This is a starting point, but a lot of the trends in computing and the trends in complexity are demanding a different approach, not just in security but also managing computers. People are breaking down the old and simple metaphors that people were using so there is a need for something new, and in a distributed complex world the biologically inspired ideas are really a good fit. There are different layers of granularity of what can be taken from the biological model. Some are based on the idea you could build intrusion detection systems really, deeply inspired by some of the mechanisms that are in the biological world such as detectors like B and T cells (cells in the human immune system designed to hunt and destroy foreign antigens). That is one level; another layer is looking at the metaphor. What is an immune system for a corporation? Another way is to look at systems based on feedback loops and emergent properties that are designed (for them). Those are three examples. But one thing that comes out of biological systems is that they are never really running perfectly, and of course they are dead if they are broken. They are always in this sort of intermediate grey area and they work very well in this grey area. But our computer systems are extremely binary, they are on or they are off, they are either working well or they are broken. As the complexity of everything gets bigger, making those binary decisions gets really hard. So I feel you have to embrace the fact you can’t be binary. We have to start thinking about living in this grey area.

Are people starting to embrace the idea of a non-binary/grey area IT?

I think it is getting more common, that there is a general feeling in security that we are kind of losing the battle, and it is beginning to be gray and not black and white any more. The dominant technologies and the dominant thoughts are very binary so the tools people are using are very binary. But there are some things that you have to accept when you become gray which are not very palatable to people who think in a binary world. So accepting a rate of virus infections is not easy; people want systems to be completely clear.

The virus throttling model, where did if come from?

At about the time I was starting at HP was about the time when Code Red and Nimda outbreaks were happening. So there was a lot of discussion about ways to deal with them. It started from thinking about things in a different way, thinking about this gray area. But to do so, people are going to have to accept that their systems are not going to be perfect.

Can you explain it in more detail?

When you are infected by a virus, your machine tries to contact many different machines quickly, hundreds a second. But a normal machine is contacting the same machines over and over at a much lower rate. So you set a rate limit of, say, one a second and your normal traffic flows normally. It is like a bucket with a hole in it and the hole lets out one per second. If you drip in one per second and one per second drops out, you won’t have any water in the bucket. If you put in 400 per second, the bucket gets half full of water, still only one comes out. Of the 400, say five are legitimate traffic. In our first implementation the five would have been held up but then we got more clever about prioritizing.

This is based on looking at what is in the bucket and being more granular about prioritizing what is in the bucket. Say you are browsing the Web and you get the Slammer virus, which is spread via a given port. When you look at your bucket, you will be able to see that 99 per cent of the traffic is Slammer related. Throttling helps to slow the spread (not by ridding the system of Slammer, but by reducing its impact). Throttling can also prevent congestion on the network because it is possible that just a few servers infected with a particularly virulent worm can do lots of damage to network connectivity by slowing down shared infrastructure.

Can we virus throttle today?

We are there already in terms of implementations (at HP labs in Bristol, England) but it is one of those things that is down the line, and we are exploring commercial options.

Do you think the corporate world will accept the idea of an acceptable level of infection?

I think that some people are beginning to talk like that and I think that they will have to do that; I don’t think that there is any other option. I don’t think that you are going to win. Sometimes a problem isn’t a problem, rather, it’s a fact to be coped with and a lot of these things are becoming facts that we are going to have to work around and get a handle on and not let it get out of control rather than get rid of. I think if we could do that, we could make our systems so they don’t do this any more (Williamson draws a graph that has occasional huge spikes when viruses hit). Instead they would have more frequent little blips. That has got to be cheaper.

Great technology often fails. The success of a techno-logy is more often aligned to business practices than the technology itself. How does your work fit into HP overall?

To really make a technology take off, it has got to be right in lots of different ways, of which the business side is one, but you can’t get away from that. I think that is a sort of fact of life. At HP labs we generate ideas that the company then decides what they are going to pick up. Regardless, it is definitely a good ally to have on your side. You are much more likely to get some traction with ideas when you’ve got that sort approval than you do in a much smaller place.

The down side – I guess it isn’t really a downside since it is the same in any corporation – is that you are at the mercy of what the business forces decide is important. Security is important for HP, as it is for most, so that puts us (in the Bristol lab) in a good place. But we do get actively involved in the business side because we would be the advocates for any technology. We build relationships with the parts of the company that might adopt these technologies and figure out the best ways to get them in.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now