santa.jpgFor the people who love to hate Microsoft, Christmas came early this year.

The company was using an automated agent at to let kids talk to Santa and his elves. When one user managed to get an elf talking about oral sex, however, Microsoft quickly shut the service down, prompting jokes about Microsoft killing Santa. This may be the season for peace on Earth and good will towards men, but not if they work for the world’s largest software firm. Which is strange, because Microsoft may have done IT managers a favour by exposing a problem that it almost sure to haunt them in the years to come.

Like most such automated IM programs, the virtual agent on was using artificial intelligence to create a realistic dialogue with its young users. It says something about the sophistication of the technology that the conversation could turn so naughty. Microsoft said it tried to clean up the system’s vocabulary, but it was probably wise not to let others goad it into even worse language. The whole point of this software, of course, is that it tries to give you what you want. That means it’s probably more prone to being warped for inappropriate behaviour.

You can laugh about what happens to Microsoft, but automated agents could soon become a common piece of the extended enterprise, and senior executives won’t find it very funny if they are manipulated by users. Already we’ve seen agents being deployed by schools as part of their distance education programs, or by businesses that use them to introduce products or services to Web site visitors. The emergence of rich Internet applications – and specifically the technological foundations such as Adobe’s AIR and Microsoft’s Silverlight to help create them – means many more of them will likely pop up. Not all of them will have the intelligence of Microsoft’s elves (or lack thereof) but they will undoubtedly continue to evolve as interactive tools.

Now imagine if, instead of trying to get an automated agent to swear, someone with the right kind of skills was able to use an automated agent to dispense more sensitive kinds of information. Maybe with the right combination of questions, word choices or syntax, personal customer information could be ferreted out through an agent, company financial information or passwords. In part this will depend on how “deep” the agent is connected to back end systems, but the only point in creating them is to make them more useful. As that happens we will likely see the digital equivalent of a “horse whisperer,” someone who can hack a system by talking to an automated agent.

Before they start building RIAs and animated characters to walk customers through the process of applying for a mortgage or even signing up for an e-mail newsletter, IT managers should consider the fallout from the incident and explore the risks associated with advanced IM programs. That way they’ll prevent similar problems before they happen. The real value in this particular horror story is what it could teach us, not just it’s punch line. Consider it a gift from Microsoft.

Would you recommend this article?

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication. Click this link to send me a note →

Jim Love, Chief Content Officer, IT World Canada
Previous articlePassport Canada lets all kind of personal data through
Next articlePhishing for IT expertise
Your guide to the ongoing story of how technology is changing the world