Peer-to-peer computing: the next IT tsunami?

Peer-to-peer computing, which is all the rage lately among some IT cognoscenti and much of the media, is being compared to the ill-fated “push” craze a few years ago. Don’t make that mistake. Push was an interesting idea that never lived up to its hype. Peer-to-peer (P2P) is going to change things in a big way.

Don’t make the mistake, either, of thinking of P2P only in the Napster Inc. context. Yes, file sharing is an enormously valuable component of the P2P genre – though the entertainment industry’s paranoid reaction could damage the most promising computing architecture in years. But file sharing isn’t the only promising use.

The basic notion of P2P is that two computing devices (peers) share their information and brains with each other. With Napster and many other peer-to-peer technologies, the desktop machine contains a server, not just a client. It may be a miniserver designed to send out limited kinds of data, but it’s a server nonetheless.

Servers can be peers. The domain name system, which lets computers find one another on the Internet, is a group of servers peering effectively with one another. Devices can be peers, and as we connect a billion devices to the Internet, we’ll have no choice but to use P2P.

But P2P has also been defined to include some fascinating projects in distributed computing, where we use some of the power that typically sits idle on PCs around the world by breaking up big-number-crunching problems into small pieces. Many volunteer, non-profit projects are in the works, but several for-profit companies have sprung up to take advantage of this notion, too.

Another P2P use that has gotten lost in the Napster noise is what we might call the “read-and-write Web.” The Web has been turned principally into a read-only medium, but it wasn’t designed that way. New products allow people to write on-line from inside Web browsers, saving their work directly on the Web.

The real excitement, though, is in what’s yet to come. At a recent meeting of P2P leaders in San Francisco, a representative from Intel, which is moving into P2P in a big way, offered some intriguing suggestions. For example, he said, peering computers might watch one another’s backs from a security standpoint. Or a company might be able to distribute multimedia training materials more efficiently and cheaply if it didn’t have to set up powerful servers all over the world. The possibilities are literally endless.

IT people will be forgiven if they groan at this point. Client/server and network computing are hard enough. Now comes a whole new architecture. But the potential savings and utility are enormous. P2P is for real. We will dismiss it at our peril.

Gillmor is a technology columnist at the San Jose Mercury News. Contact him at