The changing face of P2P computing

Even the purists can’t seem to agree on the state of peer-to-peer computing. Is it dead, dying, just starting out, solidly entrenched or a phoenix rising from the flames sparked by Napster’s demise?

Deciding that starts with agree exactly what P2P is.

Defining peer-to-peer computing – the key to deciding if it is waxing or waning – is a bit like trying to grab hold of a cloud.

“It is the direct transfer of resources between computers,” said Bob Knighten, peer-to-peer evangelist with Intel Corp. in Beaverton, Ore. But, he added, the company deliberately keeps the definition broad – to the point that it could even include client/server.

“A lot of old wine is being put in new bottles,” he admitted.

Andrew Mahon, director of strategic marketing at Groove Networks Inc. in Beverly, Mass., defines P2P as two or more machines making a direct connection to each other. This could be for file transfer, general communication or even the leveraging of spare cycles, he said.

“But among customers, I am finding that they couldn’t care less what the definition is,” he added.

Adam Beberg, CTO of Mithral Communications and Design in Santa Clara, Calif., who said true P2P started 30-plus years ago, came back, got some hype and is now dying, defines P2P this way: “Peer-to-peer is basically a new term for distributed computing and client/server…everything that is peer-to-peer out there is client/server.”

“There is no new technology here, the thing that is new is the hype.”

The reason for the name change, according to Beberg, is “they couldn’t just call it client/server and hype it.”

The concept

The idea behind P2P computing is having computers connect directly without having to run through central servers or censors.

The technology industry, and the media which follow it, love anything new – new releases, new gadgets, new concepts and new ideas. Since P2P, in the truest sense, was not new, it needed some cool technology to bring it back into the forefront. That technology was Napster, which is not really P2P. Napster caught the attention of the general public because people could get something for nothing.

“Clearly peer-to-peer has had the greatest success as a technology…on the consumer side. On the business side there has been very little use of peer-to-peer,” Mahon explained.

“The reason Napster is so successful is not that it is peer-to-peer and it is not even that it allows people to be independent in what they use. It is successful because of the content that they are getting,” Mahon said.

According to Beberg, “basically the stuff people are willing to pay for is the stuff people don’t want to put outside the firewall, so you are left with the stuff like Napster.”

George Reese, senior architect at Imaginet, an e-services company in Minneapolis, Minn., said, “Napster is the poster child of what is wrong with peer-to-peer.” Since the company had the power to control access, it opened up a can of worms by turning a blind eye to obvious copyright infringement.

“Regardless of the legal issues, Napster was walking a fine ethical line,” he said.

The content sellers are really not that excited about the P2P idea, but they are not exactly the most forward-thinking industry out there, Mahon added. “I think that is going to be a while before those companies and the peer technology guys are able to hit some happy medium.”

But there is some true peer-to-peer technology today that is taking hold in the business world, according to Don Bryant, director responsible for system solutions at xwave in St. Johns. “I think one of the only definitions of P2P today is instant messaging, where you see no server infrastructure in between,” he said. Just put in the IP address and use the network to connect.

part of a bigger solution

Knighten says peer-to-peer computing is really about the vastly different scale of resources available by interconnecting computers, whether it is cycle sharing, cluster computing or something different.

“There are problems we really couldn’t solve when we did them the previous way which now it looks like we might be able to solve,” he said.

The company has started a philanthropic peer-to-peer program that allows users to donate unused computer resources to aid in medical research, in essence creating a virtual supercomputer.

“Intel definitely is looking at this not as something where we think we are going to make money at peer-to-peer computing…rather we think this is a good thing for the computer industry,” he said.

“With this whole peer-to-peer phenomena, thinking of it as a separate thing in its own right is going to be a fairly short-lived phenomena,” he predicted.

Another P2P-type project under way is SETI@home. It is a scientific experiment using connected computers in the Search for Extraterrestrial Intelligence (SETI). The downloaded program helps analyse radio telescope data. With almost three million participants, the group has done the work equal to more than 600,000 years for a single CPU.

security, security, security

Regardless how the world of peer-to-peer evolves, the biggest concern is security. You don’t open your door to just anybody, so why should you open your hard drive?

“If peer-to-peer doesn’t address [security], I think that I can come as close as possible to guaranteeing that it will never be successful in a business,” Mahon said. “It will probably be successful on the consumer side because people just don’t take precautions.”

Reese agreed. “Security has to be one of the first things a peer-to-peer developer thinks of…security has to be the first and foremost concern.”

Even cycle sharing has issues of security.

Clients must trust that users are not going to crash their machine or steal something other than the cycles being offered, Knighten said. On the other side, the company collecting information has to know they are getting legitimate computations.

Possible solutions include making the data chunks small enough so they have no individual value, increasing encryption and authentication and replication of data so there are multiple copies to verify veracity, he added.

“Software applications is the area that needs the most work. The hardware is there right now.”

the p2p of tomorrow

“We are roughly in the same place that the Web was about a year and a half after Mosaic came on the scene,” Knighten said. “Imagination is the limitation.”

“You may see teenagers with a T3 connection running to the house and they are paying for that by leaving their PC running in the night time and allowing people to utilize the cycles,” Bryant said.

And just for fun he went a bit further.

“What we are seeing is a convergence of the ability to take the human brain and have it communicate with other human brains through a network to create, in theory at least, exponentially better thinking,” he added.

“In terms of businesses, the Borg model is not a bad one…but I am not sure it fits very well with our idea of socio-political views and how we should move forward with technology and computing,” he laughed.

“Take the model and take out the good pieces,” he said.

“Now we are on the cusp of understanding both the technology and how it can potentially help us.”

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now