All I want for Christmas is computing power Inc. is making a big bet, but it’s not on selling books, CDs or holiday gifts. Instead, it wants to sell you all the processing power you can eat. Rather than competing with your local bookstore, it’s taking on the likes of IBM, Hewlett-Packard Co. and Sun Microsystems Inc.

Amazon’s recently released Elastic Compute Cloud (which it calls EC2 and is still in beta) for the first time brings to the masses grid computing and utility computing — the ability to buy server power in the same way you now buy electricity or water.

In essence, you pay 10 cents per virtual server per hour, plus bandwidth costs, and you do with that power whatever you want. While it’s not quite as simple as turning on your water tap, it’s the same basic idea. You pay only for the processing power you use, and how much you use is entirely in your control.

IBM, HP and Sun already sell computing power on demand, but they sell primarily to large enterprises, and on a very big scale. Amazon, on the other hand, sells to small and medium-sized businesses, as well as to large enterprises, and does it via unique technology that builds on previously released Amazon middleware services.

Not everyone agrees that the same company that offers 40 percent off best-sellers should try to become a big-time IT provider. But Amazon has always believed that books were only an entree into selling far more sophisticated goods and services. Can it succeed? We’ll take a look inside the technology, then talk to the Amazon executives in charge of the service, which may give some hint as to whether it will pay off.

How it works

Let’s start off with a look at what the system is, how it works, and a brief history of it. EC2 is not, in fact, the first of this type of service that Amazon has launched; it’s an outgrowth of an existing platform called Amazon Web Services. Back in March of 2006, Amazon released its Simple Storage Service (S3), online metered storage that costs 15 cents per gigabyte per month of storage used, plus 20 cents per gigabyte of data transferred. It uses standard REST and SOAP interfaces.

In July of 2006, Amazon followed on with the Simple Queuing Service (SQS), a scalable hosted queue that stores messages as they travel between computers. It’s designed to let developers easily move data between distributed application components, while ensuring that messages aren’t lost.

It can be used to transfer messages even when individual components aren’t currently available — once a component is available, it’s sent to it from the queue. Again, it’s a metered model; costs are 10 cents per 1,000 messages sent, and 20 cents per gigabyte of data transferred. Like S3, it uses Representational State Transfer and Simple Object Access Protocol interfaces.

In both instances, the technology wasn’t developed from scratch. Instead, Amazon used its own internal infrastructure and technologies and made them available to developers.

EC2 continues in that tradition. Put in the simplest terms, Amazon rents out virtual servers, which it called “instances,” from its data centers, which are grids. Each instance has the approximate power of a server with a 1.7Ghz Xeon processor, 1.75GB of RAM, a 160GB hard drive, and a 250M bit/sec. Internet connection which can communicate in bursts of up to 1G bit/sec.

You pay 10 cents per hour for each instance, plus 20 cents per gigabyte of data transfer. You can also combine it with S3 and pay 15 cents per gigabyte per month for storage. In the future, Amazon will likely roll out other tiers of instances, with more powerful instances costing more per hour.

This is a big change from most hosted models, in which you typically pay based on a maximum or planned capacity, plus fees for added redundancy. In the Amazon model, you pay only for what you actually use.

In order to use the service, you create a server image (called an Amazon Machine Image, or AMI), based on an Amazon spec. Ultimately, the server image will be able to have whatever operating system, applications, configuration, log-ins and security that you want. At the moment, it only supports the Linux kernel. Amazon also has prebuilt AMIs built that you can use as well, so that you don’t have to configure them from scratch.

To use EC2, you upload the AMI, then invoke it and use it via an Amazon API. That virtual server can do anything you want — power a database, speed downloads, power search or host a Web site, for example. You treat the virtual servers just as if they were your own servers.

Users can have multiple AMIs, and those AMIs can cooperate with one another, in the same way as servers can. So, for example, you could build a three-tiered application with three different AMIs. One tier could be a web server using Apache, a second tier could handle the application logic and the third tier could be the database.

While there are clear benefits for small business, larger enterprises have signed on, too. For example, Microsoft Corp. has used the service to speed up software downloads, and Linden Lab has used it to help handle downloads of its Second Life online virtual world.

Where EC2 is headed

One major question raised by EC2 has nothing to do with technology, and everything to do with business: Has Amazon made a blunder by venturing outside of its core competency? After all, selling the latest best-seller and holiday gifts is one thing; trying to be a major league IT provider is something else entirely.

But Amazon execs don’t see things that way. In fact, they maintain, EC2 and similar services are at the heart of the Amazon business plan.

“Amazon is fundamentally a technology company; we’ve spent more than one and a half billion dollars investing in technology and content,” says Adam Selipsky, vice president of product management and developer relations for Amazon Web Services. “We began by retailing books, but it was never in our business plan to stay with that.”

Selipsky says that Amazon’s first major move into expanding its platform beyond books and basic retail came in 2000, when the company opened its platform to third-party merchants, who were able to sell their products on Amazon.

In 2002, the third wave began, he says, when Amazon launched the Amazon Ecommerce Service, which allows developers to create applications which hook into Amazon’s database, retrieve and display product information and build customer shopping carts.

Out of that grew Amazon’s Web Services initiatives, including S3, SQS, and EC2.

“The Web Service initiatives let us pass on the engineering expertise we’ve acquired through the years, and the sometimes painful lessons we’ve learned building a Web-scale business,” Selipsky explains. He adds that Amazon will continue to add other service for developers and businesses, although would not be specific about what future services might be launched.

What Amazon Cloud means for grid computing

EC2 is one of the more innovative uses of grid computing and middleware, but it is far from the only one, and will certainly not be the last. Grid computing has been hyped for several years, but to date has not yet lived up to the hype.

Robert Rosenberg, president of the analyst firm Insight Research, has been tracking grid computing for at least four years, and says, “There’s some progre

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now