AWS launches generative AI tools

Generative artificial intelligence (AI) software such as ChatGPT has captured the imaginations of consumers and businesses alike, as they experiment with everything from search bots like Microsoft Bing Chat to embedded technology in existing customer support systems, such as Salesforce Einstein.

Today, Amazon Web Services (AWS) launched a series of tools to help its customers use the technology in their businesses.

“Over the last seven years, we played a key role in taking all the AI/ML experience that we have and democratizing it, making it accessible to anyone who wants to use it,” said Vasi Philomin, vice president and general manager, machine learning and AI at Amazon, during a media briefing.

“Now we want to take the same sort of democratizing approach to generative AI. And the way we’re going to do that is we’re going to take these generative AI technologies out of the realm of research and experiments and extend their availability far beyond just the handful of startups and large, well-. funded tech companies.”

To that end, the company announced three initiatives to further the use of generative AI in businesses.

First, it announced Amazon Bedrock, a new service, now in limited preview, that makes foundation models (FMs) – very large models trained on massive amounts of data that can be used in a wide number of contexts – from AI21 Labs, Anthropic, Stability AI, and Amazon accessible via an API (application program interface). Bedrock allows customers to build their own applications using FMs through simple API calls instead of having to manage (and pay for) huge infrastructures, and gives them the ability to customize the FM to their own needs.

Amazon has also created its own FMs, dubbed Titan, and today announced its first pair of models. Titan Text is a generative large language model (LLM) used for summarizing text, creating text such as blog posts, classification, open-ended Q&A, and information extraction. Titan Embeddings translates text inputs (words, phrases or possibly large units of text) into numerical representations (known as embeddings) that contain the semantic meaning of the text. It can be used to improve search results and personalization.

The second announcement, AWS said, will enable companies building, customizing, and using FMs to do so in a more efficient and economical way. It said both Amazon EC2 Trn1n instances powered by AWS Trainium and Amazon EC2 Inf2 instances powered by AWS Inferentia2 are now generally available.

The company said that Trn1n instances double the network bandwidth compared to its Trn1 instances and are designed to deliver 20 per cent higher performance for large, network-intense models over Trn1 instances (which, it said, can on their own deliver up to 50 per cent savings on training costs compared to other EC2 instances).

Amazon EC2 Inf2 instances powered by AWS Inferentia2 chips, AWS said, offer the highest performance, most energy efficiency, and the lowest cost for running generative AI inference workloads at scale on AWS. Inf2 instances deliver up to 4x higher throughput and up to 10x lower latency compared to the prior generation, which it said can drive up to 40 per cent better inference price performance than any other EC2 instance.

Finally, Amazon announced that Amazon CodeWhisperer, a generative AI tool to assist developers write better code more quickly, is now generally available and free to individual developers. It uses a FM under the hood to improve developer productivity by generating code suggestions in real-time, based on developers’ comments in natural language and prior code, in their preferred Integrated Development Environment (IDE), via the AWS Toolkit IDE extensions.

CodeWhisperer has been in preview since last year. The generally available version adds 10 new programming languages to the original list of Python, Java, JavaScript, TypeScript and C#.

“During the preview, we ran a productivity challenge, and participants who used CodeWhisperer completed tasks 57 per cent faster, on average, and were 27 per cent more likely to complete them successfully than those who didn’t use CodeWhisperer,” AWS said in its announcement. “This is a giant leap forward in developer productivity, and we believe this is only the beginning.”

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Lynn Greiner
Lynn Greiner
Lynn Greiner has been interpreting tech for businesses for over 20 years and has worked in the industry as well as writing about it, giving her a unique perspective into the issues companies face. She has both IT credentials and a business degree.

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now