Amazon invests in massive AI model

Amazon is investing heavily in training a massive language model codenamed “Olympus,” with 2 trillion parameters, according to two people familiar with the matter. This would make it one of the largest AI models currently under development, surpassing OpenAI’s GPT-4 model, which has one trillion parameters.

The team leading the project is led by Rohit Prasad, former head of Alexa and now head scientist of artificial general intelligence (AGI) at Amazon. Prasad has brought together researchers from Alexa AI and the Amazon science team to work on training the model, uniting AI efforts across the company.

Amazon has already trained smaller language models such as Titan and partnered with AI model startups such as Anthropic and AI21 Labs to offer their models to Amazon Web Services (AWS) users. However, Amazon believes that having its own homegrown models could make its offerings more attractive to enterprise clients on AWS, who want access to top-performing models.

Amazon’s investment in LLMs is part of a broader race to develop AI capabilities among tech giants. OpenAI and Alphabet are also investing heavily in AI, and each company is trying to develop the most advanced AI models.

The sources for this piece include an article in Reuters.

IT World Canada Staff
IT World Canada Staff
The online resource for Canadian Information Technology professionals.

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

ITW in your inbox

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

More Best of The Web