Site icon IT World Canada

Amazon invests in massive AI model

Amazon is investing heavily in training a massive language model codenamed “Olympus,” with 2 trillion parameters, according to two people familiar with the matter. This would make it one of the largest AI models currently under development, surpassing OpenAI’s GPT-4 model, which has one trillion parameters.

The team leading the project is led by Rohit Prasad, former head of Alexa and now head scientist of artificial general intelligence (AGI) at Amazon. Prasad has brought together researchers from Alexa AI and the Amazon science team to work on training the model, uniting AI efforts across the company.

Amazon has already trained smaller language models such as Titan and partnered with AI model startups such as Anthropic and AI21 Labs to offer their models to Amazon Web Services (AWS) users. However, Amazon believes that having its own homegrown models could make its offerings more attractive to enterprise clients on AWS, who want access to top-performing models.

Amazon’s investment in LLMs is part of a broader race to develop AI capabilities among tech giants. OpenAI and Alphabet are also investing heavily in AI, and each company is trying to develop the most advanced AI models.

The sources for this piece include an article in Reuters.

Exit mobile version