In This Article:
Amazon Web Services (AMZN) isn't putting all its chips on one AI model. Instead, it aims to win by keeping friends close and potential rivals closer.
While other hyperscalers such as Microsoft (MSFT), Google (GOOG, GOOGL), and Meta (META) compete on large language models, AWS hopes to offer customers whatever models they want. The cloud giant is positioning itself not just as an artificial intelligence platform to train models but as a marketplace to sell them.
"There's not going to be one model that's going rule them all," AWS CEO Matt Garman told Yahoo Finance at the Goldman Sachs 2024 Communacopia and Technology Conference. "The best model today may not be the best model tomorrow. … If you have a platform that has a bunch of models available, it's actually relatively easy to take advantage of new ones or add new capabilities from other providers as they come along."
Amazon CEO Andy Jassy mentioned this strategy on Amazon’s latest earnings call. Jassy noted that Amazon's Bedrock service has the "broadest selection of LLMs" and offers a variety of leading foundational models to AWS customers, who can then build their own AI applications using Anthropic’s Claude 3 models, Meta’s Llama 3 models, and Amazon's Titan models.
The latest example of this strategy emerged on Monday when AWS and Oracle announced a partnership despite 15 years of competition between the two cloud providers.
Cooperation, not direct competition, is how AWS plans to diversify its revenue streams and monetize AI. AWS is projecting $105 billion in revenue this year — about 17% of Amazon’s total revenue, according to estimates.
AWS not 'rushing in' to chatbots
When asked about the perception that Amazon is falling behind on AI — particularly when compared to Microsoft — Garman was quick to mention that for Microsoft, "it’s not their own technology. … OpenAI was listed as a competitor of theirs, which is an interesting dynamic for them. ... We like to partner, not necessarily compete."
In other words, AWS’s perceived sluggishness in AI adoption is a strategic feature, not a bug.
"We felt that it was more important to actually build a baseline platform that customers could really go build real enterprise value into their applications on as opposed to rushing in and quickly getting out chatbot technology that ... looks cool and allows people to write haikus," Garman said. "That's not really the eventual place that the value is going to come from this technology."