In This Article:
This week, Amazon (AMZN) Web Services, or AWS, kicked off re:Invent 2023 — its biggest conference of the year — with a show of force.
After a rock band banged out a rendition of "My Hero" by Foo Fighters on Tuesday morning, AWS CEO Adam Selipsky unveiled a deluge of announcements primarily related to AI. Star partners, including Nvidia (NVDA) CEO Jensen Huang and Anthropic CEO Dario Amodei, dropped by for cameos.
The surprise appearance of Nvidia's Huang was especially notable, as he joined Selipsky on stage to announce the expansion of AWS and Nvidia's partnership. The alliance will be characterized by a number of initiatives, including what the companies call "the first cloud AI supercomputer," run with NVIDIA's Grace Hopper Superchip and AWS's UltraClusters. The cluster computing system allows customers to access thousands of powerful GPUs.
AWS will also be the first to host Nvidia's DGX Cloud, an AI-training-as-a-service platform capable of training generative AI and large language models (LLMs) beyond one trillion parameters.
Huang seemingly nodded to Amazon and AWS's longtime support, telling the audience that "AWS was the world’s first cloud provider to really recognize the importance of GPU-accelerated computing.”
AWS and Selipsky emphasized the company's partnership with OpenAI competitor Anthropic, which makes the LLM Claude. In September, Amazon announced plans to invest up to $4 billion in Anthropic, whose backers also include Google (GOOG, GOOGL), Salesforce (CRM) Ventures, and Zoom (ZM) Ventures.
Selipsky took a swing at Microsoft-backed (MSFT) Open AI and its recent boardroom melodrama, referencing the events of the "past 10 days" and saying that "you don't want a cloud provider that's beholden primarily to one model provider."
Amodei, who co-founded Anthropic, came on stage to discuss AI's applications in medicine, law, and finance, including a new partnership with Pfizer (PFE).
"We've seen a lot of organic adoption," said Amodei. "One example is in the biomedical space ... AI can advance medical science and bring about life-saving drugs around the world."
Additionally, AWS revealed a number of processors and chips focused on optimizing compute power and energy efficiency, including the Graviton4 and Trainium2. The latter is geared towards training large foundation models, or FMs, and LLMs faster while reducing cost.
The company's AI announcements snowballed from there. AWS, for example, introduced a product geared towards AI safety called Guardrails. It's now integrated into Bedrock, the company's platform for building generative AI apps.