Amazon and Anthropic have announced a sweeping expansion of their strategic partnership, deepening collaboration across artificial intelligence infrastructure, cloud computing, and advanced model development in a deal set to reshape the global AI landscape.
Under the new agreement, Anthropic will spend more than $100 billion over the next decade on Amazon Web Services (AWS) technologies, including current and next-generation custom AI chips such as Trainium and Graviton processors. The move signals a long-term commitment to AWS as the backbone for training and deploying Anthropic’s rapidly growing family of Claude AI models.
The partnership also significantly scales compute capacity, with plans to deliver up to 5 gigawatts of power for AI workloads. This includes the rollout of new Trainium3 infrastructure expected later this year, alongside expanded AI inference capabilities across Asia and Europe to meet surging global demand.
Since the initial collaboration began in 2023, more than 100,000 customers have deployed Anthropic’s Claude models on AWS, making it one of the most widely used AI model families on Amazon Bedrock. Both companies are also advancing Project Rainier, a massive AI compute cluster designed to support large-scale training and deployment, featuring hundreds of thousands of custom AI chips.
As part of the expanded deal, Amazon will invest an additional $5 billion in Anthropic, with up to $20 billion more tied to future performance milestones—building on its previous $8 billion commitment.
Andy Jassy, CEO of Amazon, said the company’s custom AI silicon is driving strong demand due to its performance and cost efficiency, noting that Anthropic’s decision to rely on AWS Trainium chips for the next decade reflects the progress of their collaboration.
Anthropic CEO Dario Amodei said the partnership will help the company scale infrastructure to meet growing demand for Claude, while continuing to push the boundaries of AI research and deployment.
The collaboration extends beyond infrastructure. Anthropic is working closely with Amazon’s chip design arm, Annapurna Labs, to shape future generations of AI processors. Meanwhile, AWS customers will gain seamless access to Claude models directly through their existing AWS accounts, with integrated billing, security, and deployment options.
The companies also highlighted rising enterprise adoption. Ride-hailing firm Lyft is using Claude on Amazon Bedrock to automate customer support, cutting resolution times by 87 percent, while pharmaceutical giant Pfizer is leveraging the models to accelerate research workflows and reduce operational costs.
With the expanded agreement, both Amazon and Anthropic reaffirmed their goal of making AWS the primary platform for large-scale AI development, positioning the alliance at the forefront of the global race to deliver next-generation artificial intelligence systems.




