AWS and Anthropic Deepen Pact: Claude Now Trained on Custom Silicon, Launches 'Cowork' AI Agent in Bedrock

From Usahobs, the free encyclopedia of technology

Breaking: AWS and Anthropic Unveil Deepened AI Partnership

Amazon Web Services (AWS) and Anthropic today announced a major expansion of their strategic collaboration, revealing that Claude—the advanced AI model—is now being trained on AWS custom chips, Trainium and Graviton. In a simultaneous release, the companies launched Claude Cowork, a collaborative AI agent now available within Amazon Bedrock.

AWS and Anthropic Deepen Pact: Claude Now Trained on Custom Silicon, Launches 'Cowork' AI Agent in Bedrock
Source: aws.amazon.com

“This marks a fundamental shift in how we build and deploy frontier AI,” said Dr. Jane Holloway, VP of AI Infrastructure at AWS. “By co-engineering at the silicon level with Annapurna Labs, we are optimizing every layer from the chip up to the model, delivering unmatched efficiency and security for enterprise builders.”

The move positions AWS as a key hardware provider for Anthropic’s most advanced models, while giving customers a tightly integrated path to deploy Claude-based agents—all within the AWS ecosystem.

Claude Now Runs on AWS Trainium and Graviton

Anthropic confirmed it is training its latest generation of foundation models on AWS Trainium and Graviton infrastructure. This deep integration includes direct engineering collaboration with Annapurna Labs, AWS’s chip design arm, to maximize computational efficiency from the hardware up through the full software stack.

“Training at this scale requires purpose-built silicon, and AWS is the only cloud provider offering that level of co-optimization,” added Holloway. The result is expected to accelerate model development while reducing cost for enterprises.

Claude Cowork Enters Amazon Bedrock

Also announced today is the general availability of Claude Cowork within Amazon Bedrock. This tool transforms Claude from a passive response generator into an active collaborator that can reason, plan, and execute multi-step tasks alongside human teams.

“Claude Cowork is the AI teammate you can trust with complex workflows,” said Alex Chen, Anthropic’s VP of Product. “It keeps your data secure inside AWS, uses the power of Claude, and operates within your existing Bedrock environment.” Early adopters report up to 40% faster project completion on analytical tasks.

Claude Platform on AWS (Coming Soon)

A unified developer experience called Claude Platform on AWS is slated for release later this year. It will provide a single interface to build, deploy, and scale Claude-powered applications entirely within AWS—eliminating the need to switch cloud environments.

Meta Signs Massive Graviton Deal for Agentic AI

In a separate blockbuster announcement, Meta has signed an agreement to deploy AWS Graviton processors at massive scale. The deal will start with tens of millions of Graviton cores to power CPU-intensive agentic AI workloads—including real-time reasoning, code generation, search, and multi-step task orchestration.

“Agentic AI demands massive compute that is both powerful and cost-efficient,” said Meta’s Head of Infrastructure, Sarah Lin. “Graviton gives us the performance we need without the power penalty.”

The agreement underscores AWS’s growing dominance in custom silicon for AI, now attracting not only generative AI startups but also hyperscale social platforms.

AWS and Anthropic Deepen Pact: Claude Now Trained on Custom Silicon, Launches 'Cowork' AI Agent in Bedrock
Source: aws.amazon.com

Lambda Now Mounts S3 Buckets as File Systems

AWS Lambda functions can now mount Amazon S3 buckets as file systems using the new S3 Files capability. Built on Amazon EFS, this feature allows functions to perform standard file operations—read, write, append—without having to preload data into memory.

Multiple Lambda functions can simultaneously access the same file system, making it ideal for AI and machine learning workloads where agents need persistent memory and shared state. “This is a game-changer for stateless compute,” said AWS Serverless lead Raj Patel. “Now your serverless code can feel like a traditional app while scaling infinitely.”

Background

The partnership between AWS and Anthropic has evolved rapidly since 2023, when Anthropic chose AWS as its primary cloud provider. Over the past two years, the companies have deepened integration—Claude API on Bedrock, use of AWS Inferentia, and now custom training on Trainium. Meanwhile, AWS has invested billions in custom chip development (Graviton, Trainium, Inferentia) to reduce dependence on third-party GPU suppliers like Nvidia.

Meta’s Graviton deal is the largest public commitment to AWS silicon by a tech giant outside of Twitter/X, signaling a paradigm shift in how hyperscalers view cost-efficient compute for AI. The Lambda S3 Files feature addresses a long-standing developer request: the ability to treat cloud storage as a local file system without breaking the serverless model.

What This Means

These announcements collectively signal that AWS is building the most integrated AI infrastructure stack on the market. For enterprise developers, it means lower latency, better data security (data never leaves AWS), and simpler scaling. The Anthropic partnership gives AWS a premier model that can be run natively on its custom chips, challenging the GPU-centric paradigm.

For the broader AI industry, the Meta deal proves that even the largest AI workloads can shift away from GPUs for certain tasks—reducing cost and energy consumption. Meanwhile, Lambda S3 Files eliminates a major friction point for running AI agents in serverless environments, potentially accelerating adoption of agent-based architectures.

Next steps: Developers can start using Claude Cowork in Bedrock today. The Lambda S3 Files feature is available now in all commercial AWS regions. Meta expects initial Graviton deployments within 60 days.