Logo
FrontierNews.ai

Claude on AWS Now Offers Day-One Access to New AI Features Without the Feature Lag

Anthropic has made its Claude Platform generally available on AWS, giving enterprise teams direct access to the full Claude API with all native features shipping on the same day they launch on Anthropic's platform. This removes a major friction point for AWS customers who previously had to wait weeks for new Claude capabilities to arrive through Amazon Bedrock, the older AWS integration service.

What's the Difference Between Claude on AWS and Claude on Bedrock?

AWS customers now have two distinct options for building with Claude, and the choice depends on your data residency and feature needs. Claude Platform on AWS is operated directly by Anthropic, meaning data is processed outside the AWS boundary, but you get immediate access to all native Claude API features from day one. Claude on Amazon Bedrock, by contrast, is operated by AWS itself, keeps your data within AWS infrastructure, and is better suited for organizations with strict regional data residency requirements.

The practical difference matters most for teams building with cutting-edge Claude capabilities. Under the old Bedrock model, new tools like code execution and the Model Context Protocol (MCP) connector took weeks to arrive after launch. With Claude Platform on AWS, that lag disappears entirely.

What New Features Can AWS Teams Access Right Now?

The Claude Platform on AWS includes a comprehensive set of tools for building autonomous AI agents and data analysis workflows:

  • Claude Managed Agents (beta): Build and deploy AI agents at scale without managing separate orchestration infrastructure.
  • Code Execution: Run Python code directly within API calls to generate visualizations, analyze datasets, and process information without spinning up separate compute instances.
  • MCP Connector (beta): Connect Claude to any remote Model Context Protocol server without writing custom client code, enabling integration with existing databases and internal tools.
  • Web Search and Web Fetch: Access real-time data from the internet within Claude conversations.
  • Files API (beta): Upload and reference documents across conversations for document analysis and processing.
  • Skills (beta): Teach Claude best practices for consistent results across repeated tasks.
  • Advisor Strategy (beta): Have agents consult an advisor model for better decision-making in complex scenarios.
  • Prompt Caching: Reduce costs and latency on repeated context by caching frequently used information.
  • Batch Processing: Handle high-volume asynchronous workloads efficiently.

The platform also includes access to Claude's latest models: Claude Opus 4.7, Claude Sonnet 4.6, and Claude Haiku 4.5.

How Does Authentication and Billing Work?

One of the biggest operational advantages is that Claude Platform on AWS integrates directly with AWS Identity and Access Management (IAM). Teams use their existing AWS credentials and IAM policies, eliminating the need to manage separate API keys or learn new permission systems. Audit logging flows directly into AWS CloudTrail, and billing appears on a single AWS invoice that fully retires against existing AWS commitments.

This integration means that for teams already deep in the AWS ecosystem, adding Claude feels like a native AWS service rather than bolting on an external vendor.

How to Deploy Claude Agents on AWS

  • Enable the Service: Go to the Claude Platform on AWS page and enable the service in your AWS account.
  • Check Existing Commitments: If you have existing Bedrock private offers, contact your Anthropic or AWS account executive before migrating to avoid losing discounts.
  • Test Code Execution: Try the code execution tool with a sample dataset to understand how Python runs within API calls.
  • Build Your First Agent: Set up a Managed Agent for a simple internal workflow, such as summarizing support tickets or analyzing customer feedback.
  • Connect to Existing Services: Use the MCP connector documentation to wire Claude into your existing MCP servers without writing custom integration code.

What Are Real Teams Saying About the New Platform?

Early adopters have highlighted the operational simplicity and feature parity as game-changers. Jonathan Echavarria, Principal Research Scientist at a customer organization, noted the impact on his team's workflow:

"Claude Platform on AWS helped simplify how we access Claude, improved the experience for key users like our Claude Code engineers, and gave us a practical path to integrate further frontier AI capabilities into our cybersecurity and engineering workflows, while staying within our existing cloud operating model," stated Jonathan Echavarria.

Jonathan Echavarria, Principal Research Scientist

Tomas Oliva, AI Platform Engineer at OpenRouter, emphasized the consistency and feature velocity:

"Using Claude Platform on AWS gives OpenRouter and our users direct access to the latest and greatest features of the native Claude API. It has delivered consistent performance on uptime, latency, and throughput," explained Tomas Oliva.

Tomas Oliva, AI Platform Engineer at OpenRouter

Support quality also emerged as a differentiator. Avinash Vishwakarma, Chief Architect at another customer, remarked:

"Support has been one of the best parts, and it felt like one team across both companies, not two separate relationships," said Avinash Vishwakarma.

Avinash Vishwakarma, Chief Architect

Why Does This Matter for Enterprise AI Development?

The feature lag between Bedrock and the native Claude API has been a persistent pain point for AWS-first organizations. Teams building production AI agents often found themselves choosing between staying within AWS infrastructure and getting access to cutting-edge Claude capabilities. Claude Platform on AWS eliminates that tradeoff.

For developers building agentic systems, the combination of Managed Agents, code execution, and the MCP connector means you can deploy production-grade autonomous agents without standing up separate orchestration infrastructure. The code execution feature is particularly powerful: instead of spinning up a Lambda function or EC2 instance to run data analysis, you can pass a Python request directly to Claude and get results back in the same API call.

The platform is available today in most AWS commercial regions and supports both global and U.S. inference geographies.