Running Claude Cowork in Amazon Bedrock
AWS published the official guide to running Claude Cowork on Amazon Bedrock — reuse the same Bedrock infrastructure you built for Claude Code to bring Claude to every knowledge worker, inside your AWS perimeter.
This article is a summary based on official documentation.
Overview
On April 21, 2026, AWS published the official deployment model for running Claude Cowork in Amazon Bedrock.
Organizations have been using Claude Code on Bedrock to boost developer productivity. Claude Cowork reuses the same Bedrock infrastructure to extend Claude to every knowledge worker — all inside the company’s AWS environment. Prompts, files, and completions stay inside the AWS perimeter, and billing runs through AWS consumption with no Anthropic seat licensing.
References
- AWS blog: From developer desks to the whole organization: Running Claude Cowork in Amazon Bedrock
- Claude docs: Use Claude Cowork with third-party platforms
- Claude Desktop download: claude.com/download
Key features
-
Reuse Claude Code’s Bedrock infrastructure for Cowork
Organizations that already configured Bedrock for Claude Code (IAM, VPC endpoints, CloudWatch, CloudTrail) can reuse that infrastructure for Claude Cowork. Expanding AI adoption to non-developer knowledge workers — product managers, operations managers, finance analysts, research teams — doesn’t require a second build-out.
-
Amazon Bedrock Inference Profile routing
Inference is configured via MDM (Jamf, Microsoft Intune, Group Policy) by pushing the model ID and an Amazon Bedrock Inference Profile. Choose in-Region, geo cross-Region, or global cross-Region profiles to match regional constraints and capacity needs.
-
AWS-native security, observability, and audit
Authentication uses AWS IAM or Amazon Bedrock API keys, network isolation uses VPC endpoints, usage metrics export via OpenTelemetry to Amazon CloudWatch, and audit logs land in AWS CloudTrail. Consolidated AWS billing with granular cost attribution is supported.
-
Data stays inside the AWS boundary
“Amazon Bedrock does not store prompts, files, tool inputs and outputs, or model responses, and does not use them to train foundation models.” The Claude Desktop agent loop, file tools, and MCP servers run locally on the user’s device. Only three outbound paths exist: Bedrock inference, allowlisted MCP servers, and aggregate Anthropic telemetry.
-
Full Claude Desktop capabilities for knowledge-worker tasks
Projects, artifacts, memory, file upload/export, remote connectors, skills, plugins, and MCP servers all work. Example scenarios from the AWS post:
- Product manager synthesizing customer meeting notes into product briefs
- Operations manager consolidating scattered documentation into an SOP
- Finance analyst transforming raw data into a formatted monthly review
- Research teams compiling findings from multiple sources into a single report
-
Consumption-based AWS pricing, no seat licensing
Billing runs through the organization’s existing AWS agreement based on token consumption; there is no Anthropic seat licensing. Org-level cost modeling is simpler as a result.
Notes
- Other platforms are also supported — per the Claude docs, Google Cloud Vertex AI, Azure AI Foundry, and LLM gateways exposing
/v1/messagesare also supported. This AWS announcement details the Bedrock path specifically. - Not feature-parity with Claude Enterprise — Chat tab, project sharing, Dispatch, mobile, voice mode, Computer Use, and the plugin marketplace are not included. Most features that require Anthropic-hosted inference are out of scope.
- VDI support is limited — AWS WorkSpaces, AppStream 2.0, and Citrix DaaS are not supported; VDI deployments require nested virtualization.
- Telemetry can be disabled — default aggregate telemetry (token counts, model ID, error codes, anonymous device identifier) contains no prompts or credentials and can be turned off at the organization level.
- macOS and Windows, in AWS regions that support Claude models — download Claude Desktop from claude.com/download.
Frequently Asked Questions
What is running Claude Cowork in Bedrock?
A deployment model AWS published on 2026-04-21. Reuse the Amazon Bedrock infrastructure (IAM, VPC, CloudWatch, CloudTrail) you already configured for Claude Code to extend Claude Cowork to every knowledge worker — data stays inside your AWS perimeter.
How is it billed?
On AWS consumption (token usage) under your existing AWS contracts and billing — no per-user Anthropic license.
Are platforms other than Bedrock supported?
Per Claude's official docs, Google Cloud Vertex AI, Azure AI Foundry, and any LLM gateway that exposes `/v1/messages` are also supported. This announcement details the Bedrock path.
What's missing compared to Claude Enterprise?
Chat tab, project sharing, Dispatch, mobile, voice mode, Computer Use, and the plugin marketplace are not provided — most features that depend on Anthropic-hosted inference are excluded.
Where are the official docs?
AWS blog: aws.amazon.com/blogs/machine-learning/from-developer-desks-to-the-whole-organization-running-claude-cowork-in-amazon-bedrock/