Artificial Intelligence
Iterate faster with Amazon Bedrock AgentCore Runtime direct code deployment
Amazon Bedrock AgentCore is an agentic platform for building, deploying, and operating effective agents securely at scale. Amazon Bedrock AgentCore Runtime is a fully managed service of Bedrock AgentCore, which provides low latency serverless environments to deploy agents and tools. It provides session isolation, supports multiple agent frameworks including popular open-source frameworks, and handles multimodal workloads and long-running agents.
AgentCore Runtime supports container based deployments where the container definition is provided in a Dockerfile, and the agent is built as a container image. Customers who have container build and deploy pipelines benefit from this method, where agent deployment can be integrated into existing pipelines.
Today, AgentCore Runtime has launched a second method to deploy agents – direct code deployment (for Python). Agent code and its dependencies can be packaged as a zip archive, alleviating the need for Docker definition and ECR dependencies. This makes it straightforward for developers to prototype and iterate faster. This method is a good fit for customers who prefer not to worry about Docker expertise and container infrastructure when deploying agents.
In this post, we’ll demonstrate how to use direct code deployment (for Python).
Introducing AgentCore Runtime direct code deployment
With the container deployment method, developers create a Dockerfile, build ARM-compatible containers, manage ECR repositories, and upload containers for code changes. This works well where container DevOps pipelines have already been established to automate deployments.
However, customers looking for fully managed deployments can benefit from direct code deployment, which can significantly improve developer time and productivity. Direct code deployment provides a secure and scalable path forward for rapid prototyping agent capabilities to deploying production workloads at scale.
We’ll discuss the strengths of each deployment option to help you choose the right approach for your use case.

With direct code deployment, developers create a zip archive of code and dependencies, upload to Amazon S3, and configure the bucket in the agent configuration. When using the AgentCore starter toolkit, the toolkit handles dependency detection, packaging, and upload which provides a much-simplified developer experience. Direct code deployment is also supported using the API.
Let’s compare the deployment steps at a high level between the two methods:
Container-based deployment
The container-based deployment method involves the following steps:
Direct code deployment
The direct code deployment method involves the following steps:
- Package your code and dependencies into a zip archive
- Upload it to S3
- Configure the bucket in agent configuration
- Deploy to AgentCore Runtime
How to use direct code deployment
Let’s illustrate how direct code deployment works with an agent created with Strands Agents SDK and using the AgentCore starter-toolkit to deploy the agent.
Prerequisites
Before you begin, make sure you have the following:
- Any of the versions of Python 3.10 to 3.13
- Your preferred package manager installed. For example, we use uv package manager.
- AWS account for creating and deploying agents
- Amazon Bedrock model access to Anthropic Claude Sonnet 4.0
Step 1: Initialize your project
Set up a new Python project using the uv package manager, then navigate into the project directory:
Step 2: Add the dependencies for the project
Install the required Bedrock AgentCore libraries and development tools for your project. In this example, dependencies are added using .toml file, alternatively they can be specified in requirements.txt file:
Step 3: Create an agent.py file
Create the main agent implementation file that defines your AI agent’s behavior:
Step 4: Deploy to AgentCore Runtime
Configure and deploy your agent to the AgentCore Runtime environment:
This will launch an interactive session where you configure the S3 bucket to upload the zip deployment package to and choose a deployment configuration type (as shown in the following configuration). To opt for direct code deployment, choose option 1 – Code Zip.
Deployment Configuration
Select deployment type:
- Code Zip (recommended) – Simple, serverless, no Docker required
- Container – For custom runtimes or complex dependencies
This command creates a zip deployment package, uploads it to the specified S3 bucket, and launches the agent in the AgentCore Runtime environment, making it ready to receive and process requests.
To test the solution, let’s prompt the agent to see how the weather is:
The first deployment takes approximately 30 seconds to complete, but subsequent updates to the agent benefit from the streamlined direct code deployment process and should take less than half the time, supporting faster iteration cycles during development.
When to choose direct code instead of container-based deployment
Let’s look at some of the dimensions and see how the direct code and container-based deployment options are different. This will help you choose the option that’s right for you:
- Deployment process: Direct code deploys agents as zip files with no Docker, ECR, or CodeBuild required. Container-based deployment uses Docker and ECR with full Dockerfile control.
- Deployment time: Although there is not much difference during first deployment of an agent, subsequent updates to the agent are significantly faster with direct code deployment (from an average of 30 seconds for containers to about 10 seconds for direct code deployment).
- Artifact storage: Direct code stores ZIP packages in an S3 bucket. Container-based deployment stores Docker images in Amazon ECR. Direct code deployment incurs storage costs at standard S3 storage rates (starting February 27th 2026) as artifacts are stored in the service account. Container-based deployment incurs Amazon ECR charges in your account.
- Customization: Direct code deployment supports custom dependencies through ZIP-based packaging, while container based depends on a Dockerfile.
- Package size: Direct code deployment limits the package size to 250MB whereas container-based packages can be up to 2GB in size.
- Language Support: Direct code currently supports Python 3.10, 3.11, 3.12, and 3.13. Container-based deployment supports many languages and runtimes.
Our general guidance is:
Container-based deployment is the right choice when your package exceeds 250MB, you have existing container CI/CD pipelines, or you need highly specialized dependencies and custom packaging requirements. Choose containers if you require multi-language support, custom system dependencies or direct control over artifact storage and versioning in your account.
Direct code deployment is the right choice when your package is under 250MB, you use Python 3.10-3.13 with common frameworks like LangGraph, Strands, or CrewAI, and you need rapid prototyping with fast iteration cycles. Choose direct code if your build process is straightforward without complex dependencies, and you want to remove the Docker/ECR/CodeBuild setup.
A hybrid approach works well for many teams, use direct code for rapid prototyping and experimentation where fast iteration and simple setup accelerate development, then graduate to containers for production when package size, multi-language requirements, or specialized build processes demand it.
Conclusion
Amazon Bedrock AgentCore direct code deployment makes iterative agent development cycles even faster, while still benefiting from enterprise security and scale of deployments. Developers can now rapidly prototype and iterate by deploying their code directly, without having to create a container. To get started with Amazon Bedrock AgentCore direct code deployment, visit the AWS documentation.
About the authors
Chaitra Mathur is as a GenAI Specialist Solutions Architect at AWS. She works with customers across industries in building scalable generative AI platforms and operationalizing them. Throughout her career, she has shared her expertise at numerous conferences and has authored several blogs in the Machine Learning and Generative AI domains.
Qingwei Li is a Machine Learning Specialist at Amazon Web Services. He received his Ph.D. in Operations Research after he broke his advisor’s research grant account and failed to deliver the Nobel Prize he promised. Currently he helps customers in the financial service and insurance industry build machine learning solutions on AWS. In his spare time, he likes reading and teaching.
Kosti Vasilakakis is a Principal PM at AWS on the Agentic AI team, where he has led the design and development of several Bedrock AgentCore services from the ground up, including Runtime, Browser, Code Interpreter, and Identity. He previously worked on Amazon SageMaker since its early days, launching AI/ML capabilities now used by thousands of companies worldwide. Earlier in his career, Kosti was a data scientist. Outside of work, he builds personal productivity automations, plays tennis, and enjoys life with his wife and kids.