AWS Cloud Financial Management
5 ways to use Kiro and Amazon Q to optimize your Infrastructure
It’s Friday morning. You’re expecting an easy day when suddenly—ding—a budget alert hits your inbox. Not only have you been notified, but so has your manager and the FinOps team. Your relaxed Friday just disappeared.
Sound familiar? This scenario happens more often than it should. With Kiro CLI or Amazon Q Developer IDE, AWS’s generative AI-powered assistant, you can prevent these panic-inducing moments while saving significant money. Here are five powerful ways to use AI to optimize your AWS infrastructure which came from a re:Invent 2025 talk: Optimize AWS Costs: Developer Tools and Techniques.
1. Find optimization opportunities with MCPs (Model Context Protocol)
What it does: MCPs enable Kiro to interact directly with AWS cost management tools like AWS Cost Optimization Hub, AWS Cost Explorer, and billing console APIs.
How to use it:Instead of manually navigating through the AWS Console, you can ask Kiro directly from your Kiro IDE or terminal using Kiro CLI
"Get me compute optimizer recommendations for my account"
Kiro will:
- Connect to Cost Optimization Hub via MCP
- Pull all optimization recommendations
- Display them right in your terminal or IDE
- Show you savings opportunities for instances, auto-scaling groups, and more
Setup tip: Configure MCPs in your Kiro settings file by adding the following:
- Cost Explorer
- Pricing
- CloudFormation (for your IaC deployments)
- Billing and Cost Management
Real-world impact: What used to take five+ minutes navigating consoles now takes seconds in your development environment.
Here is an example of an mcp.json file which can help with your cost optimization.
{
"mcpServers": {
"awslabs.cost-explorer-mcp-server": {
"command": "uvx",
"args": [
"awslabs.cost-explorer-mcp-server@latest"
],
"env": {
"FASTMCP_LOG_LEVEL": "ERROR",
"AWS_PROFILE": "<PROFILE>"
},
"timeout": 120000,
"disabled": false
},
"awslabs.aws-pricing-mcp-server": {
"command": "uvx",
"args": [
"awslabs.aws-pricing-mcp-server@latest"
],
"env": {
"FASTMCP_LOG_LEVEL": "ERROR",
"AWS_PROFILE": "<PROFILE>",
"AWS_REGION": "us-east-1"
},
"disabled": false,
"autoApprove": []
},
"awslabs.billing-cost-management-mcp-server": {
"command": "uvx",
"args": [
"awslabs.billing-cost-management-mcp-server@latest"
],
"env": {
"FASTMCP_LOG_LEVEL": "ERROR",
"AWS_PROFILE": "<PROFILE>",
"AWS_REGION": "us-east-1"
},
"disabled": false,
"autoApprove": []
}
}
}
2. Automatically apply recommendations to your Infrastructure as Code
What it does: Kiro doesn’t just identify optimization opportunities—it can automatically update your infrastructure as code files.
The workflow:
- Ask Kiro to find optimization opportunities (like switching to Graviton instances)
- Give it your infrastructure folder path
- Tell it exactly what to change and where to deploy
Example prompt:
"Go through my infrastructure at /path/to/templates, find the instance that can be optimized, change it to the recommended instance type,
and give me the CLI command to deploy it"
What happens:
- Kiro searches through your infrastructure files
- Finds the specific resource flagged for optimization
- Updates the instance type (e.g., from t3.large to t3.medium or ARM-based Graviton)
- Provides you the exact deployment command
Time saved: A task that would take 10-15 minutes (finding the file, making changes, and testing) now takes about two minutes.
3. Optimize infrastructure files with one command
What it does: The/optimize command in Amazon Q Developer analyzes your entire infrastructure file and suggests comprehensive optimizations.
How to use it:
- Open a large infrastructure file (AWS CloudFormation, Terraform, etc.)
- Select the code
- Right-click → “Send to Amazon Q” → “Optimize”
Example’s of what it finds:
- Storage tier optimizations (Amazon Elastic Block Storage GP2 → GP3, S3 lifecycle policies)
- Networking improvements (VPC Flow Logs, NAT Gateway consolidation)
- Compute right-sizing opportunities
- Auto-scaling configurations
- Amazon CloudWatch Logs retention policies
- Database optimization (Amazon Relational Database Service, Amazon Graviton migrations)
Pro tip: The optimize command works on application code too! Use it on Python, JavaScript, or any language to find performance and cost inefficiencies in your code logic.
Example output:
Recommendations: - Move EBS volumes from GP2 to GP3: ~$450/month savings - Implement S3 Intelligent-Tiering: ~$780/month savings - Optimize VPC Flow Logs retention: ~$120/month savings - Convert EC2 instances to Graviton: ~$1,200/month savings Total potential monthly savings: $2,550
4. Create cost-optimized Service Control Policies (SCPs) with natural language to validate your infrastructure
What it does: Kiro converts your cost governance rules written in plain English into proper AWS Service Control Policies.
The problem: SCPs are powerful but complex to write. You want to enforce cost best practices (like “only use GP3 volumes” or “deploy only in specific regions”), but writing the JSON policy syntax is tedious.
The solution: Describe your rules in plain language:
"Create an SCP with these rules: - Use GP3 volumes instead of GP2 - Deny NAT Gateway creation outside the main VPC - Enforce tagging on all resources - Only allow specific EC2 instance types - Restrict deployments to eu-west-1 and us-east-1"
Kiro will generate the complete SCP policy JSON for you.
Bonus features: Save conversations. If you can’t finish the task, save your conversation and reload it later:
Save conversation as: scp-project.txt
Then later:
Load: scp-project.txt
"Summarize what we discussed about SCPs"
Validation before deployment: Before applying the SCP, ask:
"Review my infrastructure code based on this SCP
and tell me if I'll get any violations"
Kiro will scan your entire codebase and flag any resources that would be blocked by the new policy—saving you from deployment failures. Kiro will automatically update your files to comply with the policy.
Auto-fix violations: "Fix all the SCP violations you found"
5. Embed cost optimization into your development workflow with context/rules
What it does: Amazon Q Developer IDE Plugin“rules” or Kiro CLI “context” allow you to teach AI your organization’s best practices so it automatically builds optimized infrastructure more consistently.
The problem: Every time you ask AI to create an AWS Lambda function or Amazon Elastic Compute Cloud instance, you have to remind it: “Use Graviton,” “Add CloudWatch log retention,” “Use GP3 volumes,” etc. This gets repetitive.
The solution: Create a context file that AI remembers permanently.
Setup:
- Create a rule file: optimization-rules.md
- Write your standards in plain English:
# My Infrastructure Rules
## Lambda Functions
- Always use ARM64 architecture (Graviton)
- Attach a CloudWatch Log Group with 7-day retention
- Use the latest Python runtime (3.12+)
## EC2 Instances
- Prefer Graviton instances when available
- Use GP3 volumes by default
- Always add proper tags: Environment, Owner, CostCenter
## Storage
- Use S3 Intelligent-Tiering for infrequently accessed data
- Set lifecycle policies on all S3 buckets
AI will now follow these rules automatically
Before context:
"Create a test Lambda function in YAML"
Result:
- Uses Intek architecture
- No log group
- Python 3.9
After context:
"Create a test Lambda function in YAML"
Result:
- Uses ARM64 (Graviton) ✓
- Includes CloudWatch Log Group with 7-day retention ✓
- Uses Python 3.12 ✓
Where it works:
- Amazon Q Developer
- Kiro CLI
- Syncs between both environments automatically
Why it matters: This file ensures every new resource you build is optimized from from deployment—no more going back to fix cost issues later.
Key Takeaways
- Work smarter, not harder: Let AI do the heavy lifting—finding resources, updating code, and generating policies
- Optimize the low-hanging fruit forever: Use rules and SCPs to bake optimization into your workflow
- Save conversations and context: AI can remember your preferences and continue where you left off
Stop getting those panic-inducing budget alerts. Start optimizing with AI today.
Want to learn more? Check out “The Keys to AWS Optimization” on Twitch and YouTube for regular cost optimization tips and techniques. Or see a version of this blog as a re:Invent talk on YouTube.