AWS DevOps & Developer Productivity Blog
Category: AWS CodePipeline
Using AWS CodePipeline for deploying container images to AWS Lambda Functions
AWS Lambda launched support for packaging and deploying functions as container images at re:Invent 2020. In the post working with Lambda layers and extensions in container images, we demonstrated packaging Lambda Functions with layers while using container images. This post will teach you to use AWS CodePipeline to deploy docker images for microservices architecture involving […]
CICD on Serverless Applications using AWS CodeArtifact
Developing and deploying applications rapidly to users requires a working pipeline that accepts the user code (usually via a Git repository). AWS CodeArtifact was announced in 2020. It’s a secure and scalable artifact management product that easily integrates with other AWS products and services. CodeArtifact allows you to publish, store, and view packages, list package […]
Deploy data lake ETL jobs using CDK Pipelines
This post is co-written with Isaiah Grant, Cloud Consultant at 2nd Watch. Many organizations are building data lakes on AWS, which provides the most secure, scalable, comprehensive, and cost-effective portfolio of services. Like any application development project, a data lake must answer a fundamental question: “What is the DevOps strategy?” Defining a DevOps strategy for […]
Blue/Green deployment with AWS Developer tools on Amazon EC2 using Amazon EFS to host application source code
Many organizations building modern applications require a shared and persistent storage layer for hosting and deploying data-intensive enterprise applications, such as content management systems, media and entertainment, distributed applications like machine learning training, etc. These applications demand a centralized file share that scales to petabytes without disrupting running applications and remains concurrently accessible from potentially […]
Use the Snyk CLI to scan Python packages using AWS CodeCommit, AWS CodePipeline, and AWS CodeBuild
Learn how to scan Python packages for security vulnerabilities using AWS Developer tools and Snyk
Enforcing AWS CloudFormation scanning in CI/CD Pipelines at scale using Trend Micro Cloud One Conformity
Integrating AWS CloudFormation template scanning into CI/CD pipelines is a great way to catch security infringements before application deployment. However, implementing and enforcing this in a multi team, multi account environment can present some challenges, especially when the scanning tools used require external API access. This blog will discuss those challenges and offer a solution […]
Continuous Compliance Workflow for Infrastructure as Code: Part 2
In the first post of this series, we introduced a continuous compliance workflow in which an enterprise security and compliance team can release guardrails in a continuous integration, continuous deployment (CI/CD) fashion in your organization. In this post, we focus on the technical implementation of the continuous compliance workflow. We demonstrate how to use AWS […]
Building a CI/CD pipeline to update an AWS CloudFormation StackSets
AWS CloudFormation StackSets can extend the functionality of CloudFormation Stacks by enabling you to create, update, or delete one or more stack across multiple accounts. As a developer working in a large enterprise or for a group that supports multiple AWS accounts, you may often find yourself challenged with updating AWS CloudFormation StackSets. If you’re […]
Building an end-to-end Kubernetes-based DevSecOps software factory on AWS
DevSecOps software factory implementation can significantly vary depending on the application, infrastructure, architecture, and the services and tools used. In a previous post, I provided an end-to-end DevSecOps pipeline for a three-tier web application deployed with AWS Elastic Beanstalk. The pipeline used cloud-native services along with a few open-source security tools. This solution is similar, […]
Integrate GitHub monorepo with AWS CodePipeline to run project-specific CI/CD pipelines
Understand how to automate trigger of project specific code pipeline for GitHub mono repos users. Currently, if a customer is using GitHub as a version control system and he has only one repository which contains multiple folders each for a different project, change in any file, triggers the code pipeline for the whole repository rather than for the appropriate project. With this blog, they would be able to automate trigger of appropriate pipeline based on the project folder where the file gets changed.