AWS DevOps & Developer Productivity Blog
Category: Learning Levels
CICD on Serverless Applications using AWS CodeArtifact
Developing and deploying applications rapidly to users requires a working pipeline that accepts the user code (usually via a Git repository). AWS CodeArtifact was announced in 2020. It’s a secure and scalable artifact management product that easily integrates with other AWS products and services. CodeArtifact allows you to publish, store, and view packages, list package […]
Use the Snyk CLI to scan Python packages using AWS CodeCommit, AWS CodePipeline, and AWS CodeBuild
Learn how to scan Python packages for security vulnerabilities using AWS Developer tools and Snyk
Building a centralized Amazon CodeGuru Profiler dashboard for multi-account scenarios
This post shows you how to configure CodeGuru Profiler to collect multiple applications’ profiling data into a central account and review the applications’ performance data on one dashboard.
Enforcing AWS CloudFormation scanning in CI/CD Pipelines at scale using Trend Micro Cloud One Conformity
Integrating AWS CloudFormation template scanning into CI/CD pipelines is a great way to catch security infringements before application deployment. However, implementing and enforcing this in a multi team, multi account environment can present some challenges, especially when the scanning tools used require external API access. This blog will discuss those challenges and offer a solution […]
Building an end-to-end Kubernetes-based DevSecOps software factory on AWS
DevSecOps software factory implementation can significantly vary depending on the application, infrastructure, architecture, and the services and tools used. In a previous post, I provided an end-to-end DevSecOps pipeline for a three-tier web application deployed with AWS Elastic Beanstalk. The pipeline used cloud-native services along with a few open-source security tools. This solution is similar, […]
Hackathons with AWS Cloud9: Collaboration simplified for your next big idea
Many organizations host ideation events to innovate and prototype new ideas faster. These events usually run for a short duration and involve collaboration between members of participating teams. By the end of the event, a successful demonstration of a working prototype is expected and the winner or the next steps are determined. Therefore, it’s important […]
Choosing a Well-Architected CI/CD approach: Open-source software and AWS Services
Take a Well-Architected approach to make an informed decision when choosing to implement CI/CD using open-source tools on AWS services, using managed AWS services, or a combination of both.
We will look at key considerations for evaluating open-source software and AWS Services using the perspectives of a startup company, and a mature company, as examples. These will give you two very different points of view that you can use to compare to your own organization. To make this investigation easier we will use Continuous Integration (CI) and Continuous Delivery (CD) capabilities as the target of our investigation.
In our next two blog posts we will follow two AWS customers Iponweb and BigHat Biosciences as they share their CI/CD journeys, their perspective, the decisions they made, and why.
To end the series, we will explore an example reference architecture showing the benefits AWS provides regardless of your emphasis on open source tools or managed AWS services.
Building an ARM64 Rust development environment using AWS Graviton2 and AWS CDK
2020 was the year that ARM chips made the headlines by moving from largely mobile form factors into the cloud thanks to AWS Graviton2, allowing you to have up to 40% better price performance over comparable current generation x86 Amazon Elastic Compute Cloud (Amazon EC2) and Amazon Relational Database Service (Amazon RDS) instances. We speak […]
Integrate GitHub monorepo with AWS CodePipeline to run project-specific CI/CD pipelines
Understand how to automate trigger of project specific code pipeline for GitHub mono repos users. Currently, if a customer is using GitHub as a version control system and he has only one repository which contains multiple folders each for a different project, change in any file, triggers the code pipeline for the whole repository rather than for the appropriate project. With this blog, they would be able to automate trigger of appropriate pipeline based on the project folder where the file gets changed.
How SOMA Global deploys their application with a dynamic multi-account pipeline
In April 2020, SOMA Global, a leading provider of Public Safety as a Service (PSAAS™), set out to update its computer aided design (CAD) platform to increase reliability to 99.999%, an industry first. SOMA Global adopted an account-based approach for tenant isolation to meet Criminal Justice Information Service (CJIS) regulations. The development and operations team […]









