AWS Partner Network (APN) Blog
Tag: AWS CodeCommit
Build and Deploy a Secure Container Image with AWS and Snyk
Learn how to build a Java application in a Docker container and push the container image to Amazon ECR orchestrated by AWS CodePipeline. We’ll use Snyk to scan your code, build a container image, and display the results in both Snyk and Amazon ECR. We’ll also show you how Amazon Inspector utilizes Snyk open source to provide insight into your software security vulnerabilities. All of this functionality is available from the AWS Management Console.
Developing Data-Driven IoT Business Models for Sustainability with Storm Reply
Firms spend substantial efforts to identify and collect quality data streams from different sources. However, identifying and interpreting energy, water, or gas usage patterns and consumption types is sometimes insufficient. In this post, you’ll learn how Storm Reply, combining industry knowledge with its expertise in the development of data analytics platforms in AWS, can help customers in the design, development, and maintenance of secure serverless IoT big data platforms with a focus on sustainability.
Accelerating Genomics and High Performance Computing on AWS with Relevance Lab Research Gateway Solution
Running genomics and high performance computing (HPC) workloads is complicated. To address these customer asks, Relevance Lab developed Research Gateway—a solution that delivers secure and scalable research without customers having to do the heavy lifting. This post provides an overview of the solution architecture and standard genomic research workflow, including a walkthrough of how to access Research Gateway, provision products required for their genomics sequencing analysis, and run the analysis.
Implementing Data Analytics in Industrial Machines with Quantiphi and AWS-Native Solutions
To improve the fault detection process, it’s crucial to monitor production systems and collect performance data in real-time using smart sensors and IoT devices. This post highlights the critical aspects of Quantiphi’s serverless, fully-managed, and streaming ETL pipeline, along with the benefits of the centralized lakehouse (data lake + data warehouse) solution built on AWS. Learn how the solution helped a U.S.-based manufacturing company make better decisions and improve their customer’s production efficiency.
Parentsmile Launches First Family Care SaaS Platform on AWS with Support from ZERO12
Looking for qualified support for a parent is a hard task and often a leap in the dark. Parentsmile is a unique reservation platform that integrates healthcare, training, educational, and all-encompassing psychophysical well-being services. This post demonstrates how ZERO12 built Parentsmile’s SaaS platform; exploring the main infrastructure with Amazon Elastic Container Service (Amazon ECS), the continuous integration and continuous delivery (CI/CD) process, and the asynchronous workflow for payments and reminders.
How to Migrate Legacy Applications Using AWS App2Container
DXC Technology offers a suite of application services to migrate a customer’s application portfolio from a legacy infrastructure to the container management service in the most efficient manner. These services are bundled under DXC’s Applications Containerization as a Service (aCaaS) offering and enable organizations to leverage the seismic shift in application hosting with containerization. Learn how some of the challenges encountered during legacy migration can be successfully mitigated using AWS App2Container.
Automating Signature Recognition Using Capgemini MLOps Pipeline on AWS
Recognizing a user’s signature is an essential step in banking and legal transactions, and typically involves relying on human verification. Learn how Capgemini uses machine learning from AWS to build ML-models to verify signatures from different user channels including web and mobile apps. This ensures organizations can meet the required standards, recognize user identity, and assess if further verifications are needed.
Cognizant AWS DataHyperloop: A Continuous Data Journey Towards Developmental Agility and Faster Data Delivery
The concept of DataOps was born with the goal of solving issues prevalent in old, complex, and monolithic architectures, and to optimize data pipeline architectures. To meet the demand, Cognizant and AWS jointly built the DataHyperloop solution which provides a real-time view of DataOps and demonstrates automation of continuous integration, delivery, testing, and monitoring of data assets moving across the data lifecycle on AWS.
Accelerating AWS Adoption Using Servian’s Cloud Foundation Solution
Servian Cloud Foundation helps organizations build an automated, secure, compliant, and fully customizable account and infrastructure foundation on AWS. It’s a well-architected and opinionated blueprint built from Servian’s expertise and experience helping customers address security, compliance, DevOps, and IaC adoption challenges. Servian Cloud Foundation significantly reduces time to adoption for building solutions at speed and scale, while decreasing risk and putting in place consistent best-practice processes.
Managing the Evolution of an Amazon Redshift Data Warehouse Using a Declarative Deployment Pipeline
Enterprise data warehouses are complex and consist of database objects that need to be modified to reflect the changing needs of business, data analytics, and machine learning teams. In this post, learn about an approach to managing the evolution of enterprise-scale data warehouses based on the experience of Deloitte’s Data and AI global practice teams. The declarative tool developed by Deloitte that can automatically generate DDL statements to align Amazon Redshift’s state to an approved baseline configuration.