FAQs
General
Open allWhat is a modern application?
How are companies building modern applications?
What’s unique or better about building modern applications on AWS?
Architecture
Open allWhere do you draw the boundaries of a microservice?
Should I migrate to the cloud or break the monolith first?
How can modern architectures enable improved security?
Not only can microservices provide smaller systems that can be owned and operated by small, autonomous teams, the technologies available to build microservices also provide new opportunities for automation, scaling, and security.
AWS customer Travelex is a perfect example of how modern architecture can improve security. Travelex is trusted globally for currency exchange with presence in 130 countries. They wanted to move away from their monolithic on-premises data center model to release faster than their current pace (8 times per year) and improve security. Although they had decades of experience complying with financial regulations, seeking approval for a cloud workload was new.
Their design, with improved security, allowed them to gain approval and deploy a microservices architecture using Docker and Amazon Elastic Container Service as well as a security controls framework that includes AWS Key Management Service, Amazon VPC, and Amazon Web Application Firewall. They creating automated, auditable and tamper-proof deployments. And, to reduce the blast radius of any compromise they designed a process to destroy every container after 24-hours and redeploy with new security certificates to minimize the effect of sensitive configurations being lost or stolen. They are now able to deploy 100s of times per week as opposed to 8 times per year in the old low-frequency model and the automation they implemented has improved their overall security posture.
Culture/Organization
Open allHow do I structure my teams in order to enable ownership and autonomy?
At Amazon, we have two pizza teams which get their name from their size – small enough to be fed by two pizzas, typically 5-10 people. These teams have complete ownership and autonomy over their applications and all the skills necessary to deliver – handoffs, cross-team communication, and dependencies are minimized. More and more, organizations are adopting agile and DevOps practices along with their move to the cloud. While these practices and technologies can certainly provide value independently, to truly unlock your agility and maximize value delivery, organizations combine these practices with the concept of two pizza teams to maximize speed and autonomy. At the end of the day, this sort of transformation requires different behaviors and different structures to maximize the effectiveness of the two pizza teams and therefore, typical shared services models tend to shrink in favor of teams that can own and operate what they build.
For example, Cox Automotive embarked on a transformation of their people, process, and technology by implementing scaled agile framework (SAFe) across the entire enterprise, going all-in with AWS, and evolving to a “You build it, you run it” environment. Cox Automotive built out small teams that used Scrum, Kanban or other agile methodologies and teams were typically 8-10 people in size to drive ownership and autonomy, very similar to Amazon’s two-pizza teams. Implementing a coordinated delivery, operations, and technology strategy allowed Cox Automotive to unify IT and the business by creating a team of product, engineering, architecture, and business leaders at the upper levels of SAFe, resulting in a more transparent and collaborative environment. This model provided connectivity of priorities and gave teams context as they looked to solve customer problems.
How do I retrain my teams?
What are the foundational elements necessary to shift from a transitional architecture and organizational structure to a more strategic position?
What are the cultural shifts that have to happen before we can meaningfully adopt a microservices architecture?
Software Delivery
Open allHow do I start introducing CI/CD for my team?
What are the keys to speeding up value delivery?
Operational model
Open allHow do I decide when to use Lambda and/or containers?
More and more customers are choosing AWS Lambda and containers to build modern application with maximum agility. Your choice depends largely on the complexity of your workload, typical task runtime as well as its invocation pattern. Containers are the most popular option for packaging code and a great tool for modernizing legacy applications, because they offer excellent portability and flexibility over application settings. With Lambda, you get the most simplicity. The only code you write is business logic.
Containers provide a standard way to package code, configurations, and dependencies into a single object so you can run your it anywhere, scale quickly, and also set CPU and memory utilization granularly. In fact, the majority of container workloads are all run on AWS. To run containers, you need compute, which can be Amazon EC2 or AWS Fargate. Fargate enables you to run containers serverlessly. You also need an orchestration service, such as Amazon ECS or Amazon EKS.
Lambda runs code in response to events or triggers, such as a new file added to S3 or new table entry in DynamoDB, or directly through calling its APIs. Lambda has 115 event trigger, more than anyone else in the market. Customers just upload their code and Lambda takes care of everything required to run and scale the code with high availability. They only pay for the compute time they consume - there is no charge when code is not running. Customers can run code for virtually any type of application or backend service - all without provisioning or managing servers.
Which container orchestration service should I choose?
This depends on your current expertise and preference for operational ease or control over application settings. To minimize operations, we suggest AWS Fargate, which is the only compute engine that allows you to run your containers without having to manage servers. With Fargate, you build your container image, define how and where you want it to run, and pay for the resources. This eliminates the need for you to choose the right instance type, secure, patch, and upgrade the instances or scale your cluster.
Amazon ECS is the best place to run your containers if you’re familiar with AWS constructs and intend to primarily use AWS tools and services as part of your container infrastructure. Because we built ECS from the ground-up and have complete control of its roadmap, we are able to quickly and natively integrate with AWS services such as IAM, CloudWatch, Autoscaling, Secrets Manager, and Load Balancers while providing you a familiar experience to deploy and scale your containers.
If you prefer to run Kubernetes, then we recommend Amazon EKS, the most reliable way to run Kubernetes in the cloud. EKS is reliable because it runs across multiple AWS availability zones, resulting built-in redundancy and resiliency it. We also make sure our EKS customers have the latest security patches, which means we’ll we take fixes out of the most recent version and apply it to the unsupported versions to prevent reliability and availability issues with your EKS application.
When do I choose serverless technologies over managing it myself?
Will using serverless technologies affect my multi-cloud strategy?
Security
Open allHow does security improve with modern applications?
How do security practices and options change with the cloud and modern applications?
In modern applications security is now code, just as infrastructure is code. The ability to automate the inclusion of security controls and monitoring of the infrastructure is a game changer for many organizations. This allows for the design and build of self-healing infrastructures. Additionally, with the use of serverless technologies, the attack surface of applications is greatly reduced, since any potentially vulnerable code is only running when needed.
For example, Verizon was migrating an additional 1,000 business critical applications and databases to AWS and needed a way to scale their security validation processes. They needed to put direct access to cloud resources in the hands of thousands of developers, not just infrastructure teams. Verizon also knew they needed to unleash their developers and allow them to focus more time on delivering new value for their customers, not waiting on hardware and security checks. So, they developed an automated security validation process that allowed team to develop new applications on their own timeline. In stage one, the process analyzes basic configuration rules BEFORE using them to deploy, checking for encryption, logging, access rights config. In stage two, it deploys the validated template into a test environment for live validation and uses the AWS native config service to audit deployed system configs. Finally in stage three, it runs a 3P Vulnerability assessment to check for missing OS patches or other vulnerabilities. Once complete, it digitally signs the results and their automated deployment pipeline checks to determine whether the application is approved for production or not, and only deploys those that have passes this validation process.
Verizon’s automated security validation process is an example of how customers are using automation and the AWS cloud to increase business agility by creating autonomy within their teams, allowing them to safely move quickly.
What am I actually securing in a serverless environment? Do my tools and processes change?
Your focus is twofold: 1 - securitization of the application code that is developed/running, in accordance with best practices (e.g. OWASP top 10); and 2 - securitization of the infrastructure you control, aligning with cloud best practices focusing on identity, detective controls, infrastructure security, data protection and incident response.
How do I implement my existing security policies in this new world?
Data management
Open allHow do I evaluate which database is the best fit for purpose?
Customers tell us they want to build scalable, high-performing, and functional applications that meet specific performance and business requirements. When choosing a database for an application, customers should take into account these requirements as well as the data model and data access patterns.
In order to meet these diverse customer needs, we offer a host of purpose-built database services:
• Amazon RDS for fully-managed relational databases and Amazon Aurora for commercial-grade relational databases as well as an ever-improving feature set such as Amazon Aurora Serverless.
• Amazon DynamoDB, a key-value and document database that delivers single-digit millisecond performance at any scale
• Amazon Neptune for graph databases
• Amazon DocumentDB, a fully-managed document database that supports MongoDB workloads
• Amazon Timestream, a time-series database service for Internet of Things (IoT) and operational applications
• Amazon Quantum Ledger Database (QLDB), a purpose-built ledger database
• Amazon Aurora Global Database, spanning multiple AWS Regions while replicating writes with a typical latency of less than one second.
I have a legacy database and a long term licensing agreement. How do I get started on the process of migrating to a more modern database?
Database Freedom is a unique program designed to assist qualifying customers migrating from traditional database engines to cloud-native ones on AWS. Database Freedom supports migrations to Amazon Aurora - a MySQL and PostgreSQL - compatible relational database built for the cloud, Amazon RDS for PostgreSQL, MySQL and MariaDB, Amazon Redshift, Amazon DynamoDB, Amazon EMR, Amazon Kinesis, Amazon Neptune, Amazon QLDB, Amazon Timestream and Amazon DocumentDB. Additionally, AWS Schema Conversion Tool and AWS Database Migration Service can help customers migrate their databases to these services quickly and securely.
We offer qualifying customers advice on application architecture, migration strategies, program management, and employee training customized for their technology landscape and migration goals. We also support proof-of-concepts to demonstrate the feasibility of a migration.
We also assist qualifying customers in migrating to AWS through our AWS Professional Services team and our network of Database Freedom Partners. These teams and organizations specialize in a range of database technologies and bring a wealth of experience acquired by migrating thousands of databases, applications, and data warehouses to AWS. We also offer service credits to qualifying customers to minimize the financial impact of the migration.
We have helped customers such as 3M, Verizon, Capital One, Intuit, Ryanair and Amazon.com achieve database freedom.
Find out more about Database Freedom and contact us by going here.