AWS Big Data Blog
Amazon MWAA best practices for managing Python dependencies
Customers with data engineers and data scientists are using Amazon Managed Workflows for Apache Airflow (Amazon MWAA) as a central orchestration platform for running data pipelines and machine learning (ML) workloads. To support these pipelines, they often require additional Python packages, such as Apache Airflow Providers. For example, a pipeline may require the Snowflake provider package for interacting with a Snowflake warehouse, or the Kubernetes provider package for provisioning Kubernetes workloads. As a result, they need to manage these Python dependencies efficiently and reliably, providing compatibility with each other and the base Apache Airflow installation.
Python includes the tool pip to handle package installations. To install a package, you add the name to a special file named requirements.txt
. The pip install
command instructs pip to read the contents of your requirements file, determine dependencies, and install the packages. Amazon MWAA runs the pip install
command using this requirements.txt
file during initial environment startup and subsequent updates. For more information, see How it works.
Creating a reproducible and stable requirements file is key for reducing pip installation and DAG errors. Additionally, this defined set of requirements provides consistency across nodes in an Amazon MWAA environment. This is most important during worker auto scaling, where additional worker nodes are provisioned and having different dependencies could lead to inconsistencies and task failures. Additionally, this strategy promotes consistency across different Amazon MWAA environments, such as dev, qa, and prod.
This post describes best practices for managing your requirements file in your Amazon MWAA environment. It defines the steps needed to determine your required packages and package versions, create and verify your requirements.txt
file with package versions, and package your dependencies.
Best practices
The following sections describe the best practices for managing Python dependencies.
Specify package versions in the requirements.txt file
When creating a Python requirements.txt
file, you can specify just the package name, or the package name and a specific version. Adding a package without version information instructs the pip installer to download and install the latest available version, subject to compatibility with other installed packages and any constraints. The package versions selected during environment creation may be different than the version selected during an auto scaling event later on. This version change can create package conflicts leading to pip install errors. Even if the updated package installs properly, code changes in the package can affect task behavior, leading to inconsistencies in output. To avoid these risks, it’s best practice to add the version number to each package in your requirements.txt
file.
Use the constraints file for your Apache Airflow version
A constraints file contains the packages, with versions, verified to be compatible with your Apache Airflow version. This file adds an additional validation layer to prevent package conflicts. Because the constraints file plays such an important role in preventing conflicts, beginning with Apache Airflow v2.7.2 on Amazon MWAA, your requirements file must include a --constraint
statement. If a --constraint
statement is not supplied, Amazon MWAA will specify a compatible constraints file for you.
Constraint files are available for each Airflow version and Python version combination. The URLs have the following form:
https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt
The official Apache Airflow constraints are guidelines, and if your workflows require newer versions of a provider package, you may need to modify your constraints file and include it in your DAG folder. When doing so, the best practices outlined in this post become even more important to guard against package conflicts.
Create a .zip archive of all dependencies
Creating a .zip file containing the packages in your requirements file and specifying this as the package repository source makes sure the exact same wheel files are used during your initial environment setup and subsequent node configurations. The pip installer will use these local files for installation rather than connecting to the external PyPI repository.
Test the requirements.txt file and dependency .zip file
Testing your requirements file before release to production is key to avoiding installation and DAG errors. Testing both locally, with the MWAA local runner, and in a dev or staging Amazon MWAA environment, are best practices before deploying to production. You can use continuous integration and delivery (CI/CD) deployment strategies to perform the requirements and package installation testing, as described in Automating a DAG deployment with Amazon Managed Workflows for Apache Airflow.
Solution overview
This solution uses the MWAA local runner, an open source utility that replicates an Amazon MWAA environment locally. You use the local runner to build and validate your requirements file, and package the dependencies. In this example, you install the snowflake
and dbt-cloud
provider packages. You then use the MWAA local runner and a constraints file to determine the exact version of each package compatible with Apache Airflow. With this information, you then update the requirements file, pinning each package to a version, and retest the installation. When you have a successful installation, you package your dependencies and test in a non-production Amazon MWAA environment.
We use MWAA local runner v2.8.1 for this walkthrough and walk through the following steps:
- Download and build the MWAA local runner.
- Create and test a requirements file with package versions.
- Package dependencies.
- Deploy the requirements file and dependencies to a non-production Amazon MWAA environment.
Prerequisites
For this walkthrough, you should have the following prerequisites:
- An AWS account in an AWS Region where Amazon MWAA is supported
- Access to Amazon MWAA and Amazon Simple Storage Service (Amazon S3)
- Docker Desktop
Set up the MWAA local runner
First, you download the MWAA local runner version matching your target MWAA environment, then you build the image.
Complete the following steps to configure the local runner:
- Clone the MWAA local runner repository with the following command:
- With Docker running, build the container with the following command:
Create and test a requirements file with package versions
Building a versioned requirements file makes sure all Amazon MWAA components have the same package versions installed. To determine the compatible versions for each package, you start with a constraints file and an un-versioned requirements file, allowing pip to resolve the dependencies. Then you create your versioned requirements file from pip’s installation output.
The following diagram illustrates this workflow.
To build an initial requirements file, complete the following steps:
- In your MWAA local runner directory, open
requirements/requirements.txt
in your preferred editor.
The default requirements file will look similar to the following:
- Replace the existing packages with the following package list:
- Save
requirements.txt
. - In a terminal, run the following command to generate the
pip install
output:
test-requirements
runs pip install
, which handles resolving the compatible package versions. Using a constraints file makes sure the selected packages are compatible with your Airflow version. The output will look similar to the following:
The message beginning with Successfully installed
is the output of interest. This shows which dependencies, and their specific version, pip installed. You use this list to create your final versioned requirements file.
Your output will also contain Requirement already satisfied
messages for packages already available in the base Amazon MWAA environment. You do not add these packages to your requirements.txt
file.
- Update the requirements file with the list of versioned packages from the
test-requirements
command. The updated file will look similar to the following code:
Next, you test the updated requirements file to confirm no conflicts exist.
- Rerun the
requirements-test
function:
A successful test will not produce any errors. If you encounter dependency conflicts, return to the previous step and update the requirements file with additional packages, or package versions, based on pip’s output.
Package dependencies
If your Amazon MWAA environment has a private webserver, you must package your dependencies into a .zip file, upload the file to your S3 bucket, and specify the package location in your Amazon MWAA instance configuration. Because a private webserver can’t access the PyPI repository through the internet, pip will install the dependencies from the .zip file.
If you’re using a public webserver configuration, you also benefit from a static .zip file, which makes sure the package information remains unchanged until it is explicitly rebuilt.
This process uses the versioned requirements file created in the previous section and the package-requirements
feature in the MWAA local runner.
To package your dependencies, complete the following steps:
- In a terminal, navigate to the directory where you installed the local runner.
- Download the constraints file for your Python version and your version of Apache Airflow and place it in the
plugins
directory. For this post, we use Python 3.11 and Apache Airflow v2.8.1:
- In your requirements file, update the constraints URL to the local downloaded file.
The –-constraint
statement instructs pip to compare the package versions in your requirements.txt
file to the allowed version in the constraints file. Downloading a specific constraints file to your plugins
directory enables you to control the constraint file location and contents.
The updated requirements file will look like the following code:
- Run the following command to create the .zip file:
package-requirements
creates an updated requirements file named packaged_requirements.txt
and zips all dependencies into plugins.zip
. The updated requirements file looks like the following code:
Note the reference to the local constraints file and the plugins directory. The –-find-links
statement instructs pip to install packages from /usr/local/airflow/plugins
rather than the public PyPI repository.
Deploy the requirements file
After you achieve an error-free requirements installation and package your dependencies, you’re ready to deploy the assets to a non-production Amazon MWAA environment. Even when verifying and testing requirements with the MWAA local runner, it’s best practice to deploy and test the changes in a non-prod Amazon MWAA environment before deploying to production. For more information about creating a CI/CD pipeline to test changes, refer to Deploying to Amazon Managed Workflows for Apache Airflow.
To deploy your changes, complete the following steps:
- Upload your
requirements.txt
file andplugins.zip
file to your Amazon MWAA environment’s S3 bucket.
For instructions on specifying a requirements.txt
version, refer to Specifying the requirements.txt version on the Amazon MWAA console. For instructions on specifying a plugins.zip file
, refer to Installing custom plugins on your environment.
The Amazon MWAA environment will update and install the packages in your plugins.zip
file.
After the update is complete, verify the provider package installation in the Apache Airflow UI.
- Access the Apache Airflow UI in Amazon MWAA.
- From the Apache Airflow menu bar, choose Admin, then Providers.
The list of providers, and their versions, is shown in a table. In this example, the page reflects the installation of apache-airflow-providers-db-cloud
version 3.5.1 and apache-airflow-providers-snowflake
version 5.2.1. This list only contains the provider packages installed, not all supporting Python packages. Provider packages that are part of the base Apache Airflow installation will also appear in the list. The following image is an example of the package list; note the apache-airflow-providers-db-cloud
and apache-airflow-providers-snowflake
packages and their versions.
To verify all package installations, view the results in Amazon CloudWatch Logs. Amazon MWAA creates a log stream for the requirements installation and the stream contains the pip install output. For instructions, refer to Viewing logs for your requirements.txt.
A successful installation results in the following message:
If you encounter any installation errors, you should determine the package conflict, update the requirements file, run the local runner test, re-package the plugins, and deploy the updated files.
Clean up
If you created an Amazon MWAA environment specifically for this post, delete the environment and S3 objects to avoid incurring additional charges.
Conclusion
In this post, we discussed several best practices for managing Python dependencies in Amazon MWAA and how to use the MWAA local runner to implement these practices. These best practices reduce DAG and pip installation errors in your Amazon MWAA environment. For additional details and code examples on Amazon MWAA, visit the Amazon MWAA User Guide and the Amazon MWAA examples GitHub repo.
Apache, Apache Airflow, and Airflow are either registered trademarks or trademarks of the Apache Software Foundation in the United States and/or other countries.
About the Author
Mike Ellis is a Technical Account Manager at AWS and an Amazon MWAA specialist. In addition to assisting customers with Amazon MWAA, he contributes to the Apache Airflow open source project.