Overview
Apache Airflow (or simply Airflow) is a platform to programmatically author, schedule, and monitor workflows.
When workflows are defined as code, they become more maintainable, versionable, testable, and collaborative.
Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command line utilities make performing complex sugeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress, and troubleshoot issues when needed.
Highlights
- Data Pipelines
- Pipelines as Code
- Detailed Logging
Details
Unlock automation with AI agent solutions

Features and programs
Financing for AWS Marketplace purchases
Pricing
- ...
Dimension | Cost/hour |
---|---|
m5.large Recommended | $0.028 |
t3.micro AWS Free Tier | $0.003 |
i3en.24xlarge | $3.00 |
c5d.12xlarge | $0.72 |
m5a.2xlarge | $0.101 |
c5d.9xlarge | $0.54 |
g4dn.4xlarge | $0.361 |
c7i-flex.xlarge | $0.053 |
r7iz.16xlarge | $1.488 |
c5.12xlarge | $0.636 |
Vendor refund policy
The instance can be terminated at any time to stop incurring charges.
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
64-bit (x86) Amazon Machine Image (AMI)
Amazon Machine Image (AMI)
An AMI is a virtual image that provides the information required to launch an instance. Amazon EC2 (Elastic Compute Cloud) instances are virtual servers on which you can run your applications and workloads, offering varying combinations of CPU, memory, storage, and networking resources. You can launch as many instances from as many different AMIs as you need.
Version release notes
- DAG manual fail/succeed changes on task teardown
- tracebacks on SIGTERM
- deprecate conf from Task context
Upgrade Steps
You may upgrade your instance using the following command from the shell:
$ dnf update apache-airflow
Additional details
Usage instructions
Launch Airflow Log onto the web console: * username is admin * password is the instance id Go to Admin/DAGs Code Editor and write your first workflow
http://docs.last-bastion.net/marketplace/products/datascience/airflow.html#quickstartÂ
Status Checks
Your product (and all supporting processes) is monitored and restarted via our monit suite. You can verify everything is 'OK' with the following:
$ monit summary
Sensitive Information
Apache Airflow contains no cryptographically stored data and does not perform any encryption at rest.
Local X509 PKI may be utilised to manage network-based TLS. All certificates and keys are stored locally in /etc/pki and it is completely at your discretion to regenerate any of these at any time; the product will transparently operate on any consistent set of configured keys.
Apache Airflow ships with openssl, step-cli and other popular tools to facilitate certificate regeneration.
Documentation
Visit http://docs.last-bastion.net/marketplace/products/datascience/airflow.html Â
SSH Connectivity
The SSH user for your instance is ec2-user. You can connect by using the private key of the key pair you launched the instance with:
$ ssh --identity </path/to/private key.pem> ec2-user@<EC2_Instance_Public_DNS>
Resources
Vendor resources
Support
Vendor support
We take support very seriously.
Your system is provisioned with an entitlement to our Yum repositories for software patches and in-place operating system and product upgrades .
You may raise support requests at  https://entitlements.last-bastion.net/support/form . Visit  https://docs.last-bastion.net/support/index.html for more information on how to get help.
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.