
Overview
Databricks at AWS re:Invent 2024
Databricks at AWS re:Invent 2024

Product video
Get started today with up to $400 in usage credits during your 14-day free trial. Trial ends the earlier of when credits are consumed or the 14-day period expires. After your trial ends, you will be automatically enrolled into a Databricks pay-as-you-go plan using the payment method associated with your AWS Marketplace account, paying only for what you use and you can cancel anytime. You can view the full per-product rates for Databricks Units (DBUs) at https://www.databricks.com/product/pricing
The Databricks Data Intelligence Platform allows your entire organization to use data and AI. Its built on a lakehouse to provide an open, unified foundation for all your data and governance. And its powered by a Data Intelligence Engine that speaks the language of your organization so anyone can access the data and insights they need.
The Data Intelligence Platform simplifies your modern data stack by eliminating the data silos that traditionally separate and complicate data engineering, analytics, BI, data science and machine learning. Databricks is built on open source and open standards to maximize flexibility. And the platforms common approach to data management, security and governance helps you operate more efficiently and innovate faster across all analytics use cases.
Reach out to sales@databricks.com to get specialized configurations and pricing for Databricks on AWS Marketplace on a contract basis.
** Technical Support: For help setting up your account, connecting to data, or exploring the platform please reach out to awsmp-onboarding-help@databricks.com **
Highlights
- Simple: Databricks provides a simplified data architecture by unifying data, analytics and AI workloads on one common platform running on Amazon S3.
- Open: Built on top of the world's most successful open source data projects, the Lakehouse Platform unifies your data ecosystem with open standards and formats.
- Collaborative: With native collaboration capabilities, the Databricks Lakehouse Platform unifies data teams to collaborate across the entire data and AI workflow.
Details
Introducing multi-product solutions
You can now purchase comprehensive solutions tailored to use cases and industries.
Features and programs
Buyer guide

Financing for AWS Marketplace purchases
Pricing
Free trial
Dimension | Cost/unit |
|---|---|
Databricks Consumption Units | $1.00 |
Vendor refund policy
No refunds
Custom pricing options
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Software as a Service (SaaS)
SaaS delivers cloud-based software applications directly to customers over the internet. You can access these applications through a subscription model. You will pay recurring monthly usage fees through your AWS bill, while AWS handles deployment and infrastructure management, ensuring scalability, reliability, and seamless integration with other AWS services.
Resources
Support
Vendor support
Please reach out to sales@databricks.com with any questions or for options on contract or pricing terms.
Technical Support: For help setting up your account, connecting to data, or exploring the platform please reach out to awsmp-onboarding-help@databricks.com
For additional training:
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.


Standard contract
Customer reviews
Databricks’ Unified Platform: Fast SQL, Streamlined Pipelines, and Context-Aware AI
Lakeflow Pipelines (formerly DLT) makes it straightforward to build medallion-architecture pipelines, and the Photon engine delivers real performance gains on SQL workloads without requiring any code changes. Recent additions like Genie Code and background agents also show they’re serious about agentic AI—it doesn’t feel like a bolt-on copilot, because it can actually understand your data context through Unity Catalog. Serverless compute has been another big quality-of-life improvement as well, since I no longer have to wait for cluster spin-up when I just want to run quick, ad hoc queries.
Unity Catalog is powerful, but the initial setup and the migration from legacy HMS can be painful, particularly for large orgs with years of existing Hive metastore objects. The documentation is generally good, yet it sometimes lags behind new feature releases. On top of that, the workspace UI can feel sluggish at times, especially when you’re working with a large number of assets.
Databricks Unifies Data Engineering, Science, and Analytics Exceptionally Well
Unified Data Engineering, Analytics, and ML on a Scalable Databricks Platform
Its integration with Spark and Delta Lake is another big plus, making it both scalable and dependable when working with large datasets.
Databricks Brings Spark, Delta, and ML Together with Effortless Auto-Scaling
Genie Code and Inline Assistant Dramatically Boosted My Debugging Productivity
After migrating the platform to Databricks, we were able to substantially improve the data pipeline architecture. We implemented streaming along with optimized ETL pipelines, reducing the data refresh cycle to about 30 minutes. We also created a dedicated view that retains data from the previous run, so downstream systems always have a consistent dataset available while the next pipeline execution is still in progress.
Before, we struggled with delayed refresh cycles and a limited ability to meet near real-time data needs in our Redshift-based architecture. After moving to Databricks, we enabled faster ETL processing and improved near real-time data availability.
As a result, we reduced ETL refresh time to roughly 30 minutes and enabled near real-time access for downstream tools like Jasper and Sisense. Reliability also improved because the stable view continues to serve the previous run’s data during pipeline updates. Finally, the overall architecture became simpler by consolidating processing and analytics capabilities within Databricks.
Overall, Databricks helped us build a more scalable and efficient near real-time data processing platform, significantly improving the timeliness and reliability of analytics for the claims-processing workflow.