AWS Clean Rooms now supports configurable Spark properties for PySpark

Posted on: Apr 17, 2026

AWS Clean Rooms now supports configurable Spark properties for PySpark jobs, offering customers the ability to optimize their workloads based on their performance and scale requirements. With this launch, customers can customize Spark settings such as memory overhead, task concurrency, and network timeouts for each analysis that uses PySpark, the Python API for Apache Spark. For example, a pharmaceutical research company collaborating with healthcare organizations for real-world clinical trial data can set specific memory tuning for large-scale workloads to improve performance and optimize costs. 

AWS Clean Rooms helps companies and their partners easily analyze and collaborate on their collective datasets without revealing or copying one another’s underlying data. For more information about the AWS Regions where AWS Clean Rooms is available, see the AWS Regions table. To learn more about collaborating with AWS Clean Rooms, visit AWS Clean Rooms.