AWS Public Sector Blog

Tag: AWS CodeCommit

Photo by Hunter Harritt on Unsplash

Modern data engineering in higher ed: Doing DataOps atop a data lake on AWS

Modern data engineering covers several key components of building a modern data lake. Most databases and data warehouses, to an extent, do not lend themselves well to a DevOps model. DataOps grew out of frustrations trying to build a scalable, reusable data pipeline in an automated fashion. DataOps was founded on applying DevOps principles on top of data lakes to help build automated solutions in a more agile manner. With DataOps, users apply principles of data processing on the data lake to curate and collect the transformed data for downstream processing. One reason that DevOps was hard on databases was because testing was hard to automate on such systems. At California State University Chancellors Office (CSUCO), we took a different approach by residing most of our logic with a programming framework that allows us to build a testable platform. Learn how to apply DataOps in ten steps.

Read More
Photo by Brandon Griggs on Unsplash

T Digital shares lessons learned about flexibility, agility, and cost savings using AWS

T-Digital, a division of Tshwane University Technology Enterprise Holding (TUTEH) in South Africa, built TRes, a digital platform for students living in student housing and for accommodation providers. TRes connects students with available housing and verified and authorized property owners. It addresses student accommodation needs and helps verified and approved property owners fully allocate their residences, while alleviating administrative burden. With help from AWS Professional Services, T-Digital experienced flexibility, agility, and realized cost savings.

Read More