AWS Public Sector Blog

Tag: analytics

Performance Dashboard on AWS

New Performance Dashboard on AWS makes delivering open, responsive government simple

Data is at the heart of showing citizens how public services are working, and it enables the public sector to improve policy and operational delivery. Citizens expect accessible and useful services. The public sector aims to demonstrate success through data. To build trust in this relationship and promote accountability, public sector organizations need to communicate the data-driven performance of the services they provide. To help address these challenges, AWS is releasing Performance Dashboard on AWS. Performance Dashboard on AWS is a new open source solution to help you measure and share what’s important in one place and at minimal cost, and you can have the solution up and running in a matter of minutes.

aerial street map Singapore

NUS Urban Analytics Lab scales research globally with AWS

The Urban Analytics Lab at the National University of Singapore (NUS) spearheads research in geospatial data analysis and 3D city modelling. The lab’s work underpins the development of smart cities and provides scientists, architects, urban planners, and real estate developers with data insights. These insights help parties make informed decisions about projects ranging from energy modelling to urban farming. To meet rising global demand for its data analytics and planning tools, Urban Analytics Lab turned to Amazon Web Services (AWS).

connected blue dots over a body of water bridges and city

Accelerating nonprofit and education sector impact through data insights with Salesforce and AWS

Nonprofits and education institutions of all sizes rely on large amounts of data to serve their stakeholders, programs, and governance. For many organizations, the first step in a technology transformation begins with centralizing data that is siloed across a variety of mission critical systems. In support of these goals, Salesforce.org and Amazon Web Service (AWS) are working together to help nonprofits and education institutions derive actionable insights from their data.

Sharing SAS data with Athena and ODBC

Sharing SAS data with Athena and ODBC

If you share data with other researchers, especially if they are using a different tool, you can quickly run into version issues, not knowing which file is the most current. Rather than sending data files everywhere, AWS offers a simple way to store your data in one central location so that you can read your data into SAS and still share it with other colleagues. In this blog post, I will explain how to export your data, store it in AWS, and query the data using SAS.

global map in blue showing connecting cities

Combating illicit activity by tracking flight data via the cloud

Many organizations including the intelligence community, security organizations, law enforcement, regulatory bodies, news organizations, and non-governmental organizations work together to disrupt transnational crime networks. Their missions include combating illicit trade; disrupting human, animal, and narcotics trafficking; detecting money laundering; and exposing political corruption. This community needs rapid analysis of large, diverse streams of information about air transportation networks, because air transportation is the fastest way to conduct illicit trade internationally. The nonprofit Center for Advanced Defense Studies (C4ADS) built the Icarus Flights application to meet this need. By building on AWS using managed cloud services, C4ADS spends less time and energy managing infrastructure, which frees them to focus on building innovative analytics and alerting services that their user community needs.

Photo by Hunter Harritt on Unsplash

Modern data engineering in higher ed: Doing DataOps atop a data lake on AWS

Modern data engineering covers several key components of building a modern data lake. Most databases and data warehouses, to an extent, do not lend themselves well to a DevOps model. DataOps grew out of frustrations trying to build a scalable, reusable data pipeline in an automated fashion. DataOps was founded on applying DevOps principles on top of data lakes to help build automated solutions in a more agile manner. With DataOps, users apply principles of data processing on the data lake to curate and collect the transformed data for downstream processing. One reason that DevOps was hard on databases was because testing was hard to automate on such systems. At California State University Chancellors Office (CSUCO), we took a different approach by residing most of our logic with a programming framework that allows us to build a testable platform. Learn how to apply DataOps in ten steps.

bootprint in sand; photo by Bernard Hermant via Unsplash

Enabling warfighters and intelligence mission success

In a world where data is produced and handled at unprecedented speeds and quantities, the need for effective methods to securely store, analyze, and interpret this data is more important now than ever. As agencies within the U.S. Department of Defense and Intelligence Community turn to cloud adoption, they are able to bring new capabilities closer to the tactical edge and accelerate their digital transformation. Agencies can effectively leverage these new technologies such as AI, ML, and data analytics to free up time and resources for warfighters and analysts to focus on mission critical tasks.

man's hand moving pin and string on design board

Addressing emergencies and disruptions to create business continuity

While disruptive events are challenging for any organization, sudden and large-scale incidents such as natural disasters, IT outages, pandemics, and cyber-attacks can expose critical gaps in technology, culture, and organizational resiliency. Even smaller, unexpected events such as water damage to a critical facility or electrical outages can negatively impact your organization if there is no long-term resiliency plan in place. These events can have significant consequences on your employees, stakeholders, and mission, and can result in long-term financial losses, lost productivity, loss of life, a deterioration of trust with citizens and customers, and lasting reputational damage.

football on field; Photo by Dave Adamson on Unsplash

Using a data-driven approach and machine learning to coach at the collegiate level

The University of Illinois Urbana-Champaign (UIUC) believes that technology is a powerful tool for driving results and innovation on campus. Their chief information officer, Mark Henderson, developed a task force—called the Data and Technology Innovation Lab—to identify department challenges and task individuals to build innovative solutions using technology. One area where UIUC identified an opportunity was sports analytics using machine learning (ML). Learn more about how UIUC was inspired by what they were seeing in professional sports, using data to shift their approach to coaching football.

man with headphones leaning over laptop

Pivoting and scaling with AWS: Three EdTechs share their journey to support education

The impact of COVID-19 has K12 and higher education institutions working hard to prepare for students to return to learning that will be anything but typical. The 2020-2021 academic year will include various teaching and learning modalities—virtual, hybrid, and face-to-face—and most expect a shift from one to another throughout the year. Globally, EdTechs are working with AWS to accelerate features and solutions to better support students and educators in teaching and learning, physical and mental wellness, and health and safety.