AWS HPC Blog

Training forecasters to warn severe hazardous weather on AWS

This post was authored by Zac Flamig, Tech Lead Amazon Sustainability Data Initiative (AWS/ASDI) and Dale Morris, Senior Research Associate (University of Oklahoma/CIMMS and NOAA)

Training users on how to use high performance computing resources — and the data that comes out as a result of those analyses — is an essential function of most research organizations. Having a robust, scalable, and easy-to-use platform for on-site and remote training is becoming a requirement for creating a community around your research mission. A great example of this comes from the NOAA National Weather Service Warning Decision Training Division (WDTD), which develops and delivers training on the integrated elements of the hazardous weather warning process within a National Weather Service (NWS) forecast office. In collaboration with the University of Oklahoma’s Cooperative Institute for Mesoscale Meteorological Studies (OU/CIMMS), WDTD conducts its flagship course, the Radar and Applications Course (RAC), for forecasters issuing warnings for flash floods, severe thunderstorms, and tornadoes. Trainees learn the warning process, the science and application of conceptual models, and technical aspects of analyzing radar and other weather data in the Advanced Weather Interactive Processing System (AWIPS).

As part of the Amazon Sustainability Data Initiative, we invited Dale Morris, senior researcher at OU/CIMMS, to share how they use Amazon Web Services (AWS) to train their users to assess risk and issue warnings to a community that may be affected by hazardous weather.

Tell us about your role and how you are involved with the RAC?

I work for the University of Oklahoma, and through our Cooperative Institute for Mesoscale Meteorological Studies, I’ve supported the NOAA National Weather Service Warning Decision Training Division for about 15 years. I lead the Weather Event Simulator (WES) project and serve as an instructor in WDTD’s flagship course that trains all new NWS forecasters how to issue warnings for severe thunderstorms, tornadoes, and flash flooding. The WES is like a flight simulator, but for weather forecasters who work through past weather events that are played back just like they actually happened. With the WES, forecasters practice issuing warnings and forecasts with the exact same tools they use operationally.

What is the importance of the RAC and why do the forecasters want to participate in it?

The RAC puts forecasters through challenging weather situations as a first step in becoming ready to issue real warnings. Completion of the RAC is typically required for any NWS forecaster before being deemed ready to issue official real-time warnings to protect lives and property for the American public.

What is the structure of the RAC? Is this the first year you have done entirely virtual instruction?

After a period of online distance (asynchronous) learning in RAC, the WES is used to drive a few online synchronous webinars, plus it is the foundation of a week-long workshop. This workshop uses the WES in a variety of ways to suit the instructional needs of the trainees, ranging from individual student exercises (like a single-player video game for weather data analysis) to full-on multi-player simulations. In multi-player simulations, a forecaster team works through an event together, simulating the actual work environment of an operational NWS forecast office. In this full simulation mode, an instructor typically guides a team where each forecaster has a different warning responsibility. Responsibilities can include specific areas that may dynamically change through the event or specific weather hazards like flash flooding. The WES allows the forecaster team to collaborate on warning decisions and adjust workloads as the event demands it.

The WES runs alongside the normal software that forecasters use to do their daily jobs. This system is known as Advanced Weather Interactive Processing System (AWIPS). WES gives AWIPS the ability to run in a “displaced real time” mode by controlling data availability and the clock. WES manages up to five simultaneous instances of AWIPS. Each of these can contain a single weather event. Since RAC is job-centered training, it’s critically important that we train forecasters in a fashion that is as operationally relevant as possible, and train them with a variety of weather events.

The NWS decided to deliver RAC entirely online (“RAC-in-the-Cloud” or RITC) this year due to the COVID-19 pandemic. RAC normally employs a blended instructional approach that combines online instruction followed by the week-long workshop in our facilities at the National Weather Center in Norman. This actually isn’t the first time that this course has gone entirely online. An earlier version of this course many years ago actually went from a 3-week residence course to an online only course with no workshop. An evaluation of the course back then revealed that the hands-on workshop portion was critically important to the success of the course, so a 3–4 day workshop was reinstated, underscoring the importance of what we’re doing in RITC today. For the past several years, the workshop expanded to 5 days to provide forecasters the opportunity to complete more simulations.

Figure 1. A map of the USA showing the location of RITC participants and instructors for the Spring 2021 RITC Workshops.

Figure 1. The location of RITC participants and instructors for the Spring 2021 RITC Workshops.

How were you able to move AWIPS to the cloud so quickly? Have you been experimenting with this previously?

Besides running WES in our lab facilities in Norman, OK, each NWS forecast office also has a simulator. To better support each of these forecast offices, we have been working toward having a cloud-based version of the WES. We connected with AWS through the NOAA Big Data Program. In my role as a university employee, I was awarded some Earth on AWS grants to prototype deploying both AWIPS and the WES, along with other associated software in Amazon Elastic Compute Cloud (EC2). These proofs of concept enabled me to help socialize the benefits of using the cloud for training purposes among various NWS staff. I also found other folks within the NWS who were doing similar things in AWS, but for other purposes. Once NWS procured their own AWS accounts for a training prototype, we quickly redeployed AWIPS and WES to new Amazon EC2 instances, and our team started to formulate workflows that could support the RITC workshops. These workflows included identifying how many Amazon EC2, AWIPS and WES instances were needed to support instructors coaching trainees who would interact with AWIPS during the weather events used in the lab exercises. This helped enable effective instruction, while overcoming certain technical constraints inherent within AWIPS and WES.

Can you give us a brief technical overview of how AWIPS integrates with AWS services? Which services are you using?

As I previously mentioned, we are trying to replicate a unique in-residence training workshop. The normal workshop experience features instructors looking over-the-shoulder of forecasters working through their exercises and events. The instructors work one-on-one with individual forecasters or with an entire team of three forecasters. To replicate the usual NWS workstation configuration in our training lab, each workstation has three monitors (two large wide-screen monitors plus a smaller screen for additional instructional information and product generation). Typical warning operations require two large monitors full of radar and other weather data to help the forecasters have an efficient workflow and maintain situation awareness. Lab workstations are physically and logically arranged in groups of three so forecasters can work together. During a typical week we use the WES in different ways to facilitate different types of instruction. Forecasters work through three events individually in a time-paused mode to deep dive into data analysis and warning issuance. Ramping up to more difficult exercises, each group of forecasters completes one out of ten “mini-scenarios” where each team briefs their warning decisions to the entire class. Finally, the teams complete four full multi-player displaced real-time simulation events.

Replicating the technical side of the RAC workshop in the cloud involved determining the number of Amazon EC2 instances that we would need based on the number of students and available instructors. It also included deploying the archived weather events to the particular instances using a combination of Amazon S3 buckets and Amazon Elastic Block Storage (EBS) volumes. The instructional piece of this was particularly challenging because the instructors (mainly but not exclusively working from home) required the ability to view all the forecaster’s screens for the group. They also needed to manage the simulation controller, video conferences, student worksheets, instructional guides and chatrooms.

The AWIPS display software renders OpenGL graphics directly using GPUs. Fortunately, AWS offered the NICE DCV software that renders this graphical output from virtual GPUs, while providing each user with a Linux desktop just as if they were sitting in front of the workstation in their forecast office. While there is a NICE DCV software client, it can also run inside a web browser, which is attractive in our situation so there are no additional software requirements for each forecaster-trainee beyond what they would normally have on their office or home computers. NICE DCV also provides the ability to spread the Linux desktop across two monitors, so we can replicate the two large monitors we normally have in our lab. Finally, NICE DCV allows multiple logins to the same display session (interactive screen sharing), so our instructors can view the forecasters’ screens to monitor their data analysis and warning decision methodologies. The main constraint from the trainee perspective is having at least two large monitors and enough bandwidth.

We found that a single instructor could manage a group of two forecasters (two AWIPS screens each) and the instructor’s own WES screen to control the simulation and view the forecaster’s warning polygons and text. The instructors also could manage viewing worksheet windows for each student, a video conference window, and an internal chat room before becoming overwhelmed and potentially detract from delivering an effective teaching experience. When I instructed the first time, I found I was managing 10 windows; in other later scenarios I required 6–7 windows, depending on the design of each simulation or exercise. One of our staff developed a Chrome browser extension that assists the instructors in identifying which NICE DCV window belongs to which forecaster. The extension also can intercept mouse input so instructors can move over a student’s window without interfering with student workflow or instructors can disable this feature to demonstrate something to the student via the mouse.

Figure 2. The instructor’s view of two student workstation desktops.

Figure 2. The instructor’s view of two student workstation desktops.

Given instructional, staffing, budgetary, and AWIPS software limitations, we are running eight WES “server” instances in Amazon EC2, which are g3.4xlarge machines with a 1500-GB EBS volume that contains our weather scenario data. This configuration supports eight students and four instructors simultaneously. We doubled the number of students and instructors starting in April by using an additional set of much smaller display clients that attach to the eight main servers. Based on a 55 GB AMI (g4dn.2xlarge), we spin up these additional eight “disposable” visualization EC2 instances quickly and they only exist when we need them to support trainees.

Because of the AWS deployment, we can use the WES software in ways beyond its initial design specifications. Additional CPU and memory in the server instances, as compared to our normal workstation hardware, allowed us to add an AWIPS instance to each of our WES servers. Normally WES runs on a single standalone workstation, which restricts a single WES workstation to only one run simulation at a time. However, since the NICE DCV gives us multiple simultaneous and accessible Linux desktop sessions, we can actually run multiple simulations at the same time with different users; this actually becomes a requirement for some of our exercise sessions when we move to cohorts of 16. This also gives us additional flexibility in case data management (reducing workshop session set-up time). This setup even allowed us to drive an interactive webinar that featured two different weather events, while also conducting a workshop session with eight forecasters at the same time.

What do you think the future holds for being able to use tools like AWIPS over environmental data in the cloud?

Even with this online deployment, this RITC workshop is not expected to have equivalent effectiveness as the residence versions. For various logistical and instructional reasons, we had to cut time from most of our simulation activities and eliminate some activities. Of course, the interpersonal social aspects of the residence workshop are also diminished at a distance. So while RITC is an acceptable alternative during the COVID pandemic, this is not foreseen as an ultimate replacement for the residence version. That being said, we have gained valuable experience that will be useful in expanding cloud-based weather-event simulations in RAC and across the NWS in other learning opportunities where they make the most sense.

I mentioned earlier that we have been working on a cloud-based solution for each NWS office to use their WES. The hardware architecture and security posture of the operational AWIPS systems places constraints on the WES, which requires each office to archive datasets independently to use with their WES workstation. In addition, the initial AWIPS archived data is not colocated with the WES, which then necessitates some data management tasks. Additionally, as datasets like the WSR-88D, GOES-16 and GOES-17 satellites, and forecast models have grown over the years, data management tasks for local NWS staff have consumed more of their time. We are in the process of developing a centralized archive and weather event storage repository that should streamline these types of tasks. This will allow local training officers to focus on science and training rather than slinging data around. Just having the simulator colocated with the data archives is huge for the WES program. The cloud also enables offices to more easily share simulations with each other. The environmental data archives that already exist in Amazon S3 also represent an opportunity to supplement existing cases and build new ones.

Each NWS office also only has one WES workstation, which means only one forecaster can use the WES in their office in a single-player mode, as contrasted with the multi-player functions that the RAC implementation provides. The cloud technology will simply allow more forecasters from a single office to complete more simulations in a shorter amount of time. We anticipate being able to design simulations where multiple forecasters from the same office can work through an event together so they can practice teamwork skills, in addition to hardcore data analysis. We also intend to design additional multi-player simulations where forecasters from neighboring offices can participate at the same time.

A sister agency, the NOAA National Severe Storms Laboratory (NSSL), physically colocated with us in the National Weather Center, is using similar cloud-based technology to conduct virtual testbed activities with forecasters spread across the country. Some of their testbed experiments are using the WES and AWIPS in AWS with the NICE DCV to test new prototypes of experimental forecast and warning tools that forecasters may be able to use operationally in the coming years.

I have been involved in training/education/outreach and data visualization in meteorology for nearly three decades. An occurrence in a recent RITC workshop was not lost on me. One of the forecaster groups (two forecasters working together in the same instances all week long) was located in Juneau, Alaska, and in Pittsburgh. Their instructors were based in the Norman area, Tulsa, Oklahoma; Plano, Texas; and in Omaha, Nebraska. Doing this type of remote real-time collaboration within AWIPS was thought to have been impossible just a few years ago. So the future for training NWS forecasters in advancements in science and technology and in teamwork to help protect our nation’s population and property is very bright.

To learn more about the Amazon Sustainability Data Initiative, visit our webpage.

About the author

Dale Morris is a senior researcher at OU/CIMMS in Norman, Oklahoma. He received his BS and MS in Meteorology from the University of Oklahoma. He has been working in the weather enterprise in Norman for over 30 years.

Angel Pizarro

Angel Pizarro

Angel is a Principal Developer Advocate for HPC and scientific computing. His background is in bioinformatics application development and building system architectures for scalable computing in genomics and other high throughput life science domains.

Zac Flamig

Zac Flamig

Zac Flamig, PhD is the Tech Leader for the Amazon Sustainability Data Initiative (ASDI), help data producers and consumers engage together with a focus on environmental sustainability. Zac aids customers in optimizing their data for sharing in the cloud by applying best practices developed by the community of data users. He has a background in meteorology and hydrology, having spent over a decade building applications to better predict flash floods both in the U.S. and abroad.