SPRING PROGRAM 2001
Science Support Area
16 April - 8 June 2001
Paul R. Janish and Steven J. Weiss
Storm Prediction Center
John S. Kain, and Michael E. Baldwin
National Severe Storms Laboratory
I. Historical Perspective
Co-location of the Storm Prediction Center (SPC) with the National Severe Storms Laboratory (NSSL) and other agencies in the Norman, OK Weather Center has facilitated considerable interaction and collaboration on a variety of experimental forecast and other operationally relevant research programs. A wide cross section of local and visiting forecasters and researchers have participated in a variety of programs over the past several years. These include forecasting support for field programs, establishing the SPC winter weather mesoscale discussion product, evaluating operational and experimental NWP, and integrating new observational data, objectives analyses and display tools into forecast operations. A key goal of these programs is to improve forecasts of hydrological and meteorological phenomena by speeding up the transfer of new technology and research ideas into forecast operations at the SPC, and sharing new techniques, skills, and results of applied research more freely. Typical issues addressed in these exercises include, but are not limited to: data overload concerns in operations, testing and evaluation of new analysis or predictive (NWP) models, better understanding of operational forecast problems, development and evaluation of diagnostic conceptual models, and new product development and display strategies.
During the Spring of 2000 and 2001, the emphasis of these collaborative
programs will focus on critical SPC operational products including the
short term predictability of severe and non-severe thunderstorms and potential
impact on operational convective watch lead time. This document will provide
an overview of logistical, personnel, planning and verification issues
involved in the program for the coming year.
II. Program Motivation, Goals and Objectives
Over the last 20 years verification statistics have shown a gradual decrease in watch lead time. One reason is the large increase in severe events, especially marginally severe reports, that are compiled by WFOs each year. The increase in the severe event data base has been influenced by a number of factors, including improvements in spotter networks, the deployment of the WSR-88D radars and a resultant increase in detection capability of severe storms, and incentives associated with the national warning verification program that encourage WFOs to obtain ground-truth for suspected severe storms. However, other internal SPC factors also contributed to a decrease in lead time. Beginning in the 1980s forecasters became more reliant on new sources of real-time observational data, particularly from satellite and radar, to monitor the life cycle of thunderstorms. Most notable was the discovery that forecasters could often wait until they saw signs of convective initiation before issuing a watch. This new operational methodology resulted in more accurate placement of watches in time and space, but it also changed the character of the convective watch from a pure forecast product to a hybrid nowcast/forecast product.
Over the last two decades, SPC has been a recognized leader in the use of interactive computer workstations for operational forecasting of short-term hazardous weather. Given our primary mission of mesoscale forecast responsibility, it is not only prudent but necessary to place a strong emphasis on diagnostic analysis using real-time observational data. However, owing to insufficient sampling of the mesoscale environment (especially when the distribution of water vapor is considered) coupled with limited scientific knowledge of important mesoscale and storm- scale processes, considerable uncertainty still exists in the short-term prediction of convection. As a result, it is in our best interests to more fully explore the potential use of operational and experimental mesoscale model guidance to see if there is information from the models that can help us more confidently predict when and where convection will develop a few hours in advance. This approach allows us to examine important issues related to mesoscale model performance, use of new model-based prediction systems, and information transfer from models to forecasters, that can be directly related to forecaster decision making and potential improvements in watch lead time.
The goal of the Spring Program is to facilitate
collaboration and interaction between SPC forecasters and leading scientists
in the field of meteorology to advance operationally relevant research
and improve forecasts. During Spring Program 2001, the primary objective
is to explore ways to improve short-term forecasts of convective initiation
and evolution, directly leading to increased projection time in SPC severe
local storm watches (where projection time is typically defined as the
time period between watch issuance and the time of the first severe report
in the watch). A final report on findings from the Spring Program will
be available by the end of the summer.
III. Program Thrust Areas
Spring Program 2001 will have seven (7) research thrust areas:
A full description of all program objectives, types of model output, forecast products, evaluation and verification forms, daily weather summary, and other related links are available at the Spring Program web site:
This web site is under development and
will be fully operational by 1 May 2001. The site is intended to support
real time operations as well as additional research and reference after
the conclusion of the program.
V. Dates and Participants
Spring Program 2001 will run M-F from 16
April through 8 June 2001. Full time participants will work shifts of one
week with part-time visiting scientists participating on a 2-3 day basis
(schedule permitting). Program operations will be conducted in the Science
Support Area (SSA) located adjacent to the SPC Operations area. The full
time forecast team will consist of three to four forecasters and/or scientists
to complete daily forecasts and participate in evaluation/ verification
exercises. Staffing will include one SPC forecaster, one NSSL scientist
and one or two visiting scientists from NCEP/EMC, FSL, WFO/OUN, and Iowa
State University. Visiting participants are invited to present a
seminar to the Norman Weather Center. Interested persons should contact
Paul Janish or Jack Kain at their discretion. A brief training
session will be provided to all participants prior to their first scheduled
shift. A full schedule of participants is provided in Attachment
VI. Daily Operations Schedule
SPC, NSSL, and visiting staff will create forecast products, conduct evaluation exercises and participate in a daily map discussion in the Science Support Area from 8am-4pm M-Th. Operations on Friday's will run from 8am-2pm and will serve to verify the previous day's forecast as well as document findings by the forecast team during the prior week. No official forecasts will be created on Friday (see daily schedule below).
Participants are encouraged to perform evaluation exercises collaboratively. Participants may eat lunch while completing exercises or at their discretion any time during the day. An outline of the daily schedule for activities during the Spring Program is as follows:
8:00am - 10:00 am: Subjective
verification of previous day forecast and model parameters.
10:00am - 12:30 pm: Preparation of Forecast #1 (valid 1-4pm; due 12:00pm) and evaluation forms.
12:30pm - 1:30 pm: Prepare and lead daily Map Discussion
1:30 pm - 3:30 pm Preparation of Forecast #1 (valid 4-7pm; due 3:00 pm) and evaluation forms.
3:30 pm - 4:00 pm Summarize activities, archive data, and wrap-up.
8:00am - 10:30 am: Subjective
verification of previous day forecast and model parameters.
10:30am - 11:30 pm: NSSL Weekly Seminar
11:30am - 12:30 pm: Summarize weekly summary of events, issues, comments, etc.
12:30pm - 1:30 pm: Prepare and lead daily Map Discussion
1:30 pm - 2:00 pm: Complete any remaining archive of data, and wrap-up.
VII. Forecast Product
A forecast component will be included in the program this year that consists of formulating mesoscale sized convective forecast products valid for short periods of time. The intent is to examine the ability of forecasters to issue scheduled short-term convective forecasts (initiation of general and severe convection) with up to a 4 hour lead time.
The forecasts will consist of two graphical products and a short written discussion explaining the rationale of the forecast, with emphasis on the role of the model guidance in the decision-making process. In order to limit the size of the geographic area the forecasts will be valid for, the experimental products will focus on severe risk area(s) delineated in the 13Z SPC Day-1 Outlook. Separate "confidence" forecasts will be made for: 1) the occurrence of thunderstorms, and 2) the occurrence of severe thunderstorms, within the risk area. These events will be verified by CG lightning strike data and severe storm reports, respectively.
If more than one severe risk area is included in the 13Z outlook, the forecasters will choose one of the risk areas to concentrate on (The forecast team will fill out a "Daily Forecast Area" form to provide general information regarding the synoptic and mesoscale conditions expected in the forecast area - Attachment B). Since we are most interested in timing/location of the initiation of convection and severe storms, rather than the continuation of existing convection and severe storms, these considerations will affect the choice of outlook areas. Also, areas of nocturnal convection should be avoided as these events will most likely take place outside of our forecast period. There will be a choice of up to three contours (Low, Medium, High) which will represent discrete levels of forecaster confidence, of convective initiation or development of severe convection duirng a 3h period as described below. For severe convection, this level of confidence would also represent the confidence of a "watch" being issued by the forecast team during the forecast period.
Experimental forecasts will be issued twice daily and are valid for specific time periods.
|Issue Time||Valid Period|
|1700Z (12:00 PM CDT)||18-21Z (1-4 pm CDT)|
|2000Z ( 3:00 PM CDT)||21-00Z (4-7 pm CDT)|
After each forecast package is issued,
the forecast team will complete a multiple choice evaluation form, questionnaire,
and log that will be used to document the usefulness of various sources
of model information and displays in the forecast decision-making process.
In order to complete evaluation forms in a timely manner, part of the forecast
team should begin completing it while the forecast discussion is being
written. Examples of the forecast product and evaluation forms are provided
in Attachment C.
VIII. Verification Exercises and Daily Map Discussion
From 8:00 am - 10:00 am daily, the forecast team will have the opportunity to subjectively evaluate forecasts made from the previous day as well as parameter evaluation from specific models. Since no forecasts are made on Sunday, participants will spend the first two hours on Monday morning for orientation and familiarization or get an early start on creation of the first period forecast product. Verification will be made by comparing severe reports and C-G lightning displays with forecasts made the previous day using N-AWIPS. A variety of verification data and displays (models, satellite/radar image data, soundings, observed data, severe reports, etc.) will also be available on the Spring Program web site. An objective verification of probability forecasts will also be performed at the conclusion of the program.
The forecast team will complete a web based
verification/evaluation form intended to solicit specific information regarding
the quality of the forecast, utility of model data and what data had the
highest impact on the forecast. Completion of this form should be done
in a collaborative fashion with each participant filling out the form on
a rotating basis. (An example of the verification form is available
in Attachment D). Findings during the verification exercises
will be presented in the first half (15 minutes) of the daily map discussion
at 1:00 pm CDT. The remaining 15 minutes will be allocated to an open discussion
related to the first period forecast (valid 18-21Z) and other short range
convective forecast issues. The map discussion is scheduled to end promptly
at 1:30 pm, to allow forecasters adequate time to prepare for the second
forecast later in the afternoon. The forecast team is asked to document
specific comments made in map discussion pertaining to convective forecast
issues and record the briefings for later review.
IX. Forecaster/Participant Duties and Responsibilities
All new participants will be scheduled with a program coordinator for training prior to or on the morning of their first scheduled shift. However, to become familiar with program goals and objectives, all participants are asked to read the operations plan prior to their first day in the SSA.
The forecast team will be made up of three members on most days with a visiting scientist acting as a 4th participant on select days (see schedule, Attachment A).The most critical task for all participants to achieve is the timely creation and issuance of the forecast product. Completion of evaluation forms and documentation of key scientific findings and theories are also important, but should not drive the forecast product or delay issuance.
Participants in the Spring Program are
responsible for the following activities while on shift:
While it is recommended the entire forecast team work together and interact on forecast issuance and verification, a suggested breakout of specific duties is as follows:
Forecaster A - SPC Representative who should lead the forecast team during daily operations. They are responsible for stimulating the forecast process and discussion, creating forecast graphics, and writing the forecast discussion. This forecaster's primary work area will be the SPC N-AWIPS workstation in the northwest corner of the SSA. Forecaster A should lead map discussion on the first day of operations, but that responsibility should be shared among other participants as they become more familiar with systems/displays later in the week.
Forecaster B - NSSL Representative who is primarily responsible for providing insight into specific performance of the models, adding insight to the forecast process via use of model and/or observational data, and providing assistance in completing the forecast evaluation forms (with Forecaster C) during the time the forecast discussion is being written. This forecaster is also responsible for documenting discussion during map discussion. Their primary work area will be the NSSL HP Workstation (N-AWIPS) located on the south wall in the SSA.
Forecaster C - Visiting Scientists should provide insight into that part of the forecast process with which they are most familiar. These participants should focus on their areas of expertise as it pertains to issuance of the forecast product or verification. Their primary work area will be the Linux/Windows PC located in the northeast corner of the SSA. This will allow them access to Internet (for non-standard displays) and N-AWIPS. This forecaster will work with Forecaster B to complete forecast evaluation forms while the forecast discussions are being written. They are also responsible for ensuring all products are saved to the daily forecast folder (below).
- These visiting scientists or forecasters are invited to participate in
the forecast discussion and provide insight as applicable. They are encouraged
to perform analysis of observed data or work with the forecast team as
applicable. This participant's primary work area will be the PC located
on the southwest corner of the SSA.
Participants are asked to follow the checklist (Attachment D) to ensure that all of the data listed below is archived in the daily folder or is saved electronically:
Copy of ALL SPC Day 1 convective outlooks
Copy of SPC mesoscale discussions in SLGT risk areas 06Z-22Z.
Copy of Day 1 convective outlooks with lightning/severe reports overlaid (Janish).
Severe weather reports (text and graphics - flysheet, Janish).
Severe Reports / VIL for 18Z-21Z, 21Z-00Z, and 00Z-03Z.
Completed evaluation/verification forms.
Analyzed charts and other relevant information in the daily folder.A checklist of duties is provided in Attachment E.
Additional instructions are provided in the Science Support Area.
X. Experimental Displays and Model Data
In order to incorporate new analysis displays and NWP model data into the forecast process, several non-operational data sets will be available for use during the Spring Program.
It is hoped that through a proof-of-concept methodology data sets and analysis tools which provide useful information during the Spring Program will be more efficiently integrated into SPC operational data flow and worksations.
NWP model data which will be available to forecasters participating in the Spring Program includes the following (model run resolution / model display grid):
20km/80km Operational Eta
Model (00Z and 12Z)
20km/40km Operational Eta Model (00Z and 12Z)
20km/20km Operational Eta Model (00Z and 12Z)
10km/10km Experimental Nested Eta Model (00Z and 12Z; as available for forecast area)
20km/20km Experimental EtaKF Model (00Z and 12Z)
20km/40km Experimental EtaKF Model (00Z and 12Z)
40km/40km Operational RUC Model (12Z, 15Z, and 18Z)
20 km/20km Experimental RUC Model (12Z, 15Z, and 18Z)
Short Range (ETA/RSM) EMC Ensembles (SREF - 00Z only)
NSSL/MM5 Short Range Ensemble (00Z only)
Mesoscale Short Range Ensemble (MM5, Eta, EtaKF, RUC20; 00Z only)
Cloud Model Ensemble Predictions (Elmore - 00Z only)
WRF Model (Kain - 00Z only)
* Italicized fields are experimental
data not typically available to SPC forecasters *
* All model data will be available via N-AWIPS workstations or Internet *
In addition to NWP data, several experimental
analysis displays will be available for Spring Program participants to
use. These include the ability to create point forecast sounding loops,
web based sounding comparison and difference computations for PFC and observed
soundings, and displays of 1-D output for the Kain-Fritch and Betts-Miller-Janic
convective paramerterization schemes.
XI. Operations Center
Hardware and Software
Spring Program forecast and evaluation
exercises will take place in the Science Support Area (SSA), immediately
adjacent to SPC operational forecast area. Equipment available to spring
program participants includes:
XII. Data Archive
In addition to hard copy archives saved in the daily folder, the following data will be archived to 8mm tape on a daily basis from 15 April through 8 June 2001. Archives will be in GEMPAK format unless otherwise stipulated. These data may be restored at the request of program participants pending disk space availability. Note...model data listed as model run resolution
All Gridded Model Data:
20km/80km Oper ETA, 20km/40km Oper ETA, 20km/20km ETA3H_PCPN
10km/10km Exper Nested ETA model (when available)
20km/40km and 20km/20km ETAKF,
40km/40km and 20km/20km RUC2,
SREF runs (individual and ensemble data)
WRF (when available)
80km Operational Eta (basic fields for briefing purposes)
All Point Forecast Data:
ETA, ETAKF (others?)
SFCOA, sfcwxdataloop (metafiles)
NWS Text Products:
Surface Obs, Oklahoma Mesonet, Upper Air Obs, VAD/Profiler
All U.S. Mosaic Radar (BREF, CREF, ECHO, VILS, RAIN)
(This includes hourly accumulated and 24h accumulated values)
(Individual site radar data as necessary)
1-SPC / 2km Visible and 4km WV/IR
Satellite Derived Data:
Lightning, Severe Reports
Special thanks and appreciation is extended
to all participants and staff for assisting in Spring Program preparations/planning,
programming and data flow issues. Without the combined efforts of many
SPC and NSSL staff, the Spring Program could not be conducted. In particular,
special thanks to Mike Kay (SPC) and Greg Carbin (SPC) for their work on
web page development, evaluation forms and archive; Johh Hart (SPC) for
software support and development; Phillip Bothwell (SPC) and Gregg Grosshans
(SPC) for providing access to model and verification data; Dave Stensrud
(NSSL) for experimental MM5 data access; Kim Elmore (NSSL) for providing
experimental cloud model ensemble data; Jay Liang (SPC), Gary Petroski
(SPC), Doug Rhue (SPC), Steve Fletcher (NSSL) and Brett Morrow (NSSL) for
assistance in configuring hardware/software in the Science Support Area
and Charlie Crisp (NSSL) for his expert meteorological analysis and contributions
to the web page. We further wish to recognize the full support of SPC and
NSSL management and enthusiasm by participants from Forecast Systems Lab
(FSL), Environmental Modeling Center (NCEP/EMC), National Weather Service
Forecast Office, Norman, OK; and Iowa State University who provided motivation
for making such an undertaking a positive experience for everyone.
A - Participant Schedule
Attachment B - Daily Forecast Area (Synopsis) Form
Attachment C - Forecast Product Example / Forecast Evaluation
Attachment D - Verification Forms
Attachment E - Daily Checklist