HWRF  trunk@4391
Namespaces
scripts Namespace Reference

High-level logic of the HWRF Python scripts. More...

Detailed Description

High-level logic of the HWRF Python scripts.

Scripting Layer

The scripts package contains the high-level logic of the HWRF system. These are command line executable scripts that can be run manually, or by the workflow layer (such as the wrappers, Rocoto or the J-Jobs via ecFlow). Their job is to pass control on to the hwrf and pom packages to actually do the work of running the HWRF system.

Generally, these are all simple Python scripts which load the hwrf_expt module, import some of its objects and tell them to execute part of the workflow. All of these scripts will log an initial message and a final success or failure message to the main log stream, the jlogfile.

Most of the Python scripts in this directory follow a standard pattern:

  1. Initialize the produtil.setup and hwrf_expt Python modules.
  2. Log a message to the produtil.log.jlogger saying that the script is starting.
  3. Import one or more hwrf.hwrftask.HWRFTask objects and call their run() methods.
  4. Log a message to the produtil.log.jlogger saying the script has succeeded or failed.

The actual logic of how to run the HWRF system is in the hwrf package, and the connections between the various HWRF components is made in the hwrf_expt module, and its various helper modules (such as hwrf.init and hwrf.hwrfsystem).

In some cases, the Scripting Layer has to do significant extra processing in order to glue everything together. For example, the exhwrf_output has to copy many Tasks' outputs to the COM directory or elsewhere. The exhwrf_post has to run several Tasks at the same time and does so with a sophisticated loop. The exhwrf_products has a similar problem which it solves by running multiple scripts at the same time in different MPI ranks.

Scripting Layer Contents

This section describes the relationships between the various scripts, and references useful pages of documentation. Pages in parentheses are hwrf and pom package documentation.

HWRF Launching Script

There is one script that must be run first:

That launcher job creates the directory structure, database files and other critical initial files needed by the HWRF system. It will also run sanity checks to see if the system will be able to run the requested configuration.

Data Pulling/Pushing Scripts

These scripts pull data from tape or over the internet to serve as input to the HWRF, or they push the data to tape for storage. This is needed because a full-scale test of the HWRF system cannot generally fit on disk. The needed input from four years of GFS, GDAS and GFS ENKF runs also will not fit on disk. Instead, it is placed on tape, and pulled to disk as needed.

Initialization Scripts

After the parent model and observational data has been pulled from tape (or wherever) it must then be processed to create initial and boundary conditions. Here are the initialization jobs in approximate order they appear in the workflow:

Forecast and Post-Processing

After the initial and boundary conditions are available, the forecast starts, and its output must be converted to forms usable by forecasters.

Data Assimilation Ensemble

The HWRF system is capable of running a 6hr forecast ensemble based on the output of the GFS ENKF. The output of this ensemble is fed into the next HWRF cycle's exhwrf_gsi to be used by hwrf.gsi for generating forecast error covariances.

Scripting Layer

The scripts package contains the high-level logic of the HWRF system. These are command line executable scripts that can be run manually, or by the workflow layer (such as the wrappers, Rocoto or the J-Jobs via ecFlow). Their job is to pass control on to the hwrf and pom packages to actually do the work of running the HWRF system.

Generally, these are all simple Python scripts which load the hwrf_expt module, import some of its objects and tell them to execute part of the workflow. All of these scripts will log an initial message and a final success or failure message to the main log stream, the jlogfile.

Most of the Python scripts in this directory follow a standard pattern:

  1. Initialize the produtil.setup and hwrf_expt Python modules.
  2. Log a message to the produtil.log.jlogger saying that the script is starting.
  3. Import one or more hwrf.hwrftask.HWRFTask objects and call their run() methods.
  4. Log a message to the produtil.log.jlogger saying the script has succeeded or failed.

The actual logic of how to run the HWRF system is in the hwrf package, and the connections between the various HWRF components is made in the hwrf_expt module, and its various helper modules (such as hwrf.init and hwrf.hwrfsystem).

In some cases, the Scripting Layer has to do significant extra processing in order to glue everything together. For example, the exhwrf_output has to copy many Tasks' outputs to the COM directory or elsewhere. The exhwrf_post has to run several Tasks at the same time and does so with a sophisticated loop. The exhwrf_products has a similar problem which it solves by running multiple scripts at the same time in different MPI ranks.

Scripting Layer Contents

This section describes the relationships between the various scripts, and references useful pages of documentation. Pages in parentheses are hwrf and pom package documentation.

HWRF Launching Script

There is one script that must be run first:

exhwrf_launch — sets up the workflow for a particular forecast cycle

That launcher job creates the directory structure, database files and other critical initial files needed by the HWRF system. It will also run sanity checks to see if the system will be able to run the requested configuration.

Data Pulling/Pushing Scripts

These scripts pull data from tape or over the internet to serve as input to the HWRF, or they push the data to tape for storage. This is needed because a full-scale test of the HWRF system cannot generally fit on disk. The needed input from four years of GFS, GDAS and GFS ENKF runs also will not fit on disk. Instead, it is placed on tape, and pulled to disk as needed.

exhwrf_input — pulls data for input (hwrf.input) to HWRF exhwrf_bufrprep — turns data tanks into bufr files for exhwrf_gsi's consumption (hwrf.bufrprep, hwrf.gsi) exhwrf_para_archive — pushes data to tape after HWRF has finished exhwrf_wrfout_archive — special extra archiving job for native wrfout files

Initialization Scripts

After the parent model and observational data has been pulled from tape (or wherever) it must then be processed to create initial and boundary conditions. Here are the initialization jobs in approximate order they appear in the workflow:

exhwrf_ocean_init — generates ocean initial and boundary conditions (hwrf.mpipomtc, pom) from data pulled by the exhwrf_input script.

exhwrf_init — spectral processing (hwrf.prep), interpolation and creation of wrfanl, ghost, wrfinput, wrfbdy and other input files (hwrf.fcsttask) from inputs obtained by the exhwrf_input script. The high-level description of this workflow is created in the hwrf.init module.

exhwrf_relocate — takes input from the exhwrf_init job, and relocates the vortex, resizing it and changing its intensity if needed (hwrf.relocate)

exhwrf_gsi — runs the GSI data assimilation system (hwrf.gsi) on the relocated vortex

exhwrf_merge — merges (hwrf.relocate) the relocated vortex and GSI output to create the final input to WRF

Forecast and Post-Processing

After the initial and boundary conditions are available, the forecast starts, and its output must be converted to forms usable by forecasters.

exhwrf_gsi_post — post-processes (hwrf.gsipost, hwrf.hwrfsystem) the inputs and outputs of GSI (hwrf.gsi) to create lat-lon GRIB2 files suitable for study. This is for examining the effect of data assimilation on the input conditions to the forecast.

exhwrf_forecast — runs the full-length forecast, either with (hwrf.mpipomtc.WRFCoupledPOM) or without (hwrf.fcsttask.WRFAtmos) ocean coupling. Takes inputs from the exhwrf_ocean_init, exhwrf_init, exhwrf_relocate and exhwrf_merge jobs.

exhwrf_unpost — deletes the output of the exhwrf_post, exhwrf_products and some of exhwrf_output, allowing the post-processing to be redone.

exhwrf_post — runs the Unified Post Processor (hwrf.post, hwrf.hwrfsystem) on the output of the exhwrf_forecast to create native grid GRIB files.

exhwrf_products — runs GRIB regridding utilities (hwrf.gribtask, hwrf.regrib) on the output of the exhwrf_post to create lat-lon GRIB2 output files suitable for use by forecasters. Runs the GFDL vortex tracker (hwrf.tracker) to create a hurricane track file

exhwrf_output — delivers the output of the exhwrf_products and exhwrf_forecast to their destination. In the operational HWRF, this job also emails the NOAA Senior Duty Meteorologist the AFOS file, which contains the HWRF track.

Data Assimilation Ensemble

The HWRF system is capable of running a 6hr forecast ensemble based on the output of the GFS ENKF. The output of this ensemble is fed into the next HWRF cycle's exhwrf_gsi to be used by hwrf.gsi for generating forecast error covariances.

exhwrf_ensda_pre — determines if the ensemble should be run

exhwrf_ensda — runs one member of the forecast ensemble (hwrf.ensda)

exhwrf_ensda_output — checks to see if the exhwrf_ensda scripts all completed

Namespaces

 exhwrf_bufrprep
 Runs the hwrf.bufrprep.Bufrprep task to convert data tanks to bufr files for input to the hwrf.gsi tasks.
 
 exhwrf_check_init
 Checks the initialization jobs' output and database entries to see if the initialization appears to have succeeded.
 
 exhwrf_ensda
 Runs one member of the HWRF data assimilation ensemble.
 
 exhwrf_ensda_output
 This script exists only for dependency tracking with Rocoto.
 
 exhwrf_ensda_pre
 Determines whether the ENSDA needs to be run for this cycle and writes the ENSDA flag file.
 
 exhwrf_forecast
 Runs the HWRF forecast based on provided input data, producing native output suitable for the HWRF post processing suite.
 
 exhwrf_gsi
 Runs the GSI data assimilation system scripts in hwrf.gsi on one of the HWRF resolutions.
 
 exhwrf_gsi_post
 Runs the HWRF post-processing on the inputs and outputs to the GSI.
 
 exhwrf_hycomab_archive
 Archives *.a and *.b files output by the hycom component of the coupled HWRF forecast job.
 
 exhwrf_init
 Runs the spectral, initial and boundary processing and generates WRF input files for use by other scripts.
 
 exhwrf_input
 Pulls input data from tape or over the network.
 
 exhwrf_launch
 Creates the initial HWRF directory structure for executing a single HWRF cycle.
 
 exhwrf_merge
 Merges the exhwrf_relocate and exhwrf_gsi outputs to create the final input to the exhwrf_forecast.
 
 exhwrf_ocean_init
 Runs the ocean initialization for the chosen ocean model and sets the ocean status file accordingly.
 
 exhwrf_output
 Delivers data to the COM directory or elsewhere.
 
 exhwrf_para_archive
 Generates an archive file from HWRF COM directory outputs.
 
 exhwrf_post
 Runs the Unified Post Processor on the output of the exhwrf_forecst script.
 
 exhwrf_products
 Runs regribbing operations on the output of exhwrf_products, and runs the GFDL vortex tracker on the regribbed output.
 
 exhwrf_relocate
 Relocates, resizes and otherwise modifies the vortex from the parent model and prior HWRF cycle's forecast.
 
 exhwrf_unpost
 This script marks all post-processor products as having not been created.
 
 exhwrf_wrfout_archive
 Archives native wrfout files to tape via the "htar" command.