HWRF  trunk@4391
HWRF Experiment Configuration

This page discusses HWRF configurations in the trunk that are tested and known to work, including the primary (default) configuration. We discuss how to tell the scripts to run these configurations, and any other information one would need.

Data Availability

Although the HWRF can run many configurations, and knows the individual input requirements of each, you must have the needed input data. You may not have access to the data if you don't have an HPSS account, or don't have rstprod access. Worse, depending on the time period, the data may not even exist. Generally, on WCOSS, Zeus, Theia and Jet, it is possible to run any storm in the past ten days, so near-real-time runs should be possible on those machines.

However, for times older than then days ago, your scripts will have to access the NOAA mass storage, to pull archives with input data. Some retrospective datasets are available on Zeus, Jet or WCOSS, but generally not a complete set. The HWRF scripts know how to obtain this data automatically if you can access it. Unfortunately, some of the archives contain NOAA Restricted Data, and require rstprod access to obtain. You do not need any of the restricted data, but you do need some non-restricted data in the restricted archive. If you do not have rstprod access, you could request it, or you could ask someone who does have access to get the data for you.

2016 Pre-Implementation HWRF

This is the default configuration for the HWRF. As we get closer to the 2016 HWRF upgrade, the trunk will be slowly evolved towards the 2016 HWRF implementation. If you provide no arguments to the launcher job, this is the configuration you will get. However, due to the prelaunch system, one additional file will be loaded depending on the basin:

Basin Ocean Data Assim Ensemble Vertical Model Top Extra .conf File
N.Atl 3D POM Always TDR Only 61 level 2 mbar hwrf_AL.conf (empty)
NE.Pac 3D POM TDR Only TDR Only 61 level 2 mbar hwrf_EP.conf
NC.Pac 3D POM TDR Only TDR Only 61 level 2 mbar hwrf_CP.conf
NW.Pac 3D POM Never Never 43 level 50 mbar hwrf_other_basins.conf
N.Ind 3D POM Never Never 43 level 50 mbar hwrf_other_basins.conf
S.Pac 3D POM Never Never 43 level 50 mbar hwrf_other_basins.conf
S.Ind 3D POM Never Never 43 level 50 mbar hwrf_other_basins.conf
S.Atl None Never Never 43 level 50 mbar hwrf_other_basins.conf

2015 Operational HWRF

The trunk now runs something close to the 2016 baseline. To enable the 2015 configuration, you need to provide the parm/hwrf_2015.conf file. This will not match the 2015 operational configuration exactly since it adds a number of bug fixes and source code changes. In addition, some of the configuration done for the hwrf_2015.conf is added by other parm/hwrf_2015*.conf files:

Basin Ocean Data Assim Ensemble Vertical Model Top Extra .conf File
N.Atl 3D POM Always TDR Only 61 level 2 mbar None
NE.Pac 3D POM TDR Only TDR Only 61 level 2 mbar hwrf_2015_EP.conf
NC.Pac None Never Never 61 level 2 mbar hwrf_2015_CP.conf
NW.Pac None Never Never 43 level 50 mbar hwrf_2015_other_basins.conf
N.Ind None Never Never 43 level 50 mbar hwrf_2015_other_basins.conf
S.Pac None Never Never 43 level 50 mbar hwrf_2015_other_basins.conf
S.Ind None Never Never 43 level 50 mbar hwrf_2015_other_basins.conf
S.Atl None Never Never 43 level 50 mbar hwrf_2015_other_basins.conf

Full System in All Basins

It is possible to run the full HWRF system in all basins. That means the 61 layer vertical structure, GSI data assimilation, POM ocean coupling, and a 40 member ensemble. In other words, you can run the North Atlantic configuration in all basins. This is done by sending these extra arguments to the scripts.exhwrf_launch or run_hwrf programs:

prelaunch.basin_overrides=no
config.ensda_when=always

The first option turns off the per-basin overrides. The second ensures that ENSDA is always run. It does so by telling the hwrf.ensda.ensda_pre_object_for() to return an hwrf.ensda.AlwaysRunENSDA object.

2015 27:9:3 HWRF Configuration

It is possible to run the trunk with 27:9:3 km resolution, the resolution used in 2013 and 2014. The rest of the system will be the same as the 2015 operational HWRF, but with the non-hydrostatic bug fix enabled.

../parm/hwrf_3km.conf

This also changes the files that are used for overriding the basin:

Basin Ocean Data Assim Ensemble Vertical Model Top Extra .conf File
N.Atl 3D POM Always TDR Only 61 level 2 mbar None
NE.Pac 3D POM TDR Only TDR Only 61 level 2 mbar hwrf_3km_EP.conf
NC.Pac None Never Never 61 level 2 mbar hwrf_3km_CP.conf
NW.Pac None Never Never 43 level 50 mbar hwrf_3km_other_basins.conf
N.Ind None Never Never 43 level 50 mbar hwrf_3km_other_basins.conf
S.Pac None Never Never 43 level 50 mbar hwrf_3km_other_basins.conf
S.Ind None Never Never 43 level 50 mbar hwrf_3km_other_basins.conf
S.Atl None Never Never 43 level 50 mbar hwrf_3km_other_basins.conf

27:9:3 Full System in All Basins

As with the default, 18:6:2 km, HWRF, the 3km configuration can be run with the full configuration by disabling the per-basin and per-TDR settings. Do so by adding the same two more options:

../parm/hwrf_3km.conf
prelaunch.basin_overrides=no
config.ensda_when=always

Lower Vertical Structure

Studies have shown that the 61 level vertical structure used by HWRF results in weaker storms in both the GFDL and HWRF models. Unfortunately, this vertical structure is required in order to use GSI. Satellite radiance assimilation needs stratospheric and mesospheric model data since the spectral bands used have significant absorption in those layers of the atmosphere. Still, the HWRF can run with a lower model top and 43 vertical levels if the GSI is enabled. To do so:

../parm/hwrf_43lev.conf
config.run_gsi=no
config.run_ensemble_da=no

The 3km configuration can be used too, resulting in a model similar to 2013 but with no DA and new physics:

../parm/hwrf_43lev.conf
../parm/hwrf_3km.conf
config.run_gsi=no
config.run_ensemble_da=no
Note
When running the 43 level, 3km model, you must load the hwrf_43lev.conf first, so that the hwrf_3km.conf overrides the model timesteps. Otherwise, the 33.75 timestep will be used, and several other things will break.

GEFS-Based HWRF Ensemble

For most of the life of HWRF, there has been a GEFS-based HWRF ensemble run by Zhan Zhang. The configuration this year uses the 27:9:3 km HWRF with the 43 level vertical structure.

Todo:
You can only run the GEFS-Based HWRF Ensemble on the NOAA Jet computer because the scripts only know how to locate GEFS on Jet filesystems. The parm/hwrf_input.conf needs to be updated to know how to go to FTPPRD, HPSS or WCOSS /com disk areas.

Turning on the GEFS-Based HWRF Ensemble requires three config files:

../parm/hwrf_43lev.conf
../parm/hwrf_3km.conf
../parm/hwrf_ensemble.conf
config.run_gsi=no
config.run_ensemble_da=no

See the HWRF Ensembles page for details on this configuration option.

Different Forecast Length

You can run the HWRF with different forecast lengths. The default is 126 hours. That allows the HWRF to finish six hours late and provide a five day forecast, which is how it works in operations. To change the forecast length, add this to the launcher:

config.forecast_length=72

Any six-hourly forecast length from 12 hours up to 126 should work. Running forecasts longer than 126 hours may work as well, but has not been tested. It will require access to HPSS, and will have to pull significant extra data. The known limit, at present, is 192 hours, which is the time after which the GFS resolution changes and prep_hybrid can no longer process it. You may be able to run up to 384 hours if you disable use of spectral files entirely.

Changing Physics Schemes

There are changes to the physics schemes that are easy to do on the command line. It is possible to turn these on along with all of the above other configurations.

Scale-Aware SAS In All Domains

This is a new scale-aware version of the Simplified Arakawa Schubert scheme, presently available as cu_physics=86:

namelist_moad.physics.cu_physics=86
namelist_inner.physics.cu_physics=86

GFS PBL with EDMF

A newer version of the GFS PBL is proposed for upgrade in the 2016 GFS. A modified version is available in the HWRF:

namelist_moad.physics.bl_pbl_physics=93

SAS In All Domains

Convection is usually turned off in the inner domain. To turn on SAS in the inner domain:

namelist_inner.physics.cu_physics=84

Thompson Microphysics

Turning on the Thompson microphysics scheme is straightforward:

moad_namelist.physics.mp_physics=8

To also turn on Thompson microphysics in the innermost domain:

moad_namelist.physics.mp_physics=8
namelist_inner.physics.mp_physics=8

Physics Every Timestep

To run physics schemes (except radiation) every timestep in all three domains for the 18:6:2 km 61 level configuration:

moad.physics.nphs=1
moad.physics.ncnvc=1
namelist_outer.physics.nphs=1
namelist_outer.physics.ncnvc=1
namelist_outer.physics.movemin=42
namelist_inner.physics.movemin=84

For the 18:6:2 43 level configuration:

moad.physics.nphs=1
moad.physics.ncnvc=1
namelist_outer.physics.nphs=1
namelist_outer.physics.ncnvc=1
namelist_outer.physics.movemin=48
namelist_inner.physics.movemin=96

For the 27:9:3 configuration (any vertical structure):

moad.physics.nphs=1
moad.physics.ncnvc=1
namelist_outer.physics.nphs=1
namelist_outer.physics.ncnvc=1
namelist_outer.physics.movemin=36
namelist_inner.physics.movemin=72

Disabling System Components

Disabling GSI

Disabling the GSI, and running with only vortex initialization, is straightforward:

config.run_gsi=no
config.run_ensemble_da=no

This dramatically reduces the required input data from 200 GB per cycle to only 80 GB per cycle.

Disabling the DA Ensemble

The HWRF uses a 40 member 6hr forecast ensemble, driven by the GFS ENKF for forecast error covariances. This requires the GFS ENKF analysis for members 01 through 40. You can disable this ensemble, and use the GFS ENKF directly:

config.run_ensemble_da=no

Disabling Ocean Coupling

The HWRF can run without an ocean coupling. It already does so by default in most basins, but it is possible to disable it entirely in all basins:

config.run_ocean=no

This option is compatible with all other configurations listed on this page.

Disabling Initialization

The HWRF can run without any initialization at all, simply integrating the parent model (GFS or GEFS) vortex. To do so, add this option to the launcher:

config.run_relocation=no
config.run_gsi=no
config.run_ensemble_da=no

Running Without Spectral Files

The massive input requirements for the HWRF system are from the GFS, GDAS and GFS ENKF spectral files, which are huge and do not compress well. Most of the model system knows how to run without spectral data. However, due to various limitations, you can only run GSI with spectral data enabled. Hence, disabling spectral data also requires disabling GSI. Note that the NCEP FTP server has all data needed to run when GSI and spectral boundary conditions are disabled. One cannot run off of NCEP FTP if either GSI or spectral boundary conditions are enabled.

Note
The 3km configuration can also run with the two below options. The GEFS-based HWRF ensemble already runs entirely without spectral data.

There are two options for disabling spectral input. You can use GRIB input for boundary conditions and spectral for initial conditions, or you can disable spectral input entirely.

To disable spectral input entirely, send these options to the launcher:

config.run_gsi=no
config.run_ensemble_da=no
config.use_spectral=no

This "no spectral, no DA" option reduces the data requirements of the HWRF to around 30 GB per cycle. However, there is a slight track degradation.

These options disable spectral for boundary, but include it for the initial state:

config.run_gsi=no
config.run_ensemble_da=no
config.spectral_bdy=no

This configuration requires about 33 GB per cycle of input data. It results in a slight track degradation though.

Bug:
It should be possible to run GSI when config.spectral_bdy=no, but it is not. This is because we have no way of tricking real_nmm into using the GDAS 9hr forecast metgrid file for hour 9 and 12. That prevents the scripts.exhwrf_init job from running from GDAS GRIB files.