This page discusses HWRF configurations in the trunk that are tested and known to work, including the primary (default) configuration. We discuss how to tell the scripts to run these configurations, and any other information one would need.
Although the HWRF can run many configurations, and knows the individual input requirements of each, you must have the needed input data. You may not have access to the data if you don't have an HPSS account, or don't have rstprod access. Worse, depending on the time period, the data may not even exist. Generally, on WCOSS, Zeus, Theia and Jet, it is possible to run any storm in the past ten days, so near-real-time runs should be possible on those machines.
However, for times older than then days ago, your scripts will have to access the NOAA mass storage, to pull archives with input data. Some retrospective datasets are available on Zeus, Jet or WCOSS, but generally not a complete set. The HWRF scripts know how to obtain this data automatically if you can access it. Unfortunately, some of the archives contain NOAA Restricted Data, and require rstprod access to obtain. You do not need any of the restricted data, but you do need some non-restricted data in the restricted archive. If you do not have rstprod access, you could request it, or you could ask someone who does have access to get the data for you.
This is the default configuration for the HWRF. As we get closer to the 2016 HWRF upgrade, the trunk will be slowly evolved towards the 2016 HWRF implementation. If you provide no arguments to the launcher job, this is the configuration you will get. However, due to the prelaunch system, one additional file will be loaded depending on the basin:
Basin | Ocean | Data Assim | Ensemble | Vertical | Model Top | Extra .conf File |
---|---|---|---|---|---|---|
N.Atl | 3D POM | Always | TDR Only | 61 level | 2 mbar | hwrf_AL.conf (empty) |
NE.Pac | 3D POM | TDR Only | TDR Only | 61 level | 2 mbar | hwrf_EP.conf |
NC.Pac | 3D POM | TDR Only | TDR Only | 61 level | 2 mbar | hwrf_CP.conf |
NW.Pac | 3D POM | Never | Never | 43 level | 50 mbar | hwrf_other_basins.conf |
N.Ind | 3D POM | Never | Never | 43 level | 50 mbar | hwrf_other_basins.conf |
S.Pac | 3D POM | Never | Never | 43 level | 50 mbar | hwrf_other_basins.conf |
S.Ind | 3D POM | Never | Never | 43 level | 50 mbar | hwrf_other_basins.conf |
S.Atl | None | Never | Never | 43 level | 50 mbar | hwrf_other_basins.conf |
The trunk now runs something close to the 2016 baseline. To enable the 2015 configuration, you need to provide the parm/hwrf_2015.conf file. This will not match the 2015 operational configuration exactly since it adds a number of bug fixes and source code changes. In addition, some of the configuration done for the hwrf_2015.conf is added by other parm/hwrf_2015*.conf files:
Basin | Ocean | Data Assim | Ensemble | Vertical | Model Top | Extra .conf File |
---|---|---|---|---|---|---|
N.Atl | 3D POM | Always | TDR Only | 61 level | 2 mbar | None |
NE.Pac | 3D POM | TDR Only | TDR Only | 61 level | 2 mbar | hwrf_2015_EP.conf |
NC.Pac | None | Never | Never | 61 level | 2 mbar | hwrf_2015_CP.conf |
NW.Pac | None | Never | Never | 43 level | 50 mbar | hwrf_2015_other_basins.conf |
N.Ind | None | Never | Never | 43 level | 50 mbar | hwrf_2015_other_basins.conf |
S.Pac | None | Never | Never | 43 level | 50 mbar | hwrf_2015_other_basins.conf |
S.Ind | None | Never | Never | 43 level | 50 mbar | hwrf_2015_other_basins.conf |
S.Atl | None | Never | Never | 43 level | 50 mbar | hwrf_2015_other_basins.conf |
It is possible to run the full HWRF system in all basins. That means the 61 layer vertical structure, GSI data assimilation, POM ocean coupling, and a 40 member ensemble. In other words, you can run the North Atlantic configuration in all basins. This is done by sending these extra arguments to the scripts.exhwrf_launch or run_hwrf programs:
prelaunch.basin_overrides=no config.ensda_when=always
The first option turns off the per-basin overrides. The second ensures that ENSDA is always run. It does so by telling the hwrf.ensda.ensda_pre_object_for() to return an hwrf.ensda.AlwaysRunENSDA object.
It is possible to run the trunk with 27:9:3 km resolution, the resolution used in 2013 and 2014. The rest of the system will be the same as the 2015 operational HWRF, but with the non-hydrostatic bug fix enabled.
../parm/hwrf_3km.conf
This also changes the files that are used for overriding the basin:
Basin | Ocean | Data Assim | Ensemble | Vertical | Model Top | Extra .conf File |
---|---|---|---|---|---|---|
N.Atl | 3D POM | Always | TDR Only | 61 level | 2 mbar | None |
NE.Pac | 3D POM | TDR Only | TDR Only | 61 level | 2 mbar | hwrf_3km_EP.conf |
NC.Pac | None | Never | Never | 61 level | 2 mbar | hwrf_3km_CP.conf |
NW.Pac | None | Never | Never | 43 level | 50 mbar | hwrf_3km_other_basins.conf |
N.Ind | None | Never | Never | 43 level | 50 mbar | hwrf_3km_other_basins.conf |
S.Pac | None | Never | Never | 43 level | 50 mbar | hwrf_3km_other_basins.conf |
S.Ind | None | Never | Never | 43 level | 50 mbar | hwrf_3km_other_basins.conf |
S.Atl | None | Never | Never | 43 level | 50 mbar | hwrf_3km_other_basins.conf |
As with the default, 18:6:2 km, HWRF, the 3km configuration can be run with the full configuration by disabling the per-basin and per-TDR settings. Do so by adding the same two more options:
../parm/hwrf_3km.conf prelaunch.basin_overrides=no config.ensda_when=always
Studies have shown that the 61 level vertical structure used by HWRF results in weaker storms in both the GFDL and HWRF models. Unfortunately, this vertical structure is required in order to use GSI. Satellite radiance assimilation needs stratospheric and mesospheric model data since the spectral bands used have significant absorption in those layers of the atmosphere. Still, the HWRF can run with a lower model top and 43 vertical levels if the GSI is enabled. To do so:
../parm/hwrf_43lev.conf config.run_gsi=no config.run_ensemble_da=no
The 3km configuration can be used too, resulting in a model similar to 2013 but with no DA and new physics:
../parm/hwrf_43lev.conf ../parm/hwrf_3km.conf config.run_gsi=no config.run_ensemble_da=no
For most of the life of HWRF, there has been a GEFS-based HWRF ensemble run by Zhan Zhang. The configuration this year uses the 27:9:3 km HWRF with the 43 level vertical structure.
Turning on the GEFS-Based HWRF Ensemble requires three config files:
../parm/hwrf_43lev.conf ../parm/hwrf_3km.conf ../parm/hwrf_ensemble.conf config.run_gsi=no config.run_ensemble_da=no
See the HWRF Ensembles page for details on this configuration option.
You can run the HWRF with different forecast lengths. The default is 126 hours. That allows the HWRF to finish six hours late and provide a five day forecast, which is how it works in operations. To change the forecast length, add this to the launcher:
config.forecast_length=72
Any six-hourly forecast length from 12 hours up to 126 should work. Running forecasts longer than 126 hours may work as well, but has not been tested. It will require access to HPSS, and will have to pull significant extra data. The known limit, at present, is 192 hours, which is the time after which the GFS resolution changes and prep_hybrid can no longer process it. You may be able to run up to 384 hours if you disable use of spectral files entirely.
There are changes to the physics schemes that are easy to do on the command line. It is possible to turn these on along with all of the above other configurations.
This is a new scale-aware version of the Simplified Arakawa Schubert scheme, presently available as cu_physics=86:
namelist_moad.physics.cu_physics=86 namelist_inner.physics.cu_physics=86
A newer version of the GFS PBL is proposed for upgrade in the 2016 GFS. A modified version is available in the HWRF:
namelist_moad.physics.bl_pbl_physics=93
Convection is usually turned off in the inner domain. To turn on SAS in the inner domain:
namelist_inner.physics.cu_physics=84
Turning on the Thompson microphysics scheme is straightforward:
moad_namelist.physics.mp_physics=8
To also turn on Thompson microphysics in the innermost domain:
moad_namelist.physics.mp_physics=8 namelist_inner.physics.mp_physics=8
To run physics schemes (except radiation) every timestep in all three domains for the 18:6:2 km 61 level configuration:
moad.physics.nphs=1 moad.physics.ncnvc=1 namelist_outer.physics.nphs=1 namelist_outer.physics.ncnvc=1 namelist_outer.physics.movemin=42 namelist_inner.physics.movemin=84
For the 18:6:2 43 level configuration:
moad.physics.nphs=1 moad.physics.ncnvc=1 namelist_outer.physics.nphs=1 namelist_outer.physics.ncnvc=1 namelist_outer.physics.movemin=48 namelist_inner.physics.movemin=96
For the 27:9:3 configuration (any vertical structure):
moad.physics.nphs=1 moad.physics.ncnvc=1 namelist_outer.physics.nphs=1 namelist_outer.physics.ncnvc=1 namelist_outer.physics.movemin=36 namelist_inner.physics.movemin=72
Disabling the GSI, and running with only vortex initialization, is straightforward:
config.run_gsi=no config.run_ensemble_da=no
This dramatically reduces the required input data from 200 GB per cycle to only 80 GB per cycle.
The HWRF uses a 40 member 6hr forecast ensemble, driven by the GFS ENKF for forecast error covariances. This requires the GFS ENKF analysis for members 01 through 40. You can disable this ensemble, and use the GFS ENKF directly:
config.run_ensemble_da=no
The HWRF can run without an ocean coupling. It already does so by default in most basins, but it is possible to disable it entirely in all basins:
config.run_ocean=no
This option is compatible with all other configurations listed on this page.
The HWRF can run without any initialization at all, simply integrating the parent model (GFS or GEFS) vortex. To do so, add this option to the launcher:
config.run_relocation=no config.run_gsi=no config.run_ensemble_da=no
The massive input requirements for the HWRF system are from the GFS, GDAS and GFS ENKF spectral files, which are huge and do not compress well. Most of the model system knows how to run without spectral data. However, due to various limitations, you can only run GSI with spectral data enabled. Hence, disabling spectral data also requires disabling GSI. Note that the NCEP FTP server has all data needed to run when GSI and spectral boundary conditions are disabled. One cannot run off of NCEP FTP if either GSI or spectral boundary conditions are enabled.
There are two options for disabling spectral input. You can use GRIB input for boundary conditions and spectral for initial conditions, or you can disable spectral input entirely.
To disable spectral input entirely, send these options to the launcher:
config.run_gsi=no config.run_ensemble_da=no config.use_spectral=no
This "no spectral, no DA" option reduces the data requirements of the HWRF to around 30 GB per cycle. However, there is a slight track degradation.
These options disable spectral for boundary, but include it for the initial state:
config.run_gsi=no config.run_ensemble_da=no config.spectral_bdy=no
This configuration requires about 33 GB per cycle of input data. It results in a slight track degradation though.