HWRF
trunk@4391
|
This page documents how to modify the HWRF parm/system.conf file. Note that the file does not exist in the repository. Users must copy a template and modify it to suit their run environment. The file selects input sources, chooses run areas, and sets archiving options.
The following system.conf templates are available:
File | System |
---|---|
system.conf.jet | NOAA Jet cluster |
system.conf.nco | NOAA WCOSS, NCEP Central Operations only |
system.conf.wcoss2 | NOAA WCOSS, all other WCOSS users |
system.conf.theia | NOAA Theia |
system.conf.zeus | NOAA Zeus |
system.conf.yellowstone | NCAR Yellowstone |
Copy one of those to parm/system.conf and edit it as needed.
Here is an example from the system.conf.jet:
[dir] CDNOSCRUB=/pan2/projects/{disk_project}/{ENV[USER]}/noscrub CDSCRUB=/pan2/projects/{disk_project}/{ENV[USER]}/pytmp CDSAVE=/pan2/projects/{disk_project}/{ENV[USER]} syndat=/lfs3/projects/hwrf-data/hwrf-input/SYNDAT-PLUS
There are a few special variables in [dir] that you must set correctly:
CDSAVE
— this must be the directory above the directory where you checked out the HWRF.CDSCRUB
— this is the location in which HWRF temporary files will reside while you run the HWRF.CDNOSCRUB
— this is where the HWRF copies certain small files after each cycle completes. (For example, the track and HTCF files.)syndat
— directory that contains syndat_tcvitals.$year files with tcvitals. These files are used to determine what storms to run for which cycles.utilexec
— used to override paths to non-HWRF NCEP programs such as cnvgrib. The default, {HOMEhwrf}/nwport/util/exec, is fine for most users. On WCOSS, we override this to use the actual /nwprod/util/exec operational executables.On all platforms, these variables are set correctly for the EMC HWRF group members. Anyone else must change the paths.
The scripts.exhwrf_input job uses the [hwrfdata] section to decide where to put files it pulls from tape, or over the network. By default, this data is placed inside each cycle's scrub area, and will be deleted after the cycle completes. If you intend to run the same case many times for different configurations, it is best to store this data in a non-scrubbed location. That way, the input job will not pull any input data, and will finish immediately. Several users can use the same directory for this, if permissions on the directory are set correctly.
The input directory is changed by adding or modifying these lines:
[hwrfdata] inputroot=/path/to/input/area
The HWRF system will never scrub the /path/to/input/area
directory if it is outside of {com} and {WORKhwrf}. That means the data will be available for any number of cases you want to rerun. However, it also means that you must delete the area yourself if you run out of space.
The HWRF system knows how to make two archives: the COM archive, and the WRFOUT archive. The COM archive contains everything from the HWRF COM directory, which includes all operational and diagnostic outputs. The WRFOUT archive contains all native WRF output files. By default, the WRFOUT archive is not created due to its large size.
The COM archive path is set in the [config] section archive
option:
[config] archive=hpss:/NCEPDEV/emc-hwrf/1year/{ENV[USER]}/{SUBEXPT}/{out_prefix}.tar
The NCEPDEV/emc-hwrf directory component is correct for EMC HWRF group members, but anyone else will need to change it. You can also change the archive method or disable archiving:
The WRFOUT archive is specified in the [archive] section:
[archive] wrfout=hpss:/1year/NCEPDEV/emc-hwrf/{ENV[USER]}/{SUBEXPT}/wrfout/{out_prefix}.tar
The bulk of the input file configuration is done in the hwrf_input.conf, and basic configuration is available in system.conf. Users should not need to edit hwrf_input.conf unless they are adding new parent models (ECMWF, CFS, UKMET, etc.), porting to a new cluster or adding new inputs from GFS (ie.: sfluxgrib files). Generally, the settings in the system.conf are sufficient. There are two options in the [config] section:
[config] input_sources=jet_sources_{GFSVER} fcst_catalog=jet_fcst_{GFSVER}
The [config] section input_sources option specifies the conf section name in hwrf_input.conf to use for input sources. That section is passed to an hwrf.input.InputSource object. During the scripts.exhwrf_input job, the hwrf.input.InputSource will search input sources in priority order for data to satisfy all input needs. This is only done when doing a retrospective run ("HISTORY" case). During a real-time forecast, the [config] fcst_catalog is used instead.
The default setting should be appropriate for all platforms.
When running a real-time forecast, the HWRF assumes the data is already on disk from some other workflow, such as real-time rsyncs. On WCOSS, the output directory from the GFS model is accessed directly. One specifies this input using the [config] section input_sources option. It refers to the section in hwrf_input.conf that will be used by an hwrf.input.DataCatalog to find parent model outputs and other input data.
The default setting should be appropriate for all existing platforms.