job_settings

 

This file includes the main settings for the simulation and is called at the beginning of each script in the job chain.

Environment Variable

Description

SPDIR

Absolut parent path of the SPICE directory. SPDIR only occurs in job_settings. This is not necessary to set if you define the different paths by yourself.

EXPID

job identification

Main directory settings

PFDIR

Parent directory where the scripts for the experiment are stored. PFDIR should be on a disk partition that is saved frequently as a copy.

WORKDIR

NONE of the created files under this directory will be deleted at the end of the simulation. The disk partition should have enough remaining disk space.

SCRATCHDIR

ALL created files under this directory will be deleted at the end of the job chain. This disk partition is only for temporary file storage.

SRCDIR

directory of supplementary programs

DATADIR

directory holding supplementary data (grid descriptions etc.)

ARCHIVE_OUTDIR

Directory where the results should be archived. "Results" are the output of the ICON output streams define in the namelist block output_nml.

RESDIR

Directory where the restart files will be stored

INIDIR

Directory where some initial data of the simulation will be stored

Information on the time period of the simulation

YDATE_START_ISO

Start date of simulation in ISO8601 format: YYYY-MM-DDTHH:MM:SS

YDATE_STOP_ISO

End date of simulation in ISO8601 format: YYYY-MM-DDTHH:MM:SSZ

INC_DATE

Time increment for the ICON-CLM job. Standard is one month 01:00:00 . You may want to use a higher frequency, e.g. if a monthly increment exceeds the batch job limits of your computing system.
Valid values are 01:00:00, 00:01:00, 00:02:00, ... , 00:27:00

ITYPE_CALENDAR

Calendar settings. 0 = proleptic gregorian, 1 = 360 day year, 2 = 365 day year

Email, notification and account settings

EMAIL_ADDRESS

Replace this with your email address or leave the environment variable empty to prevent getting emails from your batch system

NOTIFICATION_ADDRESS

SPICE scripts send notifications to this address, in case it is set. In case of mailx is used this is your email address.

NOTIFICATION_SCRIPT

The script holding the notification instruction including the whole absolute path to the script. It is called like so

${NOTIFICATION_SCRIPT} ${NOTIFICATION_ADDRESS} subject message-file

The following scripts are available under ${SPDIR)/src/notifications:
mailx.sh - for notifications with mailx
mattermost.sh - for notifications in a mattermost hook
pushover.sh - for notifications through pushover https://pushover.net/

You can also define your own notification script holding your favorite notification system.

PROJECT_ACCOUNT

On the computer systems at DKRZ you need to set you project account ID. If your computings system does not have this feature you can leave it open

Directory and binary path settings for utilities

CFU

cfu (climate fortran utilities) program including the absolute path. These are part of SPICE

UTILS_BINDIR

Absolute path to the directory of utility binaries (ccaf2icaf and correct_cf). These are part of SPICE

CDO

cdo program including the absolute path.

NCO_BINDIR

Absolute path to the directory of NCO utility binaries (e.g. ncks, ncrcat, ncatted ...)

NC_BINDIR

Absolute path to the directory of netCDF standard binaries (e.g. ncdump, nccopy)

PIGZ

pigz program including the absolute path in case your computing system supports parallel gzip.

PYTHON

python interpreter including the absolute path

Special script settings

ITYPE_CONV2ICON

Choose whether to run conv2icon (=1) or use already existing conv2icon output (=0)

ITYPE_ICON_ARCH_PAR

Submit icon.job.sh in arch.job.sh directly after checking the ICON output (=1,  i.e. icon simulation and archiving job are running in parallel after the file checking) or at the end of arch.job.sh (=0, i.e. are running sequentially).

ITYPE_COMPRESS_POST

Compression type of output in post-processing
0 = no compression
1 = internal compression (compression in netCDF file, requires netCDF Library with HDF-lib and z-lib)
2 = external compression (compression with gzip, requires gzip version >=1.6 to be installed)
3 = external compression (compression with pigz, requires pigz to be installed), set TASKS_POST  below for the -p option accordingly

ITYPE_COMPRESS_ARCH

Compression type of output in archiving
0 = no compression
1 = internal compression (compression in netCDF file, requires netCDF Library with HDF-lib and z-lib)
2 = external compression (compression with gzip, requires gzip version >=1.6 to be installed)
3 = external compression (compression with pigz, requires pigz to be installed), set TASKS_ARCH below for the -p option accordingly

ITYPE_TS

How timeseries are combined
0 = no timeseries
1 = monthly
2 = yearly (in a format that can directly be used as input for the CLM-Community CMOR-Tool)
3 = 1 & 2 (this doubles the post output !!)

ITYPE_SAMOVAR

choose SAMOVAR check for original ICON output
0 = no SAMOVAR check
1 = short SAMOVAR check (just the output of the last day of the month in each "outNN" directory)
2 = long SAMOVAR check (all output files will be checked)

SAMOVAR_EXE

Name of SAMOVAR executable including full path

SAMOVAR_SH

Name of the SAMOVAR shell script including the path. Default: ${SPDIR}/src/samovar/samovar.sh

SAMOVAR_LIST

File in csv format including the maximal value ranges of the output quantities of original ICON output. Default: ${SPDIR}/src/samovar/samovar.csv

ITYPE_SAMOVAR_TS

chosse SAMOVAR check for time series
0 = no SAMOVAR check
1 = SAMOVAR check

SAMOVAR_LIST_TS

File in csv format including the maximal value ranges of the output quantities in the time series file. Quantity names defined as in mapping_to_cosmo.csv . Default: ${SPDIR}/src/samovar/samovar_ts.csv

Initial and boundary data input

GCM_DATADIR

Directory of the initial and boundary data

GCM_PREFIX

prefix of the data (e.g., caf or cas)

HINCBOUND

boundary data temporal increment in hours

ICON_INPUT_OPTIONAL

e.g. 'QC,Qi' for ERAInterim or 'QC,QI,QR,QS' for ERA5

SMI_DEFAULT

In case the coarse grid data do not contain the soil moisture index (SMI), a default value can be set here. The environment variable has to be exported in order to be read in the pre-processing by ccaf2icaf. The setting has no effect on ICON, ERAInterim, ERA5 reanalysis input since they include values for SMI.

GCM_REMAP

The default remapping of coarse grid data in the conv2icon.job.sh script. CDO is used for remapping, therefore the name must be given accordingly. 

Only remapping options that conserves the dependencies of the fields in the vertical structure should be used here (e.g. remapnn, remaplaf)

Grid description, climatological and other supplemental data

INI_BASEDIR

Directory where the grid description, climatological and other supplemental data are stored

LAM_GRID

Grid description of the limited area dynamic grid (can be generated with the ICON Grid Generator + ExtPar Web Frontend)

PARENT_GRID

Grid description of the parent grid (can be generated with the ICON Grid Generator + ExtPar Web Frontend)

ECRADDIR

Directory containing supplemental files for the ECRAD radiation scheme. Can point to the directory in the ICON distribution externals/ecrad/data 

GHG_FILENAME

File name of greenhouse gas concentrations. Needed by the radiation scheme. Possible data files are available under ${INI_BASEDIR}/greenhouse_gases 

EXTPAR

External parameters on ICON grid (can be generated with the ICON Grid Generator + ExtPar Web Frontend)

OUTPUT_MAPPING_FILE

Mapping of ICON parameter names to those given in the mapping file. These names have to be consistent with those in correct_cf.f90, functions.sh and post.job.sh

TARGET_GRID

External data file holding information on the domain in rotated coordinates. This information is needed in the post-processing for interpolation from ICON grid to rotated lat/lon grid

ICON specific settings

BINARY_ICON

Full path and name of ICON executable

DTIME

Time step in seconds (if not defined it will be calculated in icon.job.sh)

ZML_SOIL

Values of soil levels

HOUT_INC

HOUT_INC is an array. Each index holds the output frequency of one output stream given in the ICON namelist

PRECIP_INTERVAL

Interval for accumulating precipitation in ISO8601 format (must be consistent with HOUT_INC)

RUNOFF_INTERVAL

Interval for accumulating runoff in ISO8601 format (must be consistent with HOUT_INC)

SUNSHINE_INTERVAL

Interval for accumulating sunshine duration after the definition from WMO "threshold for bright sunshine is 120 W m-2 in a plane perpendicular to the direct solar beam"

MAXT_INTERVAL

Interval for min/max of 2m-temperature in ISO8601 ( must be consistent with HOUT_INC)

GUST_INTERVAL

Interval for maximum of wind gust

OPERATION

OPERATION is an array. Each index holds the operation of one output stream given in the ICON namelist. The operation parameter affect all variables defined in an output stream. The variables must be instantaneous variables. Depending on the name of the operation ("mean", "max", or "min") the variables are averaged over the output interval or the maximum or minimum over the output interval is calculated, respectively. No value, expressed as  "", means no operation is performed.   

NESTING_STREAM

Output stream to save quantities as input for further downscaling. Put the number of the output stream here. This chosen output stream is saved as is. A value of 0 means no output stream is saved as input for further downscaling. This output stream cannot be used in post.job.sh for building time series.

PLEVS

List of pressure levels. Must be the same as or a subset of the p_levels in icon namelist !! BUT: in hPa instead of Pa !!

ZLEVS

List of altitude levels (height over NN). Must be the same as or a subset of the h_levels in icon namelist.

HLEV_STREAM

Output stream that contains 4D prognostic quantities plus the height on full level z_mc. Put the number of the output stream here. In case HLEV_STREAM=0 no vertical interpolation will be performed.

HLEVS

Array holding the height levels over ground in meter

Global attribute settings in the ICON netCDF output

GA_INSTITUTION

Name of the institution where the simulation has been performed

GA_TITLE

Title of the simulation

GA_PROJECT_ID

Project ID

GA_REALIZATION

Realization number of the simulation

GA_CONVENTIONS

The conventions the ICON output uses (e.g. "CF-1.4")

GA_CONVENTIONSURL

Website address of the conventions description

GA_CONTACT

Contact address in case of questions on the simulation

GA_ICON_CLM_VERSION

Stable ICON-CLM version used for the simulation

Parallelization settings ⚠️ Not all variables are available on each computing system ⚠️

PARTITION_PREP

Select specific node type/partition for prep.job.sh. This can be different from computing system to computing system. At DKRZ is can be compute or shared.

TASKS_PREP

Number of parallel tasks in prep.job.sh

CPUS_PER_TASK_PREP

Number of tasks per cpu for prep.job.sh

TIME_PREP

requested time in batch queue for prep.job.sh

PARTITION_CONV2ICON

Select specific node type/partition for conv2icon.job.sh. This can be different from computing system to computing system. At DKRZ is can be compute or shared.

TASKS_CONV2ICON

Number of parallel tasks in conv2icon.job.sh

OMP_THREADS_CONV2ICON

Number of OPENMP threads (used in CDO commands) in conv2icon.job.sh

NODES_CONV2ICON

Number of nodes requested for conv2icon.job.sh ⚠️ conv2icon.job.sh can be run only on 1 node presently ⚠️

TIME_CONV2ICON

requested time in batch queue for conv2icon.job.sh

PARTITION_ARCH

Select specific node type/partition for arch.job.sh. This can be different from computing system to computing system. At DKRZ is can be compute or shared.

TASKS_ARCH

Number of parallel tasks in arch.job.sh

CPUS_PER_TASK_ARCH

Number of tasks per cpu for arch.job.sh

TIME_ARCH

requested time in batch queue for arch.job.sh

PARTITION_POST

Select specific node type/partition for post.job.sh. This can be different from computing system to computing system. At DKRZ is can be compute or shared.

TASKS_POST

Number of parallel tasks in post.job.sh

OMP_THREADS_POST

Number of OPENMP threads (used in CDO commands) in post.job.sh

TIME_POST

requested time in batch queue for post.job.sh

TIME_POST_YEARLY

requested time in batch queue for post.job.sh when yearly time series are computed

PARTITION_ICON

Select specific node type/partition for icon.job.sh. This can be different from computing system to computing system. At DKRZ is can be compute or shared.

NP_ICON

Number of processors for ICON

NODES_ICON

Number of nodes for ICON

NUM_THREAD_ICON

Number of OpenMP threads: choose one of the following values 1 (without hyperthreading), 2, 3 (without hyperthreading), 4, 6, 8 (compute only), 9 (compute2 only)

HT_ICON 

hyperthreading: 0 - off, 1 - on

TIME_ICON

requested time in batch queue for ICON simulation

NUM_IO_PROCS

num_io_procs: 1,2 specified in namelist parallel_nml (icon.job.sh)

NUM_RESTART_PROCS

num_restart_procs: number of processors for restart (on DWD NEC)

NUM_PREFETCH_PROC

num_prefetch_proc: number of processors for LBC prefetching (on DWD NEC)