/
Scripts and Job Flow

Scripts and Job Flow

Scripts

To use the computer facilities effectively the whole job chain is not carried out in just one script, but divided into several different scripts to perform the main sub-tasks of the job chain. Main settings for the job are in the file job_settings that is read at the beginning of each script.



Purpose

This main script calls the job scripts of the directory scripts. subchain sends the following sub-tasks to the compute server. The scripts reads settings from the file job_settings.


Contents

The subchain script contains several sections (called actions ) which are executed dependent on the subchain [action] call. If you just call subchain without any argument you will get a list of valid actions. The actions start and clean are called interactively. The other actions are called internally by the subchain script, only in case of a restart in case of an error they may be called interactively.
The subchain script first checks whether the file date.log exists. During the simulation this file holds the current date of the ICON runs. When calling subchain the first time this file does not exist and will be created as well as the directory structure of the job.
Available actions:

  • start
    This is the very first call to start the simulation. The file date.log and the directory structure of the job will be created.

  • prep [YYYYMM]
    prepares some environment variables and submits the prep.job.sh for the month taken from the file date.log or optionally by the argument YYYYMM .

  • conv2icon [YYYYMM]
    prepares some environment variables and submits the conv2icon job conv2icon.job.sh for the month taken from the file date.log or optionally by the argument YYYYMM .

  • icon [noprep]
    prepares some environment variables and submits the ICON job icon.job.sh for the month taken from the file date.log. It also calls subchain prep ${NEXT DATE} which starts a pre-processing job for the next month, if this is not suppressed by calling this action with the argument noprep.

  • arch [YYYYMM]
    prepares some environment variables and submits the archiving job arch.job.sh for the month taken from the file date.log or optionally by the argument YYYYMM.

  • post [YYYYMM] [skip]
    prepares some environment variables and submits the post-processing job post.job.sh for the month taken from the file date.log or optionally by the argument YYYYMM.
    If a third argument exist (skip), only the yearly concatenating is performed in post.job.sh. This makes only sense for MM=01.

  • clean
    deletes all files and directories created during the model simulation for this experiment. You can use this actions if you want to start the simulation from scratch. However, it does noch revert changes that has been made in the scripts.

  • create
    this actions creates just the directory structure for your experiment. It is only needed in rare cases when the directory structure is corrupted.

Purpose

All necessary pre-processing is performed by this script, e.g. copying boundary data from the archive, unpacking the data. The script can be changed to perform copying from tape or via ftp or scp.


Contents

The script includes the job_settings file.
If ITYPE_CONV2ICON == 1
Prepares the coarse grid initial or boundary data (e.g. from reanalysis, GCM or a coarser ICON) for use in the conv2icon job for a specific month. This script has to be adjusted by the user depending on how and where the coarse grid data are stored.
The caf or cas data files have the same format as those that has been used by the INT2LM program that interpolates forcing data to COSMO-CLM input. However, ICON does up to now only accept files as input which have the IFS netCDF format. Therefore the caf or cas data files have to be converted by an extra program ccaf2icaf. The converted files are stored into the directory ${SCRATCHDIR}/${EXPID}/output/prep/YYYY_MM. At the end of this job an conv2icon job is initiated by subchain conv2icon ${CURRENT DATE}.

If the coarse grid data are internally compressed netCDF files, perform a decompression in this template by calling nccopy -k 2 infile outfile. This is because compressed input data may cause trouble when using the cdo calls in conv2icon.


If ITYPE_CONV2ICON /= 1
Assumes that the output  files of an conv2icon run already exist. These data are copied to ${SCRATCHDIR}/${EXPID}/output/conv2icon/YYYY_MM. At the end of this job an ICON job is initiated by subchain icon.

Purpose

This script prepares input data for the subsequent ICON simulation: the course grid boundary meteorological fields are remapped to the model grid.


Contents

The script includes the job_settings file. It contains several cdo remap calls to remap the course grid data to the ICON grid. At the end of this job an ICON job is initiated by subchain icon. 

Purpose

This script contains name list parameters for the model grid, as well as model options for dynamics, physics, I/O, and diagnostics, etc. Those parameters which are not explicitly set here are set to their default values during the simulation.


Contents

The script includes the job_settings file. It contains the namelist for ICON. After creating the input namelist files the ICON executable is called with the system specific MPI call. At the end of this job an archive job is initiated by subchain arch if the total current month has been run by the ICON. If the ICON is run in shorter time length than a month and the end of the month has not been reached yet, the next ICON chunk is initiated by subchain icon. The user may change the namelist settings and the MPI call.

Purpose

Archiving of the results is performed in this script, e.g. compressing and combining output directories into tar files. In the standard script archiving is performed on hard disk. However, the script can be changed to perform archiving on tape or via ftp or scp. 

Contents

The script includes the job_settings and function.sh files. Before archiving the netCDF attributes of the ICON output files are modified to go along with the CF-Conventions. In addition the variable names and standard_names are adopted as given in the file mapping_to_cosmo.csv.
Optionally a sanity check can be performed by the SAMOVAR script.

The resulting files are optionally compressed then tarred and copied to the archive. At the end of this job a post-processing job is initiated by subchain post. 

Purpose

Any post-processing needed is performed here. Archiving of postprocessing results is also done here. Post-processing may take a lot of time, depending on its complexity. This may slow down the whole chain. Therefore it is run in parallel to the rest of the chain.

Contents

The script includes the job_settings and functions.sh files. The following post-processing is performed by default. However, the user can adjust the script to his needs. The following steps are performed:

  • the time series functions in functions.sh are called for user selected quantities. The resulting time series are stored under ${WORKDIR}/${EXPID}/post/YYYY_MM.

  • depending on the setting of the environment variable ITYPE_COMPRESS_POST (to be set in the file job_settings) a compression is applied.

  • If the environment variable ITYPE_TS in job_settings is set to 2 the time series are stored in yearly files instead of monthly ones. These files can be used as input for the cmorizing scripts developed in the CLM-Community.

Includes

 

Functions

Call

Description

Functions

Call

Description

cdocor

 

correction in a file that came in by using CDO commands

iconcor

 

corrections and modifications to the original output file format of ICON to make them CF-netCDF compatible

cell_methods_time

 

in case cell_methods exists the value of the time variable is replaced by the value of middle of the time interval

remap2rot

 

remapping time series from ICON to rotated grid

timeseries

timeseries varname outNN remapnn

building time series in the form varname_ts.nc files in post/YYYY_MM

timeseriesp

timeseriesp varname outNN PLEVS[@] remapnn

as timeseries, but on selected pressure levels

timeseriesz

timeseriesz varname outNN ZLEVS[@] remapnn

as timeseries, but on selected altitude levels

timeseriesa

timeseriesa varname
or
timeseries varname

calculate additional quantities on the basis of time series files created already by the function timeseries

timeseriesap

timeseriesap varname PLEVS[@]
or
timeseriesp varname PLEVS[@]

calculate additional quantities on the basis of time series files created already by the function timeseriesp

timeseriesaz

timeseriesaz varname PLEVS[@]
or
timeseriesz varname PLEVS[@]

calculate additional quantities on the basis of time series files created already by the function timeseriesz

List of possible additional quantities

Single level fields:

Name

Description

Necessary time series

Name

Description

Necessary time series

ASOD_S

Total downward sw radiation at the surface averaged over output time interval

ASODIFD_S, ASODIRD_S

ASODIRD_S

Solar direct downward flux at the surface averaged over output time interval

ASOB_S, ASODIFU_S, ASODIFD_S

ASOU_T

Solar upward radiation at top averaged over output time interval

ASOD_T, ASOB_T

ATHD_S

Total downward lw radiation at the surface averaged over output time interval

ATHB_S, ATHU_S

DD_10M

Wind direction at 10m height

U_10M, V_10M

DTR_2M

Diurnal temperature range

TMAX_2M, TMIN_2M

FIELDCAP

Field capacity, pore volume and wilting point

SOILTYP

FR_SNOW

snow fraction

where W_SNOW < 0.0000005 FR_SNOW=0 else

FR_SNOW=W_SNOW/0.015 ; MIN(FR_SNOW,1.) ; MAX(FR_SNOW,0.01)

W_SNOW

PREC_CON

Convective precipitation

RAIN_CON, SNOW_CON

PVAP_2M

Water vapour partial pressure in 2m

PS, QV_2M

RUNOFF_T

Total runoff

RUNOFF_S, RUNOFF_G

TOT_SNOW

Total snowfall

SNOW_GSP, SNOW_CON

TQW

Vertical integrated cloud condensed water

TQC, TQI

Multi level fields (pressure or height)

Name

Description

Necessary time series

Name

Description

Necessary time series

DD

Wind direction

U, V on pressure or height levels

SP

Wind speed

U, V on pressure or height levels

In this script the monthly time series files for a quantity are combined into yearly files. These files have a format that can be used as input for the CCLM2CMOR tool.

Jobflow

Related content