Scripts and Job Flow

Scripts

To use the computer facilities effectively the whole job chain is not carried out in just one script, but divided into several different scripts to perform the main sub-tasks of the job chain. Main settings for the job are in the file job_settings that is read at the beginning of each script.



Purpose

This main script calls the job scripts of the directory scripts. subchain sends the following sub-tasks to the compute server. The scripts reads settings from the file job_settings.


Contents

The subchain script contains several sections (called actions ) which are executed dependent on the subchain [action] call. If you just call subchain without any argument you will get a list of valid actions. The actions start and clean are called interactively. The other actions are called internally by the subchain script, only in case of a restart in case of an error they may be called interactively.
The subchain script first checks whether the file date.log exists. During the simulation this file holds the current date of the ICON runs. When calling subchain the first time this file does not exist and will be created as well as the directory structure of the job.
Available actions:

  • start
    This is the very first call to start the simulation. The file date.log and the directory structure of the job will be created.

  • prep [YYYYMM]
    prepares some environment variables and submits the prep.job.sh for the month taken from the file date.log or optionally by the argument YYYYMM .

  • conv2icon [YYYYMM]
    prepares some environment variables and submits the conv2icon job conv2icon.job.sh for the month taken from the file date.log or optionally by the argument YYYYMM .

  • icon [noprep]
    prepares some environment variables and submits the ICON job icon.job.sh for the month taken from the file date.log. It also calls subchain prep ${NEXT DATE} which starts a pre-processing job for the next month, if this is not suppressed by calling this action with the argument noprep.

  • arch [YYYYMM]
    prepares some environment variables and submits the archiving job arch.job.sh for the month taken from the file date.log or optionally by the argument YYYYMM.

  • post [YYYYMM] [skip]
    prepares some environment variables and submits the post-processing job post.job.sh for the month taken from the file date.log or optionally by the argument YYYYMM.
    If a third argument exist (skip), only the yearly concatenating is performed in post.job.sh. This makes only sense for MM=01.

  • clean
    deletes all files and directories created during the model simulation for this experiment. You can use this actions if you want to start the simulation from scratch. However, it does noch revert changes that has been made in the scripts.

  • create
    this actions creates just the directory structure for your experiment. It is only needed in rare cases when the directory structure is corrupted.

Purpose

All necessary pre-processing is performed by this script, e.g. copying boundary data from the archive, unpacking the data. The script can be changed to perform copying from tape or via ftp or scp.


Contents

The script includes the job_settings file.
If ITYPE_CONV2ICON == 1
Prepares the coarse grid initial or boundary data (e.g. from reanalysis, GCM or a coarser ICON) for use in the conv2icon job for a specific month. This script has to be adjusted by the user depending on how and where the coarse grid data are stored.
The caf or cas data files have the same format as those that has been used by the INT2LM program that interpolates forcing data to COSMO-CLM input. However, ICON does up to now only accept files as input which have the IFS netCDF format. Therefore the caf or cas data files have to be converted by an extra program ccaf2icaf. The converted files are stored into the directory ${SCRATCHDIR}/${EXPID}/output/prep/YYYY_MM. At the end of this job an conv2icon job is initiated by subchain conv2icon ${CURRENT DATE}.

If the coarse grid data are internally compressed netCDF files, perform a decompression in this template by calling nccopy -k 2 infile outfile. This is because compressed input data may cause trouble when using the cdo calls in conv2icon.


If ITYPE_CONV2ICON /= 1
Assumes that the output  files of an conv2icon run already exist. These data are copied to ${SCRATCHDIR}/${EXPID}/output/conv2icon/YYYY_MM. At the end of this job an ICON job is initiated by subchain icon.

Purpose

This script prepares input data for the subsequent ICON simulation: the course grid boundary meteorological fields are remapped to the model grid.


Contents

The script includes the job_settings file. It contains several cdo remap calls to remap the course grid data to the ICON grid. At the end of this job an ICON job is initiated by subchain icon. 

Purpose

This script contains name list parameters for the model grid, as well as model options for dynamics, physics, I/O, and diagnostics, etc. Those parameters which are not explicitly set here are set to their default values during the simulation.


Contents

The script includes the job_settings file. It contains the namelist for ICON. After creating the input namelist files the ICON executable is called with the system specific MPI call. At the end of this job an archive job is initiated by subchain arch if the total current month has been run by the ICON. If the ICON is run in shorter time length than a month and the end of the month has not been reached yet, the next ICON chunk is initiated by subchain icon. The user may change the namelist settings and the MPI call.

Purpose

Archiving of the results is performed in this script, e.g. compressing and combining output directories into tar files. In the standard script archiving is performed on hard disk. However, the script can be changed to perform archiving on tape or via ftp or scp. 

Contents

The script includes the job_settings and function.sh files. Before archiving the netCDF attributes of the ICON output files are modified to go along with the CF-Conventions. In addition the variable names and standard_names are adopted as given in the file mapping_to_cosmo.csv.
Optionally a sanity check can be performed by the SAMOVAR script.

The resulting files are optionally compressed then tarred and copied to the archive. At the end of this job a post-processing job is initiated by subchain post. 

Includes

 

Jobflow