Install

Software Requirements

  • Fortran 90/95 and C compiler

  • MPI or OpenMP libraries

  • netCDF4 library linked with HDF5 and zip libraries and extended by the Fortran netCDF package (The netCDF4 package comes with the programs ncdump and nccopy)

  • UNIX utilities: make, ksh, uname, sed, awk, wget, etc.

  • For post-processing: Climate Data Operators (CDO) and netCDF Operators (NCO)

  • An ICON binary

SPICE uses netCDF I/O of ICON-CLM

Create an ICON Binary

The ICON source code does not come with SPICE because it is not distributed by the CLM-Community. Copy the latest ICON source code from the ICON download page to your working directory. If you do not yet have access, i.e. no private license, go to the License page first.

How to adopt and compile ICON on your computing system is not part of the SPICE documentation. Please read the ICON documentations on how to build a binary.

On Levante at DKRZ perform the following commands for the ICON release 2.6.6 to create an ICON binary:

$ xz -d icon-2.6.6.tar.xz $ tar -xzf icon-2.6.6.tar $ cd icon-2.6.6

Create your build directory and compile ICON:

$ mkdir build $ cd build $ ../config/clm/levante.intel-2021.5.0_ecrad_2.6.5_optimfpe $ LANG=en_US.utf8 $ make -j 8

Install SPICE

 

1 Get the source code

Goto https://www.clm-community.eu/wiki/wg-suptech/icon-clm/spice/ and download the latest revision, i.e. tag (e.g. 2.2 in the following), as tarball, copy spice-v2.2.tar.gz to your computing system and proceed like so:

$ tar -xvf spice-v2.2.tar.gz $ cd spice-v2.2 $ SPDIR=$PWD # used as a shortcut in the following



2 Get supplementary data 

2.1 Get the example constant and external data files

A directory rcm is created holding the necessary data to run the ICON-CLM test experiment.

3 Configure SPICE  and run the test examples

Call the script config.sh  at DKRZ Levante like so

at DWD Nec like so

or at CSCS Daint like so

or at IMS cyclone like so

This will create two directories including the basic scripts. You find them under ${SPDIR}/chain/gcm2icon/sp001 and ${SPDIR}/chain/icon2icon/sp002 .

Run the test examples

There are two tests, one for testing ICON with GCM or reanalysis data as initial and boundary conditions (${SPDIR}/chain/gcm2icon/sp001) and one for testing ICON with coarse grid ICON data as initial and boundary conditions (${SPDIR}/chain/icon2icon/sp002). Actually sp001 creates the necessary input data for sp002.

Before you start the experiment look for the following environment variables in the job_settings of sp001 and sp002 file and adopt them to your needs.



If you are not running the tests at DKRZ:

Adopt the input directory of the ERAInterim data: GCM_DATADIR=/pool/data/CLMcom/CCLM/reanalyses/ERAInterim

and probably the de-tar part in prep.job.sh.

Now you should be ready to start the first experiment:

This experiment is a two month simulation, 50 km / Europe / driven bei ERAInterim

After successful completion start the second one:

This experiment is a two month simulation, 3km / region around Hamburg / driven by ICON output of sp001 . 



3.1  Create the supplemental programs

a. Create the fortran library libcsv

This library is used for reading csv data files. The original source code can be found on GitHub under https://github.com/jacobwilliams/fortran-csv-module

Choose a Fopts file in the directory LOCAL and copy it to the base directory of libcsv (here we choose Fopts.dkrz-levante as an example):

Adopt the Fopts file to your system and type:

After successful compilation you find the libcsv in ${SPDIR}/src/fortran-csv-lib/lib.

b. Create the cfu executable

The climate fortran utilities contain several functions needed in the runtime environment.

Choose a Fopts file in the directory LOCAL and copy it to the base directory of libcsv (here we choose Fopts.dkrz-levante as an example):

Adopt the Fopts file to your system and type:

After successful compilation you find the cfu executable in ${SPDIR}/src/cfu/bin

c. Create additional conversion programs

The programs are used to convert COSMO-CLM caf-files to ICON-CLM compatible caf-file (ccaf2icaf) and to correct the netCDF output of ICON-CLM (correct_cf).

Choose a Fopts file in the directory LOCAL and copy it to the base directory of libcsv (here we choose Fopts.dkrz-levante as an example):

Adopt the Fopts file to your system and type:

After successful compilation you find the executables ccaf2icaf and correct_cf in ${SPDIR}/src/utils/bin

3.2  Configure SPICE on your computing system

If you intent not to run ICON-CLM at DKRZ or DWD you have to perform some adoptions. First,  find out which batch system comes nearest to your system. DKRZ uses SLURM (i.e. SBATCH commands) and DWD uses the Portable Batch Commands (i.e. PBS commands). Lets suppose as an example in the following that you use SLURM on your computing system and therefore you use "dkrz" as a template.

a. Change into the configure_scripts directory:

b. Adopt the dkrz part in the system_settings.tmpl file to the settings on your system.

c. Run the config.sh script:

This will create two directories including the basic scripts. You find them under ${SPDIR}/chain/gcm2icon/sp001 and ${SPDIR}/chain/icon2icon/sp002

Now comes the hardest part: you have to dive into the scripts in the directories and adopt the scripts to your system (e.g. modify batch commands, program calls etc.). Start with adopting the experiment ${SPDIR}/chain/gcm2icon/sp001  . 

Run the test examples

There are two tests, one for testing ICON with GCM or reanalysis data as initial and boundary conditions (${SPDIR}/chain/gcm2icon/sp001) and one for testing ICON with coarse grid ICON data as initial and boundary conditions (${SPDIR}/chain/icon2icon/sp002). Actually sp001 creates the necessary input data for sp002.

SP001  is a two month simulation, 50 km / Europe / driven bei ERAInterim

SP002 This experiment is a two month simulation, 3km / region around Hamburg / driven by ICON output of sp001 . 

Before you start the experiment look for the following environment variables in the job_settings of and adopt them to your needs.

Adopt the input directory of the ERAInterim data

and probably the de-tar part in prep.job.sh.

Now you should be ready to start the first experiment:

In SPICE the scripts are called in the order prep.job.sh, conv2icon.job.sh, icon.job.sh, arch.job.sh, post.job.sh.  If your job crashes in one of the scripts you do not have to run all the successful scripts again, but can start this script again after you made the corrections by submitting the appropriate command from the following list:



After successful completion of the  SP001 experiment adopt the scripts in  SP002 and start this experiment: