Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

...

...

...

...

...

...

...

Software Requirements

  • Fortran 90/95 and C compiler

  • MPI or OpenMP libraries

  • netCDF4 library linked with HDF5 and zip libraries and extended by the Fortran netCDF package (The netCDF4 package comes with the programs ncdump and nccopy)

  • UNIX utilities: make, ksh, uname, sed, awk, wget, etc.

  • For post-processing: Climate Data Operators (CDO) and netCDF Operators (NCO)

...

Info

CCLM_SP uses netCDF I/O of COSMO-CLM

The starter package is written for the HRLE–3 Mistral HRLE-4 Levante at DKRZ. Additional changes need to be applied for unsage on other machines. On the Mistral At DKRZ the Simple Linux Utility for Resource Management SLURM ist is installed.On mistral please use OPENMPI by setting

Code Block
languagebash
module load intel-oneapi-compilers/172022.0.1-gcc-11.2.0
module load openmpi/4.1.2-intel-2021.0.2p1_hpcx-intel14

export OMPI_MCA_pml=cm
export OMPI_MCA_mtl=mxm
export MCM_RDMA_PORTS=mlx5_0:1

...

5.0

Unpack and Configure

1. Copy the starter package from RedC🔑 Downloads.
2. Unpack the starter package:

Code Block
languagebash
$ tar -xzvf cclm-sp-45.0.tgz
$ mv cclm-sp-45.0 yourpath/cclm-sp

yourpath/cclm-sp is named SPDIR from here on.

3. Copy and unpack the supplementary data files for testing the starter package (the program wget needs to be installed on your system)

Code Block
languagebash
$ cd yourpath/cclm-sp$SPDIR/data
$ ./get_sp_ext.sh

4. Change to the directory yourpath$SPDIR/cclm-sp/configure_scripts
5. Adjust the settings in the file system_settings to your computer system
6. Type the following command to create a default test experiment

Code Block
languagebash
$ ./config.sh

this first compiles the necessary cfu program and the fortran-csv-lib, and then creates the test experiments ${SPDIR}/chain/gcm2cclm/sp001 and ${SPDIR}/chain/cclm2cclm/sp002.
or, if you want to create a default experiment including add-ons:

Code Block
languagebash
$ ./config.sh -a addon1,addon2,…

1 Get the source code

Goto https://redc.clm-community.eu/projects/icon-clm-starter-package/wiki and download the latest revision, i.e. tag (e.g. 1.0 in the following), as tarball, copy spice-v1.0.tar.gz to your computing system and proceed like so:

Code Block
languagebash
$ tar -xvf spice-v1.0.tar.gz
$ cd spice-v1.0
$ SPDIR=$PWD # used as a shortcut in the following

2 Get supplementary data 

2.1 Get the example constant and external data files

Code Block
languagebash
$ cd ${SPDIR}/data
$ ./get_spice_rcm.sh

A directory rcm is created holding the necessary data to run the ICON-CLM test experiment.

3 Configure SPICE  and run the test examples

...

titleConfiguration at DKRZ or DWD

Call the script config.sh  at DKRZ like so

Code Block
languagebash
$ cd ${SPDIR}/configure_scripts
$ ./config.sh -s dkrz

...

Compile

The source code for INT2LM, CCLM, and the auxiliary program packages CFU and fortran-csv-lib need to be compiled for running a simulation with the regional climate model. INT2LM and CCLM need to be compiled by yourself. CFU and fortran-csv-lib are automatically compiled when you run the config.sh script.

Compiler options are stored in the file Fopts for each program. You need to set the appropriate compiler options for your computer system.

Compile INT2LM

Code Block
languagebash
$ cd 
${SPDIR}/configure_scripts $ ./config.sh -s dwd

This will create two directories including the basic scripts. You find them under ${SPDIR}/chain/gcm2icon/sp001 and ${SPDIR}/chain/icon2icon/sp002 .

Run the test examples

There are two tests, one for testing ICON with GCM or reanalysis data as initial and boundary conditions (${SPDIR}/chain/gcm2icon/sp001) and one for testing ICON with coarse grid ICON data as initial and boundary conditions (${SPDIR}/chain/icon2icon/sp002). Actually sp001 creates the necessary input data for sp002.

Before you start the experiment look for the following environment variables in the job_settings of sp001 and sp002 file and adopt them to your needs.
$SPDIR/src/int2lm

Open the Fopts script with any text editor and change the Fortran options according to your computer platform. Compile INT2LM:

Code Block
languagebash
PROJECT_ACCOUNT=  # your project account
EMAIL_ADDRESS=    # your email address if you want to get information when your job crashes or finishes
BINARY_ICON=      # ICON executable including full path
ECRADDIR=         # path to the ECRAD data directory, if you plan to use the ECRAD radiation scheme
Info

If you are not running the tests at DKRZ:

Adopt the input directory of the ERAInterim data

Code Block
languagebash
GCM_DATADIR=/pool/data/CCLM/reanalyses/ERAInterim

and probably the de-tar part in prep.job.sh.

Now you should be ready to start the first experiment:

Code Block
languagebash
$ cd ${SPDIR}/chain/gcm2icon/sp001
$ ./subchain start

This experiment is a two month simulation, 50 km / Europe / driven bei ERAInterim

After successful completion start the second one:

Code Block
languagebash
$ cd ${SPDIR}/chain/icon2icon/sp002
$ ./subchain start

This experiment is a two month simulation, 3km / region around Hamburg / driven by ICON output of sp001 . 

Expand
titleConfiguration on a not supported computing platform

3.1  Create the supplemental programs

a. Create the fortran library libcsv

This library is used for reading csv data files. The original source code can be found on GitHub under https://github.com/jacobwilliams/fortran-csv-module

Choose a Fopts file in the directory LOCAL and copy it to the base directory of libcsv (here we choose Fopts.dkrz as an example)
$ make

In case of problems, try module unload python3 before the make-command (conda might have installed Fortran and destroyed the Intel-Fortran). If compilation is successful an executable $SPDIR/src/int2lm/bin/int2lm.exe is created.

You may try to perform a parallel make by typing:

Code Block
languagebash
$ 
cd ${SPDIR}/src/fortran-csv-lib $ cp LOCAL/Fopts.dkrz Fopts

Adopt the Fopts file to your system and type:

Code Block
languagebash
$ make

After successful compilation you find the libcsv in ${SPDIR}/src/fortran-csv-lib/lib.

b. Create the cfu executable

The climate fortran utilities contain several functions needed in the runtime environment.

Choose a Fopts file in the directory LOCAL and copy it to the base directory of libcsv (here we choose Fopts.dkrz as an example):
make -j N

this results in a much faster compilation. Replace N by the number of requested processors.

Compile CCLM

Code Block
languagebash
$ cd 
${SPDIR}
$SPDIR/src/
cfu $ cp LOCAL/Fopts.dkrz Fopts

Adopt the Fopts file to your system and type:

Code Block
languagebash
$ make

After successful compilation you find the cfu executable in ${SPDIR}/src/cfu/bin

c. Create additional conversion programs

The programs are used to convert COSMO-CLM caf-files to ICON-CLM compatible caf-file (ccaf2icaf) and to correct the netCDF output of ICON-CLM (correct_cf).

Choose a Fopts file in the directory LOCAL and copy it to the base directory of libcsv (here we choose Fopts.dkrz as an example):

Code Block
languagebash
$ cd ${SPDIR}/src/utils
$ cp LOCAL/Fopts.dkrz Fopts
Adopt the Fopts file to your system and type
cclm

Open the Fopts script with any text editor and change the Fortran options according to your computer platform. The Makefile might be edited for setting the right MACHINE (e.g., levante.atos.local). Compile CCLM:

Code Block
languagebash
$ make
After successful compilation you find the executables ccaf2icaf and correct_cf in ${SPDIR}

If compilation is successful an executable $SPDIR/src/

...

cclm/bin

...

3.2  Configure SPICE on your computing system

If you intent not to run ICON-CLM at DKRZ or DWD you have to perform some adoptions. First,  find out which batch system comes nearest to your system. DKRZ uses SLURM (i.e. SBATCH commands) and DWD uses the Portable Batch Commands (i.e. PBS commands). Lets suppose as an example in the following that you use SLURM on your computing system and therefore you use "dkrz" as a template.

...

/cclm.exe is created.

You may try to perform a parallel make by typing:

Code Block
languagebash
$ 
cd ${SPDIR}/configure_scripts

b. Adopt the dkrz part in the system_settings.tmpl file to the settings on your system.

c. Run the config.sh script:

Code Block
languagebash
$ cd ${SPDIR}/configure_scripts
$ ./config.sh -s dkrz

This will create two directories including the basic scripts. You find them under ${SPDIR}/chain/gcm2icon/sp001 and ${SPDIR}/chain/icon2icon/sp002

Now comes the hardest part: you have to dive into the scripts in the directories and adopt the scripts to your system (e.g. modify batch commands, program calls etc.). Start with adopting the experiment ${SPDIR}/chain/gcm2icon/sp001  . 
make -j N

this results in a much faster compilation. Replace N by the number of requested processors.

Run the test examples

There are two tests, one for testing

...

CCLM with GCM or reanalysis data as initial and boundary conditions (${SPDIR}/chain/

...

gcm2cclm/sp001) and one for testing

...

CCLM with coarse grid

...

CCLM data as initial and boundary conditions (${SPDIR}/chain/

...

cclm2cclm/sp002). Actually, sp001 creates the necessary input data for sp002.

...

SP002 This experiment is a two month simulation, 3km / region around Hamburg / driven by ICON output of sp001 . 

Before you start the experiment look for the following environment variables in the job_

...

settings of sp001 and sp002 file and adapt them to your needs.

GCM_DATADIR=/pool/data/CCLM/reanalyses/ERAInterim

and probably the de-tar part in prep.job.sh.

Code Block
languagebash
PROJECT_ACCOUNT=  # your project account
EMAIL_ADDRESS= # your email address if you want to get information when your job crashes or finishes BINARY_ICON= # ICON executable including full path ECRADDIR= # path to the ECRAD data directory, if you plan to use the ECRAD radiation scheme

Adopt the input directory of the ERAInterim data

Code Block
languagebash

Now you should be ready to start the first experiment:

Code Block
languagebash
$ cd ${SPDIR}/chain/
gcm2icon
gcm2cclm/sp001
$ ./subchain start
In SPICE the

This experiment is a two month simulation, 50 km / Europe / driven by ERAInterim

The scripts are called in the order prep.job.sh,

...

int2lm.job.sh,

...

cclm.job.sh, arch.job.sh, post.job.sh.  If your job crashes in one of the scripts you do not have to run all the successful scripts again, but can start this script again after you made the corrections by submitting the appropriate command from the following list:

Code Block
languagebash
$ ./
subchain prep
subchain prep
$ ./subchain conv2icon
$ ./subchain 
icon
cclm noprep
$ ./subchain arch
$ ./subchain post

After successful completion of

...

the sp001 experiment adapt the scripts in

...

sp002 and start this experiment:

Code Block
languagebash
$ cd ${SPDIR}/chain/
icon2icon
cclm2cclm/sp002
$ ./subchain start

This experiment is a two-month simulation, 2 km resolution, around Hamburg, driven by the results from the sp001 experiment.