Telemac Notes

There are currently modules for two Telemac versions. You can search for the names of modules with the "module spider ..." command:


[abol@katahdin ~]$ module spider telemac


-----------------------------------------------------------------------------------------------------------------

telemac:

-----------------------------------------------------------------------------------------------------------------

Versions:

telemac/v7

telemac/v8p2r0


-----------------------------------------------------------------------------------------------------------------

For detailed information about a specific "telemac" module (including how to load the modules) use the module's full name.

For example:


$ module spider telemac/v8p2r0

------------------------------------------------------------------------------------------------------------------


The version that seems to work the best is the v7 one. To get more information about this module you can use the "module show ..." command like:


[abol@katahdin ~]$ module show telemac/v7

-------------------------------------------------------------------------------------------------------------

/opt/ohpc/pub/modulefiles/telemac/v7:

-------------------------------------------------------------------------------------------------------------

setenv("HOMETEL","/opt/ohpc/pub/telemac/v7")

prepend_path("PATH","/opt/ohpc/pub/telemac/v7/scripts/python27")

prepend_path("SOURCEFILE","/opt/ohpc/pub/telemac/v7/scripts/python27")

prepend_path("SYSTELCFG","/opt/ohpc/pub/telemac/v7/configs/systel.cis-centos.cfg")

setenv("USETELCFG","cluster")

prepend_path("PYTHONUNBUFFERED","'true'")

prepend_path("PYTHONPATH","/opt/ohpc/pub/telemac/v7/scripts/python27")

prepend_path("LD_LIBRARY_PATH","/opt/ohpc/pub/telemac/v7/builds/cluster/lib")

prepend_path("PYTHONPATH","/opt/ohpc/pub/telemac/v7/builds/cluster/lib")

setenv("METISHOME","/opt/ohpc/pub/parmetis/4.0.3")

prepend_path("LD_LIBRARY_PATH","/opt/ohpc/pub/parmetis/4.0.3/lib")

unload("mvapich2")

load("mvapich2-intel/intel-2.2","anaconda2")


Useful information about this:

  • the Telemac software expects the old, obsolete 2.7 version of Python

  • the PATH variable gets updated with: /opt/ohpc/pub/telemac/v7/scripts/python27 added to the beginning

  • the Telemac configuration file is: /opt/ohpc/pub/telemac/v7/configs/systel.cis-centos.cfg

  • specific versions of MVAPICH2 (MPI Distributed Parallel libraries), METIS and Anaconda (Python) are loaded automatically

  • The MVAPICH2 and Anaconda modules that are loaded may in turn load other modules.

As an example, when you first login to Katahdin, you will have a set of default modules loaded:


[abol@katahdin ~]$ module list


Currently Loaded Modules:

1) autotools 2) prun/1.3 3) gnu8/8.3.0 4) mvapich2/2.3.2 5) ohpc



By loading the Telemac module, it should provide all dependencies to run Telemac. You do not need to worry about setting up Python or MVAPICH2.

To load the module run:


module load telemac/v7

Then, if you check to see what modules you have loaded you will see:


[abol@katahdin ~]$ module list


Currently Loaded Modules:

1) autotools 3) ohpc 5) intel/2017.1.132 7) gnu8/8.3.0 9) telemac/v7

2) prun/1.3 4) legacy/1.0 6) mvapich2-intel/intel-2.2 8) anaconda2/5.3.1


The telemac/v7 module is there but also the anaconda2/5.3.1 module has been added. You can also see that the version of the MVAPICH2 module has changed from mvapich2/2.3.2 to mvapich2-intel/intel-2.2. In addition, the intel/2017.1.132 compiler module has been added as well as a module called legacy/1.0. The gnu8/8.3.0 module is still loaded because the Intel compilers require it.

Telemac is different than most programs that are run on the cluster in that in order to submit a job you just run a Telemac command rather than needing to create a job script to submit. The general Telemac commands for running models are:


  • telemac1d.py

  • telemac2d.py

  • telemac3d.py

A typical command to submit a job would be something like:


telemac2d.py t2d_gouttedo.cas --nctile=10 --ncnode=6 --walltime="10:00:00" --queue="haswell" --jobname="2d-test"


To find out what options there are for these programs run:


[abol@katahdin ~]$ telemac2d.py --help



Loading Options and Configurations

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


Usage: telemac3d.py [options]

use -h for more help.


Options:

-h, --help show this help message and exit

-c CONFIGNAME, --configname=CONFIGNAME

specify configuration name, default is randomly found

in the configuration file

-f CONFIGFILE, --configfile=CONFIGFILE

specify configuration file, default is systel.cfg

-r ROOTDIR, --rootdir=ROOTDIR

specify the root, default is taken from config file

-s, --sortiefile specify whether there is a sortie file, default is no

-t, --tmpdirectory specify whether the temporary directory is removed,

default is yes

-x, --compileonly specify whether to only create an executable but not

run, default is no

-w WDIR, --workdirectory=WDIR

specify whether to re-run within a defined

subdirectory

--nozip specify whether to zip the extra sortie file if

simulation in parallel

--jobname=JOBNAME specify a jobname for HPC queue tracking

--queue=HPC_QUEUE specify a queue for HPC queue tracking

--walltime=WALLTIME specify a walltime for HPC queue tracking

--email=EMAIL specify an e-mail adress to warn when HPC job is

finished

--hosts=HOSTS specify the list of hosts available for parallel mode,

';' delimited

--ncsize=NCSIZE the number of processors forced in parallel mode

--nctile=NCTILE the number of core per node. ncsize/nctile is the

number of compute nodes

--ncnode=NCNODE the number of of nodes. ncsize = ncnode*nctile is the

total number of compute nodes

--sequential if present, imposes that multiple CAS files are

launched one after the other

--mpi make sure the mpi command is executed, ignoring any

hpc command

--split will only do the trace (and the split in parallel) if

option there

--merge will only do the output copying (and recollection in

parallel) if option there

--run will only run the simulation if option there

--use-link Will use link instead of copy in the temporary folder

(Unix system only)


In setting up how these parameters are translated to SLURM, I found that the descriptions are misleading. Here is what I ended up in the configuration file:


mpi_hosts: katahdin

mpi_cmdexec: srun <exename>

#

par_cmdexec: <config>/partel < PARTEL.PAR >> <partel.log>

#

hpc_stdin: #!/bin/bash -l

#SBATCH --output=<sortiefile>

#SBATCH --error=<exename>.err

#SBATCH --job-name=<jobname>

#SBATCH --partition=<queue>

#SBATCH --nodes=<nctile>

#SBATCH --ntasks-per-node=<ncnode>

#SBATCH --mincpus=<ncnode>

#SBATCH --time=<walltime>

#SBATCH --mem=59gb


module load telemac/v7 anaconda2

<mpi_cmdexec>

exit


I found that the descriptions in --help for --ncnode and --nctile are backwards. That is, nctile refers to the number of nodes to allocate and --nctile refers to the number of tasks/cores per node to run. So in the example command above:


telemac2d.py t2d_gouttedo.cas --nctile=10 --ncnode=6 --walltime="10:00:00" --queue="haswell" --jobname="2d-test"


it will allocate 10 nodes and 6 tasks/cores per node for a total of 60 tasks/cores/processes to run the job with.

When this command is run a number of things are done:


  1. a new directory is created based on the name of the input file and the current date/time

  2. a new program is compiled in that new directory based on the settings of the input file

  3. a job script is created

  4. the job is submitted

Here is an example:

[abol@katahdin gouttedo]$ telemac2d.py /home/abol/telemac/gouttedo/t2d_gouttedo.cas --nctile=2 --ncnode=3 --walltime=10:00 --queue=haswell --jobname=telemac-test



Loading Options and Configurations

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


_ _ _

| | (_) (_)

_ _ _ __ | | __ _ __ ___ __ __ _ __ _ __ ___ __ __ _ ___ _ ___ _ __

| | | || '_ \ | |/ /| '_ \ / _ \ \ \ /\ / /| '_ \ | '__| / _ \\ \ / /| |/ __|| | / _ \ | '_ \

| |_| || | | || < | | | || (_) | \ V V / | | | | | | | __/ \ V / | |\__ \| || (_) || | | |

\__,_||_| |_||_|\_\|_| |_| \___/ \_/\_/ |_| |_| |_| \___| \_/ |_||___/|_| \___/ |_| |_|

... parsing configuration file: /opt/ohpc/pub/telemac/v7/configs/systel.cis-centos.cfg



Running your CAS file for:

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


+> configuration: cluster

+> parallel mode on the HPC queue, using mpiexec within the queue.

| In that case, the file partitioning and assembly are done by the

| python on the main node.

| (the merge would have to be done manually)

| The only difference with hydru is the presence of the key

| hpc_cmdexec. Of course, you also need the key hpc_stdin.

+> root: /opt/ohpc/pub/telemac/v7



~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~



... reading the main module dictionary


... processing the main CAS file(s)

+> running in English


... handling temporary directories


... checking coupling between codes


... checking parallelisation


... first pass at copying all input files

copying: t2d_gouttedo.f /home/abol/telemac/gouttedo/t2d_gouttedo.cas_2022-09-14-11h46min55s/t2dfort.f

copying: geo_gouttedo.cli /home/abol/telemac/gouttedo/t2d_gouttedo.cas_2022-09-14-11h46min55s/T2DCLI

copying: geo_gouttedo.slf /home/abol/telemac/gouttedo/t2d_gouttedo.cas_2022-09-14-11h46min55s/T2DGEO

re-copying: /home/abol/telemac/gouttedo/t2d_gouttedo.cas_2022-09-14-11h46min55s/T2DCAS

copying: telemac2d.dico /home/abol/telemac/gouttedo/t2d_gouttedo.cas_2022-09-14-11h46min55s/T2DDICO

... checking the executable

mpifort for MVAPICH2 version 2.2

ifort: warning #10315: specifying -lm before files may supersede the Intel(R) math library and affect performance

ifort version 17.0.1

/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/bin/intel64/fortcom -mGLOB_em64t=TRUE -mP1OPT_version=17.0-intel64 -mGLOB_diag_file=t2dfort.diag -mGLOB_long_size_64 -mGLOB_routine_pointer_size_64 -mGLOB_source_language=GLOB_SOURCE_LANGUAGE_F90 -mP2OPT_static_promotion -mP1OPT_print_version=FALSE -mCG_use_gas_got_workaround=F -mP2OPT_align_option_used=TRUE -mGLOB_gcc_version=830 "-mGLOB_options_string=-I/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/include -I/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/include -convert big_endian -lpthread -v -lm -o t2d_gouttedo -L/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib -lmpifort -Wl,-rpath -Wl,/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib -Wl,--enable-new-dtags -lmpi" -mGLOB_cxx_limited_range=FALSE -mCG_extend_parms=FALSE -mGLOB_compiler_bin_directory=/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/bin/intel64 -mGLOB_as_output_backup_file_name=/tmp/ifortKY2l8Was_.s -mGLOB_dashboard_use_source_name -mIPOPT_activate -mGLOB_product_id_code=0x22006d8f -mP3OPT_use_mspp_call_convention -mP2OPT_subs_out_of_bound=FALSE -mP2OPT_disam_type_based_disam=2 -mGLOB_ansi_alias -mPGOPTI_value_profile_use=T -mGLOB_opt_report_use_source_name -mP2OPT_il0_array_sections=TRUE -mGLOB_offload_mode=1 -mP2OPT_offload_unique_var_string=ifort1737642580wqVHKX -mP2OPT_hlo -mP2OPT_hpo_rtt_control=0 -mIPOPT_args_in_regs=0 -mP2OPT_disam_assume_nonstd_intent_in=FALSE -mGLOB_imf_mapping_library=/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/bin/intel64/libiml_attr.so -mPGOPTI_gen_threadsafe_level=0 -mIPOPT_link -mIPOPT_ipo_activate -mIPOPT_mo_activate -mIPOPT_source_files_list=/tmp/ifortslisldll71 -mIPOPT_mo_global_data -mIPOPT_link_script_file=/tmp/ifortscriptQGEOMF "-mIPOPT_cmdline_link="/lib/../lib64/crt1.o" "/lib/../lib64/crti.o" "/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/crtbegin.o" "--eh-frame-hdr" "--build-id" "-dynamic-linker" "/lib64/ld-linux-x86-64.so.2" "-m" "elf_x86_64" "-L/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib" "-o" "t2d_gouttedo" "/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64_lin/for_main.o" "-L/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib" "-L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/tbb/lib/intel64/cc4.1.0_libc2.4_kernel2.6.16.21" "-L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64" "-L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64" "-L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/ipp/lib/intel64" "-L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64_lin" "-L/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/" "-L/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/../../../../lib64" "-L/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/../../../../lib64/" "-L/lib/../lib64" "-L/lib/../lib64/" "-L/usr/lib/../lib64" "-L/usr/lib/../lib64/" "-L/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib/" "-L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/" "-L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/" "-L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/ipp/lib/intel64/" "-L/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/../../../" "-L/lib64" "-L/lib/" "-L/usr/lib64" "-L/usr/lib" "-lpthread" "-L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64_lin" "-Bstatic" "-limf" "-Bdynamic" "-lm" "t2dfort.o" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/telemac2d/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/waqtel/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/tomawac/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/sisyphe/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/stbtel/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/ad/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/nestor/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/bief/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/hermes/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/parallel/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/damocles/homere_telemac2d.a" "/opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/special/homere_telemac2d.a" "/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib/libmpich.so" "/opt/ohpc/pub/metis/5.1.0-intel/lib/libmetis.a" "-lmpifort" "-rpath" "/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib" "--enable-new-dtags" "-lmpi" "-Bdynamic" "-Bstatic" "-lifport" "-lifcoremt" "-limf" "-lsvml" "-Bdynamic" "-lm" "-Bstatic" "-lipgo" "-lirc" "-Bdynamic" "-lpthread" "-Bstatic" "-lsvml" "-Bdynamic" "-lc" "-lgcc" "-lgcc_s" "-Bstatic" "-lirc_s" "-Bdynamic" "-ldl" "-lc" "/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/crtend.o" "/lib/../lib64/crtn.o"" -mIPOPT_il_in_obj -mIPOPT_ipo_activate_warn=FALSE -mIPOPT_obj_output_file_name=/tmp/ipo_iforthSWC6b.o -mIPOPT_whole_archive_fixup_file_name=/tmp/ifortwarch5FvTbS -mGLOB_linker_version=2.27 -mGLOB_driver_tempfile_name=/tmp/iforttempfileCcoVSv -mP3OPT_asm_target=P3OPT_ASM_TARGET_GAS -mGLOB_async_unwind_tables=TRUE -mGLOB_obj_output_file=/tmp/ipo_iforthSWC6b.o -mGLOB_source_dialect=GLOB_SOURCE_DIALECT_NONE -mP1OPT_source_file_name=ipo_out.f -mP2OPT_symtab_type_copy=true t2dfort.o -mIPOPT_object_files=T -mIPOPT_assembly_files=/tmp/ifortalisvOFvrt -mIPOPT_generated_tempfiles=/tmp/ifortelisynAX66 -mIPOPT_embedded_object_base_name=/tmp/iforteobj5zPpMK -mIPOPT_cmdline_link_new_name=/tmp/ifortllisieqSro

ld /lib/../lib64/crt1.o /lib/../lib64/crti.o /opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/crtbegin.o --eh-frame-hdr --build-id -dynamic-linker /lib64/ld-linux-x86-64.so.2 -m elf_x86_64 -L/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib -o t2d_gouttedo /opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64_lin/for_main.o -L/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib -L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/tbb/lib/intel64/cc4.1.0_libc2.4_kernel2.6.16.21 -L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 -L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64 -L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/ipp/lib/intel64 -L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64_lin -L/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/ -L/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/../../../../lib64 -L/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/../../../../lib64/ -L/lib/../lib64 -L/lib/../lib64/ -L/usr/lib/../lib64 -L/usr/lib/../lib64/ -L/opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib/ -L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/ -L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/ -L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/ipp/lib/intel64/ -L/opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/../../../ -L/lib64 -L/lib/ -L/usr/lib64 -L/usr/lib -lpthread -L/opt/ohpc/pub/intel/parallel_studio_xe_2017/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64_lin -Bstatic -limf -Bdynamic -lm t2dfort.o /opt/ohpc/pub/telemac/v7/builds/cluster/lib/telemac2d/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/waqtel/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/tomawac/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/sisyphe/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/stbtel/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/ad/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/nestor/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/bief/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/hermes/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/parallel/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/damocles/homere_telemac2d.a /opt/ohpc/pub/telemac/v7/builds/cluster/lib/utils/special/homere_telemac2d.a /opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib/libmpich.so /opt/ohpc/pub/metis/5.1.0-intel/lib/libmetis.a -lmpifort -rpath /opt/ohpc/pub/mvapich2/2.2/intel-2017.1.132/lib --enable-new-dtags -lmpi -Bdynamic -Bstatic -lifport -lifcoremt -limf -lsvml -Bdynamic -lm -Bstatic -lipgo -lirc -Bdynamic -lpthread -Bstatic -lsvml -Bdynamic -lc -lgcc -lgcc_s -Bstatic -lirc_s -Bdynamic -ldl -lc /opt/ohpc/pub/compiler/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0/crtend.o /lib/../lib64/crtn.o

created: t2d_gouttedo

re-copying: t2d_gouttedo /home/abol/telemac/gouttedo/t2d_gouttedo.cas_2022-09-14-11h46min55s/out_t2d_gouttedo


... modifying run command to MPI instruction

... modifying run command to PARTEL instruction


... partitioning base files (geo, conlim, sections, zones and weirs)

+> /opt/ohpc/pub/telemac/v7/builds/cluster/bin/partel < PARTEL.PAR >> partel_T2DGEO.log

0


... splitting / copying other input files


... handling sortie file(s)



Running your simulation(s) :

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~



... modifying run command to HPC instruction

_ _ ___ _ _ _

| | | | |__ \ | | | | | |

| |_ ___ | | ___ _ __ ___ __ _ ___ ) | __| | ______ | |_ _ __ _ _ _ __ | | __

| __| / _ \| | / _ \| '_ ` _ \ / _` | / __| / / / _` | |______| | __|| '__|| | | || '_ \ | |/ /

| |_ | __/| || __/| | | | | || (_| || (__ / /_ | (_| | | |_ | | | |_| || | | || <

\__| \___||_| \___||_| |_| |_| \__,_| \___||____| \__,_| \__||_| \__,_||_| |_||_|\_\

Submitted batch job 962653

... Your simulation (t2d_gouttedo.cas) has been launched through the queue.


+> You need to wait for completion before re-collecting files using the option --merge




My work is done


The directory that gets created has the following:


[abol@katahdin t2d_gouttedo.cas_2022-09-14-11h46min55s]$ ls -alrt

total 12358

drwx------ 1 abol abol 21 Sep 14 11:46 ..

-rw-rw-r-- 1 abol abol 5242 Sep 14 11:46 t2dfort.f

-rw-rw-r-- 1 abol abol 19296 Sep 14 11:46 T2DCLI

-rw-rw-r-- 1 abol abol 181988 Sep 14 11:46 T2DGEO

-rw-rw-r-- 1 abol abol 2745 Sep 14 11:46 T2DCAS

-rw-rw-r-- 1 abol abol 207550 Sep 14 11:46 T2DDICO

-rw-rw-r-- 1 abol abol 4 Sep 14 11:46 CONFIG

-rw-rw-r-- 1 abol abol 8848 Sep 14 11:46 t2dfort.o

-rwxrwxr-x 1 abol abol 10424008 Sep 14 11:46 out_t2d_gouttedo

-rw-rw-r-- 1 abol abol 53 Sep 14 11:46 MPI_HOSTFILE

-rw-rw-r-- 1 abol abol 44 Sep 14 11:46 PARTEL.PAR

-rw-rw-r-- 1 abol abol 74 Sep 14 11:46 PARAL

-rw-rw-r-- 1 abol abol 14163 Sep 14 11:46 T2DPAR00005-00000

-rw-rw-r-- 1 abol abol 11946 Sep 14 11:46 T2DPAR00005-00002

-rw-rw-r-- 1 abol abol 18450 Sep 14 11:46 T2DPAR00005-00001

-rw-rw-r-- 1 abol abol 13343 Sep 14 11:46 T2DPAR00005-00003

-rw-rw-r-- 1 abol abol 16942 Sep 14 11:46 T2DPAR00005-00004

-rw-rw-r-- 1 abol abol 12617 Sep 14 11:46 T2DPAR00005-00005

-rw-rw-r-- 1 abol abol 31248 Sep 14 11:46 T2DGEO00005-00000

-rw-rw-r-- 1 abol abol 8944 Sep 14 11:46 T2DCLI00005-00000

-rw-rw-r-- 1 abol abol 31140 Sep 14 11:46 T2DGEO00005-00001

-rw-rw-r-- 1 abol abol 5408 Sep 14 11:46 T2DCLI00005-00001

-rw-rw-r-- 1 abol abol 31188 Sep 14 11:46 T2DGEO00005-00003

-rw-rw-r-- 1 abol abol 30632 Sep 14 11:46 T2DGEO00005-00002

-rw-rw-r-- 1 abol abol 8944 Sep 14 11:46 T2DCLI00005-00003

-rw-rw-r-- 1 abol abol 14560 Sep 14 11:46 T2DCLI00005-00002

-rw-rw-r-- 1 abol abol 31328 Sep 14 11:46 T2DGEO00005-00004

-rw-rw-r-- 1 abol abol 4992 Sep 14 11:46 T2DCLI00005-00004

-rw-rw-r-- 1 abol abol 31344 Sep 14 11:46 T2DGEO00005-00005

-rw-rw-r-- 1 abol abol 14352 Sep 14 11:46 T2DCLI00005-00005

-rw-rw-r-- 1 abol abol 3027 Sep 14 11:46 partel_T2DGEO.log

-rwxr-xr-x 1 abol abol 377 Sep 14 11:46 HPC_STDIN

-rw-rw-r-- 1 abol abol 10638 Sep 14 11:46 hpc-job.sortie

drwxrwxr-x 1 abol abol 44 Sep 14 11:47 .

-rw-rw-r-- 1 abol abol 228144 Sep 14 11:47 T2DRES00005-00002

-rw-rw-r-- 1 abol abol 233364 Sep 14 11:47 T2DRES00005-00001

-rw-rw-r-- 1 abol abol 232728 Sep 14 11:47 T2DRES00005-00000

-rw-rw-r-- 1 abol abol 233568 Sep 14 11:47 T2DRES00005-00005

-rw-rw-r-- 1 abol abol 234048 Sep 14 11:47 T2DRES00005-00004

-rw-rw-r-- 1 abol abol 231924 Sep 14 11:47 T2DRES00005-00003

-rw-rw-r-- 1 abol abol 10524 Sep 14 11:47 PE00005-00002.LOG

-rw-rw-r-- 1 abol abol 10524 Sep 14 11:47 PE00005-00001.LOG

-rw-rw-r-- 1 abol abol 10524 Sep 14 11:47 PE00005-00005.LOG

-rw-rw-r-- 1 abol abol 10524 Sep 14 11:47 PE00005-00004.LOG

-rw-rw-r-- 1 abol abol 10524 Sep 14 11:47 PE00005-00003.LOG

-rw-rw-r-- 1 abol abol 12 Sep 14 11:47 out_t2d_gouttedo.err


And the SLURM script that was made (called HPC_STDIN) looks like:


#!/bin/bash -l

#SBATCH --output=hpc-job.sortie

#SBATCH --error=out_t2d_gouttedo.err

#SBATCH --job-name=telemac-test

#SBATCH --partition=haswell

#SBATCH --nodes=2

#SBATCH --ntasks-per-node=3

#SBATCH --mincpus=3

#SBATCH --time=10:00

#SBATCH --mem=59gb

module load telemac/v7 anaconda2

srun /home/abol/telemac/gouttedo/t2d_gouttedo.cas_2022-09-14-11h46min55s/out_t2d_gouttedo

exit