Difference between revisions of "General use of the TELEMAC system"
Line 4: | Line 4: | ||
This page describes the general use of the TELEMAC system in Geographical Sciences. | This page describes the general use of the TELEMAC system in Geographical Sciences. | ||
− | + | TELEMAC-2D, SISYPHE, ESTEL-2D and ESTEL-3D are available. More modules could be added if necessary. Just ask. | |
− | TELEMAC | + | = Linux = |
+ | The TELEMAC system is installed centrally on "dylan" which the Linux operating system (CentOS). | ||
+ | '''You will need to log into dylan and use linux commands to run TELEMAC jobs'''. Therefore it helps to practice a bit in a Linux environment. The [[:category:Pragmatic Programming | Pragmatic Programming]] course might be a good place for this. Ask the scientific computer officer for pointers if you need some and get some training if required. | ||
= Environment set-up = | = Environment set-up = | ||
Line 26: | Line 28: | ||
/home/telemac/bin/telemac2d | /home/telemac/bin/telemac2d | ||
</pre> | </pre> | ||
+ | |||
+ | Note that if you log in another machine (i.e. not dylan) you might get an error message about "/home/telemac" not existing or file not found. this is normal. It does not exist of the other machine... Live with it or adapt the .bashrc so that the files are sourced only on dylan. | ||
= Test = | = Test = | ||
− | Telemac- | + | Telemac-2d includes some test cases. Copy one into your filespace and run it: |
<pre> | <pre> | ||
Line 37: | Line 41: | ||
</pre> | </pre> | ||
− | If this works, you have a well configured | + | If this works, you have a well configured environment. Now go and do some real work with your own files |
+ | |||
+ | = A note about ascii and binary files = | ||
+ | |||
= Parallel jobs = | = Parallel jobs = | ||
− | The TELEMAC is configured to run in parallel mode if requested by the user. This is actually a very simple thing to do and highly encouraged if you use large meshes and run long simulations. | + | The TELEMAC is configured to run in parallel mode if requested by the user. This is actually a very simple thing to do and highly encouraged if you use large meshes and run long simulations. However, a few extra initial steps are required. |
TELEMAC uses MPI for parallel operations. MPI requires a secret word in a hidden configuration file. Simply type the following instructions to create it. Note that "somethingsecret" below should contains no spaces. | TELEMAC uses MPI for parallel operations. MPI requires a secret word in a hidden configuration file. Simply type the following instructions to create it. Note that "somethingsecret" below should contains no spaces. | ||
Line 58: | Line 65: | ||
</pre> | </pre> | ||
− | The example above should run in about 55s on dylan. Now edit cas.txt so that the line about the number | + | The example above should run in about 55s on dylan. Now edit cas.txt so that the line about the number of processors looks like: |
<pre> | <pre> | ||
PARALLEL PROCESSORS = 8 | PARALLEL PROCESSORS = 8 | ||
</pre> | </pre> | ||
+ | |||
+ | Note that dylan has 8 processor cores so the system is configured to run with '''8 processors as a maximum'''. Put 0 to run in scalar mode. "1" runs in parallel mode but with one processor only, so "0" and "1" should gibve the ssame results despite two different libraries. | ||
Before you can run TELEMAC in parallel, you need to start the MPI daemon. This needs to be done once per login. | Before you can run TELEMAC in parallel, you need to start the MPI daemon. This needs to be done once per login. |
Revision as of 16:55, 18 September 2008
This page describes the general use of the TELEMAC system in Geographical Sciences.
TELEMAC-2D, SISYPHE, ESTEL-2D and ESTEL-3D are available. More modules could be added if necessary. Just ask.
Linux
The TELEMAC system is installed centrally on "dylan" which the Linux operating system (CentOS). You will need to log into dylan and use linux commands to run TELEMAC jobs. Therefore it helps to practice a bit in a Linux environment. The Pragmatic Programming course might be a good place for this. Ask the scientific computer officer for pointers if you need some and get some training if required.
Environment set-up
It is very easy to configure the environment to use TELEMAC as you simply have to source central files. Simply add the following lines into your .bashrc configuration file, then log-out and back in again.
# Location of the TELEMAC system SYSTEL90=/home/telemac export SYSTEL90 source $SYSTEL90/intel_env source $SYSTEL90/config/systel_env
You should then be able to "see" the Fortran compiler and the programs of the TELEMAC system, for instance:
$ which telemac2d /home/telemac/bin/telemac2d
Note that if you log in another machine (i.e. not dylan) you might get an error message about "/home/telemac" not existing or file not found. this is normal. It does not exist of the other machine... Live with it or adapt the .bashrc so that the files are sourced only on dylan.
Test
Telemac-2d includes some test cases. Copy one into your filespace and run it:
$ cp -r /home/telemac/telemac2d/tel2d_v5p8/test.gb/hydraulic_jump . $ cd hydraulic_jump $ telemac2d cas.txt
If this works, you have a well configured environment. Now go and do some real work with your own files
A note about ascii and binary files
Parallel jobs
The TELEMAC is configured to run in parallel mode if requested by the user. This is actually a very simple thing to do and highly encouraged if you use large meshes and run long simulations. However, a few extra initial steps are required.
TELEMAC uses MPI for parallel operations. MPI requires a secret word in a hidden configuration file. Simply type the following instructions to create it. Note that "somethingsecret" below should contains no spaces.
$ cd $ touch .mpd.conf $ chmod 600 .mpd.conf $ echo "MPD_SECRETWORD=somethingsecret " > .mpd.conf
Run the software once in scalar mode once to look at job duration, for instance:
$ cp -r /home/telemac/telemac2d/tel2d_v5p8/test.gb/cavity . $ cd cavity/ $ telemac2d cas.txt
The example above should run in about 55s on dylan. Now edit cas.txt so that the line about the number of processors looks like:
PARALLEL PROCESSORS = 8
Note that dylan has 8 processor cores so the system is configured to run with 8 processors as a maximum. Put 0 to run in scalar mode. "1" runs in parallel mode but with one processor only, so "0" and "1" should gibve the ssame results despite two different libraries.
Before you can run TELEMAC in parallel, you need to start the MPI daemon. This needs to be done once per login.
$ mpd &
Then, you can now run teleun telemac2d again:
$ telemac2d cas.txt
It should run again, faster, maybe 30 seconds or so. It is not a lot faster because it's a silly example and splitting the mesh in 8 subdomains accounts for a large part of the computation time.
Before you log out, it is a good idea to kill the MPI daemon:
$ mpdallexit