Compile XBeach parallel version for use on Linux/cluster

  1. On your Windows PC, start Exceed > Exceed XDMCP Broadcast (or the icon on your desktop).
  2. Choose 'devux.wldelft.nl'.
  3. Under Sessions, choose 'GNOME'.
  4. Use your Deltares user name and password to log in.
  5. Start a Terminal session (Applications > System Tools > Terminal).
  6. Make a directory "checkouts":
    mkdir ~/checkouts
  7. Checkout the latest and greatest version of XBeach (enter your Deltares password if asked for):
    svn co https://repos.deltares.nl/repos/XBeach/trunk ~/checkouts/XBeach
    If you already have the local repository, but want to update it, use:
    svn update
  8. Go to the XBeach directory:
    cd ~/checkouts/XBeach
  9. Run
    FC=ifort ./configure
  10. Run
    make 
  11. Make sure version 10 of the Intel Fortran compiler is used (instead of version 8):
    . /opt/intel/fc/10/bin/ifortvars.sh
  12. Delete all files not needed for compiling to get rid of files that could mess it up:
    make clean
  13. Compile the parallel version:
    PATH=/opt/mpich2-1.0.7/bin:$PATH USEMPI=yes ./configure && make
  14. (optional) Copy the executable to your personal bin-folder:
    cd ~/bin
    cp ~/checkouts/bin/xbeach.mpi .

Compiled version

Compiled Linux versions of XBeach (both the MPI and serial version) can be found in Arend's bulletin box:

/BULLETIN/pool_ad/xbeach_linux/   # or
/u/pool_ad/BULLETIN/xbeach_linux/

Run XBeach parallel version on h3 cluster

To run a parallel XBeach job on the h3 cluster (from now on 'h3'), you need 3 things:

  1. A parallel version of XBeach somewhere on the u-drive (preferably in /u/username/bin)
  2. A job (shell) file you feed to the cluster
  3. A directory on your part of the u-drive with the simulation-data (params.txt, bathy.dep, etc)

Logging on to cluster

Windows
The easiest way to log on to h3, is using the program PuTTY, which can be found on the Desktop of your Deltares PC. The first time you connect to h3, you need to supply some basic parameters, which you can save for later use (with 'Save'). In the Dialog Box, under Host Name, fill in: h3.wldelft.nl; you don't need to touch the other options (leave the Protocol to SSH). Optionally, save the information as e.g. 'h3'. The first time you connect to h3, you'll probably see a message about the server's host key. Click Yes to accept. Log in with your Deltares user name and password.

Linux
If you want to connect from e.g. the development server (devux) to h3, you can connect from a terminal session. Type

ssh h3

to connect to h3. Your user name has already been sent for you, so you only need to submit your password.

Obtain latest version of XBeach executable

There are 2 ways to obtain the latest (or any) version of the parallel XBeach executable:

  1. Compile it yourself (see the instructions in #compilexbeach)
  2. Copy it from Arend Pool's BULLETIN:
    cd ~/bin
    cp /u/pool_ad/BULLETIN/xbeach_linux/current/xbeach.mpi .

Obtain the XBeach MPI job file

There are also 2 ways to obtain the job file to run xbeach.mpi on h3:

  1. Copy it from Arend Pool's BULLETIN:
    mkdir ~/simulations    # can be skipped if directory already exists
    cd ~/simulations
    cp /u/pool_ad/BULLETIN/xbeach_linux/xbeach.sh .
    In the above instructions, it is assumed you place the job file in the direction /simulations. If you want to place it somewhere else, feel free to do so and change the instructions accordingly.
  2. Create the shell file yourself in a location you prefer. The file should contain the following code:
    #!/bin/sh
    
    . /opt/sge/InitSGE
    export PATH="/opt/mpich2/bin:$PATH"
    echo "numslots: $DELTAQ_NumSlots"
    echo "nodes: $DELTAQ_NodeList"
    echo $DELTAQ_NodeList | tr ' ' '\n' | sed 's/.wldelft.nl//' > machines
    echo "Machines file:"
    cat machines
    mpdboot -1 -n $DELTAQ_NumSlots -f machines
    mpirun -np $DELTAQ_NumSlots ~/bin/xbeach.mpi
    mpdallexit
    
    The second last line should contain the path to xbeach.mpi. Edit this if you have placed it somewhere else.

Run the parallel job

Make sure you have placed your simulation (directory) somewhere on a shared location (u-drive or p-drive) and go ('cd') there (for example):

cd /u/username/simulations/simulation1   # or
cd /p/project/simulations/simulation1

Finally, submit your job to the grid engine (h3) with the following command:

qsub -pe spread N /path-to-job-file/xbeach.sh

where N is the number of nodes you want to use and path-to-job-file the path to xbeach.sh.

More info

The SGE User Guide can be found at the ICT department space of this wiki.

  • No labels