Creating and running an application on the NGS -support.ac.uk

advertisement

http://www.grid-support.ac.uk

http://www.ngs.ac.uk

Creating and running an application on the NGS

http://www.nesc.ac.uk/ http://www.pparc.ac.uk/ http://www.eu-egee.org/

Outline

• A “User interface” machine and our set-up today

• Commands to be used

• Practical:

– Port code and data from desktop/UI to the NGS compute nodes

– Compile and run code

– Invoke your application from the UI machine

• Summary

4

Goals

• You’ve got your proxy on a UI machine and want to know how to port and run an application

• Er.. What’s a UI machine?

5

The “UI” machine

• The users interface to the grid

– Where you upload your certificate for your session

– Where you create proxy certificates

– Where you can run the various commands

6

Our setup

ssh

UI

Internet

Core NGS nodes grid-data.rl.ac.uk

Tutorial room machines

7

UI

Secure file copy

NGS compute node gsiscp: copies file using proxy certificate to allow

AA

8

Open shell on NGS CN

UI

Code and data

NGS compute node

Code and data gsissh

Can be an Xwindows client

Compile, edit, recompile, build

SHORT interactive runs are ok (sequential)

Totalview debugger.

9

Establishing contact

• Connecting to the NGS compute node

• gsissh via port 2222

• Modify environment on the NGS account

(note which node you use and keep using it, or repeat commands on other nodes)

– .bashrc

• Used when you connect with gsissh

• Used by globus

10

• gsissh -X -p 2222 grid-compute.leeds.ac.uk

• -X: enables forwarding of X11 connections

• -p: sets port through which gsissh works

• [ can set –X –p 2222 as defaults in config files]

11

Run jobs from the UI

UI

Code and data

NGS compute node

Code and data

Executables globus_job_run

Or globus_job_submit / globus_get_output

Can pass files with these commands: e,g, parameters for a job.

12

PRACTICAL 2

13

The “UI” machine

• The users interface to the grid

– Where you upload your certificate for your session

– Where you create proxy certificates

– Where you can run the various commands

• That was the good news.

• The bad news: install your own (currently!)

– May be your desktop with additional software

(GT2) – if you run UNIX derivative

– Note: 31 March - a 1 day course on this.

14

the gatekeeper reads the gridmapfile to map the user’s id from their proxy certificate to a local account

globus_job_run

Job request head-node

Globus gatekeeper

Forks process to run command.

Info system gridmapfile

I.S.

Job runs on head-node

16

the gatekeeper reads the gridmapfile to map the user’s id from their proxy certificate to a local account

globus_job_submit

Job request head-node

Globus gatekeeper

Job queue: PBSPro

Info system gridmapfile

I.S.

Job runs on a cluster node

17

Questions -1

• “How do I know which compute node to use?”

– As more nodes join the NGS, use the Information Service

(next talk)

– You should find that the Core nodes of the NGS all run the same software

– In general you are likely to use the same machine

• Is my NGS Compute Node account shared across all machines??

– NO – You must synchronise your accounts on different machines yourself. Your account names may be different on each machine. Use GridFTP (from portal) or gsi-scp

– You can hold files in the SRB,(Storage Resource Broker –see tomorrow) and read/write these from any compute node

18

Questions -2

• “Should I stage an executable?” (stage = Send it to a compute node from my desktop/UI)

– Only if the UI produces a binary-compatible file!

– Safer to

• Check it compiles locally

• Copy to a compute node

• Compile it there

19

Further information

• VDT documentation http://www.cs.wisc.edu/vdt/documentation.html

• NGS user pages http://www.ngs.ac.uk/users/userguide.html

20

Extras for people who like MPI

(outside scope of today)

21

.bashrc example, for MPI jobs

• [ngs0255@grid-compute ngs0255]$ cat .bashrc

• # .bashrc

• # Source global definitions

• if [ -f /etc/bashrc ]; then

. /etc/bashrc batch queue manager

Setting environment to use MPI with the

• fi

• module add clusteruser

• module remove mpich-gm

• module remove intel-compiler

• module remove intel-math

• module add mpich-gm/1.2.5..10-gnu

GNU compiler.

Can then compile with mpicc

See user documentation on NGS website

22

Check loaded modules

[ ngs0255@grid-compute ngs0255]$ module list

Currently Loaded Modulefiles:

1) /dot 6) /gm/2.0.8

2) /null 7) /globus/2.4.3

3) /modules 8) /clusteruser

4) /cluster-tools/0.9 9) /mpich-gm/1.2.5..10-gnu

5) /pbs/5.3.3

23

globus-job-run

Examples

• globus-job-run grid-data.man.ac.uk /bin/date

• globus-job-run grid-data.rl.ac.uk ./test

• globus-job-run \ grid-compute.leeds.ac.uk/jobmanager-pbs \

-np 8 -x ‘(jobtype=mpi)(environment= \

(NGSMODULES clusteruser))’ ./MPI/test

• globus-job-run grid-data.rl.ac.uk -s ./test

24

globus-job-submit

Examples

• globus-job-submit \ grid-data.rl.ac.uk/jobmanager-pbs ./test

• globus-job-submit \ grid-compute.leeds.ac.uk/jobmanager-pbs \

-x ‘(jobtype=mpi)(directory=/home/bob/mpi) \

(environment=(NGSMODULES clusteruser)) \

(count=8)’ ./mpi_program

• globus-job-submit \ grid-data.man.ac.uk/jobmanager-pbs -s ./test

25

• globus-job-submit gridcompute.leeds.ac.uk/jobmanager-pbs -np

• 2 -x

'(jobtype=mpi)(directory=/home/data01_b/ng s0255)

• (environment=(NGSMODULES clusteruser))

• ' ./M3

26

Download