Resources for e-Science Bids at NeSC and EPCC Dr. Dave Berry Research Manager www.nesc.ac.uk 19th November 2003 The “e-” in “e-Science” E-Science bids must include “Grid Stretch” New technology enabling new science Collaboration is key Joint applications We need funding too! e-Science knowledge/links Advice and help with proposals Few staff, many proposals Shared understanding Application scientists must understand the Grid Grid researchers must understand the application Open Grid Services Architecture Applications for X Research Simulation, Analysis & Integration Technology for X Brokering Integration VOs Transactions Reservation Discovery Replication Accounting OGSA Workflow Queueing Registry Data Access CMM/WSDM Provisioning Authorisation Execution WS-Agreement OGSI: Interface to Grid Infrastructure Distributed Compute, Data & Storage Resources What exists now (roughly) … Applications for X Research Workflow Registry Data Access Authorisation Execution WS-Agreement OGSI: Interface to Grid Infrastructure Distributed Compute, Data & Storage Resources NeSC in the UK Globus Alliance National eScience Centre HPC(x) Edinburgh Glasgow Newcastle Belfast Directors’ Forum Engineering Task Force Grid Support Centre Architecture Task Force Database Task Force OGSA-DAI GGF DAIS-WG e-Science Institute Daresbury Lab Manchester Cambridge Hinxton Oxford Cardiff RAL London Southampton NeSC Core Grid Research Research Leaders Research projects UK E-Science Grid Engineering Task Force OGSA Test Grid Grid Projects BRIDGES IBM Early Evaluation edikt www.edikt.org EPCC HPC and Grid expertise HPCx, Globus Alliance Professional software development Successful Grid projects OGSA-DAI (databases, with IBM and Oracle) SunDCG (scheduling, with Sun and Globus) MS.NETGrid (OGSI, with Microsoft) GridWeaver (Configuration, with HP & Informatics) PGPGrid (Imaging, with Pepper’s Ghost Productions) FirstDIG (Data mining, with FirstBus plc) Hardware HPCx (EPCC/CLRC) UK National Service (portion of CPU time for Grid research) AIX, 1280 CPUs, 1.28TB RAM, 18TB disk Guaranteed levels of service Gilmour (NeSC) Main development machine Linux, 4 CPUs, 7GB RAM, 1TB disk BlueDwarf (NeSC/Informatics) Large memory computing AIX, 16 CPUs, 128GB RAM, 2.1TB+ disk Lomond (EPCC) HPC resource (portion of CPU time for Grid research) Solaris, 52 CPUS, 52GB RAM Guaranteed levels of service Other Resources Access Grid Two nodes: EPCC and NeSC e-Science Institute Research Visitors Events programme NeSCForge Project repository CVS, issue & task tracking, forums, etc. GridNet Funds travel to standardisation meetings (e.g. GGF) Web site National e-Science Centre http://www.nesc.ac.uk/ Mission, Background, Foundation, Locations, Staff, Resources, Projects Register interest, Mailing lists, NeSCForge Regional associations and Collaborations News, Notices Presentations and Lectures http://www.nesc.ac.uk/presentations/ National e-Science Institute http://www.nesc.ac.uk/esi/ Mission, Events (Future and Past) Register for Events, Visitor Programme UK e-Science Map and Index of Centres Technical Papers Index of >100 Projects Task Forces Careers http://www.nesc.ac.uk/centres/ http://www.nesc.ac.uk/technical_papers/ http://www.nesc.ac.uk/projects/ http://www.nesc.ac.uk/teams/ http://www.nesc.ac.uk/career/ General Information Glossary, Bibliography, Whose who