OGSA Test Grid Dave Berry, Research Manager NeSC Review, 18 March 2003

advertisement
OGSA Test Grid
Dave Berry, Research Manager
NeSC Review, 18th March 2003
Project Aims
To investigate deployment of OGSA
Service-based Grids
Initially GT3
Moving to GT4 technology preview
Focus on generic issues
Deploy sample applications
Produce information for OMII/GOC/ETF
£20,000 value to NeSC
0.5 FTE for one year
Project Members
Wolfgang Emmerich (UCL) – PI
Paul Brebner – Project Manager
Dave Berry (NeSC)
Oliver Malham – RA
Steven Newhouse (LeSC)
David McBride – RA
Paul Watson (NEReSC)
Savas Parastatidis – Local Manager
Jake Wu – RA
Sister Project (UK)
One of two OGSA Test Grids
Sister project differs slightly in focus
Slower move to WSRF
More emphasis on interoperability testing
Led by Mark Baker (Portsmouth)
Two projects keep in touch
Sister Project (NeSC)
IBM Early Evaluation Project
Test IBM’s OGSI/WSRF releases
Feedback to IBM
Also test Globus releases
Feedback to IBM and Globus
Overlap with OGSA Test Grid
Common testing
Aim to produce test application(s)
£100,000 value to NeSC
1.0 FTE for two years
OGSA Test Grid Status
Project started Dec. 15th
GT3 installed at all sites
Newcastle and UCL had problems with
GT3.2 alpha.
Fixed after upgrade to GT3.2 beta.
Accounts and Certificates installed
Testing basic connectivity
Problems with firewalls, port
configurations, etc.
Reverting to remote logins to resolve
networking issues
Testing plans
Scripts
Based on GITS from ETF Grid
New performance tests
Central database for results
Application: e-Materials
Simulate growth of crystals
Two services co-ordinated by BPEL
Java and Fortran
Deployment services
How do we deploy components on
remote machines?
Investigate remote deployment
mechanisms
Working with HP (SmartFrog)]
Three possible scenarios…
Scenario 1: Manual Installation
1. Administrators install
application on server
2. Client runs remote
application
Library consistency checked manually
Deployment tested manually
Security checked manually
Scenario 2: Separate Services
1. Deployment service
installs application
from Repository
2. Client runs remote
application
Library consistency automatically enforced
Automatic test scripts to check deployment
Security by trusting installer
Scenario 3: Remote execution
service
1. Client sends application
to remote server
2.
Server runs
application
Stronger
security
required
andproof-carrying
returns result code?)
(e.g.
Library consistency automatically enforced
Automatic test scripts check entire process
Security by trusting client
IBM Early Evaluation Status
Project started July
Initial learning period
Updated GT3 training materials
Sample application: SAT-Trac
C++ application
First port to Grid Services
Various problems encountered and
reported at 6-month point
Second attempt more successful
Some residual security problems
Questions?
Scenario 1: Manual Installation
1. Administrators install application on
server
• Library consistency checked manually
• Deployment tested manually
• Security checked manually
2. Client runs remote application
Scenario 2: Separate services
1. Deployment service installs
application from Repository
• Library consistency automatically
detected and/or enforced
• Automatic test scripts to check
deployment
• Security by trusting installer
2. Client runs remote application
Scenario 3: Remote execution
service
1. Client sends application to remote
server
• Library consistency automatically
detected and/or enforced
2. Server runs application and returns
result
• Automatic test scripts check entire
process
• Security by trusting client
• Possibly requires stronger security (e.g.
proof-carrying code?)
Download