DNA1.9-Report-FINAL v.02

advertisement
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
ACTIVITY
AND
MANAGEMENT REPORT
FINAL REPORT, 1 MAY 2008
Document Filename:
TO
30 APRIL 2010
Activity:
BGII-DNA1.5-KTH-Activity-and-Management-Report-Y1v0.1
NA1
Partner(s):
KTH
Lead Partner:
KTH
Document classification:
PUBLIC
Abstract: Activity and management report for the BalticGrid-II project. The report
covers the full project duration, 24 months, 1 May 2008 to 30 April 2010.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 1 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Document review and moderation
Name
Partner
Date
Signature
Released for
moderation to
PMB
All
14/04/2009
Approved for
delivery by
PMB
All
30/04/2009
Document Log
Version
Date
Summary of changes
Author
0.4
29/04/2009
Merge of document with
Activity and Management
Report, to form the DNA1.5
document
Åke Edlund
0.5
29/04/2009
Update of VU data, and BNTU
presentations at EGEE UF.
Algimantas Juozapavicius, Ihar
Miklashevich
0.6
30/04/2009
Added project metric in NA1,
harmonizing the text.
Åke Edlund, All
0.7
30/04/2009
Financial data added, typos and
general info added.
All
1.0
30/04/2009
Final released version.
Åke Edlund
1.1
28/05/2009
Added Publishable Summary
Åke Edlund
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 2 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Contents
1. PUBLISHABLE EXECUTIVE SUMMARY – YEAR ONE ......................................................................... 4
2. ACTIVITIES PROGRESS DURING THE FIRST TWELVE MONTHS .................................................. 8
2.1. NA1 – MANAGEMENT OF THE PROJECT ................................................................................................................8
2.2. NA2 – EDUCATION, TRAINING, DISSEMINATION AND OUTREACH ................................................................ 11
2.3. NA3 – APPLICATION IDENTIFICATION AND SUPPORT ..................................................................................... 19
2.4. NA4 – POLICY AND STANDARDS DEVELOPMENT ............................................................................................. 22
2.5. SA1 – GRID OPERATIONS...................................................................................................................................... 23
2.6. SA2 – NETWORK RESOURCE PROVISIONING ..................................................................................................... 27
2.7. SA3 – APPLICATION INTEGRATION AND SUPPORT .......................................................................................... 33
2.8. JRA1 – ENHANCED APPLICATION SERVICES ON SUSTAINABLE EINFRASTRUCTURE.................................. 37
3. JUSTIFICATION OF MAJOR COST ITEMS AND RESOURCES ......................................................... 47
3.1. WORK PERFORMED BY EACH PARTNER DURING THE PERIOD ......................................................................... 47
3.1.1. KTH....................................................................................................................................................................... 47
3.1.2. EENET .................................................................................................................................................................. 48
3.1.3. KBFI...................................................................................................................................................................... 49
3.1.4. IMCS UL............................................................................................................................................................... 50
3.1.5. IFJ PAN ................................................................................................................................................................ 51
3.1.6. PSNC..................................................................................................................................................................... 52
3.1.7. VU.......................................................................................................................................................................... 53
3.1.8. RTU ....................................................................................................................................................................... 54
3.1.9. ITPA...................................................................................................................................................................... 55
3.1.10. CERN ................................................................................................................................................................. 57
3.1.11. NICH BNTU ..................................................................................................................................................... 58
3.1.12. UIIP .................................................................................................................................................................... 59
3.1.13. VGTU ................................................................................................................................................................. 60
3.2. BUDGETED COSTS AND ACTUAL COSTS .............................................................................................................. 62
3.2.1. BUDGETED PERSON MONTHS AND ACTUAL PERSON MONTHS .................................................................. 63
3.3. REBUDGETING ......................................................................................................................................................... 65
4. ANNEX ........................................................................................................................................................... 66
4.1. USE OF FOREGROUND AND DISSEMINATION ACTIVITIES DURING THE FIRST YEAR OF THE PROJECT ....... 66
4.2. REBUDGETING ......................................................................................................................................................... 73
4.3. LIST OF ACRONYMS ................................................................................................................................................. 74
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 3 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
1. PUBLISHABLE EXECUTIVE SUMMARY – YEAR ONE
The Baltic Grid Second Phase (BalticGrid-II) project is designed to increase the impact, adoption and
reach, and to further improve the support of services and users of the BalticGrid-I created Baltic States
e-Infrastructure.
This has been achieved through an extension of the Baltic Grid infrastructure to Belarus; initiated
efforts on interoperation of the gLite-based infrastructure with UNICORE and ARC based Grid
resources in the region; identifying and addressing the specific needs of new scientific communities
such as nano-science and engineering sciences; and by establishing new Grid services for linguistic
research, Baltic Sea environmental research, data mining tools for communication modeling and
bioinformatics.
Goal with the second phase is to ensure that the Baltic States e-Infrastructure will be fully
interoperable with the pan-European e-Infrastructures established by EGEE, EGEE associated
projects, and the planned EGI, with the goal of a sustained e-Infrastructure in the Baltic Region.
The present Baltic Grid e-Infrastructure of 26 clusters in five countries has more than doubled after
ending of BalticGrid-I, and is envisaged to grow, both in capacity and capability of its computing
resources.
The consortium is composed of 13 leading institutions in seven countries, with 7 institutions in
Estonia, Latvia and Lithuania, 2 in Belarus, 2 in Poland, and one each in Sweden and Switzerland.
The overall vision is to support and stimulate scientists and services used in the Baltic region to
conveniently access critical networked resources both within Europe and beyond, and thereby enable
the formation of effective research collaborations.
Table 1 – BalticGrid Partners
Partner Name
(Organisation, city,
country)
Kungliga Tekniska
Högskolan, Stockholm,
Sweden
(Coordinator)
Partner
Short name
Short description
KTH
The Royal Institute of Technology (Kungliga Tekniska
Högskolan), KTH, is responsible for one-third of Sweden’s
capacity for engineering studies and technical research at
post-secondary level. The acting partner from KTH in this
project is PDC – Centre for Parallel Computers. As
coordinator KTH is leading the NA1 activity and
contributes to NA2, NA4, SA1, SA2 and SA3.
Estonian Educational and EENet
Research Network,
Tartu, Estonia
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
EENet is the Estonian NREN. The mission of EENet is to
provide a high quality national network infrastructure for
the Estonian research and educational communities. EENet
is leading the SA3 activity and contributes to NA2, NA3,
NA4, SA1, and SA2.
INTERNAL
Page 4 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Partner Name
(Organisation, city,
country)
Partner
Short name
Short description
Keemilise ja Bioloogilse KBFI
Füüsika Instituut,
Tallinn, Estonia
KBFI (named NICPB in BG-I) is an autonomous public
research institution with about 140 employees. Main
research areas are in chemical physics, biology and
biotechnology, particle and nuclear physics, computational
physics and chemistry. KBFI is a collaborator of the
Compact Muon Solenoid experiment at CERN. KBFI is
leading the SA1 activity and contributes to NA2, NA3 and
SA2, and SA3.
Institute of Mathematics IMCS UL
and Computer Science,
University of Latvia,
Riga, Latvia
IMCS UL is leading research institution of mathematics
and computer science in Latvia. The institute also operates
the national research and education network LATNET
IMCS UL is leading the SA2 activity and contributes to
NA2, NA3, NA4, SA1, and SA3.
Instytut Fizyki Jadrowej, IFJPAN
im. Henryka
Niewodniczanskiego,
Polskiej Akademii Nauk,
Kraków, Poland
The Henryk Niewodniczanski Institute of Nuclear Physics
Polish Academy of Sciences is one of the largest research
institutions of Poland. It employees about 200 researchers,
mainly working on high energy and elementary particle
physics, physics of the structure of the nucleus and of
nuclear reaction mechanisms, studies of the structure,
interactions and properties of condensed matter, as well as
on applications of nuclear methods in geophysics, radiochemistry, medicine, biology, environmental physics and
material engineering. IFJ PAN is leader of activity NA2
and contributes to SA1 and SA3.
Poznan Supercomputing PSNC
and Networking Center,
Poznan, Poland
PSNC is a leading HPC Center in Poland, and a Systems
and Network Security Center as well as a R&D Center for
New Generation Networks, Grids and Portals. PSNC is the
operator of the Polish National Research and Education
Network PIONIER, operator of the optical city network
POZMAN. PSNC is leading the JRA1 activity and
contributes to NA2 and SA3 activities.
VU
Vilnius University,
Vilnius,
Lithuania (http://www.v
u.lt)
Vilnius University (http://www.vu.lt), the leading
university in Lithuania, is involved in fundamental and
applied research in mathematics, physics, chemistry,
biology and many other fields. The Faculties of Vilnius
University involved in this project, are: Faculty of
Mathematics and Informatics, Faculty of Chemistry,
Faculty of Physics, and Institute of Material and Applied
Sciences. VU is leading the NA3 activity and contributes to
NA2, NA4, SA2, SA3 and JRA1 activities.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 5 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Partner Name
(Organisation, city,
country)
Partner
Short name
Short description
RTU
Riga Technical
University, Riga, Latvia
Riga Technical University (RTU) is the oldest and at
present the second largest institution of higher education in
Latvia, offering advanced study programs in Engineering,
Technology, Natural Sciences, Architecture, and Business
Administration. RTU is leading the NA4 activity and
contributes to NA2, NA3, SA2 and SA3.
ITPA
Vilnius University
Institute of Theoretical
Physics and Astronomy,
Lithuania
ITPA is engaged in fundamental physics and astronomy
research such as theoretical particle physics calculations,
high precision spectroscopy of complex atoms and ions
involving several open electron shells, the structure of, and
processes within, molecules and their complexes, the
modelling of processes in stars. ITPA contributes to
activities NA2, NA3, NA4, SA2 and SA3.
CERN
The European
Organization for Nuclear
Research, Geneva,
Switzerland
CERN, the European Organisation for Nuclear Research,
funded by 20 European nations, is currently engaged in
constructing a new particle accelerator the Large Hadron
Collider (LHC), which will be the most powerful machine
of its type in the world, providing research facilities for
several thousand High Energy Physics (HEP) researchers.
CERN contributes to activities SA1 and SA3.
Research Division of
Belarusian National
Technical University
NICH BNTU The Belarusian National Technical University (BNTU) is
leading institution in higher technical education of the
Republic of Belarus, offering advanced educational
programs in Engineering, Technology, Natural Sciences,
Architecture, and Business Administration.
United Institute of
Informatics Problems of
National Academy of
Sciences of Belarus
UIIP NASB
The UIIP NASB is the leading Belarusian institution for
carrying out fundamental and applied research in the fields
of information technology, computer science, applied
mathematics, computer aided design and some other
attached fields in Belarus.
Vilnius Gediminas
Technical University
VGTU
VGTU is one of the biggest (the 3rd) universities in
Lithuania. It includes 8 faculties, Aviation Institute, 10
research institutes and 19 research laboratories. VGTU was
the first university in Lithuania who started working with
HPC: IBM SP in 1998, Beowulf cluster in 2002, the
Lithuanian record of the best parallel performance on
cluster “Vilkas” in 2005, GRID cluster in 2006.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 6 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
During the last half of the first year a focused effort on involving industry was made, resulting in the
Baltic Grid Innovation Lab (BGi), aiming at educating early stage startups in the use of Baltic Grid
resources, mainly through a cloud interface – the Baltic Cloud. Baltic Cloud and BGi are subprojects
within Baltic Grid. Specialized courses on cloud computing, using Baltic Grid resources, and startup
know-how, was initiated during the first year. Goal is to increase the competiveness of SMEs in the
region, and to build a network of early stage startups in the Baltic and Belarus region.
A series of events has been held introducing Grid technologies and its usage to the scientific
community in the Baltic states and Belarus; Grid Open Days to attract a general audience, tutorials
giving scientists and students a hands-on experience, seminar series to stimulate the interest for Grids.
Coordinator of Baltic Grid is Åke Edlund (KTH), Phone. 0046 8 790 6312, edlund@pdc.kth.se
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 7 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
2. ACTIVITIES PROGRESS DURING THE FIRST TWELVE MONTHS
2.1. NA1 – MANAGEMENT OF THE PROJECT
Objectives of the Activity
The objectives of the NA1 activity are the overall project management and reporting to the EU,
including: daily management of the project activities; resource allocation; monitoring; conflict
resolutions and corrective actions.
Progress Towards Objectives
Deliverables in the first year of the project were all on time and the activities are evolving according
to plan. Initiation of cross national groups aiming at NGI and Cloud Computing have successfully
evolved. Two all-hands meeting have taken place during the reporting period, first, kick-off, in
Vilnius in May, followed by the first BalticGrid meeting ever in Belarus, in Minsk in late October
and Riga in May 2009. Introduction of the new activity, SA3 and its collaboration with NA3, and
the new partners, UIIP, BNTU and VGTU, all went without complications.
During the first year the BG-II have strengthen its collaboration with EGEE, formalizing a MoU
with specific tasks and deadlines. In parallel the project have signed MoUs with UNICORE and
Nordugrid (ARC) to ensure a close collaboration with these middlewares. In addition the
Memorandum of Usage between the BalticGrid-II and BELIEF-II project was signed by Ake
Edlund and the BELIEF-II project coordinator in October 2008; the document defined the ways of
cooperation between the both projects.
New industry engagement initiative – the Baltic Grid Innovation Lab (BGi)
Background
The need to engage industry in using the BG-II resources, to better learn how to build a competitive
business has been pointed out a number of times both internally in the project as well as from
outside, e.g. at the last two reviews. This is a common challenge for all grid related projects, and
something we have not (so far) seen much progress in. In late 2008 NA1 initiated discussions on
participation in EU SME project call, deadline mid December, but after a low interest from industry
decision was made not to pursue this, and to look at alternatives.
The Initiative
A specific initiative to engage industry in the region was launched in the very beginning of January
2009 - the BalticGrid Innovation Lab (BGi). The BGi was established to build a network of BG-II
aware smaller companies, as a bottom-up approach. The top-down approach have not shown any
success as today, i.e. trying to engage larger, or mid-size companies to use grid technology.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 8 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Figure 1, BGi in relation to BG-II
The driving motor of the BGi initiative is a number of courses for last year students and early stage
startups in the region. These courses teach the users how to use cloud computing to migrate their
current or ongoing IT infrastructure, and how to use cloud computing to create new services for
their users and customers. The course is bundled with a basic innovation management knowledge.
Alumni of these courses will form a network of SMEs in the region. A network of companies that
know the value of cloud computing for their business.
To support this course, and its users usage of the BG-II resources a BalticCloud is being established.
The Baltic Cloud is a resource interface for the BGi users and for research groups in the region, who
prefer to access the BG-II resources through cloud interfaces rather than grid. The BalticCloud is
built on top of the BG-II using open source software. During the first three months cloud instances
were rolled out in all Baltic states as well as Belarus. A number of presentations were given on this
effort (see NA2) and the first version of the course was prepared, to be given at Tartu University in
May 2009.
The BGi and Baltic Cloud initiative has caught good attention from other grid initiatives, e.g.
EGEE-III.
Deviations from Plan
The national support of NGI is uneven in the region. On one hand we have LitGrid, the well
established NGI in Lithuania. LitGrid was one of the bidders on hosting the very EGI.org, a good
sign of maturity. On the other hand we have Estonia and Latvia, where there is no decision on the
political level to support NGIs. Especially in the Estonian case this is remarkable, considering the
strong repeated support from our External Advisory Committee, lead by Professor Eike Jessen. Also
for the Belarus it is not clear as today if there will be an NGI in place for the launch of EGI, a
launch that coincides with the ending of BG-II. BG-II is in collaboration with EGEE and related
projects, to find ways to mitigate the transition from EGEE and BG-II to EGI.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 9 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Project metric
The below project metric monitor the progress of the project towards achieving the BalticGrid-II
objectives.
*Deliverables acceptance by the EU, reported in the Quarterly Reports’ NA1 section.
Deliverables Q1
Q2
Q3
Q4
Q5
Q6
Q7
Q8
Number of
due
deliverables
6
8
6
9
3
6
4
9
Indicator,
PMB
approved,
and sent to
EC on time
6
8
6
9
Acceptance Acceptance known after the first review,
the Q1-Q4
by the EU
Commission
Acceptance known after the final
review, the Q5-Q8
* Milestones acceptance by the Project Management Board (PMB), reported in the Quarterly Reports’
NA1 section.
Milestones
Q1
Q2
Q3
Q4
Q5
Q6
Q7
Q8
Number of
due
milestones
0
3
4
3
1
2
3
2
Indicator,
PMB
approved
on time
0
3
4
3
Acceptance Acceptance known after the first review, the
Q1-Q4
by the
PMB
Id
DNA1.1
Deliverable / Milestone title
Quality and gender action plan
Lead
partner
Original
Delivery
date(*)1
Revised
delivery
date(*)
Status
(**)
KTH
3
3
PMB approved
1
(*) Dates are expressed in project month (1 to 24).
(**) Status = Not started – In preparation – Pending internal review – PMB approved
(***) Nature = R = Report P = Prototype D = Demonstrator O = Other, Deliverable id: for
Milestone attached to a deliverable
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 10 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Id
Deliverable / Milestone title
Lead
partner
Original
Delivery
date(*)1
Revised
delivery
date(*)
Status
(**)
DNA1.2
Quarterly progress report
KTH
3
3
PMB approved
DNA1.3
Quarterly progress report
KTH
6
6
PMB approved
DNA1.4
Quarterly progress report
KTH
9
9
PMB approved
DNA1.5
Yearly activity (progress) and
management (financial) reports.
KTH
12
12
PMB approved
2.2. NA2 – EDUCATION, TRAINING, DISSEMINATION AND OUTREACH
Objectives of the Activity
The NA2 team has acquired the significant experience in performing various tasks in the field of
ETDO (Education, Training, Dissemination and Outreach) activities during the work in BalticGrid
project. Therefore it has been possible to continue these activities in the BalticGrid-II with greater
knowledge and more capabilities.
The main objective of NA2 is to ensure expertise sharing and active dissemination of the benefits of
Grid-empowered e-Infrastructures (especially of BalticGrid-II) and to carry out the necessary
education to build active and successful user communities in the Baltic States and Belarus. Therefore
an important task in the BalticGrid-II is extending the list of audiences - at which the ETDO activities
have been targeted - on potential users and researchers from new scientific communities of Baltic
States and Belarus. This will allow unifying the distribution of information on the Project and
spreading knowledge about the Grid technology in all Partners’ countries. In the same time the
dissemination of the Project results and achievements has been continued to a greater extent outside
Baltic States - in Europe and at other events, whenever possible and wherever the Project
representatives have had such a possibility.
NA2 collaborated with other Grid projects (mainly for the development of education and training
material) and with existing national and regional organisations (especially with National Grid
Initiatives in Partners’ countries) to have a broader scope of dissemination activities. NA2 also
disseminated the achievements of Special Interest Groups (SIGs) already started in the BalticGrid
project and developed in the framework of BalticGrid-II.
The goal of the collaborations was to ensure effective development of a broad range of materials and
that potential users in all relevant disciplines were properly informed about, and assisted in, using
BalticGrid-II and resources available through it. Through the conferences, summer school and other
events organized by BalticGrid-II (or planned in the second year of the Project), the application users
and developers will have opportunities to familiarize with the Grid technologies’ issues and
achievements of BalticGrid-II. It will also be possible for these users to establish some initial contacts
with Grid experts, as some face-to-face meetings with them (e.g. developers of gLite, Gridcom, MD)
will be possible during above-mentioned events.
The activities of NA2 are structured to;
i)
Propagate Grid services and application success stories to new research communities in
academia and industry,
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 11 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
ii)
iii)
Provide training for the formation of staff skilled in Grids (Grid administrators),
Provide training for users at various levels – for beginners (especially for users from new
communities), and more advanced – mostly for application developers.
The dissemination efforts carried out cooperatively among all partners under IFJ PAN leadership will
result in the delivery of rich and high-quality information.
The NA2 activity is split into two tasks: “Dissemination and Outreach” and “Education and Training”.
Each task focuses on a different aspect of the activities, which are being done in close cooperation
with all Project partners, and especially with the Project management.
Task 1: Dissemination and Outreach
The primary aim of this task is sharing information on the benefits of Grids, especially the
BalticGrid-II project, to broad communities of potential users in academia and the public sector
with special emphasis on communities in the Baltic States and Belarus. This task is also assuring
communication of Project progress among partners. The dissemination is being managed coherently
with the deployment and operation of the Project infrastructure (SA1) to assure that dissemination
and education properly address current and near term offerings of BalticGrid-II. The dissemination
activity presents the BalticGrid-II offer to particular user groups and research disciplines. The
requirements and interests of said groups will be identified through specially elaborated forms,
available from the BalticGrid-II website and also sent to any interested parties.
Task 2: Education and Training
This task handles all affairs related to imparting general knowledge of - and practical skills in - Grid
technologies on users to assure their success. The task also ensures expertise sharing between
partners to improve skills and comprehension of teams that work with BalticGrid-II infrastructure.
The objectives of this task are being achieved through organisation of seminars, tutorials and Grid
summer school covering various topics of Grid computing and e-science. The events are conducted
by the more experienced Project partners or by some invited “external” speakers.
Progress Towards Objectives
The all Project events organized and deliverables completed during the first 12 months of the Project
are described below.
Conferences
BalticGrid-II Grid Open Day, Vilnius, Lithuania, 13 May 2008
The Grid Open Day in Vilnius was organized for the wide academic audience to show the benefits of
grid usage. The speakers from the BalticGrid-II project introduced the grid infrastructure, shared
experience of deploying the grid and presented the future of grid technologies. The presentations about
HPC solutions have been given by the representatives of IBM, Sun Microsystems and Lithuanian IT
company - BGM.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 12 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
BalticGrid-II Kick-off Meeting, Vilnius, Lithuania, 14 – 15 May 2008
The BalticGrid-II Kick-off meeting was held in Vilnius in premises of Lithuanian Academy of
Sciences. It was organized by VGTU. The aim of the meeting was to bring together all Partners for the
first time and to explore the new Project activities for broadening and improving BalticGrid-II
infrastructure.
Participants from three Baltic States, Poland, Switzerland, Sweden and Belarus attended the meeting.
Mainly they were partners of BalticGrid-II project but people from different Lithuanian academic
institutions participated in the event as well. The Kick-off meeting took two days (14 - 15 May). The
first day started with the presentations of the Activity Leaders (ALs). All of them introduced the main
tasks, milestones, deliverables and responsibilities of the Partners. Parallel sessions (for working in the
small groups) were organized on the second day.
The Second International Conference "Supercomputer systems and applications" (SSA’2008),
Minsk, Belarus, 27 – 29 October 2008
The aim of the SSA`2008 conference was to exchange experience in the field of fundamental and
applied research on supercomputer systems development, grid-technologies, contemporary
technologies of parallel computations, new technologies of computation resources allocation. The
conference reports (proceedings) contained 61 papers by authors from around the world, and 23 of
them were authored or coauthored by UIIP NASB scientists. Major part of SSA`2008 papers were
devoted to problems of implementation of main components of high information technologies, such
as: access to supercomputer computational resources using grid-technology, selection and application
of technologies providing security; as well as scientific and practical results obtained within RussiaBelarus program "SKIF-GRID".
BalticGrid-II Grid Open Day, Minsk, Belarus, 29 October 2008
The Grid Open Day was a free entrance event and mainly targeted to: all interested in information
technologies and data distributed processing, users in all scientific and technological areas and general
public.
The grid professionals from Baltic States, Belarus, Finland, Poland, Russia and Sweden demonstrated
to the wide audience: what is grid computing, grid and HPC computing initiatives in EU, Russia and
Belarus, state of the art, middleware and tools and grid applications.
1st BalticGrid All-Hands Meeting, Minsk, Belarus, 29 – 30 October 2008
The aim of the meeting was to estimate and discuss together with all Partners the BalticGrid-II
project’s results achieved in the first two quarters and to detail the further Project activities. Totally 75
participants from 7 countries (Baltic States, Poland, Sweden, Switzerland and Belarus) participated in
the meeting.
67th conference of the University of Latvia, ICT section, Riga, Latvia, 16 February 2009
In the framework of the annual conference of the University of Latvia, the ICT section was organized
by IMCS UL. It featured 8 presentations, including one about the BalticGrid-II project achievements
and the other about XEN virtualization.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 13 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Workshops
The Eight Cracow Grid Workshop (CGW’08), Cracow, Poland, 13 – 15 October 2008
CGW'08, as the Central European Grid Consortium event, was an opportunity for overview of
research in the main European and national grid projects. This year Workshop especially addressed the
issues of National and European Grid Initiatives. At the first day of the Workshop presentations of
several NGI's, as well as EGI-DS were given. The speakers included: P. Aerts, N. Geddes, J. Gomes,
J. Kitowski, D. Kranzlmueller, J. Marco, M. Mazzucato, A. Read, U. Schwiegelshohn, and G.
Wormser.
The keynote speakers of the event included: Mário Campolargo (EU IST Brussels, Belgium), Denis
Caromel (University of Nice-Sophia Antipolis, INRIA, CNRS, IUF, France), Ewa Deelman,
(Information Sciences Institute and Computer Science Department, University of Southern California,
USA), Fabrizio Gagliardi (Microsoft Research, Switzerland). The invited speaker was Krzysztof
Góźdź (Hewlett Packard).
About 70 contributed papers on the above topics have been presented during the 3 day event. The
workshop also involved the tutorial organized by Institute of Computer Science AGH and ACC
Cyfronet AGH, titled: “Environment for Collaborative e-Science Applications”.
Three posters concerning the BG-II topics were presented at this event by Alexander Nedzved from
UIIP NASB: “Belarusian National Grid-Initiative”, “Setting up SKIF-UNICORE Experimental Grid
Section” and “Establishing SKIF-gLite Grid Infrastructure”.
Z. Mosurska, M. Turala and H. Palka from IFJ PAN were heavily involved in the preparation of the
event, in particular in the organization of "EGI-NGI day" (Monday, 13 October 2008) and related
discussions. They also participated in the remaining two days of the workshop (14 - 15 October 2008).
SKIF-Grid Program Workshop, Minsk, Belarus, 28 October 2008
The aim of the SKIF-Grid Program Workshop was to share experience in the field of:
 applied research on supercomputer systems development,
 grid-technologies,
 contemporary technologies of parallel computations,
 new technologies of computation resources allocation, providing efficient access to
supercomputer computational resources using grid-technology,
 selection and application of technologies providing security,
 scientific and practical results obtained within Russia-Belarus program "SKIF-GRID".
BG-II infrastructure for computational meteorology workshop, Tallinn, Estonia, 3 December 2008
Modern scientific and applied meteorology is getting to consume more and more computational
power, more complicated data storage and distribution models. There are two main purposes behind
the need: (i) more accurate numerical modeling of weather for scientific and prediction bases and (ii)
data flow from satellites and terrestrial monitors are growing exponentially and it needs CPU-intensive
automated processing and storage. In Estonia, EMHI is the responsible organization for official
weather forecast, applied meteorology and hydrology in national level. Also, EMHI is carrying on the
scientific research program in collaboration with the local universities. In the workshop, the BG-II
partner KBFI (represented by Mario Kadastik and Ilja Livenson) introduced the BG-II computational
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 14 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
and data storage infrastructure to the scientists from EMHI. The usability of the BG-II infrastructure
for some concrete applications at EMHI was discussed.
Tutorials
BG-II tutorial for Grid beginners, Minsk, Belarus, 31 October – 1 November 2008
The purpose of the tutorial was to introduce grid beginners to: principles of grid computing, principles
and practices concerning Grid security and exercises on application grid-enabling. The tutorial
required a rudimentary knowledge of Linux and administration of Linux systems, networking, and
basic programming principles.
During the first day the following topics were covered: introduction to grid computing, effective
application enabling on gLite-based grid and MPI in BalticGrid-II environment. During the second
day the following topics were presented: principles of Grid computing, exercises on application gridenabling and grid security: principles and practices.
The tutorial has been conducted by lecturers and trainers from BG-II Partners' institutions (Bartek
Palak - PSNC, Rolandas Naujikas - VU, Tomasz Szepieniec - IFJ PAN, Oleg Tchij, Siarhei
Salamianka and Yury Ziamtsou - UIIP NASB)
Tutorial – How to start using Grid, Riga, Latvia, 12 November 2008
Guntis Barzdins from IMCS UL gave a tutorial for the students of the 1st semester Computer Science
Master program on how to start using Grid. These students later obtained certificates and were using
grids for their homework.
Tutorial for new grid users at VU, Vilnius, Lithuania, 12 November 2008
The following topics were presented during this tutorial: introduction - principles of grid computing,
g-lite tutorial, Java tutorial at Grid, Gridcom, and g-Eclipse. The tutorial has been conducted by
lecturers and trainers from VU: Rolandas Naujikas, Eduardas Kutka, Linas Butenas and Danila Piatov.
Tutorial: Parallel calculation in the network of Grid, Panevezys, Lithuania, 11 December 2008
It was an event organized for grid users at Panevezio College of Higher Education. The tutorial was
performed by Eduardas Kutka from VU and consisted of two parts: “How to use the LitGrid
network?” and “Calculation in the LitGrid network”.
Grid tutorial for beginners, Riga, Latvia, 22 December 2008
The tutorial was dedicated to the prime practical training of Grid and gLite middleware basics to
beginners that weren’t previously much familiar with grid technology. The participants of the tutorial
were introduced to basic gLite commands for grid proxy certificate creation, use of grid information
services, computational job submission to the Grid and acquisition of its output from the Grid.
Trainees were also instructed how to submit to the Grid the computational jobs which were written in
Matlab programming language. The tutorial was conducted by Janis Kulins and Lauris Cikovskis.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 15 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Seminars
Title
Date and place
Complex Systems Simulation of HIV
15.05.2008, Krakow, Poland
Grid technologies in Latvia
23.05.2008, Riga, Latvia
“GRID Systems” seminar course
May – June 2008, Tartu, Estonia
“Basics of Cluster and Grid Computing” seminar course
May – June 2008, Tartu, Estonia
Principles of HIV Drug Resistance
5.06.2008, Krakow, Poland
Latvian Grid seminar “Grid network and its usage”
Malleability, Migration and Replication for Adaptive
Distributed Computing over Dynamic Environments
13.06.2008, Riga, Latvia
19.06.2008, Krakow, Poland
28.08.2008, Riga, Latvia
Parallel computing with MATLAB
Advanced Data Mining and Integration on the Grid: Models
and Languages
LitGrid and BalticGrid projects: infrastructure, services,
applications
LitGrid and BalticGrid projects: infrastructure, services,
applications
Massive Non Natural Proteins Structure Prediction Using
Grid Technologies
Grid infrastructure in Europe – EGI (EENET's weekly
seminar)
Seminar for potential grid users from VGTU
9.10.2008, Krakow, Poland
15.10.2008, Vilnius, Lithuania
20.10.2008, Vilnius, Lithuania
30.10.2008, Krakow, Poland
20.11.2008, Tartu, Estonia
21.11.2008, Vilnius, Lithuania
Introduction about IMCS UL activities
24.11.2008, Riga, Latvia
On development of e-Infrastructures in European Union
28.11.2008, Vilnius, Lithuania
ETICS - Software Engineering Infrastructure and Quality
Process
11.12.2008, Krakow, Poland
BalticGrid-II project overview (EENET's winter seminar
2008)
11.12.2008, Tartu, Estonia
Seminar on GÉANT2/3 and BG-II potential collaboration
21.01.2009, Riga, Latvia
Basics of Cluster and Grid Computing course
February – June 2009, Tartu, Estonia
OGF25/EGEE User Forum in Catania summary
11.03.2009, Riga, Latvia
Grid application for science
19.03.2009, Riga, Latvia
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 16 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Grid application in medicine
8.04.2009, Riga, Latvia
The Principles and Semantics of Transactional Memory
16.04.2009, Krakow, Poland
Cloud Computing - pay-as-you-go computing explained
21.04.2009, Stockholm, Sweden
Exhibitions
EGEE’08 Conference and Exhibition, Istanbul, Turkey, 22-26 September 2008
Zofia Mosurska and Robert Pajak, Bartek Palak and Dana Ludviga were managing the BalticGrid-II
booth during the EGEE’08. The following materials were presented at the booth: BG-II movie introducing the Project and its Partners as well as the pilot applications and other which were of
interest of the Project, demos with examples of the BG-II applications running in the framework of the
Migrating Desktop, the set of brochures, a poster and pens.
Bartek and Dana were also managing the Project demo stand (titled “Using the BalticGrid-II
infrastructure: SemTi-Kamols - a linguistic analyzer of Latvian language”) on 22-23 September 2008.
ICT’08 conference, Lyon, France, 25-27 November 2008
The BELIEF-II project members were managing the common “BELIEF-II: Global e-Infrastructure”
booth during the ICT’08 conference, where a. o. the BalticGrid-II materials (including the movie)
were presented.
EGEE User Forum and OGF25, Catania, Italy, 2-6 March 2009
The general Project stand at the conference exhibition was managed by Zofia Mosurska and Robert
Pajak; the following materials were presented at the stand: a general poster, the set of technical
brochures, pens, small calendars as well as the BG-II movie - introducing the Project objectives,
partners and several applications being gridified in the framework of the Project.
The demo stand “The Synthetic spectra modeling under GRIDCOM interface” was maintained by
Grazina Tautvaisiene and Sarunas Mikolaitis, who explained the visitors the importance of the
SYNSTPEC application - gridified and integrated with GRIDCOM interface as well as the application
results.
The demo stand “CoPS - The Complex Comparison of Protein Structures supported by grid” was
managed by Bartek Palak and Edgars Znots, who demonstrated the CoPS application - gridified and
integrated with Migrating Desktop graphical portal.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 17 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Description of all NA2 activities done in the first year of the Project life is available in the DNA2.3
Dissemination Report (P12).
Table 2 – Deliverables List for NA2
Del. No.
Deliverable Title
Date Due
Accepted for
release by
PMB
Lead
contractor
DNA2.1
Project website
30/06/2008
30/06/2008
IFJ PAN
DNA2.2
Dissemination Roadmap 31/08/2008
31/08/2008
IFJ PAN
DNA2.3
Dissemination Report
30/04/2009
IFJ PAN
30/04/2009
Table 3 – Milestones List for NA2
Milestone
No.
Milestone Title
Comment
Date Due
Actual/
Forecast
delivery date
Lead
Contractor
MNA2.1
Dissemination Roadmap ready
The deliverable has
been prepared on time
31/08/2008
31/08/2008
IFJ PAN
Deviations from Plan
Planned deliverables for the first year are completed, a milestone is met and expected result achieved.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 18 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
2.3. NA3 – APPLICATION IDENTIFICATION AND SUPPORT
Objectives of the Activity
The NA3 activity “Application Identification and Collaboration” in BG-II project has an important
task to validate and prospect the efficient usage of BG-II infrastructure, to adjust and validate this
infrastructure to Baltic States and Belarus research ecosystem, to develop ways for collaboration of
applications within BalticGrid-II infrastructure, as well as in the infrastructure of European Grid
ecosystem.
According to these tasks the roadmap is produced given primary analysis of suitable applications
suggested by partners. The content of applications as well as the corresponding software used for
applications’ implementation is introduced. In order to achieve objectives, defined for Project
application identification process and application collaboration one are detailed and clarified.
Application identification process is done by creating supportive environment for applications
synchronized with SA3 activity, by creating testbeds for launching applications together with SA1
activity, by establishing parameters for analysis of jobs running in test and in production modes, by
establishing SIG portals, Gridcom and Migrating Desktop interfaces for applications. The time
schedule for such activities tre foreseen.
The application collaboration is another important task of the NA3 activity. It is planned to fulfill
the objectives of BG-II such as to develop infrastructure and cases for interdisciplinary research, to
address specific needs for scientific communities, to design and develop suitable grid-based services.
The global and most essential requirement for such infrastructure and services is a capability to attract
new users. The description and the timed list of activities and processes planned for application
collaboration are presented.
The results of activities foreseen will be presented mainly in analytical summaries as well as in the
DNA3.2-6 deliverables. The content of summaries will address needs of research communities and
key research groups of Baltic States and Belarus, evaluate BalticGrid-II infrastructure from
applications and users point of view, summarize interaction with the pan-European e-Infrastructures
established by EGEE and EGEE associated projects, summarize the effective utilization of grid,
resources enabling users to share large quantities of data and computing results, make suggestions for
further development of the Grid infrastructure for applications.
Progress Towards Objectives
Measures ands metrics of applications
Measures and metrics for application identification cover various aspects of applications: a) the
matching of the content of application (job) to a research area, b) the level of compliance of supportive
environment (provided by SA3 activity) to the application run, c) need of testbed to launch application
and to test it, together with technical parameters, d) technical attributes rfor production mode.
Numerous pilot and advanced applications are analysed, to evaluate them on the basis of measures
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 19 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
amd metrics suggested. The applications are from research areas, considered among the most
important and dynamic research areas in the Baltic States and Belarus.
The supportive environment for applications is developed by SA3 and performed by people of
support: support specialists, pilot application contacts, activity leaders. Their functions cover gridenabling of applications, grid-specific support (helping to run batch jobs, MPI jobs, create workflows,
help visualize the output, manage the data), integration of scientific applications into grid
infrastructure, provision of users with variety of helper tools like batch job managers, GUIs, VO
management software and the likes. Applications are provided by testbed (if needed), which also can
be created dynamically (on the fly). The special emphasis is given to the management of VOs and
their tags for publication.
The application under analysis cover all research areas, foreseen in the BG-II project but two (High
Energy Physics and Operational Baltic Sea Ecosystem Modeling). The HEP research area is very well
integrated internationally, uniting thousands of scientists all over the world, and the Baltic sea
modelling is now under design study, involving advanced models like HIRLAM and HIROMB.
The results of analysis of applications presented constitute the well establishment of design and
deployment of applications. They represent the content of research areas at a confident level and make
a good basis for the evaluation of BG-II infrastructure, from applications and users point of view.
They give also a solid background for development of mutual collaboration for scientists of the Baltic
States and Belarus.
Gridcom and SIG portals
The important processes, measures and tools in the context of identification of applications within
BalticGrid-II project are the tools and efforts for application development, deployment and support.
Among those measures are SIG portals, high quality user interfaces, with the most fitted functionality
for application choosen, testbeds.
The Special Interest Groups (SIGs) are well designed to meet various demands of applications and
the researcher groups to bring together people working on the design, evaluation, and implementation
of jobs in their areas of research. SIGs provide an international, interdisciplinary forum for the
exchange of ideas and distribute the research tasks, like computing, data exchange, communication
services.
Gridcom is a simple grid interface for complex applications. It automatically splits input data into
intervals, generates and submits as many jobs as needed, collects, merges and visualizes the results. It
is able to upload large files to a storage element. Gridcom is web-based and can be accessed from any
computer, even from a cell phone.
The Migrating Desktop is a matured software, which has been developed for a number of years.
Now it ensures compatibility with BalticGrid testbed (in terms of jobs submission, running and
monitoring processes’ state) and has been enhanced to follow the development of grid standards. It is
installed as one of the BalticGrid-II core services (http://desktop.balticgrid.org/) and successfully used
by applications, implemented or “gridified” within BG-II project that are progressively deployed
within BalticGrid infrastructure.
Testbeds for applications are needed to test and migrate them to production mode, as well as to fit
to newest middleware versions. The testbed may be defined dynamically: a) by using GOCDB the
nodes can be moved from site to site, new nodes added, b) installation of components will required CE
installation or/and site BDII installation, WN installation, the entire reconfiguration. Next steps in the
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 20 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
managing testbeds for application may include migration from testbed to production mode, as well as
MPI-start installation and configuration.
Research sectors and collaboration of applications
There is a need for BG-II project to high quality grid-related research-oriented applications, to the
knowledge and software. BG-II also has to be envisaged as a e-Science channel, as well as to join the
European Marketplace for Grid technologies and e-Science.
The set of perspectives, parameters and attributes, needed for applications’ collaboration
identification are presented and these parameters and attributes form suitable basis for the
development of collaboration framework, which is presented also there.
Also the potential functionality for application collaboration is described and structured, this
functionality includes research activities of grid users in programs of National academies of science,
science funding agencies, in international research programs or other internationally spanned research
activities. In addition it include organizational and technical means for the promotion of scientific
collaborations, technical/material facilities, various software packages, used in applications mentioned
above, and also perspective ones. The functionality will be granted by suitable VOs and their
sophisticated management in different Grids, by grid communities in the external high-level scientific
events (participation in international scientific conferences, in EGEE User Forums, etc.), by analytical
summaries, arranged in suitable workshops.
Id
Deliverable name
Acti
vity
No
Lead
partner
DNA3.1
Application Identification and
Collaboration Roadmap
NA3
VU
DNA3.2
Analysis, supportive environment
and testing of applications
NA3
VU
DNA3.3
Design and deployment of SIG
portals, Migrating Desktop, testbeds
for application tests and launching
DNA3.4
Collaboration structure and
functionality for scientific/academic
communities in Baltic States and
Belarus
NA3
NA3
Original
Delivery
date(*)2
Revised
delivery
date(*)
Status
(**)
3
3
PMB
approved
9
9
PMB
approved
10
10
PMB
approved
12
12
PMB
approved
VU
VU
Deviations from Plan
2
(*) Dates are expressed in project month (1 to 24).
(**) Status = Not started – In preparation – Pending internal review – PMB approved
(***) Nature = R = Report P = Prototype D = Demonstrator O = Other, Deliverable id: for
Milestone attached to a deliverable
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 21 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Planned deliverables for reporting period are completed and expected result achieved.
2.4. NA4 – POLICY AND STANDARDS DEVELOPMENT
Objectives of the Activity
The overall aim of the Policy and Standards activity is to provide a sustainable and coordinated
development of Grid infrastructure in the Baltic States and Belarus. The activity will continue efforts
started in the Baltic Grid project on coherent policy and standard development to foster further
development of e-Infrastructure in the region, strengthen cooperation and human networking activities
and providing interoperability and effective functioning of Baltic Grid infrastructure within European
Grid infrastructure. Baltic Grid II project must also concentrate on the development of infrastructure
for applications that are important for Baltic Grid.
Develop long term strategy for coherent e-Infrastructure development in Baltic States and Belarus.
Develop and implement standards and standardized procedures for grid operation. Participate in
development of technology and operation standards to maintain interoperability and user friendly
interfaces for Grid users.
Develop awareness rising and user support policy for spreading of Grid usage and efficient
exploitation of e-Infrastructures. Provide analysis on an existing good practice and develop policy and
action plan for fostering cooperation, human networking and coordination of efforts in development of
worldwide e-Infrastructures. Develop e-Science policy principles to provide effective involvement of
the scientists of Baltic States into European Research Area. Foster cooperation between BalticGrid-II
and the rest of the world Grid activities.
Contribute actively in policy and standards development and implementation through participation in
OGF, EGEE, Concertation Meetings, EU GridPMA, e-IRG etc.
PSCN participated in OGF with research group RISGE (Remote Instrumentation Services in Grid
Environment).
Progress Towards Objectives
Policy and Standards activity coordination team is formed from 8 participants who represent all
involved countries. During the reporting period the team has actively participated in OGF, EGEE, FP
Concertation Workshops, EGEE User Forum, e-IRG and EU GridPMA. Outcome of participation is
described in the NA4 activity Quaterly Reports and deliverables DNA4.1 and DNA4.2. Supported
BalticCloud subproject development.
Deliverables list for NA4 for reporting period
Del. No.
Deliverable Title
Date Due
Accepted for
release by PMB
Lead contractor
DNA4.1
Analysis Document on
Cooperation and Good
Practice
30/11/2008
28/11/2008
RTU
DNA4.2
Progress Report on
Awareness Raising
30/04/2009
30/04/2009
RTU
Deviations from Plan
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 22 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Planned deliverables for reporting period are completed and expected result achieved.
2.5. SA1 – GRID OPERATIONS
Objectives of the Activity
The main objective of the Grid operations activity is to maintain and expand the Grid infrastructure in
the Baltic States and extend the current infrastructure also to Belarus. In addition the objective of SA1
is to migrate from a project based management and support of the infrastructure towards a sustainable
infrastructure, which is funded, managed and supported by National Grid Initiatives (NGI-s).
To support the infrastructure and the user community behind it the activity can be seen as a
combination of four major tasks of “Central services”, “Maintenance of support systems”, “Testbed”
and “Additional middleware support and interoperability”. We now give the first year objectives and
actions for these separate tasks.
Progress Towards Objectives
Task 1 – Central Grid services
The objective of this task is to provide the user base as well as the participating computing centers
with the necessary central services that are needed to operate a Grid. Such services include the
Certification Authority (CA) management as well as the actual Grid services like workload
management server, information system, centralised testing facilities etc. During the first year the
project continued to manage the central services, which were in place already at the end of the
previous project as well as add new services to make sure that each country in the region would in the
worst-case scenario be self-sustaining in terms of networked services. This way a disruption in the
national network connection would not immediately hamper the datacenters inside that country and
would allow users to still continue to submit jobs as well as retrieve results. In addition to the existing
framework in Baltics the first year also saw us expanding to the Belarus, which at the start of the
project did not have any Grid related services. By the end of the first year all central services are
present in all member countries and functioning at production quality. This work has been summarised
in deliverable DSA1.2 “Central services in operation in all countries”.
The Certificate Authority service needed a new CA in Belarus as the BalticGrid CA policy only
allowed handing out certificates to members from Baltic States. The process of creating a new CA is
difficult and requires a lot of time as every CA used in Grids has to be authorized and endorsed by the
EUGridPMA body. By the end of the first year the Belarus CA has gotten the endorsement and is now
officially accredited.
Task 2 – Maintenance of user and site support system
Part of operating a Grid is always also the support of site administrators and users. In co-operation
with SA3 activity SA1 activity is maintaining a trouble ticketing system to allow both users and
administrators to contact the support personnel no matter what the problem is. The system was taken
over from the previous project and expanded as was needed.
To provide prompt support to any trouble-ticket opened the support organization was re-designed to
implement First Line Support (FLS) personnel from each member institute. In addition the FLS is
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 23 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
sectioned into countries to further facilitate migration to NGI-s and EGI at the end of the project. The
task of FLS personnel is to solve the tickets that are relevant to their country/site and to bring any nonstandard issue to the operations meeting. The Operations Director (OD), who resides at NICPB is
chairing the operations meetings on a weekly basis using the EVO (http://evo.caltech.edu/) electronic
videoconferencing system. During the meetings both the project targets as well as current situation in
the Grid support are discussed and decisions made.
Task 3 – Testbed operations
In order to keep the Grid at production quality and yet still provide the latest versions of both Grid
tools and services as well as operating systems and to allow testing of new major releases of user
software the Testbed was conceived. As updates to any layer in the complex framework of the Grid
can cause unexpected problems then any major upgrade should be tested in the local region before it is
introduced to most of the sites. To do this a separate testing infrastructure has to be in place, however
such an infrastructure would not have constant usage as major upgrades only come once or twice per
year. In that spirit the activity management decided to instead create a framework of being able to
migrate one or two sites from the production state to a testbed state with possibilities to both test them
and allow users to submit jobs to these sites even though they were not in production. This way a site
can migrate to testbed, perform the upgrade or new software installation and be tested thoroughly
before it is migrated back to production. At the same time common Grid jobs would not accidentally
be submitted to his site allowing the rest of the Grid infrastructure to remain at production quality.
This infrastructure has yet to be tested for a major upgrade as none have been released during the first
year of the project, however it is foreseen that a major upgrade will happen during the next stage as a
generic migration from Scientific Linux 4 to Scientific Linux 5 is foreseen in the next half-year.
Task 4 – Interoperability with other middlewares
The Baltic region is dominated by gLite usage due to BalticGrid and BalticGrid-II project having
chosen that middleware as the main middleware. However there are user communities using both
ARC and Unicore based computing centers for various reasons. Part of the BalticGrid-II project is to
try to find a way to unify these resource centers in such a way that users from all communities could
benefit from additional resources without having to migrate to another middleware for that. During the
first year an investigation was made into the current and upcoming situation in the world with special
regard to these three middlewares and it was found that a number of technologies either already exist
or are coming in the next few releases that should make it easier for Grid jobs to cross middlware
boundaries and hence allow each user to use his or her favorite tools while making full use of all of the
resources in the region. This work has been summarised in the deliverable DSA1.4 “Report on
additional Grid middleware support”.
Deviations from Plan
A deviation from original plans was found to be necessary as the original objectives included setting
up central services for SLA guarantee technologies, which at the time of the project start foresaw the
installation of Tycoon central services. However during the very early stage of the project it was found
out that Tycoon as a product would no longer be sustainable in the framework that it had been in mind
during the writing of the proposal and the subsequent contract. To that end a new method was
searched to provide SLA based scheduling inside BalticGrid-II project and the findings are described
in DSA1.3 “SLA mechanism central services operational”.
Usage targets
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 24 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
The measurables for for SA1 are:
 Normalized CPU hours usage increases by 25% per six month period. The baseline will be set
by the first six months. The usage in the first six months was 1,561,250 normalized hours. To
meet the 25% increase the second six-month period (until end of year one) needs to be higher
than 1,951,563 normalized hours. During the second six-month period the BalticGrid
infrastructure was used for 2,104,488 (as of April 27, this number may increase, but not
much), this is an increase of 34.7% almost a full 10% more than originally expected. This
result is even more important considering that the original six month period computed hours
were an increase of 123% over the six month period previous to the project start.
 Number of active certificates in BalticGrid CA increases by 10% every year. The amount of
active certificates in BalticGrid CA at the start of the project was 329 and at the time of
writing of this report the active number of certificates was 393 which is an increase of 19.5%
 Contacts established with ARC middleware maintainers and users by the end of year 1. Good
contacts have been established and a Memorandum of Understanding (MoU) is signed by
representatives from BalticGrid-II and KnowARC projects.
 Contacts established with Unicore middleware maintainers and users by the end of year 1. As
with ARC the contacts have been established and an MoU has been signed between
BalticGrid-II and Unicore projects.
After the first year of BalticGrid-II project the SA1 activity has reached all the targets that were set
and has actually exceeded expectations in terms of computing hours utilized even though the first six
month period was an increase of 123% over the last six months of the previous BalticGrid project and
as can be seen from the second quarterly report that tremendous increase made it very difficult to
reach the set target of 25% increase for the second period.
Table 10 – Deliverables list for SA1
Del. No.
Deliverable Title
Date Due
Accepted for
release by PMB
Lead
contractor
DSA1.1
Setup of Belarusian CA
1/08/2008
30/07/2008
KBFI
DSA1.2
Central services in operation in all
countries
1/11/2008
31/10/2008
KBFI
DSA1.3
SLA mechanism central services
operational
1/1/2009
19/12/2008
KBFI
DSA1.4
Report on additional Grid middleware 1/5/2009
support
29/04/2009
KBFI
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 25 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Table 11 – Milestones list for SA1
Milestone
No.
Milestone Title
Comment
MSA1.1
Central services in all countries
running
All countries have at 1/11/2008
least one instance of
every central service
September
2008
KBFI
MSA1.2
Tycoon central services
operational
Tycoon replaced
1/2/2009
with other tools, for
details see DSA1.3
Services in
place end of
2008
KBFI
MSA1.3
Interoperability options of
middlewares understood
Summarised in
DSA1.4
1/5/2009
KBFI
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Date Due
1/5/2009
Actual/
Lead
Contractor
Forecast
delivery date
Page 26 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
2.6. SA2 – NETWORK RESOURCE PROVISIONING
Objectives of the Activity
The objectives of SA2 activity are:
 Establish production level Central Network Coordination Centre as a coordinating body, as
well as National Network Coordination Centres for effective network monitoring, user support
and notification in respect of network resource provisioning;
 Expand network established by the BalticGrid project to new partners in Belarus; update
existing Service Level Agreements, as well as conclude a new agreement with network
operators in new Partnering countries;
 Carry out extended Risk analysis, establish the BalticGrid Incident Response groups within
each National Network Coordination Centre, develop security and Incident report and
response procedure;
 Co-operate with EGEE and GEANT monitoring and networking activities;
 To fully utilise high speed Gigabit and light-path networks, test and deploy ‘high-speed’ TCP
and scalable TCP technology in order to improve network resource utilisation;
 Provide security incident response regarding incidents occurring in the BalticGrid network or
related to the Grid clusters, solve security incidents informing involved parties and
maintaining confidentiality, and publish publicly available statistics on incident response
The overall strategy of SA2 is to ensure reliable network connectivity for Grid infrastructure in the
Baltic countries and Belarus, as well as to ensure optimal network performance for large file transfers
and interactive traffic associated with Grid.
In order to reach the objectives, during the first year of the project the work on the following tasks had
been performed:
 Task 1 Expanding BalticGrid infrastructure to new partners
 Task 2 Establishing and operation of CNCC, NNCCs, and BG-IRTs
 Task 3 - Policy development
 Task 4 - Network monitoring and cooperation with other projects
 Task 5 - Deployment of high-speed TCP and scalable TCP.
Progress Towards Objectives
In order to understand and formalise network requirements for new partners in Belarus, the status of
their network connectivity was discussed and captured, as well as helping to gather information about
applications and their demands for the network resources has been distributed. During the second
reporting period Belarusian Grid network scheme had been added to the BalticGrid network
monitoring portal http://gridimon.balticgrid.org. Belarusian application requirements collected by
NA3 activity were studied and conclusions regarding network requirements for these applications
were made.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 27 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Demarcation points were identified and Service Level agreement between the BalticGrid-II project
and Belarusian research and education network BASNET was drafted, negotiated, and signed. Service
Level Agreements with EENet, SigmaNet and LITNET were signed for the period of the BalticGrid-II
project as well.
The work performed in Task 1 is described in the deliverable DSA2.3.
The work to establish Central Network Coordination centre (CNCC) as a coordination body and
National Network Coordination centres (NNCC) with the main objective to provide network resource
support for local clusters and users has been carried out. The CNCC in the IMCS UL has been
established. NNCCs in Estonia, Latvia, Lithuania and Belarus were founded and incident handling and
response team BG-IRT in each NNCC was appointed.
DSA2.1 describes the establishment of CNCC and NNCC in more detail.
Changes in Lithuanian Grid map were negotiated. The installation of Ganglia monitoring tool on a
dedicated server was completed by CNCC. Other updates of the monitoring portal were done to ensure
actuality of data.
A procedure of incident handling and response was developed and described in DSA2.5. Sample
security incidents were used to test the procedure of incident handling and response.
In order to facilitate the collaboration between BG-IRT teams, AIRT software to track security
incidents had been installed and procedure of incident handling and response was tested.
Authentication data (username, password) for each member of BG-IRTs is created and short
instruction on AIRT usage had been elaborated.
Extended risk analysis for BalticGrid network to identify possible ways and sources of attacks had
been performed. Internal and external risks that could jeopardize BalticGrid network performance
were evaluated and the risks were classified into four categories: Network continuity risks,
Grid/Cluster risks, Network incidents, and Potential risks of data centers.
On the basis of risk analysis the BalticGrid-II Acceptable Use Policy (AUP), Security Policy (SP) and
Incident Handling and Response policy (IHP) were developed. All these policies should be considered
as a part of more global eco-system. Grid is built on the GÈANT2 network. GÈANT2 has strict rules
and policies that may not be violated. Upon these rules NRENs build their AUPs. Therefore
BalticGrid-II policies are taking into account provisions defined in the policies that form the basis of
the grid infrastructure in Europe in general and the Baltic countries and Belarus in particular (see the
image below).
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 28 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
BalticGrid IHP
BalticGrid SP
EGEE AUP
BalticGrid AUP
SLA
NREN’s AUP
GÈANT2 policy
The first tests of perfSONAR monitoring software and its application to BalticGrid network
monitoring were started.
IMCS UL has started collaboration with DANTE as a representative of the GÉANT2 project on
testing commercial solutions for Anomaly Detection to identify which tool is more suitable for setting
up a "Network Security Service" in the GÉANT2 network. Information acquired from DANTE has
been analysed and compared with data collected from the local network sensors.
Possibilities of cooperation were discussed between BalticGrid-II and DANTE representative during a
security meeting in Riga. GN3 project had started in April 2009 and it opens several opportunities for
collaboration between both projects, including further testing of PerfSONAR software, anomaly
detection solutions, etc.
BalticGrid-II representatives met also with the EGEE security people to discuss development of Grid
Sec collaboration team and collaboration between projects, NRENs and CERTs from various
countries.
Negotiations with network and security related GN3 activity leaders are continued in order to plan and
foster cooperation between SA2 of the BalticGrid-II and GN3.
BalticGrid-II SA2 group shares the opinion of CERN experts that there should be no problem among
appropriately configured systems to reach close to a full 1 Gbps throughput on a 1 Gbps link with
rather short RTTs like 35 msec. It is necessary to check the profile of each server, e.g., the network
interface model and parameters, the disk model, firmware rev. and RAID controller if any, the TCP
stack parameters, etc. It is planned to install MonALISA service on the end-nodes to monitor the TCP
configuration, traffic, lost, retransmissions, etc. Testing methodology includes establishment of testing
laboratory to simulate a real life environment.
During the first year of the project there have been three “breakthroughs”, which already at this stage
allow confidently state that the project team has identified the cause of low TCP performance, as well
as proposed a solution to this problem. The three breakthroughs are the following:

Creation of reliable gigabit test-lab for TCP performance measurement at variable RTT
delays. This turned out to be a non-trivial task, because many off-the-self delay simulation
techniques either produce results not consistent with real networks or not capable of delaying
gigabit traffic flows without distortion. Example of non-conforming approach is use of Linux
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 29 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010


transmission control ('tc') tools, which were attempted initially. A working gigabit traffic
delaying solution (breakthrough) has been found to be FreeBSD firewall ('ipfw') configured as
Layer 2 bridge rather than Layer3 router (Linux approach). The test results achieved with this
solution have been compared and positively matched with real international gigabit network
tests. This achievement provided possibility to reliably test large number of TCP
implementations and configurations in lab environment.
the thorough tests in the test-lab linked the TCP performance variations to different
versions of Linux kernels installed on different BalticGrid-II clusters. By changing only
Linux kernel version it was possible to re-create in the lab both the low TCP performance
characteristic to most of the BalticGrid-II clusters, as well as the high TCP performance
observed in some BalticGrid-II nodes.
it was discovered that default TCP stack for Scientific Linux has very small RX/TX buffer
sizes, which plays crucial role in TCP performance. It was traced down that as few as 4
configuration lines for TCP stack inserted in system configuration file resolves the TCP
performance bottleneck up to gigabit speeds. Later it was positively confirmed that KTH
clusters, which despite using low-performing Linux kernel version achieved exceptionally
good TCP performance, also have almost identical lines for improved TCP performance. KTH
clusters had these lines inserted a while back, and largely forgotten by the general community.
The following 4 lines increase must be put in /etc/sysctl.conf file to improve TCP
performance:
net/core/rmem_max = 8738000
net/core/wmem_max = 8738000
net/ipv4/tcp_rmem = 8192 4369000 8738000
net/ipv4/tcp_wmem = 8192 4369000 8738000
These lines relate to TCP RX/TX buffer sizes, and the following tests on other BGII nodes
with various Linux kernel versions have confirmed that the same 4 TCP configuration lines
indeed resolve the TCP performance bottleneck up to gigabit speeds.
The graph below shows TCP bandwidth for various Linux kernels and distributions depending
on network latency (RTT) and TCP send/receive buffer sizes. As it can be seen, default
Scientific Linux 4 installation underperforms as soon as network latency becomes greater than
1-2ms. Typical network latency between sites in BalticGrid is 8-35ms.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 30 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Deviations from Plan
There are no deviations from the plan in the first 12 months of the project.
However, it is anticipated that economical crisis may have a large impact on the international network
connectivity in partnering countries. For example, the future of GÉANT connectivity in Latvia during
of the writing of this document still is unclear. If the needed funding will not be obtained, there will be
no possibilities to use GÉANT for accessing Grid resources. In this case alternative solutions, such as
commercial services, will be adapted.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 31 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Table 12 – Deliverables list for SA2
Del. No.
Deliverable Title
Date Due
Accepted for release by Lead contractor
PMB
DSA2.1
Report on the establishment and
operation of CNCC and NNCCs,
and BalticGrid IRT.
19/08/2008
25/08/2008
IMCS UL
DSA2.2
Grid Acceptable Use Policy
16/09/2008
19/09/2008
IMCS UL
DSA2.3
Report on the expansion of
BalticGrid infrastructure
24/10/2008
26/10/2008
IMCS UL
DSA2.4
Security Policy
16/12/2008
17/12/2008
IMCS UL
DSA2.5
Incident Handling and Response
Policy
19/01/2009
02/02/2009
IMCS UL
DSA2.6
Interim report on network
performance
24/04/2009
IMCS UL
Table 13 – Milestones list for SA2
Milestone
No.
Milestone Title
Comment
MSA2.1
Incident handling and security policies
developed
31/01/2009
DSA2.4 and
DSA2.5 ready and
submitted
31/01/2009
IMCS UL
MSA2.2
First deployment of high-speed TCP
DSA2.8 ready and
submitted
31/01/2010
IMCS UL
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Date Due
31/01/2010
Actual/
Lead
Contractor
Forecast
delivery date
Page 32 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
2.7. SA3 – APPLICATION INTEGRATION AND SUPPORT
Objectives of the Activity
The main objective of this activity is to support scientists using the BalticGrid infrastructure and to
foster the use of Grid by new research communities. The applications needed and used by the
scientific communities in Baltic States and Belarus are going to be integrated with the BalticGrid
infrastructure. It is of great help to the BalticGrid users, if advanced tools developed by JRA1 and by
other projects are deployed on BalticGrid infrastructure. In parallel to providing the support, scientific
applications integration and tools, the documentation in form of user guides, usage manuals and
Frequently Asked Questions will be created.
In comparison to the BalticGrid project, this activity was added due to an identified need for more
attention to user support.
The objectives of this activity are to:
 Provide Grid-specific support for the scientists in Baltic States and Belarus e.g. helping to run
batch jobs, MPI jobs, create workflows, help visualize the output, data management
 Integrate scientific applications needed by users in Baltic States and Belarus to BalticGrid
infrastructure.
 Provide BalticGrid-II users with variety of helper tools like batch job managers, GUIs, VO
management software
The main objective of SA3 during the first project year was to kick start this new activity in the
project. Every BalticGrid-II partner was part of this activity, which involved identifying individual
partners’ SA3 personnel and planning for their trainings, if needed. It was also significant to build up
the grid users support and application support system so it can be sustainable after BalticGrid-II
project and help the NGIs
Progress Towards Objectives
There are three major tasks in SA3 activity. The main work has been done by tasks:
Task 1 – Support for scientists on using the Grid
The most important part of this task was inspect the BG-I user support procedures and improve the
support service quality for Grid users. All BG-II partners from Baltic Sates and Belarus had to appoint
their support specialists and the specialist had opportunity to participate on the trainings to be
prepared for solving users problems.. In close cooperation with SA1 activity the BalticGrid-II First
Line Support Policy was worked out.
The SA3 supporters are actively supporting their local grid users. Several channels like e-mails, instant
messengers and face-to-face meetings are used to help scientists with their grid problems. More
complicated problems are traced with the BalticGrid-II Trouble Tickets system (Request Tracker)
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 33 of 74
Figure 2
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Fig. 1 Diagram of support for researchers within BalticGrid-II
Users’ Forum – A simple and easy to use users’ documentation portal (http://support.balticgrid.org)
has been published. Contributing and reviewing manuals are work in progress. EENet is hosting this
portal. It already contains a number of user guides and texts about BaltiGrid specific information
valuable for grid users.
The Forum is based on MediaWiki software and is a significant group work facility for BalticGrid-II
supporters. It contains also internal information for SA3 people and provides environment for writing,
contributing and reviewing the user guides and documents.
Task 2 – Application integration
Similarly for task 1 the BG-II partners had to appoint their application support persons and in many
cases the persons coincide with user support persons. As it was difficult to find good trainings about
application porting during the first project year the supporters had to work more with real appoications
and there will be organized an Application grid enabling workshop during BaltigGrid-II 2nd AllHands-Meeting in Riga.
Pilot application for integrating BalticGrid infrastructure has, together with SA3, been chosen
by NA3 activity
 NWChem - by BNTU
 Sem Ti-Kamols - by IMCSUL
 CORPLT - by VU
Application supporters has been working with the pilot applications and also with several other
applications pointed by BG users. The applications are already installed to some BaltigGrid sites and
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 34 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
the users who was involved to the integration process are testing their applications. More detailed
information about applications support will be available with DSA3.2 “1st report on Gridification of
applications”
Collaboration has initiated with EGEE Application Porting group for sharing knowhow and
experiences.
Task 3 – General helper tools for users
A BalticGrid central gLite user interface service was installed. This is important for Baltic Grid users
who’s home institute is not providing user interface service for them. It solves the gLite user interface
problem for new users from new institutions. It was planned to extend the functionality of the
BalticgGrid central gLite user interface service (https://ui.balticgrid.org/) with some tools like
GridWay Metascheduler or Ganga but right now there have been problems with conflicts in software
packages. The research has dome and the tools can be added as soon the conflicts will be solved.
Providing Baltic grid users also a graphical grid user interface the Migrating Desktop and Grid
Commander, developed by BG-II JRA activity, has been deployed. At the report writing this report the
the services are in SA3 internal testing phase.
Deviations from Plan
There is a delay with application supporters “application porting training”. There were problems with
finding a suitable trainings or workshops for SA3 supporters about “application porting to grid”. The
workshop had to be organized by SA3 and to economize on the event cost the workshop will be
organized in conjunction with BalticGrid-II 2nt All-Hands-Meeting because most of the supporters are
participating on the event.
Usage targets
Measures of foreseen application integration and support activity have met on time:

Support personnel trained for milestone MSA3.1: at least 2 per Baltic State and Belarus, total
at least 8.

Application codes integrated: P12 at least 3
Table 10 – Deliverables list for SA3
Del. No.
Deliverable Title
Date Due
Accepted for
Lead contractor
release by PMB
DSA3.1
Report on support specialist
employment and training
30/04/2009
30/04/2009
EENET
DSA3.2
1st report on Gridification of
applications
01/05/2009
31/05/2009
EENET
Note: DSA3.3 was moved to PM13, communicated and approved with Project Officer.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 35 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Table 11 – Milestones list for SA3
Mileston
e No.
Milestone Title
Comment
MSA3.1
Support specialist employed and
trained
The milestone
31/03/2009
described in
DSA3.1;
All partners from
Baltic States and
Belarus have
nominated the
support persons
and the
supporters
attended on
trainings
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Date Due
Actual/
Lead
Forecast
Contractor
delivery date
EENET
Page 36 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
2.8. JRA1 – ENHANCED APPLICATION SERVICES ON SUSTAINABLE
EINFRASTRUCTURE
Objectives of the Activity
The JRA aims at development of advanced services that help grid users in their everyday work, tools
that supports users cooperation as well as visualization of computing results in grid environment.
The main objectives of activity are as follows:

Extending the e-Infrastructure with advanced services necessary from the user,
application and administrator point of view - the GUI level



User-level support for Grid advanced data management
Adaptation of mechanisms used for definition and management of task flow
Ensuring a proper way of further developments by following the existing interface and service
standards
Support of the education process organized by NA2
Providing graphical data visualization tools for scientific computing in Grid
Development of Gridcom - a web-based innovative groupware for grid technology



The activity works focus on enhancements of three main components: Migrating Desktop Platform,
Gridcom and visualization tools to support everyday work of BalticGrid-II users:
 Migrating Desktop Platform - this intuitive interface to project resources is enhanced


support for advanced data and complex application management as well as features
enabling interoperations between various computing infrastructures
Gridcom - mechanisms for cooperation of user groups and applications of this webbased innovative groupware for grid technology are extended
Tools for visualization of specific computing results - development of special
mechanisms for visualization of application results in distributed grid environment
Progress Towards Objectives
At the beginning of the project, very important task was to establish efficient activity team, defining
effective internal collaborations and communications rules.
The implementation work in the first period focused on improvements of components deployed within
BalticGrid (first phase) to follow development of the Grid standards. Deployment of the existing
visualisation tools was a base for a performance analysis of their execution in grid environment. An
outcome of the studies was described in deliverable “Performance analysis of sequential and parallel
visualization in GRID environment” (DJRA1.1)
Discussions on works planned within activity and relations to other project activities were base for the
next report: “Design phase, interoperability and definition of cooperation with other activities: SA1,
NA3, SA3 and plan for education trainings in co-operation with NA2” (DJRA1.2) Document, based
on experiences gained during BalticGrid-I project as well as discussions (internal and with
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 37 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
representatives of other activities), designed required directions of development as well as plans for
interoperability and cooperation between other project activities.
Implementation works resulted in preparation of two releases: “First prototype - system integration,
includes Migrating Desktop Framework extensions, user-level access to Grid data structures services for user-level support for Grid data management released, existing visualization tools and
Gridcom”. (DJRA1.3) and “Second prototype – including includes mechanisms used for definition of
task flow, developed visualization tools and Gridcom enhancements” (DJRA1.4).
The last stage of the activity works was devoted to preparation of the final release focusing also on
preparation of documentation, user guides and other materials essential for support of BalticGrid-II
users.
Development of Migrating Desktop
 Advanced data management
To provide user with support for advanced data management Migrating Desktop graphical file
manager (Grid Commander) was delivered as an independent network application. Providing this tool
as independent, lightweight application
that could be loaded from any location
is useful for the users which don’t
required other functionality of MD but
need intuitive graphical file manager.
Necessary
changes
included:
restructuring of the Migrating Desktop
architecture (application was divided
into
several
cooperating
and
independent components constituting
Migrating
Desktop
and
Grid
Commander
lower
layers),
implementation of mechanism for easy
integration of various file systems,
Fig. 1 Grid Commander – intuitive file manager
support for data access mechanisms and
protocols used within Grid infrastructure as well as deployment of Grid Commander as a web
application, accessible from any network location.
 Support of advanced job types
Possibility of simultaneous submission of jobs differing only by values of input parameters (so called
“parametric job”) and compound applications (based on flow of results between separate jobs) was
recognized as one of the essential features required by users during BalticGrid-I works.
Implementation of gLite parametric jobs handling required significant changes in job definition,
submission and tracking mechanisms.
Development of workflow mechanism “from scratch” significantly exceeds JRA activity resources.
Keeping in mind requirements of BalticGrid project, after detailed evaluation of existing tools, the
Kepler workflow orchestration was chosen. Kepler is claimed to be one of the most robust and
flexible tools used for designing, executing, evolving, archiving, and sharing scientific
workflows. Especially important from BalticGrid users point of view are Kepler extensions
developed within EU Euforia project, that integrated this platform with gLite and UNICORE and
provides Kepler as alternative client of Migrating Desktop server.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 38 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Fig. 2. Kepler – scientific workflows orchestration system
 Integration of Migrating Desktop and visualization tools
VizLitG visualization system was integrated within Migrating Desktop platform. This tool could be
used for visualization of HDF5 files directly from Migrating Desktop graphical client
 Evaluation of Migrating Desktop Framework
Assessment of components developed within activity was a very important task at the last stage of the
project: to increase the stability JRA tools were evaluated and bugs found were corrected. Review of
codes and their optimization led to performance improvements.
 Deployment
The production version of the Migrating Desktop (and Grid Commander which is just a specialized
GUI to RAS file management services) has been installed as one of the BalticGrid core services on
desktop.balticgrid.org host in early stage of the project. The Migrating Desktop is widely used by
users to run BalticGrid applications.
Development of visualization tools
 Grid visualization tool VisPartDEM
Developed VisPartDEM software is specialized for
distributed visualization of large particle systems
simulated by the Discrete Element Method. Advanced
algorithms based on surface extraction and generation of
Voronoi diagrams were developed in order to obtain
geometric representation of propagating cracks (Fig. 3).
Advanced filters for visualization of propagating cracks
were implemented in the VisPartDEM software. The
simplest algorithm maps initial defects to generated
triangles (tetrahedras). Spatial representation of cracks
can be visualized as cell attributes. Advanced algorithms
employ global or local Voronoi diagrams in order to
obtain contact surfaces of neighbouring particles.
Fig. 3 Visualization of propagating cracks
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 39 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Assessment of VisPartDEM flexibility revealed that the architecture of e-service was not very
convenient for tool users as well as for site administrators. Light version of VisPartDEM was
developed in order to reduce installation efforts
and to designed software as grid visualization
tool (Fig. 4). Client software including GUI
and Remote Viewer is downloaded by using
Java Web Start technology. Simplified GUI
covers from user unnecessary details of
heterogeneous grid infrastructure. VisPartDEM
client implemented as Java application
connects to any UI by means of JSCH library.
Traditional gLite commands for user
Fig. 4 Architecture of VisPartDEM Light
authentication
and
authorization,
job
submission and monitoring are enwrapped by
Java programming language. Considered visualization pipelines, JDL files and shell scripts for
running visualization engine are generated automatically in order to submit job to grid. Finally,
parallel visualization engine of VisPartDEM runs on working nodes of any computing element while
the compressed video stream is efficiently transferred through the network and displayed on the client
by Remote Viewer.
XML interface for remote data is developed in order to provide for grid users automatic data
management and interactive dataset selection. Usually, large HDF5 file containing data is located in
remote storage element, while small XML file containing metadata on the data structure can be stored
in any convenient location. GLSL shaders supported by VTK are implemented in VisPartDEM in
order to improve performance of visualization and to exploit the increasing parallelism provided
by graphics processors.
 Grid visualization e-service VizLitG
Grid visualization e-service VizLitG is designed for convenient access and interactive visualization of
remote data files located in Storage Elements. VizLitG was adapted to needs of BalticGrid users and
employed for visualization of scientific results (Fig. 5) computed in BalticGrid.
Fig. 5. Visualization of breaking wave phenomena in the dam break problem
The client-server architecture of e-service VizLitG is based on widely recognized web standards
implemented in GlassFish application server and Java EE 6. The visualization server runs on special
User Interface for Graphics (UIG). Thus, natural access to grid resources is available. The lightweighted client implemented as Java application consists of GUI and Viewer. Metadata, visualization
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 40 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
network and its parameters are transferred by JAX-WS runtime. Remote user has full interactivity
level provided by VTK widgets and GVid software. Data files located in SE can be accessed by using
traditional means like LFC (LCG) or GridFTP available in gLite distribution. The additional
application server runs on the experimental SE in order to provide special data services able
processing and transferring the selected parts of datasets.
Performance analysis revealed that the initial client-server architecture of VizLitG delivering
geometry to client for local rendering is not suitable for the pilot application of poly-dispersed particle
systems. In order to be employed for visualization of pilot applications VizLitG was redesigned for
image delivery support. High interactivity level was provided for remote users by implementing GVid
software as video-streaming module. The interactivity of e-service was enchanted by implementing
VTK widgets. The Message Authentication over SSL mechanism of GlassFish is employed for
security reasons.
 Visualization software ParaView
ParaView is very popular open source software for visualization of large datasets. It was adapted to
the deployment in BalticGrid-II in order to provide
for grid users enhanced application services aimed
for distributed visualization of large datasets.
ParaView is very useful for BalticGrid users
performing large distributed computations of actual
industrial problems like oil filters (Fig. 6), sediment
transport, compacting and hopper discharge.
Fully interactive user communication mode was
implemented, tested and deployed on BalticGrid.
Grid user can start interactive visualization session
employing full power of GUI and highly interactive
widgets. Interactive GUI was implemented in grid
environment by using client-server communication
mode named reverse connection.
Fig. 6 Velocity field in industrial oil filter 6HP26
Special purpose parallel reader is developed for
unstructured datasets stored in predefined HDF5 format. It is adapted for the nature of pilot
applications that decompose the solution domain into sub-domains, each being assigned to a
processor, and ensure load balancing. The developed reader for partitioned unstructured HDF5 files
demonstrated good parallel performance in real grid environment based on distributed or shared home
systems.
GPU or direct rendering was enabled for visualization of large datasets on multi-core architectures. It
is worth to note, that gLite JDL abilities are not enough flexible for running parallel MPI jobs on
multi-core nodes equipped by a single GPU. ParaView server was configured in order to obtain right
image and avoid artefacts. GPU rendering on multi-core architectures significantly reduced
visualization time. Performed speed-up analysis revealed that deployed ParaView software is well
designed for distributed visualization of considered datasets.
Development of Gridcom
Gridcom is a simple grid interface for complex applications. It splits input data into intervals;
generates and submits as many jobs as needed; scatters parametric jobs into simple jobs (Fig. 7);
resubmits aborted jobs; and collects, merges and visualizes the results. All these functions
are performed automatically, including transparent upload of large files to storage elements.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 41 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Fig. 7. Parametric application execution via Gridcom
 User access control and user management interface
Now every Gridcom instance can have many read-only users. They can read application and grid log
files, study and download application results but can not launch applications on the grid. This is useful
for sharing the results and for troubleshooting, as administrators can look for any required log file
themselves. The new user management interface allows managing Gridcom users, changing
passwords. So Gridcom now does not violate BalticGrid certificate policies beacuse only certificate
owner has access to the grid resources, other users have only access to the results which are stored
locally in Gridcom server.
 HTTPS support
Gridcom security has been improved by supporting HTTPS protocol for all communication between
user and Gridcom. Also, BalticGrid SSL certificate for sig.balticgrid.org server has been gotten and
installed.
 Gridcom recovering after system restart
Now every running Gridcom application stores its state in files. Gridcom, on the other hand, now
automatically relaunches all applications after system restart. As a result all system restarts (due to
security updates or due to failure for instance) are unnoticeable to users.
 Gridcom optimization
Internal Gridcom optimization includes many improvement since the first phase of the project.
Gridcom administrative command have been developed to simplify mainteinance of Girdcom server.
The system load of hardware resources has been significantly decreased.
Cooperation with project activities
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 42 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
The Joint Research Activity works were strictly related to tasks performed within other activities of
the project so the interoperability and cooperation between activities is the key issue. JRA closely
cooperated with the SA3 providing necessary user support regarding JRA components, and actively
participate in NA2 educational and disseminative activities. JRA supported also NA3 activities (tools
developed within JRA are commonly used in everyday work of application users).
 NA2 “Education, Training, Dissemination and Outreach”
JRA activity was heavily involved in dissemination and the education process organised by NA2
activity. A number of tutorials during summer schools and local workshops prepared by NA2 were the
good opportunity for the application users and developers to familiarize with functionality (services)
delivered for the research community by JRA. Several demos shown and talks presented during
various meetings and conferences allow widely disseminate the results in scientific community and
show the usefulness from the user point of view
With a greater detail, the actions taken consisted of:
- Live demo "Using the BalticGrid-II infrastructure: SemTi-Kamols - a linguistic analyser of
Latvian language" was prepared in strict cooperation with NA2 activity and NA3 activity
(SemtiKamols application developer team). The demo of application integrated within
Migrating Desktop framework was shown on demo sessions of EGEE’08 conference.
(Istanbul, 2008)
- Demo “CoPS - The Complex Comparison of Protein Structures supported by grid” prepared
and shown during EGEE UF’09 conference (Catania, 2009)
- The Migrating Desktop as a framework for grid applications” presented at EGEE UF’09
conference (Catania, 2009)
- On-line and recorded demo "Migrating Desktop - Intuitive Interface to Grid Resources”
presenting functionality of Migrating Desktop prepared and shown during EGEE’09,
Barcelona.
- BalticGrid-II Summer School (Moletai LT) - activity representative gave to Summer School
students a lecture “Enhanced Application Services on Sustainable e-Infrastructure“
describing activity works and components developed.
- JRA representative was invited as a lecturer to the Summer School organized by EU project
DORII in St Stefan, Austria. Conducted tutorial “Migrating Desktop - intuitive graphical
interface to Grid resources” presents the BalticGrid project together with “hands-on”
Migrating Desktop exercises.
- Preparation local web pages describing components developed within activity
- Dissemination of knowledge about the BalticGrid-II project during local seminars
- Interactive demonstration “Visualization e-services and tools for grid build by gLite” has been
prepared and presented in EGEE’09, Barcelona.
- Visualization tools were promoted within grid users from different application areas during
the All Hands Meetings of BalticGrid project. Interactive demonstrations and show cases have
been prepared and presented for the users.
- Technical brochure “Grid visualization tool VisPartDEM” has been prepared, placed on the
web and distributed among potential users.
- Demonstration “VisPartDEM: distributed visualization tool for particle systems simulated by
the Discrete Element Method” on software features and usage has been prepared. Video
material of demonstration can be downloaded from the web page of VisPartDEM.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 43 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
-
-
-
Technical brochure “Grid visualization e-service VizLitG” has been prepared, placed on the
web and distributed.
The poster “Grid Visualization e-Service VizLitG” has been prepared and presented in
BalticGrid stand during EGEE’09, Barcelona.
Technical brochure “ParaView on BalticGrid” concerning usage of open-source visualization
software ParaView in BalticGrid-II has been prepared and placed on the web.
The poster “ParaView on BalticGrid-II” has been prepared and presented in BalticGrid stand
during EGEE’09, Barcelona.
Presentation “Grid visualization e-service VizLitG” in the 13th International Workshop on
New Approaches to High-Tech: Nano Design, Technology, Computer Simulations, Vilnius,
22–26 June 2009.
Presentation “Computation and Visualization of Poly-Dispersed Particle Systems on gLite
Grid” in 1st International Conference on Parallel, Distributed and Grid Computing for
Engineering, Pécs, Hungary, 6-8 April, 2009.
Invited presentation “Scientific computations, graphical environments and visualization in
grid environment“ in XVIIth seminar of Lithuanian Association of Computational Mechanics,
15 April, 2009.
 NA3 “Application Identification and Collaboration”
Tools, developed within JRA support the everyday work of NA3 application users providing intuitive
graphical interface to project resources and enabling effective collaboration between scientists. From
the other side the user expectations and requirements gave the driving forces to JRA works.
Pilot applications for developed visualization software have been considered within close
collaboration with NA3 activity. The attention has been focused on applications that had not integrated
visualization software and produce large result files located in remote storage elements. Thus,
visualization tool VisPaertDEM has been developed for the particle technology applications that
belong to the BalticGrid-II application group named Engineering Modelling Tasks. The visualized
examples including large number of particles are compacting, hopper discharge and crack propagation.
ParaView software is well designed for parallel visualization of large datasets that are common in
Computational Fluid Dynamics. Visualized applications of dam break flows, sediment and pollution
transport also belong to the BalticGrid-II application group named Engineering Modelling Tasks.
 SA3 “Application Integration and Support”
The services developed within JRA foster the new research communities supported by SA3
Works done in cooperation with SA3 activity include:

Support for integration of BG-II applications with
components developed within JRA;

Preparation of manuals concerning JRA tools. The
JRA tools documentation, user guides, demos and
other materials essential for support of BalticGrid-II
users have been continuously prepared and updated.

Publishing of training materials on BalticGrid as well
as local web-pages

Preparation of JRA technical brochure to let potential
users to get familiar with JRA Migrating Desktop a
tutorial showing basics of practical Migrating
Fig. 8. The JRA Brochure
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 44 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Desktop usage and user guide were prepared and published on Migrating Desktop wiki
Presentations and publications





R. Kačianauskas, A. Maknickas, A. Kačeniauskas, D. Markauskas, R. Balevičius. Parallel
discrete element simulation of poly-dispersed granular material. Advances in Engineering
Software, Elsevier SCI LTD: England, ISSN 0965-9978, 41(1), 2010, 52–63.
A. Kaceniauskas, R. Kacianauskas, A. Maknickas and D. Markauskas. Computation and
Visualization of Poly-Dispersed Particle Systems on gLite Grid // Proc. of 1st International
Conference on parallel, Distributed and Grid Computing for Engineering (Eds. B.H.V.
Topping and P. Iványi), ISBN 978-1-905088-28-7, Civil-Comp Press, 2009, p. 1-18.
A Kačeniauskas, R. Pacevič, T. Katkevičius, A. Bugajev. Grid visualization e-service
VizLitG. CDROM Proc. of 13th International Workshop on New Approaches to High-Tech:
Nano Design, Technology, Computer Simulations (Eds. A. Melker, V. Nelayev, J.
Tamuliene), Vilnius, Vol. 13, 2009, 92-98.
A Kačeniauskas, R. Pacevič, T. Katkevičius. Dam break flow simulation on grid. CDROM
Proc. of 13th International Workshop on New Approaches to High-Tech: Nano Design,
Technology, Computer Simulations (Eds. A. Melker, V. Nelayev, J. Tamuliene), Vilnius, Vol.
13, 2009, 85-91.
A. Kačeniauskas. Mass conservation issues of moving interface flows modelled by the
interface capturing approach. Mechanika, ISSN 1392-1207, 2008, 69(1), p. 35-41.
Deviations from Plan
No significant issues or deviations from plan were encountered.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 45 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Table 14 – Deliverables for JRA1
Del. No.
Deliverable Title
DJRA1.1
Date Due
Accepted for
release by PMB
Lead contractor
Performance analysis of sequential and 31/08/2008
parallel visualization in GRID
environment
31/08/2008
VGTU
DJRA1.2
30/10/2008
Design phase, interoperability and
definition of cooperation with other
activities: SA1, NA3, SA3 and plan for
education trainings in co-operation
with NA2
30/10/2008
PSNC
DJRA1.3
First prototype - system integration, 30/01/2009
includes Migrating Desktop
Framework extensions, user-level
access to Grid data structures services for user-level support for Grid
data management released, existing
visualization tools and Gridcom
30/01/2009
PSNC
DJRA1.4
Second prototype – including includes 31.10.09
mechanisms used for definition of task
flow, developed visualization tools and
Gridcom enhancements
31.10.09
PSNC
DJRA1.5
Final report
31.03.10
PSNC
Comment
Date Due
Actual/
Lead
Contractor
Forecast
delivery date
MJRA1.1 Design phase closed
The milestone
described in
DJRA1.2
30/10/2008
30/10/2008
PSNC
MJRA1.2 First prototype ready
The milestone
described in
DJRA1.3
30/01/2009
30/01/2009
PSNC
31.10.09
31.10.09
PSNC
MJRA1.3 Second prototype ready
The milestone
described in
DJRA1.4
30.04.10
30.04.10
PSNC
MJRA1.4 Final version of services
The milestone
described in
DJRA1.5
31.03.10
Table 15 – Milestones for JRA1
Milestone
No.
Milestone Title
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 46 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
3. JUSTIFICATION OF MAJOR COST ITEMS AND RESOURCES
3.1. WORK PERFORMED BY EACH PARTNER DURING THE PERIOD
The work performed is summarised and shown for each partner and activity. Generic tasks like
participation in recurrent activities and management meetings are not mentioned as well as self- and
partner related administration. Key personnel and activity leaders have all actively participated in the
work and responsibilities of the Project Management Board. This includes duties such as participation
in weekly meetings, reviewing of deliverables and other produced material and participation in
organisation of events. Partner representatives have all participated in the Executive Board meetings
and work, taking responsibility for the overall steering of the project.
The work performed is summarised and shown for each partner and activity. Generic tasks like
participation in recurrent activities and management meetings are not mentioned as well as self- and
partner related administration. Key personnel and activity leaders have all actively participated in the
work and responsibilities of the Project Management Board. This includes duties such as participation
in weekly meetings, reviewing of deliverables and other produced material and participation in
organisation of events. Partner representatives have all participated in the Executive Board meetings
and work, taking responsibility for the overall steering of the project.
3.1.1. KTH
NA1: Coordinator. Initiating and leading the project organisation. Collecting and compiling monthly
time reports. Collecting and compiling of quarterly partner internal cost statements.
Organising weekly Project Management Board meetings. Supporting partners concerning
financial and organisational issues. Organising the External advisory Committee and its first
project assessment. Coordinated the Gender Action Plan (DNA1.1). Initiated and coordinating
the BalticGrid Innovation Lab as well as the BalticCloud.
NA2: Co-organising BalticGrid events. Contributing to the BalticGrid website. Presenting
BalticGrid at international seminars, workshops and conferences. Reviewing the deliverable
documents.
NA3: BalticGrid Innovation Lab work, developing a course specifically aiming at SMEs.
NA4: Participating in OGF, ECRI, eIRG , EU concertation events. Reviewing the deliverable
documents.
SA1:
Assessing sites and certifying sites to become part of the EGEE grid. Continued analysis of
resource allocation, deprecation of Tycoon, analysis of existing (but un-configured) advancedreservation, preemptable, systems. Running ROC and ROD operations including escalation of
tickets to SA1 managment. theAssessing sites and certifying sites to become part of the EGEE
grid. Planning the transition of operations to the EGI.
SA2:
Applications support synchronising with initial and production SA3 Users' Forum. Reviewing
documents and network plans.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 47 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
SA3:
Initial deployment of BalticGrid Users' Forum. Imported, fixed and modernised historical
documents. Organised contributors per section. Introduced first-line support sequence.
Continuing work on re-deployed, production Forum.
3.1.2. EENET
NA2:
Contributing to the “Dissemination Report” and to NA2 deliverables DNA2.2. Reviewing
Balticagrid-II project poster. Development of the website informing users about the
BalticGrid-II project on Esronian Grid web page. Participation in kick-off meeting in
Vilnius and 1st All-Hands Meeting in Minsk. Representing the project in EGEE'08
Conference and EGEE User Forum. Contributing to the BaltucGrid users brochure.
NA3:
Contributing and reviewing DNA3.1, DNA 3.2, DNA3, DNA3.4. Filling an application
description form for NA3 about DALTON, MOSES and OpenFOAM applications and
porting the application to grid in collaboration with the scientists from Tartu University.
Participation on NA3 sessions on BalticGrid-II kick-off meeting at Vilnius. Meetings with
several scientific groups at University of Tartu: Linguists, Material Scientists, Distributed
systems group. Participate on NA3 sessions at Minsk BaltiGrid-II 1th All-Hands-Meeting
and Kalle Keskrand also made a short presentation.
NA4:
Participating in EUGridPMA meetings and representing Baltic Grid CA, reviewing
Latvian Grid Certification Authority and Belarusian Grid Certification Authority and the
policy documents. Participating in OGF25 meeting at Catania, Italy. Reading and
commenting on OGF Certificate Authority Operations work group documents.
Contributing to the reports DNA4.1.
SA1:
The work with operation of core grid services, e.g. VOMS, BDII, WMS and MyProxy.
Operating BalticGrid Certification Authority (CA). Updating and fixing bugs in grid CA
management software package Camager. Updating and fixing bugs in BalticGrid resource
monitor (http://infosite.balticgrid.org). Supporting BalticGrid resource centres with gLite
installations. Contributing to deliverable DSA1.1, DSA1.2, DSA1.3
SA2:
Monitoring, testing and configuring grid central services' network. Attending at the
network performance tests. Creating mailing list for Incident Response Team of BalticGrid.
Participating on "Technical Network Liaison Committee" sessions at EGEE'08. Building
network for new hardware for BalticGrid central services. Contributing and reviewing
DSA2.1 and DSA2.5
SA3:
Activity leader. Planning and coordinating of SA3 team work, organizing the weekly SA3
EVO-meetings and participating in the weekly PMB EVO-meetings. Supporting grid users.
Participation on BalticGrid-II kick-off meeting at Vilnius and on BaltiGrid-II 1th AllHands-Meeting at Minsk: general SA3 presentation and chairing SA3 sessions.
Choosing and preparing training for SA3 support specialists. Participation in EGEE
"Training the Trainers" event at CERN.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 48 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Applications support – working with MOSES, OpenFoam and Dalton packages for
integrating them on grid. Compiling parallel version of Dalton package. Testing OpenFoam
package after installation.
Initiating and contributing Baltic Grid Users' Forum web page
(http://support.balticgrid.org) and a central gLite user interface machine for BalticGrid
users. Testing applications for BalticGrid central gLite user interface service (Ganga and
GridWay). The documentation web page for Grid CA management software (Camager)
was updated. Writing users support system procedures and reconfiguring RT ticketing
system. Testing Migrating Desktop and Grid Commander installations for BalticGrid users.
3.1.3. KBFI
NA2:
During the first year of the BGII project, KBFI had: 14 presentations at the different
conferences and AHMs, 3 lectures, 4 PhD defenses related to the project, 11 scientific and
general public publications and 2 workshops. KBFI prepared the technical brochure
"BalticGrid Infrastructure" and a special web site for a spin-off of the BGII project, the
BalticCloud project.
NA3:
Provided support and coordinated CMS activities. Supported also general BalticGrid users
with resources and support. Contributed to deliverables DNA3.1, DNA3.2 and DNA3.4.
Operations Director resides in KBFI. Leading work on operational support and users
helpdesk. Took part in rotational support watch. Operated the testbed. Supported other sites
with gLite middleware installations and maintenance. Arranged SA1 meetings and
presented SA1 and operations results at BalticGrid meetings. Worked as Registration
Authority. Coordinated the communication with other middlewares projects. Created the
first cloud-computing instance in Northern Europe to initiate the BalticCloud initiative.
Active contribution to the BGi lab and courses to be taught in TU. Contributed in all SA1
deliverables and performed internal reviews of most SA2 deliverables.
Performed work on network throughput tests and consulted on the testing environment.
One for the first sites to join central monitoring of Ganglia. Perfomed internal reviews for
SA2 deliverables.
SA1:
SA2:
SA3:
Provided expert level user support. Helped a number of scientific groups with possible
future Gridification of their applications.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 49 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
3.1.4. IMCS UL
NA2:
Contributing to NA2 deliverables and BalticGrid-II brochure. Development of the website
informing users about the BalticGrid-II project in the Latvian language. Participation in
kick-off meeting in Vilnius and 1st All-Hands Meeting in Minsk. Representing the project
in EGEE User Forum in September 2008. 2nd All-Hands meeting in cooperation with RTU
organised in Riga. Participation in Grid sections during the conference of the University of
Latvia in 2009. Discussing collaboration between BalticGrid-II and GN2/GN3.
NA3:
Preparation of a publication about applications on the BalticGrid-II, preparation for the
EGEE08 conference, development of presentation of BG-II linguistic application SemTiKamols, studying possibility of grid computing usage for computer science students during
their laboratory works. Presentation of applications during 1st AHM and in the project
stand at the EGEE User Forum. Organising meetings with users and user groups from
various research institutions of Latvia. Evaluation of possibilities to grid-enable NS-2
network simulation software. Identification and contacting with OpenFOAM users, .
NA4:
Condor for submitting and managing grid jobs, as well as its interoperability with gLite
was studied. JSDL job description language was studied. Grid CA policy documents were
studied, presentation prepared, participation in EUGridPMA. Introduction to de-facto
standard for grid interoperability (Globus Toolkit) and various grid middlewares (gLite,
UNICORE, VDT), participation in EGEE UF and NGI meeting.
SA1:
Maintenance of IMCSUL and IMCSUL-INF sites, deployment of 64-bit SLC 4.6 WNs,
following the SA1 mailing list, user support, BalticGrid CA RA functions.
SA2:
Activity leader. Producing the reports “Report on the establishment and operation of
CNCC and NNCCs, and BalticGrid IRT” (DSA2.1), “Grid Acceptable Use Policy”
(DSA2.2), “Report on the expansion of BalticGrid infrastructure” (DSA2.3), “Security
Policy” (DSA2.4), “Incident Handling and Response Policy” (DSA2.5), “Interim report on
network performance” (DSA2.6). Periodic updates of the BalticGrid network monitoring
system, installation and support of Ganglia monitoring software. TCP performance testing
laboratory set up and tests with various configuration parameters run. The reasons behind
the poor network performance identified.
SA3:
Support of ANSYS, protein analysis and Mathlab users. ESSM (CoPS) application gridenabled and calculations performed. Work with other potential user groups initiated.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 50 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
3.1.5. IFJ PAN
NA2:
Planning and coordinating of NA2 team work, participating in the weekly PMB
video-meetings, maintenance of the BG-II web site and web usage statistics. Help in
organizing the kick-off meeting in Vilnius and 1st All-Hands Meeting in Minsk, coorganizing two Grid Open Days (in Vilnius and Minsk). Elaborating of NA2 yearly
dissemination reports, the Project Website, Dissemination Roadmap and
Dissemination Report deliverables. Participation in relevant exhibitions with Project
stands (EGEE conference in Istanbul, EGEE UF in Catania, ICT’08 in Lyon),
participation in workshops and conferences like Cracow Grid Workshop in 2008 (coorganized) and OGF. Organization of grid seminars (7) and help in organization of
one tutorial as well as support in organization of the BalticGrid-II Summer School (in
Vilnius) and preparation of the program of the event. Elaborating of the Project
disseminative materials like: general brochure (two issues), general poster, project
movie (two issues), NA2 presentations given at BG-II kick-off meeting and the 1st
AHM; elaborating feedback and event evaluation forms, BG-II pens, calendars and
business cards.
Gathering information on Activities' progress and their achievements; contacts with
application developers, gathering materials on BG-II applications for the BG-II
movie.
Cooperation with the BELIEF project w r t inclusion of the BG-II materials into the
BELIEF’s Digital Library.
SA1:
IFJ-PAN continued providing infrastructure support for hands-on tutorials in the
Project, the resources monitoring was improved which facilitated to have a successfull
tutorial during AHM meeting in Minsk. IFJ-PAN SA1 staff contributed to
development of support structures in SA1 activity which led to establishment of
relevant procedures and so called "1st line support" group. IFJ-PAN also has provided
manpower for Polish team of 1st line support. We took part in internal review of
DSA1.3 deliverable and continued on operational support for IFJ-PAN-BG site.
SA3:
IFJPAN is responsible for coordination of application support activity in SA3 as well
as for provision of 2nd level support (support for supporters) on that field.
Responsible person is Tomasz Szepieniec, who was actively involved in all Project
meetings, SA3 phone conferences and in organization of special SA3 workshop on
application adaptation. Work done includes following activity:
- selecting pilot applications for highest priority on grid enabling,
- work towards collection of best practices,
- direct support on grid enabling of NWChem package (which was
documented on a poster presented at 3rd EGEE User Forum).
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 51 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
3.1.6. PSNC
NA2:
Preparing materials and conducting BalticGrid-II (as well as local and external) workshops
and tutorials.
- A gLite tutorial at The First BalticGrid-II All-Hands Meeting (Minsk, Belarus, 2008)
- A lecture and tutorial at BalticGrid-II Summer School (Moletai, Lithuania, 2009)
- Invited talk and tutorial at the DORII project Summer School (St Stefan, Austria, 2009)
- Organisation of local workshops and tutorials
Presentation of demos shown at several conferences: EGEE’08 (Istanbul, 2008), EGEE
UF’09 (Catania, 2009), EGEE’09 (Barcelona, Spain, 2009). Talk “The Migrating Desktop
as a framework for grid applications” presented at EGEE UF’09 conference (Catania,
Italy, 2009) Participation in NA2 internal meetings. Update of local BalticGrid-II web
pages. Contributing to the BalticGrid-II brochure. Dissemination of knowledge about the
BalticGrid-II project during local seminars. Organisation and hosting of The Third
BalticGrid-II All-Hands Meeting (Poznan, Poland, 2009).
SA1:
Cluster configuration and maintenance (work performed continuously during the reported
period). Maintaining of plgrid Certificate Authority. Participation to activity internal
meetings. Describing procedures of "SA1 First line support". Setting up and monitoring
gLite facilities for tutorial “gLite for beginners” (Minsk). Active support of BalticCloud
initiative. Preparation of Site Contact list. Review of DJRA1.3 deliverable. User support
for Polish users of BalticGrid-II. Configuration of test machines for JRA purposes.
SA3:
Participation to activity teleconferences. Installation of Migrating Desktop as
BalticGrid-II core service. Supporting MD users. Integration of pilot applications.
Contribution to SA3 brochure. Evaluation of Migrating Desktop and Grid
Commander.
NA3:
Contributing to the deliverable: “Design and deployment of SIG portals, Migrating
Desktop, testbeds for application tests and launching” (DNA2.3),
JRA1:
Coordination of the activity. Enhancement of project components deployed within
BalticGrid (first phase) NA3 activity to be compatible with Grid standards of the project
testbed. Participation in preparation of report “Performance analysis of sequential and
parallel visualization in GRID environment” (DJRA1.1)
Preparation of demos presenting Migrating Desktop:
- "Using the BalticGrid-II infrastructure: SemTi-Kamols - a linguistic analyser of
Latvian language" – presented at the EGEE’08. (Istanbul, Turkey, 2008)
- “CoPS - The Complex Comparison of Protein Structures supported by grid” - shown at
the EGEE UF’09 conference (Catania, Italy, 2009)
- "Migrating Desktop - Intuitive Interface to Grid Resources” presented at EGEE’09,
(Barcelona, Spain, 2009).
Formulation of descriptions of works planned within activity and relations to other project
activities delivered as report: “Design phase, interoperability and definition of cooperation
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 52 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
with other activities: SA1, NA3, SA3 and plan for education trainings in co-operation with
NA2” (DJRA1.2)
Development of the Migrating Desktop Framework following conclusions described in
DJRA1.2 (see chapter 2.8 for more detailed description). The most important features
include:
- Providing an user-level support for Grid advanced data management
- Adaptation of mechanisms used for definition and management of task flow
Preparation of the Migrating Desktop releases and reports describing activity products
prototypes:
- The first prototype: “System integration, includes Migrating Desktop Framework
extensions, user-level access to Grid data structures - services for user-level support
for Grid data management released, existing visualization tools and Gridcom”
(DJRA1.3)
- “Second prototype – including includes mechanisms used for definition of task flow,
developed visualization tools and Gridcom enhancements” (DJRA1.4)
Preparation of the final release, and report showing activity works and achievements:
“Final report” (DJRA1.5). Creation of the local web pages describing components
developed within activity. Contribution to the article “BalticGrid-II on the Way to
Interoperability” submitted to Journal of Grid Computing. Creation of the JRA technical
brochure. Participation in project AHM meetings: Vilnius (May 2008), Minsk (October
2008), Riga (May 2009), Poznan (December 2009). Taking part in PMB teleconferences
and face to face meetings.
3.1.7. VU
NA2: Preparation of BG-II kick-off meeting in Vilnius, participation in it. Preparation for AHM in
Minsk, participation in it. Various internal meetings and local seminars to discuss problems
and tasks of BG-II. Participation in EGEE 08. Development and implementation of NA3
questionnaires. Negotiations with new Grid users. VU BG web page updating. Report on
monthly events.
NA3: Development and gridification of NEURO applications, installatiopn and testing of Octave for
NEURO applications. Analysis of applications foreseen in BG-II project, as well as checking
their status, gridification and tracking of deployment of applications of many partners. New
potential user identification. Preparation of DNA3.1, DNA3.2, DNA3.3, DNA3.4 deliverables.
NA4: NA4 participants team meeting in Vilnius AHM. BG PMB EVO meetings. Prepared article on
Analysis of grid interoperability, participation in grid interoperability and policy events in
EGEE-08. Participation of preparation in DNA4.1.
SA1:
Testing memory intensive grid jobs with shared resources, especially in cluster grid9.mif.vu.lt
and in grid6.mif.vu.lt. Weekly SA1 EVO meetings. Weekly reports to CIC portal.
Participation in EGEE-08 in Istanbul. Cluster software upgrade. Central grid services support.
Management of myProxy service. Nagios monitoring tool deployment and using in BG
infrastructure. WMS and BDII upgrading to new SLC and gLite versions. Participation in
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 53 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
CloudGrid meeting in Tallin. Hardware and virtual environment preparation for BG
CloudGrid first instance installation on RTU site.
SA2:
WMS data recovery from broken RAID. Participation in AHM, Minsk. Grid cluster testing.
Network intrusion detecting system testing. central service maintenance problem solving. New
kernel compilation and installation in worker nodes. CERN/CERT security recommendations
preparing for installation.
SA3:
Support of applications. MATLAB integration scenarios in Balticgrid infrastructure.
Participation in GridKa summer school. Ganglia monitoring system installation. Job
submittion script preparation for NAMD application. Grid presentation in many local
seminars, at Lithuanian Academy of Science, other institutions. New users applications
compatibility with Grid analysing.
JRA:
Enhancement of project components deployed within BalticGrid (first phase) NA3 activity
to be compatible with Grid standards of the project testbed.
Descriptions of works planned for Gridcom, within activity and relations to other project
activities presented as report: “Design phase, interoperability and definition of cooperation
with other activities: SA1, NA3, SA3 and plan for education trainings in co-operation with
NA2” (DJRA1.1, DJRA1.2)
Development of advanced job management support within Gridcom: restructuring of the
Gridcom architecture as an more universal grid user application. Implementation of support
for SFTP protocol and Unicore data access mechanisms.
3.1.8. RTU
NA2: Participation in BG-II kick-off meeting in Vilnius. RTU BG internal meeting. Participate in
IMCS seminar. Summarize RTU bachelors and master works concerning Grid. Prepared
DNA2.2 review. Participation in EGEE 08. Held internal meetings on EGEE-08 results.
Report to NA2 questionaire. Negotiations with new Grid users. RTU BG web page updating.
Report on montly events.
NA3: Optimal Matlab application choice for signal processing tasks. Solving Matlab gridification
problems. New potential user identification. New user community from molecular chemistry
involvement and their requirement indentification. Negotiations with Latvian Biomedical
Research and Study Center.
NA4: NA4 participants team meeting in Vilnius AHM. Participated in OGF 23 and FP5
Concertation Meeting in Barcelona. Prepared Q1-Q4 reports. BG PMB EVO meetings.
Prepared article on Grid interoperability to The Journal of Grid Computing. Participation in
Grid interoperability and policy events in EGEE-08 and OGF/EGEE events in Sardinia.
Prepared DNA4.1.- Analysis Document on Cooperation and Good Practice. Participation in
Cloudscape workshop in Brussels. Participation in eIRG workshop in Paris. Prepared
DNA4.2. and 1st year report on NA4.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 54 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
SA1:
Testing memory intensive Grid jobs with shared resources. GGUS ticket solving. VUS
software problem solving. Weekly SA1 EVO meetings. Weekly reports to CIC portal.
Participation in EGEE-08 in Istanbul. Cluster software upgrade. Central grid services support.
Add MYPROXY service. EGEE Nagios monitoring tool testing in BG infrastructure. WMS
and BDII upgrading to new SLC and gLite versions. BD-II project presentation on INGRID
conference in Sardinia. Participation in CloudGrid meeting in Tallin. Hardware and virtual
environment preparation for BG CloudGrid first instance installation on RTU site.
SA2:
Bandwidth testing between RTU and IMCS UL clusters. WMS data recovery from broken
RAID. Participation in AHM and Informatics Conference in Minsk. Stability and bandwidth
of interconnection between RTU and LUMII Grid cluster testing. Network intrusion detecting
system testing. New network monitoring system (ZABBIKS) installation. Central service
maintenance problem solving. . New kernel compilation and installation in worker nodes.
CERNCERT security recommendations preparing for installation.
SA3:
Negotiations with Mathworks about Matlab purchase. MATLAB integration scenarios in
Balticgrid infrastructure analysing. Suitable licensing model agreement signing. Matlab
integration in Grid environmet and testing on RTU site. OPNET testing in Grid environment.
Participation in GridKa summer school. Ganglia monitoring system installation. Job
submittion script preparation for NAMD application. Grid prezentation in seminar at Latvian
Biomedical and Study Center. New users application compatibility with Grid analysing.
3.1.9. ITPA
NA2: Preparation of BG-II kick-off meeting in Vilnius, participation in it. Preparations for AHM in
Minsk and Riga. Preparation of brochures (ITER and SYNTSPEC). Preparation of demo on
SYNTSPEC application. Various internal meetings and local seminars to discuss problems
and tasks of BG-II. Participation in EGEE 09 User Forum. Development and implementation
of NA3 questionnaires. Negotiations with new Grid users. ITPA VU BG web page updating.
Report on monthly activities.
NA3: Development and gridification of ITER and other applications. Analysis of applications
foreseen in BG-II project, as well as checking their status, gridification and tracking of
deployment of applications of many partners. New potential user identification. Preparation of
deliverables.
NA4: NA4 participants team meeting in Vilnius AHM. Participation in the deliverable preparation.
Prepared suggestions for European grid interoperability, participation in grid interoperability
and policy event in Lyon.
SA1:
Testing memory intensive grid jobs with shared resources on clusters Dangus, Spekras and
Atom. Weekly SA1 EVO meetings. Weekly reports to CIC portal. Cluster software upgrade.
Central grid services support. Management of myProxy service. Nagios monitoring tool
deployment and using in BG infrastructure.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 55 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
SA2:
WMS data recovery from broken RAID. Participation in AHM, Minsk. Grid cluster testing.
Network intrusion detecting system testing. central service maintenance problem solving. New
kernel compilation and installation in worker nodes. CERN/CERT security recommendations
preparing for installation.
SA3:
Support of applications. MATLAB integration scenarios in Balticgrid infrastructure.
Participation in training events. GridCom interface implementation. Grid presentation in many
local seminars, at Lithuanian Academy of Science, other institutions. New users applications
compatibility with Grid analysing. Support of Virtual Organisations.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 56 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
3.1.10. CERN
SA1:
Provide administrator support to the BalticGrid-II grid sites
o Participate in ticket watch rota
o Ongoing site monitoring and service updates
Provide administrator support to the VGTU cluster (EGEE production certified site)
o Provide support for the new on-site cluster administrator
o Quattor setup
o Services installed and monitored:
 lcg-CE / BDII / MON / APEL / GRIDICE / SRM SE / WMS / Top-BDII
 Nagios
 ~100 cores, 100GB storage
o Ongoing site monitoring and service updates
Participation in regular SA1 EVO meetings, typically once per week
Contribution and review of the following SA1 documents:
o DSA1.2 Central Services in Operation in All Countries
o DSA1.4 Report on Additional Grid Middleware Support
Other documents: contribution and review of:
o Deliverables
 DNA1.1 Quality and Gender Action Plan
 DNA2.2 Dissemination Roadmap
o MoU
 MoU with EGEE-III
 MoU with ARC
o Other documents
 Impact assessment case study document
 BalticGrid-II Users Forum Brochure
Other CERN-related activities declared under SA1:
o Provide CERN input to EC and internal reports:
 3 Quarterly Reports (Q1: May-June-July08; Q2: Aug-Sep-Oct08; Q3:Nov-DecJan09)
 4 Internal Cost Claims (Q1, Q2, Q3, Q4:Feb-Mar-Apr09)
 1 Annual report (DNA1.5 Yearly Activity and Management Reports)
 Monthly timesheets
o Liaison activities between BalticGrid-II and EGEE, and with other similar e-Infrastructure
projects
o Participation in the discussion of project sustainability after BalticGrid-II
o CERN-wide dissemination of BalticGrid-II activities
o Participation in the following BalticGrid-II events:
 BalticGrid-II Kickoff, Vilnius, 13-15 May 08
 BalticGrid-II All Hands Meeting, Minsk, 29-30 Oct 08
 Preparation for the BalticGrid-II All Hands Meeting, Riga, May 09
 Regular PMB meetings (EVO, face-to-face)
 EB meetings (co-located with Vilnius and Minsk All Hands Meetings)
o Participation in other relevant events
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 57 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
o
 6th e-Infrastructure Concertation Meeting, Lyon, Nov 08
Oversee CERN technical and administrative activities in BalticGrid-II that are not
mentioned above
 Hiring of personnel
 Handling of official documents: Grant Agreement, Consortium Agreement
 Internal review of spending; monitor/forecast spending
SA3:
Participation in regular SA3 EVO meetings, typically once per week
Provide administrator support to the users of the VGTU cluster
o Support installation of VGTU users’ applications
o Support / monitor job submission
o MPI support
o g-Eclipse testing
o VOs supported: BalticGrid / Litgrid / CMS / GAMESS
Review and input to the BalticGrid-II Users’ Forum
http://support.balticgrid.org/wiki/index.php/Main_Page
3.1.11. NICH BNTU
NA2: Participation in BG-II kick-off meeting in Vilnius. Participate in 1st BalticGrid-II All Hands
Meeting in Minsk. Make presentations in EGEE conferences in Istanbul and Catania. Filling
out NA2 questionnaires every month describing following disseminative activities performed:
14 scientific papers published; 5 scientific presentations given; 1 web site created.
NA3:
Identification of quantum mechanical and molecular dynamics software (NWChem and
NAMD) and testing it in grid environment together with Krakow. Negotiations with research
institutes in Minsk (Institute of Physics, Scientific-production center on material sciences,
Belarusian State University of Informatics and Radioelectronics, Belarusian State University)
about using this software in grid environment. Negotiation of Technological Centre (MIEET)
(Zelenograd, Russia) about possibility of quantum computation of applied problem on the
base of NWChem in grid environment. Join investigation (BNTU – ITPA VU) of carbon
nanostructures in NWChem package. Identification of engineering software packages
(ANSYS и LS-DYNA, OpenFOAM). Negotiotion about implementation of the grid into
educational processes with department of “Micro- and Nanotechnics” “Intelligent Technics”
of BNTU, “Theoretical and applied mechanics” of Belarusian State University. Identifucation
of linguistic application (Corpus Albarutenicum). The preparatory work for filling semantic
data base. At pesent the base consist approximately 150000 semantic objects. Identification
the geophysics (analysis of seismic signals) as prospective area of development applied grid
package in condition of Belarus. Contributing to all delivery reports during the project.
NA4: Participation in EGEE Policy board meetings. Regular participation in EGEE Discussion List
on policy issues for grid prospectives and evolution. Participation in development of EGEE
Blueprint and EGI establishment.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 58 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
SA1:
Obtaining full registration of BNTU unit in GRID2-FR. The g-lite services were deployed in
testing regime for single nod in local net of BNTU.
SA2:
Develop and re-configure the e-infrastructure of BNTU to integration into Balticgrid.
Transition from 100Mbps to 1Gbps between working node of grid-segment and users.
Optimization and expansion of throughput guaranteed capacity of the connection to backbone
segment of BASNET is realised with up to 100mbps. The possibility connection the local net
of BNTU to backbone segment of BASNET with throughput capacity up to 1-2Gbps on the
apparaus level is realised. Installed and configured software for local net monitoring (Cisco
network management). The net monitoring realised in real time with restore time by the net
setback during 1min. The reserve network connection for the working node realised.
SA3:
Support of functionality of NWChem in grid for CYFRONET clustere. Expanding grid
abilities and practices to agents of industrial and scientific establishments and involving them
into the BalticGrid infrastructure. Negotiation with Belarusian HEP community about
possibility to use Baltic Grid infrastructure for the LHC data calculation. Communication with
Dubna (JINR) about collaboration in HEP. Negotiation with LSDYNA representatve in
Russia/Belarus about problem of grid license. Participation in SA3 weekly EVO meetings.
Contributing to all delivery reports during the project.
3.1.12. UIIP
NA2: Participation in BG-II kick-off meeting in Vilnius. Organizing 1st BalticGrid-II All Hands
Meeting in Minsk. Filling out NA2 questionnaires every month describing following
disseminative activities performed: 1 brochure and 10 posters elaborated; 2 conferences, 2
workshops and 1 tutorial organized; 31 scientific papers published; 9 newspaper articles
published; 1 radio interview and 3 TV programs performed; 1 press conference arranged; 36
presentations given; 1 web site created. Contributing to DNA2.2 "Dissemination Roadmap"
and DNA2.3 "Dissemination Report".
NA3: Identification of bioinformatics software (molecular dynamics packages GROMACS and
AMBER) and testing it in grid environment. Negotiations with research institutes in Minsk
(Institute of Epidemiology and Microbiology, Institute of Bio-organic Chemistry, Belarusian
State University) about using this software in grid environment. Identification of engineering
software packages (UGS NX, ANSYS и LS-DYNA) and testing it in grid. Involvement and
requirement identification of new user community from the Institute of Heat and Mass
Transfer. Contributing to DNA3.1 "Application Identification and Collaboration Roadmap",
DNA3.2 "Analysis, supportive environment and testing of applications", DNA3.3 "Design and
deployment of SIG portals, Migrating Desktop, testbeds for application tests and launching",
and DNA3.4 "Collaboration structure and functionality for scientific/academic communities in
Baltic States and Belarus".
NA4: Participation in 14th EUGridPMA Meeting in Lisbon, PT. Regular participation in
EUGridPMA Discussion List on policy issues for grid certification authorities. Participation in
NA4 team meeting in Minsk at the 1st BalticGrid-II AHM. Contributing to DNA4.1 "Analysis
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 59 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
document on cooperation and good practice" and DNA4.2 "Progress report on awareness
raising".
SA1:
Central services VOMS, TopBDII, LFC, WMS are running in Belarus. GLite site BY-UIIP is
in the production since May 2008. Provision of computing and storage resources to BalticGrid
infrastructure. BY-UIIP site computing power has doubled since the start of the project. BYUIIP site storage capacity has increased to 7 TB. OS upgrade to SL4.7 and regular gLite
updates. GLite bug reports to developers. Running functional software repository for
Scientific Linux and gLite. GGUS ticket solving. Wiki content accumulation at wiki.grid.by
for gLite deployers and site administrators in Belarus. Setting up Belarusian Grid CA and
preparation of DSA1.1 “Setup of Belarusian CA”. Obtaining full accreditation for the CA
from EUGridPMA. Contributing to DSA1.2 "Central services in operation in all countries"
and DSA1.3 "SLA mechanism central services operational". Participation in weekly SA1
EVO meetings and CE ROC meetings. Installation of Eucalyptus-driven BalticCloud instance.
SA2:
Creating and operating the National Network Control Center and the Incident Response Team.
Negotiations with Polish PTTs and Belarussian PTT Beltelecom regarding the expansion of
the channel to GEANT up to 622 Mbps - 1 Gbps. Integration of the grid-cluster component
into existing BASNET network infrastructure with the speed 1 Gbps. Optimization and
expansion of throughput capacity of the backbone segment of BASNET up to 1 Gbps.
Installation and configuration of monitoring software (Nagios, Cacti). Designing graphic
presentation of the grid segment within the structure of BASNET for the Gridimon network
monitoring portal. Risk analysis for BASNET network to identify possible threats and types of
attacks. Participation in the development of Grid AUP, Security Policy and Incident Handling
and Response Policy. Contributing to all SA2 deliverables (DSA2.1–2.6.)
SA3:
Testing support portal and reviewing articles on user forum at the support portal. Preparing
and giving lessons for local grid users during BalticGrid-II Autumn Tutorial for Grid
Beginners. Expounding gLite abilities and practices to agents of industrial and scientific
establishments and involving them into the BalticGrid infrastructure. Establishing and
supporting common applications for gLite testbed in virtual environment as part of national
grid infrastructure. Participation in SA3 weekly EVO meetings. Contributing to DSA3.1
"Report on support specialist employment and training".
3.1.13. VGTU
NA2:
Organized BalticGrid II Kick-off meeting and Grid Open day in Vilnius with invited
speakers from CSC (Finland) IBM, SUN and Intel. Participation in 1st BalticGrid AllHands Meeting in Minsk. Participation in EGEE-08 User Forum in Istanbul and EGEE09/OFG-25 meeting in Catania. Contribution in preparing materials for BalticGrid II movie
and JRA1 brochure. Published one scientific paper and one popular scientific paper,
prepared and defended MSc thesis, gave 7 presentations at different locations about
BalticGrid II poject and applications, organized seminar for new grid users from VGTU.
Participation in EGI PB meetings in Praque and Catania. VGTU local BG II web site
updating. Report on monthly events.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 60 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
NA3:
Gridification and deployment of the following engineering software: CFD package
FEMTOOL (developed at VGTU), particle dynamics software DEM_MAT (developed at
VGTU), ANSYS (commercial software). Identification and analysis of distributes
computing possibilities of optimization problem solver of structural mechanics based on
genetic algorithms Provided support for engineering application as a resource contributor
and personalized support. Contributing to deliverables DNA3.1 and DNA3.2.
SA1:
Installation of central BG II services i.e. WMS and top BDII servers, installation CREAM
CE grid cluster cream.grid.vgtu.lt and testbed test.ce.vgtu.lt for engineering applications
and visualization tools. Installing UI gui.grid.vgtu.lt for testing visualization tools and UI
ui.grid.vgtu.lt for grid users. Upgrading VGTU grid cluster ce.grid.vgtu.lt. Nagios
monitoring tool deployment and using in BG infrastructure. GGUS ticket solving.
Participation in BalticCloud and BGi meeting in Tallinn. Installing and testing cloud
instances (IaaS) features based on Eucalyptus, Enomalism, OpenNebula. Deploying cloud
service (PaaS) based on AppScale. Participation in weekly SA1 EVO meetings.
SA3:
Preparing of job submission scripts, SGM deployment scripts preparation for FEMTOOL,
DEM_MAT, applications. New VO ANSYS was established for ANSYS software users,
preparing job submission scripts for ANSYS. Preparing HDF5 format output of
FEMTOOL and DEM_MAT applications for visualization tools testing. Participation in
weekly SA3 EVO meetings.
3.2. JRA1:
INVESTIGATION OF EXISTING VISUALIZATION SOFTWARE AND
DEVELOPMENT OF NEW VISUALIZATION TOOLS AS ENHANCED APPLICATION
SERVICES ACCORDING TO LOCAL USER NEEDS IS THE MAIN GOAL OF THE
JRA1 ACTIVITY IN VGTU. PERFORMANCE ANALYSIS OF DIFFERENT
VISUALIZATION TOOLS BASED ON THE VISUALISATION TOOLKIT (VTK) WAS
ACCOMPLISHED. SEQUENTIAL AND PARALLEL VISUALIZATION OF ACTUAL
BENCHMARKS BASED ON PILOT APPLICATIONS WAS PERFORMED. THE
RESULTS FORMED THE BASIS FOR THE DJRA1.1. SPECIALIZED GRID
VISUALIZATION TOOL VISPARTDEM WAS DEVELOPED FOR DISTRIBUTED
VISUALIZATION OF LARGE PARTICLE SYSTEMS SIMULATED BY THE
DISCRETE ELEMENT METHOD. ADVANCED ALGORITHMS BASED ON
SURFACE EXTRACTION AND GENERATION OF VORONOI DIAGRAMS WERE
IMPLEMENTED IN ORDER TO OBTAIN GEOMETRIC REPRESENTATION OF
PROPAGATING CRACKS. GRID VISUALIZATION E-SERVICE VIZLITG, DESIGNED
FOR CONVENIENT ACCESS AND INTERACTIVE VISUALIZATION OF REMOTE
DATA FILES LOCATED IN STORAGE ELEMENTS, WAS ADAPTED TO NEEDS OF
PILOT APPLICATIONS AND EMPLOYED FOR VISUALIZATION OF SCIENTIFIC
RESULTS COMPUTED IN BALTICGRID. POPULAR OPEN SOURCE SOFTWARE
PARAVIEW WAS ADAPTED TO THE DEPLOYMENT IN BALTICGRID-II IN ORDER
TO PROVIDE FOR LOCAL GRID USERS ENHANCED APPLICATION SERVICES
AIMED FOR DISTRIBUTED VISUALIZATION OF LARGE DATASETS. DEVELOPED
SOFTWARE WAS DEPLOYED ON BALTICGRID-II INFRASTRUCTURE. THE
RESULTS FORMED THE BASIS FOR THE DJRA1.2, DJRA1.3 AND DJRA1.4.
FLEXIBILITY ASSESSMENT AND EFFICIENCY ANALYSIS OF DEVELOPED AND
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 61 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
DEPLOYED SOFTWARE WAS PERFORMED. RESULTS WERE CONTRIBUTED TO
DJRA1.5.BUDGETED COSTS AND ACTUAL COSTS
The budget cost and actual cost for all partners are presented in Table 1. The numbers only covers
Q1-Q3, i.e. up till January 31, 2009. Some of these numbers might still be due to minor changes.
Note: when comparing % of budget, please notice the differences in total contribution from partners.
Org.short name
KTH
Estimated eligible
costs
Direct Cost
Q1+Q2+Q3
of wich subcontarcting
EENET
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
KBFI
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
IMCS UL
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
IFJ PAN
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
PSNC
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
VU
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
RTU
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
ITPA
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
CERN
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
108866.93
0
54768.78
163635.71
75999.61
0
19713.8286
95713.4386
75588.98
0
38042.86
113631.84
35515.54
0
16517.7121
52033.2521
50607.43
0
16033.47
66640.9
42292.555
0
24588.74
66881.295
83085.625
0
26806.0175
109891.6425
45623.16256
0
6414.886767
52038.04933
55156.46
0
20961.4108
76117.8708
43976.65
0
% of
budget
106.16%
0.00%
102.03%
104.74%
83.85%
0.00%
92.63%
85.52%
100.43%
0.00%
99.19%
100.01%
50.45%
0.00%
50.59%
50.50%
85.75%
0.00%
94.77%
87.76%
73.03%
0.00%
73.26%
88.14%
123.97%
0.00%
91.55%
117.83%
80.19%
0.00%
29.34%
66.07%
99.90%
0.00%
100.52%
100.07%
101.40%
0.00%
Page 62 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
NICH BNTU
Indirect costs
Total EU contribution
Direct Cost
26385.99
70362.64
43431.89
0
13854.2695
57286.1595
43136.88
0
19807.8616
62944.7416
36823.605
0
20193.175
57016.78
of wich subcontarcting
UIIP NASB
Indirect costs
Total EU contribution
Direct Cost
of wich subcontarcting
Indirect costs
Total EU contribution
Direct Cost
VGTU
of wich subcontarcting
Indirect costs
Total EU contribution
101.40%
101.40%
119.18%
0.00%
103.40%
114.94%
100.83%
0.00%
104.20%
101.87%
84.85%
0.00%
105.54%
97.50%
3.2.1. BUDGETED PERSON MONTHS AND ACTUAL PERSON MONTHS
Budgeted person months and actual person months for the period 1 May 2008 to 31 March 2009 are
presented in table below.
Note: when comparing % of budget, please notice the differences in total contribution from partners.
Used and planned PMs for the period 1 May 2008 to 31 March 2009.
NA1
KTH
EENet
KBFI
IMCS
UL
IFJ
PAN
PSNC
VU
Used
PM
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
NA2
NA3
NA4
SA1
SA2
SA3
JRA
% of
Planned
Total
7.2
1.1
2.8
8.2
2.9
1.4
23.5
9.2
1.4
1.4
4.6
1.8
0.9
19.3
Used
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
0.8
2.4
1
7.8
1.5
8.2
21.6
1.8
3.2
1.4
6.4
1.8
8.7
23.4
0.7
2.7
13.8
1.6
1.6
20.3
1.1
2.8
13.8
1.4
1.8
20.9
1.3
3
0.8
5.6
7.1
2.8
20.6
1.1
2.8
0.9
5.5
6
2.8
19
0
4.2
0
2.9
8.5
0
2.8
3.2
0.3
4.1
1.1
7.6
13
0.5
1.8
2.3
9.6
14.2
8.4
2.2
30.1
9.5
0.9
3.7
INTERNAL
2.5
92
97
108
15.6
7.8
2.8
122
13.8
113
92
Page 63 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
PM
RTU
ITPA
CERN
NICH
BNTU
UIIP
NASB
VGTU
TOTAL
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
Used
PM
Planned
PM
Used as
% of
planned
0.5
4.6
0.5
4.6
1.4
4.6
2.1
2.5
1.2
1.5
3.2
1.5
2.1
12
1.6
2.3
2.3
4.6
1.4
3.2
15.4
1.6
4.1
0.5
4.1
1.4
3.1
14.9
1.6
4.1
0.5
4.1
1.4
3.2
14.9
10.4
3.1
13
7
1.5
8
18.1
1.5
4.1
1.4
2.7
1.2
3.5
14.4
1.1
5
1.1
4.6
1.4
3.7
17
1.3
4.1
0.9
7.8
4.7
3.7
22.6
1.1
3.7
0.9
6.9
4.1
3.2
19.9
1
0.2
4.3
2.2
3.3
11.8
0.7
2.8
2.8
2.8
2.8
12.5
7.2
23.4
31.3
9.8
79.3
24.4
43.9
13.1
233.4
9.2
20.3
31.3
9
68.9
20.7
41.8
14.5
216.4
78
115
100
109
115
118
105
90
108
NA1
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
NA2
NA3
NA4
SA1
INTERNAL
SA2
SA3
JRA
Total
166
78
100
163
85
114
94
108
% of
Planned
Page 64 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
Total Used PM vs Planned PM, first year of project up till March (PM11).
Note: when comparing % of budget, please notice the differences in total contribution from partners.
3.3. REBUDGETING
One of the partners in the consortium (RTU) had used the wrong cost model in their calculations,
resulting in a surplus of funding, a total of 35,990 EUR. This surplus was re-budgeted within the
consortium – prepared by PMB and accepted by all partners through the project Executive Board
(EB). See appendix 3.2 for more details.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 65 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
4. ANNEX
4.1. USE OF FOREGROUND AND DISSEMINATION ACTIVITIES DURING THE FIRST
YEAR OF THE PROJECT
The text below is an update of the initial plan in Annex I for use and dissemination of foreground.
The knowledge of the Partners, information gathered and experiences gained during the BalticGrid
project have been used to support performing dissemination and outreach tasks in BalticGrid-II
project.
A number of Project activities aim at disseminating of Project results. The following actions have been
taken by NA2 to achieve this goal during the first 12 months of the Project:
 Promoting Project results, achievements and other information aimed at the general public
through the main Project web site,
o The BG-II web site was prepared just after the Project start; information on Project
results and achievements is placed a.o. in ‘News’, ‘Deliverables’ and ‘Related’
sections of this web site

Spreading information on upcoming and past events – organized in the framework of the
Project and other related:
o through the BG-II main web site and Partner local ones
o through the presentations at various Grid events
o during local meetings and discussions

(Co-) organizing public events such as scientific Grid conference and workshops, Grid Open
Days (accompanying Project conferences organized in the Partners’ countries), etc.
o Grid Open Days in Vilnius and Minsk
o Kick-off meeting in Vilnius and 1st BG-II AHM in Minsk
o The SSA`2008 conference in Minsk
o SKIF-Grid Program Workshop in Minsk,
o Cracow Grid Workshop
o workshop in the BG-II infrastructure for computational meteorology in Tallinn
o 67th conference of the University of Latvia in Riga

Organizing training events - to exploit the knowledge to a wide community from Baltic States
and Belarus:
o in October-November 2008 – in Minsk (with 30 participants)
o in November 2008 – in Riga (20) and in Vilnius (16)
o in December 2008 – in Panevezys (20) and in Riga (2)

Organizing seminars on Grid research, intended for groups of researchers from various fields.
Examples of Grid seminars delivered in the Baltic Countries:
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 66 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
o lectures (courses) for MSc students concerning various aspects of Grid computing
o seminars presenting Grid computing Cloud computing as well as the BalticGrid-II
project to academic communities as well as to potential users of BG-II products
(together with advices on how to get a certificate and start using the Grid)
o seminars for teachers of secondary schools with information on HPC systems,
BalticGrid-II and LitGrid projects as well as on how to become a Grid user
o seminars for pupils on Grid clusters and IT activities Partner institutions

Publishing disseminative materials, such as presentations, posters and brochures - a number of
promotional materials was published and/or distributed:
o at various Grid events (Grid Open Days, workshops, conferences),
o at Project exhibition stand and demo booths during EGEE’08 conference, EGEE UF
and ICT’08 events,
o online - most of these materials is available in the ‘Downloads’ section of the BG-II
main web site.
These materials included:
o BG-II fact sheet, general Project brochure (two issues) and several technical ones
o general Project poster and several technical ones
o customer feedback form
o event evaluation form
o three demos of BG-II applications
o BG-II movie (two issues)
o promos: a pen, a calendar and a business card

Publishing scientific papers on results of using BalticGrid-II infrastructure
o Many papers were published in conference proceedings and scientific journals –
information on them is available in the ‘Scientific Articles’ section of the BG-II main
web site and below (Event date / journal publication date, Name of conference /
workshop / journal of submission, Authors, Title):
 May 2008, Proceedings of the Scientific-practical Conference “Sea and coast
research-2008”, Klaipeda, 2008: p. 26-27., I. Dailidiene, P. Zemlys, Modeling
of pollution drifts in the Baltic Sea near Lithuanian seacoast
 13.05.2008, Proc. of the 12th International Conference on Complex
Information Protection, Yaroslavl (Russia), pp. 10–13, 2008, S. Ablameyko,
S. Abramov, U. Anishchanka, A. Krishtofik, A. Moskovskiy, O. Tchij,
Building the Belarusian-Russian Grid-network pilot segment
 13.05.2008, Proc. of the 12th International Conference on Complex
Information Protection, Yaroslavl (Russia), pp. 14–23, 2008, S. Ablameyko,
S. Abramov, U. Anishchanka, A. Krishtofik, Integration of Resources for
the “SKIF-Grid” Program
 13.05.2008, Proc. of the 12th International Conference on Complex
Information Protection, Yaroslavl (Russia), pp. 27–30, 2008, U. Anishchanka,
A. Krishtofik, O. Tchij, Building grid infrastructure “SKIF” in the Republic
of Belarus
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 67 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010












13.05.2008, Proc. of the 12th International Conference on Complex
Information Protection, Yaroslavl (Russia), pp. 114–116, 2008, A. Krishtofik,
U. Anishchanka, Security Provision of the National Grid-network
4.06.2008, Computer Physics Communications, A. Deveikis, A. Juodagalvis,
A computer program for two-particle generalized coefficients of fractional
parentage
23.06.2008, Proceedings of International workshop NDTCS'2008, J.
Tamulienė, G. Badenes, R. Vaišnoras, M. Leonas Balevičius, L. Rasteniene,
Study of the Single Con (n=6, 8, 10, 12, 14) nanoparticles
July 2008, Rīgas Tehniskās Universitātes Zinātniskie Raksti, Būvzinātne,
Sērija 2, Sējums 9. Rīga, RTU 2008, 13 lpp., K. Kalniņš, G. Jēkabsons, R.
Beitlers, O. Ozoliņš, Optimal design of fiberglass panels with physical
validation
July 2008, Proceedings of the 26th International Congress of the Aeronautical
Sciences, K. Kalnins, C. Bisagni, R. Rikards, E. Eglitis, P. Cordisco, A.
Chate, Metamodels for the optimization of damage-tolerant composite
structures
September 2008, Lithuanian Journal Physics, A. Kynienė, V. Jonauskas, S.
Kučas, R. Karazija, On the existence of dipole satellites in the region of M2,3L2,3 non-dipole emission lines for transition elements
September 2008, 1st international scientific conference “Nanostructured
materials – 2008: Belarus-Ukraine-Russia” NANO-2008, 22-25 April 2008 –
additional volume, V. V. Barkaline, A. S. Krasko, P. A. Zhuchek, A. S.
Chashynski, S. V. Grigorjev and P. Goranov, Receiving the solution of the
carbon nanotubes by pulsed electrical discharge in the water and the problem
of his chemical fictionalization
27.09.2008, Scientific services in Internet: solution of big problem.
Proceedings of all-Russia conference, (22-27 September 2008),
Novorossiysk.- М.: Изд-во МГУ, 2008. - 468 с. с.101-105, V. V. Barkaline,
S. V. Medvedev, V. V. Nelayev, P. A. Sluchak, S. N. Yurkevich, Hierarchical
system of the physical processes and material properties on the base of
supercomputer configuration K1000
October 2008, Proceedings of the 3 rd International workshop “Intelligent
Technologies in Logistics and Mechatronics systems”, J. Tamulienė, R.
Vaišnoras, G. Badenes, L. M. Balevičius, Co6Om (m = 1 ... 7) Nanoparticles
Geometric and Electronic Structure Changes Within Increasing of Oxygen
Number
15-17.10.2008, VIII International seminar “Methodological aspects of
scanning microscopy–2008”, V. V. Barkaline, A. S. Chashynski, P. A.
Zhuchek, S. A. Chizhik, MD modeling of sounder on the base of CNT
27.10.2008, Journal of Physics B: Atomic, Molecular and Optical Physics, V.
Jonauskas, S. Kučas, and R. Karazija, The essential role of many-electron
Auger transitions in the cascades following the photoionization of 3p and 3d
shells of Kr
3-5.11.2008, II Congress of Belarusian physicists, Ihar A. Miklashevich,
Development of GRID technologies in borders of BalticGRID II project
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 68 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010











5.11.2008, Proceedings of the VII International conference “Development of
informatization and systematization of scientific and technical information”,
S. V. Ablameyko, U. V. Anishchanka, A. M. Krishtofik, Creation of gridinfrastructure in the Republic of Belarus
5.11.2008, Proceedings of the VII International conference “Development of
informatization and systematization of scientific and technical information”,
S. V. Ablameyko, U. V. Anishchanka, A. M. Krishtofik , Development of grid
experimental zone
5.11.2008, Proceedings of the VII International conference “Development of
informatization and systematization of scientific and technical information”.
S. V. Ablameyko, S. M. Abramov, U. V. Anishchanka, A. M. Krishtofik, A.
A. Moscovskiy, “SKIF- Polygon” – the first step on a way to creation of a
uniform information field of Union state
5.11.2008, Proceedings of the VII International conference “Development of
informatization and systematization of scientific and technical information”,
U. V. Anishchanka, S. A. Aneichyk, Status and development prospects of the
academic network BASNET
6.11.2008, Proceedings of (12th International Electronic Conference on
Synthetic Organic Chemistry) ECSOC-12, A. Vektariene, G. Vektaris, J.
Svoboda, Theoretical study of the mechanism of thieno[3,2-b]benzofuran
bromination
3-5.12.2008
RUSNANOTECH
2008,
International
forum
for
nanotechnology 3-5 December 2008, A. S. Basaev, V. A. Labunov, V. V.
Barkaline, I. A. Taratyn, A. M. Tagachenkov, Carbon nanotubes ordered
arrays based nanoelectromechanical systems for ultra high frequency and
sensor devices
16.01.2009, Dynamic and Technological Problems of Mechanics of
Continuum amd Structure XV International Symposium dedicated to Anatoly
G. Gorshkov Yaropolets, Moscow, February 16-20, 2009, V. Barkaline, P.
Zhutchak, Hierarchical approximation to the mechanical properties of the
avia-parts modelling
20.01.2009, Seventh International Conference on Composite Science and
Technology, January 20 – 22, 2009. I. A. Miklashevich, Discrete Crack
Propagation and Composites Delamination
6-7.02.2009, Application the computational mechanics methods to
engineering, science, education. XXXIX Republic seminar, Minsk, Belarus, I.
A. Miklashevich, A. S. Chashynski, Ya.V. Douhaya, Grid technology in
resource-intensive computational problems
6-7.02.2009, Application the computational mechanics methods to
engineering, science, education. XXXIX Republic seminar, Minsk, Belarus,
V. V. Barkaline, P. A. Zhuchak, and I. A. Miklashevich, Modeling of the
aviation components by the help LS-DYNA package on the SKIF-K1000
cluster.
March 2009, Proceedings of the 4th International Conference on Availability,
Reliability and Security (ARES/CISIS 2009), Fukuoka, Japan, March 2009,
M. Ahsant, E. Tavalera Gonzalez, J. Basney, Security Credential Mapping in
Grids
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 69 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010



March 2009, Proceedings of the 4th International Conference on Availability,
Reliability and Security (ARES/CISIS 2009), Fukuoka, Japan, March 2009,
M. Ahsant, J. Basney, Workflows in Dynamic and Restricted Delegation
April 2009, Proceedings of 1st International Conference on parallel,
Distributed and Grid Computing for Engineering (Eds. B.H.V. Topping and
P. Iványi), ISBN 978-1-905088-28-7, Civil-Comp Press, 2009, p. 1-18, A.
Kačeniauskas, R. Kačianauskas, A. Maknickas and D. Markauskas,
Computation and Visualization of Poly-Dispersed Particle Systems on gLite
Grid
Attracting attention of the local media in the Partners’ countries by announcing Project
achievements and highlights
o Several newspaper articles, radio interviews and TV programs concerning BalticGridII were prepared and published – information on them is available in the ‘Press and
Media’ section of the BG-II main web site and below:
 “Belarus – in Grid-society”, newspaper “Vedu” (“Knowledge”), May 2008,
 “BASNET is accepted for TERENA”, newspaper “Vedu” (“Knowledge”),
June 2008,
 “VGTU – BalticGrid-II projekto partneris”, newspaper “Gedimino
Universitetas”, July 2008,
 “SKIF spreads its nets”, newspaper “Union assembly”, July 2008,
 “A <grid> for supercomputer”, newspaper “Vedu” (“Knowledge”), August
2008,
 “Science is becoming more and more international…”, a part of the
“European Grid Initiative: towards a sustainable long-term European grid
infrastructure” – a September 2008 issue of the “Grid Talk, Grid Briefings”
brochure,
 "Virtual tour: BalticGrid-II" – a video interview with Ake Edlund (PD), Grid
Talk web site, September 2008,
 “Achieving Global Science: Grid-powered solutions from EGEE and
Collaborating Projects” (the joint publication of EGEE and Collaborating
Projects), EGEE booklet (distributed online and during the EGEE’08
conference, Istanbul, Turkey), September 2008,
 “Academic networks and digital divide”, magazine “Sakaru pasaule”
(“Connection world”), October 2008,
 “Pasaulis – tai superkompiuteris” (“World – supercomputer”), journal
“Naujoji komunikacija” (“New communication”), October 2008,
 “An interview on BalticGrid-II” (a radio interview with Ake Edlund, in
English), Radio Belarus, October 2008,
 “An interview on BalticGrid-II” (a radio interview with Robert Pajak - NA2
leader, in Polish), Radio Belarus (Polish division), October 2008,
 “Supercomputer technologies will be available to users of the regional centres
of Belarus”, “BELTA” - National Mass Media Information Agency electronic newspaper, November 2008,
 “A step into the future of IT”, newspaper “Vedu” (“Knowledge”), November
2008,
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 70 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
















“BELIEF at ICT2008 International Village Global e-Infrastructure" - a press
release prepared by the BELIEF-II project members to promote the joint stand
of BG-II and several other EU e-Infrastructure projects at the International
Village at ICT 2008 in Lyon, published on the BELIEF-II and BG-II web
sites, November 2008,
“Supercomputer technologies for the masses”, newspaper “Vedu”
(“Knowledge”), November 2008,
“Internet is gridifying” ("Interneti tulevik on voreline"), a popular science
magazine "Tarkade Klubi", December 2008,
“First two quarters of BalticGrid-II project”, EENET's newsletter No 7,
December 2008,
“Negali būti!” (“It can’t be true!”) – a TV program shown in “LTR” –
Lithuanian national radio and television channel, December 2008,
“Union” – a TV-Magazine shown in “ONT” - Belarusian TV channel,
January 2009,
“BalticGrid-II” (an interview with Henryk Palka from IFJ PAN), “Sukcesy
Małopolski w 7. Programie Ramowym” (“The achievements of Malopolska
voivodeship in 7th FP”) – a publication elaborated by the Technology Transfer
Centre of the Krakow University, January 2009,
“An interview on grid computing” (a radio interview with Anatoliy Krishtofik
from UIIP NASB), Belarusian Radio, February 2009,
“An interview on grid technologies in Belarus for national weekly news
broadcast” (a TV interview with Anatoliy Krishtofik from UIIP NASB),
“ONT” - Belarusian TV channel, February 2009,
“Comparing protein structure using a computer grid" – a video interview with
Edgars Znots (IMCS UL), Grid Talk web site, March 2009,
"See inside the stars" – a video interview with Sarunas Mikolaitis (ITPA
VU), Grid Talk web site, March 2009,
“A section about signing of MoU between BG-II and EGEE projects”, a part
of the March 2009 edition of the EGEE director’s letter (distributed to the
EGEE all members' mailing list), March 2009,
“How can the computer be a cloud? ( “Kaip gali kompiuteris būti
debesyse”?), Journal of Vilnius University “SPECTRUM” , Vol. 10, March
2009,
“An interview about RTU participation in BalticGrid-II” (a TV interview with
Ilmars Slaidins and Lauris Cikovskis), “LTV7” - Latvian TV channel, March
2009,
“TV reportage on supercomputer and grid technologies in Belarus” (a TV
program with participation of Viktor Evtuhov, Sergey Abramov, Uladzimir
Anishchanka, Anatoly Krishtofik), “ONT” - Belarusian TV channel, March
2009,
The press conference organized to cover the state and the prospects of grid
and supercomputer technologies in Belarus and abroad. The following media
representatives participated in this event: newspaper “Vedy”, newspaper “7
Days”, "BelTA” Agency, TV-magazine “Soyuz”, Radio station “Belarus”,
newspaper “Republic”, ONT TV channel, National TV channel and
Metropolitan TV channel, March 2009.
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 71 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010


“Integration makes leaders”, newspaper “Vedu” (“Knowledge”), April 2009,
“To create something unbuyable”, newspaper “Republic”, April 2009.
o The audiences were also contacted through:
 seminars and tutorials,
 presentations,
 brochures, posters and movies,
 All-Hands meetings, Grid Open Days, workshops and other meetings and
discussions.
To help in searching the Project’s target audiences and to support the NA2 team in achieving the
dissemination objectives, the dissemination efforts shall be somehow evaluated. To do that, the
following measures are being used to evaluate the dissemination efforts:
Measure name
Value achieved in the first year of the Project
number of events organized by the Project (Grid
Open Days, conferences and workshops)
9
number of participants in events organized within
the framework of the Project
630
number of presentations on the Project results
given at various Grid events
66
number of page views of the Project web site
218 852 (June 2008 – April 2009 period)
number of published articles
20
number of scientific papers
29
Description of all NA2 activities performed in the first year of the Project life is available in the
DNA2.3 Dissemination Report (P12).
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 72 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
4.2. REBUDGETING
One of the partners in the consortium (RTU) had used the wrong cost model in their calculations,
resulting in a surplus of funding. This surplus was re-budgeted within the consortium – prepared by
PMB and accepted by all partners through the project Executive Board (EB). A more detailed
distribution on activity level is being made. The motivation for the rebudgeting is attached below.
Re-budgeting of BG-II
March 23, 2009
Aake Edlund
PD of BG-II, on behalf of the project partners
Background
As discussed in the AHM meeting in Minsk, October, 2008, a re-budgeting of BG-II was to be decided. The PD
was to give a suggestion to the EB chair, for further approval by the partners.
The total amount for redistribution is 35,990 EUR, and is mainly aiming at an additional task, something that will
be of use also after the end of the BG-II project.
Additional task – the BalticGrid Innovation Lab (BGi), including the establishment of the Baltic Cloud.
The BalticGrid Innovation Lab (BGi) is a program for early stage companies in our region, based on a hands-on
course in grid and cloud computing. This course is prepared and run by BG-II, on BG-II resources (through the
Baltic Cloud) as well as on donated resources from other parties.
The aim with the course is to help early stage high-tech Internet based companies to try their services on new
platforms, resulting in early proof of concepts and later exploitation of grid and cloud in the region. On top of the
course we'll build a network of innovative companies in the region. We will also pursue the usage of the Baltic
Cloud by researchers in our BalticGrid network.
The budget for the establishment of the BGi is:
i) 30,000 Euro to establish and run the BG Innovation Lab (BGi). This will involve all partners and activities
in the long run (in the short run mostly SA3 and NA3). 1,000 EUR of this amount will be used by
NA2 for marketing BGi and Baltic Cloud.
In addition we propose to use
ii) 5,000 Euro to NA2, and
iii) 990 Euro to NA1 in support of EAB travel expenses.
Country
Partner
Task
Amount (EUR)
EST
KBFI
BGi [1]
6000
EST
KBFI
BGi coordination [2]
5000
LT
VGTU
BGi [3]
6000
LV
RTU
BGi [4]
6000
BY
UIIP NASB
BGi [5]
6000
SE
KTH
NA1, EAC costs
990
PL
IFJ PAN
NA2 [6]
6000
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 73 of 74
ACTIVITY AND MANAGEMENT REPORT
Final Report, 1 May 2008 to 30 April 2010
SUM
35990
AVAILABLE
35990
[1, 3, 4 and 5] Setting up a cloud instance, connecting with the Baltic Cloud, preparing and running the 2 day BGi
course.
[2] Creating the BGi course. Pioneering on cloud installations, sharing with all partners. Coordinating the Baltic
Cloud during the BG-II. This coordination is then shared in a rotating fashion between the Baltic States and
Belarus.
[6] Thanks to this additional money BG-II, through NA2, will be able to participate (2 persons) with a BG-II stand
at least in one additional EGEE event, probably the last one (EGEE UF) in spring next year. 1,000 EUR of this
amount will be used by NA2 for marketing BGi and Baltic Cloud.
4.3. LIST OF ACRONYMS
AHM
BG
EENet
EGEE
HEP
ICFA
IMCS UL
IFJPAN
ITPA
KTH
LHC
LHCb
NA1
NA2
NA3
NA4
NICPB
NREN
PMB
PSNC
RTU
SA1
SA2
SICS
VU
WLCG
All-Hands Meeting
BalticGrid
Estonian Educational and Research Network, Tartu, Estonia
Enabling Grids for E-sciencE, EU project
High-Energy Physics
International Committee for Future Accelerators
Institute of Mathematics and Computer Science, University of Latvia, Riga, Latvia
Institute of Nuclear Physics, Polish Academy of Sciences, Kraków, Poland
Vilnius University Institute of Theoretical Physics and Astronomy, Lithuania
Royal Institute of Technology, Stockholm, Sweden
Large Hadron Collider
Large Hadron Collider beauty experiment
Networking Activity 1: Management of I3
Networking Activity 2: Education, Training, Dissemination and Outreach
Networking Activity 3: Application Identification and Support
Networking Activity 4: Policy and Standards Development
National Institute of Chemical Physics and Biophysics, Tallinn, Estonia
National Research and Education Network
Project Management Board
Poznan Supercomputing and Networking Center, Poznan, Poland
Riga Technical University, Riga, Latvia
Specific Service Activity 1: Grid Operations
Specific Service Activity 2: Network Resource Provisioning
Swedish Institute of Computer Science
Vilnius University, Vilnius, Lithuania
World LHC Computing Grid
BGII-DNA1.9-KTH-Activity-andManagement-Report-FINALv0.1.doc
INTERNAL
Page 74 of 74
Download