E
FS
IN
O-
-
RI
16
26
11
FS
IN
MI
E
-
RI
O-
16
26
11
MI
TSA2.2 - Quality Assurance Process Definition and Monitoring
Define standards-compliant software engineering process and guidelines
Continual activity of monitoring its application
TSA2.3 - Metrics Definition and Reporting
Definition, collection and reporting of software quality metrics
Reports information on the status of the software to take corrective actions
TSA2.4 - Tools and Repositories, Maintenance and Integration
Definition and maintenance of tools required to support QA process
Supporting activity to software providers to integrate external tools
Repositories for the EMI software packages, tests, build and reports
TSA2.5 - QA Implementation Review and Support
Review activities of the QA, test and certification implementations.
Sample review of test plans, compliance, porting guidelines, documentation, etc
Supporting the Product Teams in implementation of tests and use of testing tools
TSA2.6 - Testbeds Setup, Maintenance and Coordination
Setup and maintenance of distributed testbeds for continuous integration testing
Coordination and provision of larger-scale testbeds from collaborating providers
22-24/11/2010 EMI AHM Prague - SA2 Overview - A.Aimar
2
E
FS
IN
O-
-
RI
16
26
11
FS
IN
MI
E
-
RI
O-
16
26
11
MI
Deliverable
DSA2.1
Quality Assurance Plan
DSA2.2
QA Tools Documentation
DSA2.3
Periodic QA Reports
DSA2.4
Continuous Integration and
Certification Testbeds
22-24/11/2010
Author
M. Alandes
(CERN)
Reviewers
G. Fiameni
(SA1/CINECA)
A. Ceccanti
(JRA1/INFN)
Status
Completed
L. Dini (CERN) R.Rocha
(SA1/CERN)
TBD
M. Alandes
(CERN)
C. Cacciari
(SA1/CINECA)
M. Riedel
(JRA1/JUELIC
H)
In Preparation
Completed
D. Dongiovanni
(INFN)
Morris Riedel
(JRA1/JUELIC
H) Balazs
Konya
(TD/LU)
Oliver Keeble
(SA1/CERN)
EMI AHM Prague - SA2 Overview - A.Aimar
Under Review
3
Milestone
MSA2.1 - Software development tools and software repositories in place
Author
L. Dini
(CERN)
MSA2.2 - Continuous integration and certification testbeds in place
D. Dongiovanni
(INFN)
Status
Completed
Under
Review
E
FS
IN
O-
-
RI
16
26
11
FS
IN
MI
E
-
RI
O-
16
26
11
MI
MSA2.3 - Large-scale acceptance certification testbeds are in place
D. Dongiovanni
(INFN)
22-24/11/2010 EMI AHM Prague - SA2 Overview - A.Aimar
Under
Review
4
TSA2.2 - Quality Assurance Process Definition and Monitoring
Definition of the SQAP (03.09.2010) http://cdsweb.cern.ch/record/1277599?ln=en
Contains the definition of the documentation, processes and responsibilities relevant to the
SW lifecycle.
QA periodic report (18.10.2010) http://cdsweb.cern.ch/record/1277600?ln=en
Reports about the status of the guidelines and the project documentation relevant to the SW lifecycle.
Guidelines to support JRA1 and SA1 in the different stages of the SW lifecycle have been defined.
They are included as satellite documents of the SQAP and will be integrated in future versions of the
SQAP:
●
●
●
●
●
●
Configuration and Integration
Packaging
Releasing
Change Management
Metrics Generation
Certification and Testing
Use Insert
Header &
Footer to set this field
Use Insert Header & Footer to set this field 5
TSA2.2 - Quality Assurance Process Definition and Monitoring
Use Insert
Header &
Footer to set this field
Use Insert Header & Footer to set this field 6
TSA2.3 - Metrics Definition and Reporting
Use Insert
Header &
Footer to set this field
Use Insert Header & Footer to set this field 7
TSA2.3 - Metrics Definition and Reporting
Process Metrics (Bug-tracking related)
Related to priority, severity, open/closed bugs
Quality in Use Metrics (Bug-tracking/EGI)
3rd level GGUS related metrics (KPIs for SA1)
Product Metrics (Software related)
Unit tests, supported platforms, bug density
Static Analysis Metrics (Static analysers)
Cyclomatic Complexity, Code commenting, etc.
Optional Metrics (Valgrind/Helgrind)
Thresholds in many cases don’t make sense yet!
Why?
Some product teams will have good/poor statistics starting off
Some programming languages strongly support static metrics
We expect improving trends...
Use Insert
Header &
Footer to set this field
Use Insert Header & Footer to set this field 8
TSA2.3 - Metrics Definition and Reporting
Why bug-tracking info in XML?
Different bug-tracking tools:
ARC → Bugzilla dCache → RT/Review Board gLite → Savannah
Unicore → Sourceforge
We want a tool that uses all bug-trackers
We need a common interface
Achievements:
Bug Listing XSD, Bug Mapping XSD, XML examples, python sample scripts defined for extracting bug-tracking data
Validated UNICORE and gLite XML listings
Prototype ETICS plots/statistics for process and static analyser metrics
Pseudo code made available to SA2.4 to produce metric calculations
Pending:
2 of 4, GGUS 3rd level user incident metric reports are still to be produced
Awaiting dCache-BugListing.xml and ARC-BugListing.xml for combined bugtracking metrics
Metrics may need to evolve over time to track interoperation & reusability
Use Insert
Header &
Footer to set this field
Use Insert Header & Footer to set this field 9
TSA2.4 - Tools and Repositories, Maintenance and Integration
Achievements
•
•
•
•
•
Common tools and processes agreed
ETICS running with no downtime
4 ETICS Releases in 6 months with many improvements for EMI
Build infrastructure working 24/7
–
Currently 4 EPEL WN and 3 builds daily
JIRA, Savannah and EMI Repository area reliable services
•
•
•
•
Integration and Configuration support to EMT, SA1 and PTs
–
Configuration and Integration task-forces
–
Configuration cloner and cleaner
–
Externals installed and added to ETICS
–
Integration and Configuration Guidelines
Plugins released/fixed as planned
Report generator work in progress
No complaints received
22/11/2010 EMI All Hands Meeting 2010 - November 22-24 Prague 10
TSA2.4 - Tools and Repositories, Maintenance and Integration
Issues
•
•
•
EMI start required more resources than allocated in the DoW
Resources will reduce to normal over time to just keep the services stable
Virtual infrastructure not 100% reliable
–
Free and old VMWare used
–
Condor
–
Many problems with old platforms
Next steps
•
•
•
•
•
New ETICS Client to be released
–
Default OS version dependencies, bugfixes
Metrics support
–
Plugins and Report Generator
Infrastructure reliability
–
Move to new virtualization technology
–
New monitoring system under construction
Performance, Job Queues and cancelling
More PTs involved in nightly tests?
22/11/2010 EMI All Hands Meeting 2010 - November 22-24 Prague 11
TSA2.6 – Testbeds setup, Maintenance and Coordination https://twiki.cern.ch/twiki/bin/view/EMI/TestBed#Testbed_HOWTO
HW Resources :
ARC: 28 Instances covering production version, and some RC versions gLite: 37 Instances covering production version, and some RC versions
UNICORE: 5 Instances covering production version, no RC dCache (DESY volunteers for SA2.6: thanks!)
Monitoring : Nagios just HW up and reachable, no service probes
Testers VO : testers.eu-emi.eu (Subscribe to it)
Support : GGUS, with EMI-Testbed Support Unit (on best effort within 48 hours: SA2.6 just 2-3 FTE across 5 sites).
Documentation: https://twiki.cern.ch/twiki/bin/view/EMI/TestBed
Testbed News/Warning
Testbed Overview
Basic HOWTO
Resources Description: EMI components coverage table, inventory of instances with logbooks
22/11/2010 EMI All Hands Meeting 2010 - November 22-24 Prague 12
TSA2.6 – Testbeds setup, Maintenance and Coordination
Large Scale Acceptance Testbed
To be different from EGI Staged Rollout, we should define specific test cases:
How large is large enough?
How do we want to stress the service?
Which particular combination of services do we need to test?
Proposed Implementation (Work In Progress): specific testbed setup accordingly to a given test definition involving users strongly motivated to test the product: motivated to the point of providing HW and collaborative effort: how to find them?
– Friendly NGI close to PTs?
22/11/2010 EMI All Hands Meeting 2010 - November 22-24 Prague 13
TSA2.6 – Testbeds setup, Maintenance and Coordination
Cross Middleware Integration:
●
Common Authentication Framework for all Services
●
Resource discovery/publishing across middlewares
HW Dimensioning proportional to:
●
N° Releases * (RC+PROD)*N° Services *N° Platforms
*(Some redundancy Factor)
Future Issues:
Automation of testing (tests as a post build step..) => Automation of testbed setup
22/11/2010 EMI All Hands Meeting 2010 - November 22-24 Prague 14