All Architecture Evaluation is not the Same May 2, 2013 Matthias Naab

advertisement
All Architecture Evaluation is not the Same
Lessons Learned from more than 50 Architecture Evaluations in Industry
1
© Fraunhofer IESE
Matthias
Naab
May 2, 2013
SATURN 2013, Minneapolis
Fraunhofer IESE
2
© Fraunhofer IESE
Architecture Evaluation is a Mature Discipline …
… but only few Practical Examples Published
3
© Fraunhofer IESE
Fraunhofer IESE
> 50 Architecture Evaluations
2004 - 2013
4
© Fraunhofer IESE
Goals of this Talk
Complement literature
Share experiences with other practitioners
Give empirical evidences
5
© Fraunhofer IESE
Companies in these Countries Involved
6
© Fraunhofer IESE
Industries Involved
7
© Fraunhofer IESE
Systems under Evaluation: Code Base
S ize: 10 KLoC – 10 MLoC
Languages :
Java, C, C++, C#,
Delphi, Fortran, Cobol, Gen
Age: 0 years – 30 years
8
© Fraunhofer IESE
Initiation of Architecture Evaluations
9
© Fraunhofer IESE
Initiator of Architecture Evaluation
Top management
Initiator in same
company
Development
management
Development
team
Method support
group
Current customer
Initiator in other
company
10
© Fraunhofer IESE
Potential
customer
Disappointed
customer
Cautious
customer
Typical Goals and Evaluation Questions
How adequate are
the solutions for my
requirements?
What is the impact of
changing to paradigm
SOA, Cloud, OO, …
How different are
two architectures?
How feasible is the
migration path?
How can our system
be modularized?
11
© Fraunhofer IESE
How compliant is our
solution to a
reference
architecture?
How adequate is our
architecture as a basis
for our future
product portfolio?
How can we improve
performance, reuse,
maintainability, …?
Which framework /
platform / technology
fits our needs best?
Initial S ituation of Architecture Ev aluation
Project Ty pe
Ev aluation only
Ris k Managem ent
~ 20 Projects
Ev aluation
& Im prov em ent
Quality Managem ent
~ 15 Projects
Project “on plan”
12
© Fraunhofer IESE
Em ergency
Clas h
~ 5 Projects
~ 5 Projects
Res cue (Ev olution v s . Rev olution)
~ 5 Projects
Project “out of hand”
Criticality
Setup of Architecture Evaluation Projects
14
© Fraunhofer IESE
Organizational Constellation of Evaluations
Organization with
product under
evaluation
Initiating
Organization
3-15
…
3-15
2
(1-6)
Organization with
product under
evaluation
3-15
[Number of People involved]
15
© Fraunhofer IESE
Effort Spent on the Evaluation
Organization with
product under
evaluation
Initiating
Organization
2-60
…
1-10
[Person Days]
16
© Fraunhofer IESE
4-200
Organization with
product under
evaluation
2-60
Factors Driving Effort for Architecture Evaluation
Number of
stakeholders
Criticality of
situation
Organizational
complexity
Overall
Effort
Required
confidence
System size
and complexity
Need for
fast results
Evaluation
questions
17
© Fraunhofer IESE
Overview on Architecture Evaluation Approach
Stakeholder
Interpretation
Concerns
Concern elicitation check
Knowledge
Models
Documents
Solution adequacy assessment
Documentation assessment
Implementation/system
18
© Fraunhofer IESE
Rating

Scenario
Architecture
Source code
Code metrics

0110
01
Compliance / distance assessment
Code quality assessment
0110
01
0110
01
Outcome of Architecture Evaluations
20
© Fraunhofer IESE
Findings: Requirements that are Often Neglected
Runtime
Quality Attributes
Devtime
Quality Attributes
Operation
Quality Attributes
Typically known
Often not
explicitly known
Typically not
explicitly known
Partially missing
quantification
Often hard to
quantify
Partially missing
quantification
Often addressed
well
Often not
addressed well
Often not
addressed well
21
© Fraunhofer IESE
Findings: Adequacy of Architectures under Evaluation
Architecture typically
not thoroughly defined
Architecture often not
fully adequate any more
(older systems)
Often Emergency or
Rescue projects
6
[33 solution adequacy assessments]
22
© Fraunhofer IESE
Architecture thoroughly
defined and maintained
Often Risk Management
or Quality Management
projects
11
16
Findings: Aspects that are „Over-Elaborated“
Technical Architecture
Business Architecture
Specification of general
architectural styles
Definition of concrete
components or guidelines
how to define them
Selection of technologies
Mapping of concrete
functionality to
technologies
Over-Elaborated
Neglected
OSGi
23
© Fraunhofer IESE
ESB
…
Findings: Architecture Documentation
Often not
available
Often very good
knowledge
Architecture
Often not
available
Often very good
knowledge
Implementation
 Missing uniformity, lack of compliance,
quality problems
24
© Fraunhofer IESE
D. Rost, M. Naab: Architecture Documentation
for Developers: A Survey, ECSA 2013
Reconstruction is essential as
basis for evaluation
Architectural
Requirements
Findings: Architecture Compliance
Implementation chaotic
Typically very high
overall quality
Strong adverse impact
on quality of systems
Impact often perceived
by users and developers
High cost for rework
9
[26 compliance checks]
25
© Fraunhofer IESE
7
10
Interpretation of Evaluation Results
Stakeholders partially
try to influence the
interpretation for
their goals
No standard
interpretation
possible
Interpretation has to
consider evaluation
questions + many
context factors
Architecture
Evaluation often
not fully objective
and quantitative
Even quantitative
data (e.g. number of
incompliant
relationships) often
hard to interpret
26
© Fraunhofer IESE
Tool-based reverse
engineering often
leads to nice but
useless visualizations
Representation of
results for
management is
challenging
( actions?)
Follow-Up of Architecture Evaluations
27
© Fraunhofer IESE
Which Problems can be Fixed, which can‘t?
Problems that often
can be fixed
Problems that often
can’t be fixed
Missing documentation
High degree of
incompliance in code
Missing support for
several new scenarios
Missing thoroughness in
definition of initial
architecture
Lower degree of
incompliance in code
Strong degree of
degeneration of
architecture over time
Missing commitment in
the fixing phase
28
© Fraunhofer IESE
Actions Taken after the Architecture Evaluations
Project stopped
New project for
future
architecture
Project for
removing
architecture
violations
Project for
improvement of
architecture
Selection of one
of the candidate
systems /
technologies
Initiative for
improvement of
architecture
capabilities
None
29
© Fraunhofer IESE
Key take aways of this talk
Evaluate your Architecture – early and regularly!
Effort and method strongly depend on goals and context
Interpretation of results is challenging
Thorough and continuous architecting is key to success
30
© Fraunhofer IESE
31
© Fraunhofer IESE
Dr. Matthias Naab
Phone: +49 631 6800 2249
matthias.naab@iese.fraunhofer.de
32
© Fraunhofer IESE
architecture@iese.fraunhofer.de
architecture.iese.fraunhofer.de
Download