SE310
Analysis and Design of Software
Lecture 12 – Implementation (C++ vs.
Java) and SQA Basics
April 2, 2015
 Sam Siewert
Minute Paper
Key Advantages of OOP Libraries – E.g. Computer
Vision, Graphics and Interactive GUIs
List 3 Key Advantages
Look at OpenCV and Decide if Advantages are Realized
– C++ OOP http://docs.opencv.org/modules/core/doc/basic_structures.html
– C Procedural with ADTs http://docs.opencv.org/modules/core/doc/old_basic_structures.ht
ml
Write Up Follow-on Thursday on Realization of
Advantages
 Sam Siewert
2
Key Takeaway Points
• Everyone in the team should follow the same
coding standards.
• Test-driven development, pair programming,
and code review improve the quality of the
code.
• Classes should be implemented according to
their dependencies to reduce the need for test
stubs.
18-3
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
OOP Implementation
Numerous OOPS: C++, Java, Ada95, C#, Ruby, Smalltalk
Defining Features of an OOP [Minimal]
1.
2.
3.
Encapsulation (data hiding)
Inheritance
Dynamic method binding
Choose One? Use More than One? – “it depends”.
Industry – E.g. Digital Media – Often Uses C/C++ for Lower
Layers, Java for Upper Layers (Digital Cable, Netflix, etc.)
– Java is Machine Independent Due to JVM (Java Virtual Machine)
– C++ is Compiled (Often More Efficient for Given Architecture)
 Sam Siewert
4
Java vs. C++ Feature Level
(http://en.wikipedia.org/wiki/Comparison_of_Java_and_C++)
Java
1. Embedded and Network
Computing (Internet of Things)
2. Interpreted to Byte-Codes, not
Compiled (JIT Compile, WORA)
3. Portable to any JVM (Java Virtual
Machine)
4. Allows Overloading, but No
Operator Overloading
5. Array Bounds Checking
6. No unsigned
7. Type sizes/limits same on all
platforms
8. Automatic garbage collection, nondeterministic finalize() method
9. Strongly typed
10. No function pointers
 Sam Siewert
C++
1. Extension to C Procedural
Programming Language
2. Compiled to Object (Machine
Code or Assembly, WOCA)
3. Re-Compiler for Each Machine
Architecture, O/S
4. Allows Overloading Including
Operator Overloading
5. No Built-in Array Bounds Checking
6. Unsigned and signed arithmetic
7. Type sizes/limits can be platform
specific (16 vs. 32 vs. 64 bit)
8. Explicit memory management
(new, delete) or scope based auto
9. Type-casting and overrides
10. Function pointers
5
Java vs. C++ Syntax
(http://en.wikipedia.org/wiki/Java_syntax, http://en.wikipedia.org/wiki/C++_syntax)
1.
2.
3.
4.
5.
6.
7.
8.
Java
Everything is a Class and main()
is just a method, one Class per file
is convention, Class browsers
common
Java always uses reference
semantics, objects are not values,
types are strong
Create reference, allocate with
new, clone to copy object
System.out.println(x);
myClass Obj2 = Obj1;
Obj2.x = 5;
Obj2.method(); Obj1.method();
Break, continue, labels
 Sam Siewert
1.
2.
3.
4.
5.
6.
7.
8.
C++
Classes are specified anywhere
(prefer header files), implemented
anywhere (prefer *.cpp) and used
in main() program
Objects are variables (values) and
value semantics (e.g. pass by
value), use pointer or reference
Declare and construct Objects,
copy like a struct
cout << x << endl;
myClass *Obj 2= Obj1;
Obj2->x = 5;
Obj2->method(); Obj1.method()
Goto label or code generation,
break, continue
6
Java vs. C++ Semantics
(http://en.wikipedia.org/wiki/Comparison_of_Java_and_C++)
1.
2.
3.
4.
5.
6.
7.
8.
Java
Derive from Only ONE class
Java interfaces vs. classes has
clear distinction
Generic
Built-in support for multithreading and concurrency
Supports reflective programming –
object modification at runtime
Anonymous references not
allowed, strongly typed
References only
Minimum code is a class
 Sam Siewert
1.
2.
3.
4.
5.
6.
7.
8.
C++
Multiple Inheritance
Abstract classes with virtual, pure
virtual functions
Template
Threading via libraries (e.g.
pthreads)
Does not direclty support reflective
programming
Use of void* for pointers to
anonymous types (address)
Pass by reference and by value
Minimum code is a function
7
Test Driven Development
Prepare for
TDD
Write/edit tests for
features to be
implemented next
Run new tests
All new
tests fail?
Halt
Y
N
Y
Y
Write/edit
production code
N
More features
to implement?
N
Test coverage
accomplished?
N
Run all tests
All tests
pass?
Y
Add tests to
increase coverage
Refactor code
18-8
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Merits of Test Driven Development
• TDD helps the team understand and improve
requirements.
• TDD produces high-quality code.
• TDD improves program structure and
readability through refactoring.
• TDD makes it easier to debug.
18-9
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
What is Software Quality Assurance?
• SQA is a set of activities to ensure that the
software product and/or process confirms to
established requirements and standards.
• The activities include
–
–
–
–
Verification: Are we building the product right?
Validation: Are we building the right product?
Technical reviews
Testing: Attempts to uncover program errors.
19-10
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Cost of Quality
• Cost of quality consists of
– Costs of preventing software failures
•
•
•
•
Costs to implement an SQA framework (one time)
Costs to perform SQA activities (on going)
Costs to monitor and improve the framework (on going)
Costs of needed equipment (machines and tools)
– Costs of failure
• Costs to analyze and remove failures
• Costs of lost productivity (developer and customer)
• Costs of liability, lawsuit, and increased insurance
premium
• Costs associated with damage to company’s reputation
19-11
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Cost of Quality
Cost
Higher
Failure Cost
Prevention
Cost
Higher
Quality
Software engineering considers costs to accomplish
something. Quality and cost trade-off.
19-12
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Costs of Quality
1000
field operation
relative error
correction cost
system test
100
development test
coding
10
design
requirements
1
19-13
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Software Quality Attributes
• Reliability (adequacy: correctness,
completeness, consistency; robustness)
• Testability and Maintainability
(understandability: modularity, conciseness,
preciseness, unambiguity, readability;
measurability: assessibility, quantifiability)
• Usability
• Efficiency
• Portability
• etc.
19-14
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Quality Measurements and Metrics
• Software measurements are objective,
quantitative assessments of software attributes.
• Metrics are standard measurements.
• Software metrics are standard measurements
of software attributes.
• Software quality metrics are standard
measurements of software quality.
• Class discussion: why do we need software
metrics?
19-15
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Software Quality Metrics
•
•
•
•
•
Requirements metrics
Design metrics
Implementation metrics
System metrics
Object-oriented software quality metrics
19-16
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Requirements Metrics
Requirements Unambiguity Q1 =
#of uniquely interpreted requirements
#of requirements
Requirements Completeness Q2 =
#of unique functions
#of combinations of states and stimuli
19-17
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Requirements Metrics
Requirements Correctness Q3 =
#of validated correct requirements
#of requirements
Requirements Consistency Q4 =
#of non-conflicting requirements
#of requirements
19-18
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Design Metric Fan-In
• Number of incoming messages or control
flows into a module.
• Measures the dependencies on the module.
• High fan-in signifies a “god module” or “god
class.”
• The module may have been assigned too many
responsibilities.
• It may indicate a low cohesion.
19-19
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Design Quality Metrics: Fan-In
M1
M2
M13
M3
M4
DBMgr
M5
M12
M6
M11
M10
M9
M8
M7
19-20
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Design Quality Metrics Fan-Out
• Number of outgoing messages of a module.
• Measures the dependencies of this module on
other modules.
• High fan-out means it will be difficult to reuse
the module because the other modules must
also be reused.
19-21
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Quality Metrics: Fan-Out
M1
M2
M13
M3
M4
GUI
M5
M12
M6
M11
M10
M9
M8
M7
19-22
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Modularity
• Modularity – Measured by cohesion and
coupling metrics.
• Modularity = (a * Cohesion + b * Coupling)/(a
+ b), where a and b are weights on cohesion
and coupling.
19-23
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Cohesion
• Each module implements one and only one
functionality.
• Ranges from the lowest coincidental cohesion
to the highest functional cohesion.
• High cohesion enhances understanding, reuse
and maintainability.
19-24
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Coupling
• An interface mechanism used to relate
modules.
• It measures the degree of dependency.
• It ranges from low data coupling to high
content coupling.
• High coupling increases uncertainty of run
time effect, also makes it difficult to test,
reuse, maintain and change the modules.
19-25
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Module Design Complexity mdc
• The number of integration tests required to
integrate a module with its subordinate
modules.
• It is the number of decisions to call a
subordinate module (d) plus one.
M0
M1
M4
M2
M5
M3
M6
Module Design
Complexity mdc=4
M7
19-26
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Design Complexity S0
• Number of subtrees in the structure chart with
module M as the root.
• S0(leaf) = 1
• S0(M) = mdc+S0(child1 of M)+...+S0(childn of M)
M0
S0=3
mdc=1
S0=11
mdc=2
M2 S0=3 M3
mdc=2
M1
S0=3
mdc=2
M4
M5
M6
M7
S0=1
S0=1
S0=1
S0=1
19-27
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Integration Complexity
• Minimal number of integration test cases
required to integrate the modules of a structure
chart.
• Integration complexity S1 = S0 – n + 1, where
n is the number of modules in the structure
chart.
19-28
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Implementation Metrics
•
•
•
•
Defects/KLOC
Pages of documentation/KLOC
Number of comment lines/LOC
Cyclomatic complexity
– equals to number of binary predicates + 1
– measure the difficulty to comprehend a
function/module
– measure the number of test cases needed to cover
all independent paths (basis paths)
19-29
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
What is the cyclomatic
complexity of this
module?
19-30
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Object-Oriented Quality Metrics
•
•
•
•
•
•
Weighted Methods per Class (WMC).
Depth of Inheritance Tree (DIT).
Number of Children (NOC).
Coupling Between Object Classes (CBO).
Response for a Class (RFC).
Lack of Cohesion in Methods (LCOM).
19-31
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Weighted Methods per Class
• WMC(C) = Cm1 + Cm2 + ··· + Cmn
Cmi=complexity metrics of methods of C.
• Significance: Time and effort required to
understand, test, and maintain class C
increases exponentially with WMC.
19-32
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Depth of Inheritance Tree
• Distance from a derived class to the root class
in the inheritance hierarchy
• Measures
• the degree of reuse through inheritance
• the difficulty to predict the behavior of a class
• costs associated with regression testing due to change
impact of a parent class to descendant classes
19-33
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
High DIT Means Hard to Predict Behavior
• All three classes include
print()
• It is difficult to determine
which “print()” is used:
Shape
print()
Box
public static void main (...) {
Shape p; ….
p.print(); // which print()?
…
}
print()
Square
print()
19-34
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
ClassA
m()
ClassB
ClassC
m()
ClassD
Change to parent
class requires
retesting of
subclasses.
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
ClassE
m()
19-35
Number of Children
• NOC(C)
= | { C’ : C’ is an immediate child of C }|
• The dependencies of child classes on class C
increases proportionally with NOC.
• Increase in dependencies increases the change
impact, and behavior impact of C on its child
classes.
• These make the program more difficult to
understand, test, and maintain.
19-36
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Coupling between Object Classes
• CBO(C) = |{ C’ : C depends on C’ }|
• The higher the CBO for class C the more
difficult to understand, test, maintain, and
reuse class C.
19-37
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Response for a Class
• RFC (C) = |{ m : m is a method of C, or m is
called by a method of C }|
• The higher the RFC, the more difficult to
understand, test, maintain, and reuse the class
due to higher dependencies of the class on
other classes.
19-38
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Lack of Cohesion in Methods
• LCOM (C) = n * (n-1) / 2 - 2 *
|{ (mi, mj) : mi and mj share an attribute of C }|
• LCOM measures the number of pairs of
methods of C that do not share a common
attribute.
• Class exercise:
– Is it possible to derive a metric called “cohesion of
methods of a class?”
– If so, what would be the formula?
19-39
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Reliability and Availability
• Mean time to failure (MTTF)
• Mean time to repair (MTTR)
• Mean time between failure (MTBF) = MTTF +
MTTR
• Availability = MTTF/MTBF X 100%
TTF2 TTR2 TTF3
f1
r1
f2
r2
TTR3
f3
TTF4
r3
TTR4
f4
r4
TBF
19-40
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.
Negative Tests - HA Design Criteria
5 9’s – Out of Service no
more than 5 Minutes per
Year [Up 99.999% of the
Time]
Big iron lessons, Part 2: Reliability and availability:
What’s the difference?
Apply RAS architecture lessons to the autonomic
Self-CHOP roadmap
Availability = MTTF /
[MTTF + MTTR]
MTTF ≈ MTBF
E.g. 0.99999 ≈
(365.25x60x24 minutes) /
[(365.25x60x24) + 5
minutes]
MTBF=Mean Time
Between Failures,
MTTR=Mean Time to
Recovery
 Sam Siewert
41
Availability
Time System is Available for Use
Time System is Unavailable for Use
Ratio of Time Available over Total Time Both Available and Not Available
Some Variation in Definitions of MTTF, MTBF, MTTR
–
–
–
–
–
MTBF = MTTF + MTTR [In Book]
MTTF is Simply Time Between Failures (Observed in Accelerated Lifetime / Stress Tests)
MTBF Often Stated – Does it Really Include MTTR or Not?
Typically Recovery Time is Small, So Negligible, But if Long, Difference is More Important
Just Be Clear on Definitions [Can Vary]
% of Time System is Available for Use Each Year (Or Other Arbitrary Period of
Time)
Failure #1
MTTF
available
Failure #2
MTBF
recovery
available
recovery
MTTR
 Sam Siewert
42
Usefulness of Quality Metrics
1.Definition and use of indicators.
2.Directing valuable resources to critical areas.
3.Quantitative comparison of similar projects
and systems.
4.Quantitative assessment of improvement
including process improvement.
5.Quantitative assessment of technology.
19-43
Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.