Uploaded by Paul Suciu

Kneuper Life Cycle 2017

advertisement
FEATURE ARTICLE
Sixty Years of Software Development
Life Cycle Models
Ralf Kneuper
International University of Applied Sciences (IUBH)
Sixty years ago, in 1956, the first explicit representation of a software
development life cycle model was presented by Herbert Benington. Since
then, software development life cycle models have come a long way, and the
current article provides an overview of that development.
Introduction
This paper describes the development of software development life cycle models (SDLCM)
since the first such model was published in
1956. Such models form a central part of software engineering since they provide a structure for the various software development
activities to be performed within a project.
Even if a project does not explicitly use any
such model, it will structure its activities
in some form and thus will implicitly use a
SDLCM.
Initial thoughts about the sequence of
steps to be used in software development
started more or less from the beginning, keeping in mind that “software development” initially consisted solely of coding.
As programming became more complex,
more structure was needed for the development effort, to form a basis for project management and to support planning and communication because now teams were needed
1058-6180/17/$33.00 ©2017 IEEE
to develop software rather than individuals.
Gradually, this led to the SDLCMs discussed
here.
Regarding terminology, the terms “software (development) process” and “software
(development) life cycle” describe two closely
related but not identical concepts. While software development processes describe the
details of the individual development steps,
their input and output and how to combine
them, an SDLCM describes the combination
and temporal relationship of the various processes. However, because both terms describe
very similar and closely related concepts, a
strict distinction between the two will not be
made in this paper.
Purpose of SDLCMs
Initially, the main purpose of SDLCMs was
to provide a structure for software development, providing a framework for software
development tasks and methods. This helps
Published by the IEEE Computer Society
IEEE Annals of the History of Computing
41
FEATURE ARTICLE
to break down this increasingly complex task
into smaller subtasks to help plan and monitor work, to support cooperation and communication between the different people and
groups involved, and to ensure quality of the
result. Starting in the 1980s, the models were
additionally used to improve and automate
(parts of) the development process1(p.76) which
required more detailed descriptions or models
of the software processes involved.
This changing purpose, as well as the
changing technical and organizational environment and the growth of experience, implied
that the contents and structure of the models
also needed to change over time. As a result,
the history of SDLCMs can be split into different phases following Barry W. Boehm’s structuring of the history of software engineering.2
To describe these phases (each lasting roughly
a decade), Boehm used the Hegelian concept of
thesis (this is the way work is done) and antithesis (opposition to the thesis builds up), leading to synthesis, where a middle way is found.
However, progress regarding SDLCMs was very
limited until the 1970s, and the initial decades
are therefore combined into one section in this
paper.
Influences
Apart from their changing purpose over time,
one of the main influences on SDLCMs was the
hardware and software technology available
at different times, such as the software engineering tools (compilers, development environments, and so forth). For example, publications in the 1950s talk about “computer
system development” rather than “software
development.” During that time, software was
not usually developed separately from the
hardware it was to run on but in a joint effort.
Standard, off-the-shelf hardware became
gradually available in the 1960s after IBM had
standardized the computer interfaces with
the computer family IBM 360.3 Closely related
is the fact that computing time (and memory) initially was very expensive compared to
manpower. This led to completely different
tradeoffs between analyzing code and testing
it compared to today, recompiling and retesting code after each minor change was simply
too expensive.
42
www.computer.org/annals
Other major factors that influenced SDLCMs to varying degrees at different times were:
»» Application type and size, leading, for
example, to differences in importance of
time-to-market versus quality and reliability.
»» Management philosophies in other branches of industry, for example, restrictive management versus self-organization; lean
mana­gement; planning and management
versus agility and self-organization
»» Qualification of developers4: on the one
hand, more highly qualified developers
need fewer rules or guidance on how to
develop. On the other hand, these same
developers tend to put more emphasis on
a structured development approach.
Early Life Cycle Models
Initially, the main question was how to develop
software at all, and which steps would be necessary to do so.
The first SDLCMs were, therefore, sequential models with different variations of the
analysis-design-implementation-test sequence.
The differences (apart from minor topics like
splitting one phase into multiple phases, for
example, in testing) mainly concerned the
amount of feedback loops that were explicitly included in the models and the extent to
which the models went beyond development to
include the entire product life cycle (e.g., installation, maintenance, obsolescence).
Starting to Code
One of the first electronic computers was the
“Electronic Numerical Integrator and Computer” (ENIAC). As the name indicates, it was
used mainly for mathematical calculations,
particularly integration, to compute ballistic
tables. The requirements for programs were
usually straightforward, and the challenge was
to translate the known algorithm into a program to be run on the computer.
Because initially the ENIAC was not a
stored-program computer, it had to be programmed by adapting cabling,5(p.44) which
consisted of the following developmental
steps: step 1, determine what the program was
to do on paper; step 2, manipulate the switches
and cables to enter the program; and step 3,
verify and debug.
In their series of reports on coding,6 Herman H. Goldstine and John von Neumann of
the Institute of Advanced Study describe the
initial step in more detail, covering mainly
numerical programming tasks such as integration of various functions. These reports arose
from their work on the then-new Electronic
Discrete Variable Automatic Computer (EDVAC)
computer, which, like the ENIAC from 1948
onward,7 stored its programs in memory and
did not need the manipulation of switches and
cables for programming,8 introduced a flow diagram language, and then, for a number of mathematical problems, presented flow diagram
solutions and derived the appropriate code.
Benington’s Program
Production Life Cycle
As the systems to be developed became more
complex, just coding and testing and debugging
were no longer sufficient. The requirements
were no longer defined from the start but had to
be clarified first, and moving from requirements
to code was no longer possible in one step.
One of the best-known examples for such
a complex system in the 1950s was the SemiAutomatic Ground Environment (SAGE) project which was, according to Boehm the “most
ambitious information processing project of the
1950’s”,9 creating a real-time air-defense system.
The work performed in SAGE asked for
a systematic structure of the tasks involved,
leading to the first explicit representation of a
software development life cycle, as presented
by Herbert D. Benington in 1956 at a symposium on advanced programming methods
(since republished).10 In his paper, Benington,
at the time an engineer in the SAGE project,
presented best practices for design and engineering, including the life cycle model shown in
Figure 1. This model was derived from hardware
engineering, with most of the developers coming from a hardware engineering background.
The main difference between this model
and previous work was that Benington explicitly modeled the software development life
cycle, defining the life cycle phases and their
sequence, whereas before, they were performed but not explicitly described. Of course,
this was largely due to the fact that the life cycle
had become too complex to be implemented
without such an explicit model.
A summary of the main phases of Benington’s model shows that he used a different terminology than that in use today, but the basic
steps are already quite similar to later (sequential) models:
»» Operational plan contains (in today’s terminology) usage scenarios and stakeholder
requirements.
»» Machine specifications specify the hardware to be developed.
»» Operational specifications contain the
detailed product requirements, describing
the “transfer function” (input-output relationship) of the system.
»» Program specifications contain the design
of the software, specifying subprograms
and tables.
»» Coding specifications are roughly equivalent
to today’s programs in high-level languages.
»» Coding refers to the transfer of coding specification into machine code, today usually
performed by a compiler.
»» Parameter testing tests the individual
sub-programs based on the coding specifications.
»» Assembly testing involves gradually integrating or assembling the system and testing it based on the operational and program specifications.
»» Shakedown is the test of the complete system in its operational environment, ensuring that it is ready for operation.
»» System evaluation refers to the evaluation
of the system once it is in production.
Although this model describes a linear life
cycle, Benington mentions in the preface written later11 that a prototype was used to get a better understanding of the system.
A close look at the dashed “testing” arrows
in Figure 1 shows that this model already
included the basic idea of the later V-shaped
model (described later in this paper) of testing
the system against the specification documents
in the order of increasing levels of abstraction,
even though it did not use the V-shaped presentation yet.
July–September 2017
43
FEATURE ARTICLE
Operational plan
Operational
specification
Machine specification
cycle comparable to that of, for example, Benington’s program production life cycle.
Hosier’s “development flow chart” also
includes feedback loops, but surprisingly, only
with reference to the hardware part of the system.
Rosove’s Sequence
of Development Stages
Program specifications
Coding specifications
Coding
Parameter testing
(specifications)
Assembly testing
(specifications)
Design
Shakedown
Testing
System evaluation
FIGURE 1. Program production life cycle as described by Benington.11
Hosier’s Program
Development Approach
W.A. Hosier, working at the Systems Engineering and Management Operation, Sylvania Electronic Systems Division, in 1961, described the
development of real-time systems, mentioning, for example, the SAGE project that Benington had talked about 5 years earlier.13
That paper focuses on individual activities and what would today be called best practices. Although it presents the sequence of
main activities in a flow chart, those activities
described are on a fairly detailed level and do
not provide a structure of the development life
44
www.computer.org/annals
Experience from many projects showed that
a strict linear life cycle was not sufficient and
that some form of feedback cycle was needed,
as discussed, for example, at the 1968 NATO
conference.14(pp.20–21) In his “sequence of
development stages” published 1 year earlier,
Perry E. Rosove therefore explicitly included
feedback loops to previous phases in order to
indicate that “knowledge acquired and decisions made in later phases are fed back to earlier phases.”15(p.18) This life cycle model already
looks very similar to what later became the
“waterfall model,” describing the sequence of
the five development stages shown in Figure 2.
Rosove’s book was based on his work at the
Education Systems group of the System Development Corporation, one of the first software
companies in the world that had also been
responsible for the SAGE project. In the late
1950s and early 1960s, System Development
Corporation had trained more programmers
than any other company in the world, resulting
in a major influence on the approaches used to
develop software in the industry.17(p.142)
Furthermore, Rosove’s life cycle model
included the operations phase and the feedback from actual experience with the system,
which then may lead to new requirements and
eventually a new cycle of development.
Zurcher and Randell’s
Iterative Multilevel Modeling
Although the models discussed so far did
include feedback loops to some extent, they
did not allow for genuine iterative, incremental
development with planned iterations to gradually build up the system. There was some use
of iterative software development at the time,18
but F.W. Zurcher and Brian Randell, then of the
Thomas J. Watson Research Center, in 1968,
were the first to explicitly describe iterative
development.19 Their approach involved starting with an abstract model of the system and
gradually adding detail (stepwise refinement),
until it eventually is expressed in terms of code.
Royce’s Waterfall Model
Despite the similar models published earlier,
Winston W. Royce’s description of a software
development life cycle, published in 1970,20
is often referred to as the main reference on
waterfall models. This is unfortunate because,
apart from the fact that Benington and Rosove
had described similar life cycles, Royce did
not actually recommend a waterfall approach
but explicitly addressed feedback loops in his
paper, stating that a linear waterfall in general
was not sufficient. According to his son, Royce
“was always a proponent of iterative, incremental, evolutionary development.”21
Also the term “waterfall model” was not
actually introduced by Royce, rather it came into
use some years later by Bell and Thayer when
referring to Royce’s paper.22 It became well
known following publication of Boehm’s book.23
At the time, Royce worked for TRW, Inc.
and was “mostly concerned with the development of software packages for spacecraft mission planning, commanding, and post-flight
analysis.”24
Although Royce is best known for the
waterfall model, he actually presented and
analyzed a number of different SDLCMs, starting with a very simple life cycle consisting
only of the two phases “analysis” and “coding,”
moving on to the now famous waterfall model,
which he described in three variations: without feedback loops; with feedback loops only
going back one step; and finally allowing bigger loops, as “the design iterations are never
confined to the successive steps.”25
After that, he further extended the model
in various ways, for example, by recommending
to “do the job twice” (Figure 3), integrating the
creation of a prototype into the life cycle. This is
different from prototyping as described by Benington, where it just “happened”; Royce deliberately included prototyping in this model.
Iterative and Incremental Development
Simple forms of iterative and incremental
development and other work involving the use
of a prototype and/or feedback loops were used
more or less from the start.27
Time
Requirements
Design
Production
Installation
Operations
Feedback
FIGURE 2. Rosove’s sequence of development stages.15
However, the terms “iterative” and “incremental” development are not clearly defined,
allowing a number of very different interpretations. Early interpretations included:
»» creating a prototype or preliminary version (Benington, Royce);
»» stepwise refinement of an initial abstract
model of the system (Zurcher and Randell);
»» stepping back one or more phases in order
to correct any earlier results as necessary
(Rosove, Royce), and
»» short time-boxed iterations as used in
agile development.
According to Gerald M. Weinberg, he was
working at IBM Federal Systems Division on project Mercury in 1957, using an approach very similar to XP,28 with half-day time-boxed iterations as
well as test-first development and so forth.
With the exception of project Mercury
and related work at Federal Systems Division,
none of these interpretations involved frequent, short iterations as used in current agile
July–September 2017
45
FEATURE ARTICLE
Preliminary
program design
Analysis
Preliminary
design
Analysis
Program design
Program
design
Coding
Testing
Coding
Usage
Testing
Operations
FIGURE 3. “Do job twice”: Royce’s waterfall model with prototyping.25
development, which were fairly difficult to
implement given the technological environment of the time.
Summary
Until the 1970s, software development usually followed (or at least tried to follow) a
plan-driven, requirements-driven, sequential
approach, as was well known from engineering. Because experience showed that stepping
back was sometimes necessary, there was a
search for the best way to include feedback
loops in an SDLCM. However, such feedback
loops were mostly considered a concession to
the insufficient capability of software development methods, not as an approach to be deliberately planned.
To some extent, this linear approach was
also pushed by customers who expected to
know in advance what they would get as a result
from a project and how much it was going to
cost. If this was not successful, one tried to
46
www.computer.org/annals
improve the requirements analysis methods
rather than accepting that changes during
a running project were to be expected. This
thinking was, for example, reflected in standards such as the Department of Defense standard DoD-Std-2167 (cf. Larman and Basili29).
The number of different software development methods was growing strongly at
this time, but that had little influence on the
SDLCMs used. One can distinguish different
approaches to programming, on the one hand
the “pragmatic, can-do approach,” represented
by Grace Hopper, and on the other hand, the
theory-based, scientific approach represented
by Edsger W. Dijkstra.30 Nevertheless, the life
cycle models used by the two approaches did
not essentially differ, the difference was in
the importance of the different phases of the
model and the tasks to be performed within
these phases. To achieve high quality, the former focused on the testing and debugging
phases, whereas the latter focused on the
earlier phases, trying to prove programs correct
with respect to a formal specification, ideally
making testing, and particularly debugging,
superfluous.
The 1980s and Early 1990s:
The Rise of Software Processes
Starting in the late 1970s, the basic building
blocks of software development such as the
structured development methods (e.g., structured analysis and structured design31) had
become available, and the main challenges
now were the improvement of productivity and
scalability.32 To achieve those, the next step
was to define a life cycle in which to integrate
these methods, and the phrase “system life
cycle” almost became a fad or buzzword. Generally these were still sequential models, with
or without feedback loops to previous phases
but with no genuine iterations.
In industry generally, the management of
processes became an important topic, shown
for example by the release of the first version
of the ISO 9000 series in 1987. This trend was
also reflected in software development. Until
then, one usually tried to ensure the quality of
products mainly by testing the final result, but
gradually the focus moved to ensuring product
quality by using suitable development processes.
In IT, this led to models such as CMM,33,34
IT Infrastructure Library (ITIL),35 and the German software process V-Model.36 The growing
interest in the topic also resulted in the first
International Software Process Workshop in
1984,37 the series of European Workshops on
Software Process Technology, which started
in 1991,38 and the journal Software Process:
Improvement and Practice, which was started in
1995. The first book about software processes
that the author knows about was published (in
German) by Gerhard Chroust,39 then assistant
professor at the University of Linz, Austria, and
working at IBM research lab in Vienna.
The increased emphasis on software processes in particular showed itself in the many
software process models that were created and
published by various companies and other
organizations for internal use. Compared to
earlier models, there were two major new developments: following the general trend toward
detailed process definitions and as a step
toward tool support, software process models
went beyond the life cycle structure and additionally defined details of the processes such as
activities, work products, and roles. The second
new development was that models now tended
to cover all project activities, including topics
such as project and configuration management
as well as core development tasks. Although
published some years later, an example of this
second extension is presented (Figure 4).
IBM produced two of the earliest such models, VIDOC in 1985 and COMMAND in 1986.41
Later, these formed the basis for IBM’s ADPS
process model,42 a comprehensive model covering development, quality management, and
project management. Soon after that, many
similar models appeared by other companies.43
The research and development of software
processes during this time focused mainly on
process modeling, tailoring, and tool support.
The growing complexity of the SDLCMs and
the underlying processes as well as the desire
to enforce compliance to these models led to
the demand for suitable tools and environments. This triggered research and development of various Software Engineering Environments,44,45 Process-Centered Software
Engineering Environments,46 and Integrated
Project Support Environments.47
Published work on software process models mostly concentrated on large, complex systems, often consisting of hard- and software
and/or in a defense environment. Therefore,
large, “heavy-weight” processes were taken
for granted, describing the development steps
in detail and sometimes trying to enforce the
defined process using suitable tools and environments. Leon J. Osterweil’s famous paper
“Software Processes Are Software Too,”48
describes this approach quite well as “programming the process.” In that paper, Osterweil,
then professor of computer science at the University of Colorado at Boulder, discussed and
made explicit the duality between the development of software and the development of software processes.
The V-shaped Model
The V-shaped model was an enhancement of
the linear life cycle, using a V-shaped graphical
July–September 2017
47
FEATURE ARTICLE
Phases
Work flows
Inception
Elaboration
Construction Transition
Business modeling
Formal Methods and Cleanroom
Software Engineering
Requirements
Analysis and design
Implementation
Test
Deployment
Conf. and change mgmt.
Project management
Environment
Initial
Elab
#1
Elab
#2
Const Const Const Trans Trans
#1
#2
#3
#1
#2
Iterations
FIGURE 4. The two-dimensional structure of Rational Unified Process (RUP).39
representation to add emphasis to the systematic verification and validation of results.
It was first described by Boehm to show “the
context of verification and validation activities
throughout the software life-cycle.”49
In the V-shaped model, the life cycle is
split into two branches: the left-hand branch,
called “decomposition and definition,”50 contains the requirements, analysis, and design
tasks, leading to coding at the bottom of the V.
In the right-hand branch, the system is integrated, tested, and verified. These phases each
refer to one of the requirements, analysis and
design phases, and their task is to verify that
the developed products correctly implement
the result of the relevant left-hand phase.
The validation activities are performed
mainly in the upper part of the V, that is, the
very early and very late development phases,
whereas the lower part of the V contains the
verification activities.51
This direct relationship between constructive and analytic phases was already implicitly
contained in Benington’s program production
life cycle (see Figure 1, dotted lines). Later, this
was explicitly represented as a “V,”52 and the
48
concept was extended, for example, in the Vee
chart,53 and in the Life Cycle Model,54 an early
version of the German V-Model standard.
www.computer.org/annals
Continuing Dijkstra’s theory-based approach
mentioned above, formal development methods were widely discussed in research in the
1980s and 1990s.55 The idea was to specify
requirements in a formal language with clearly
defined semantics and then prove, in the mathematical sense, that the implementation satisfied the specification. Of course, this relies on a
sequential life cycle, with requirements identified early on. Because additionally there is considerable effort involved in such a proof, formal
methods were occasionally used in the development of systems with very high requirements on reliability but rarely elsewhere.56
Closely related is Cleanroom software
engineering, which combines formal development methods with an emphasis on iterative
development and with statistics-based testing,
with the goals to reduce the number of bugs
built into a system and increase the percentage of bugs found in testing.57,58 Again, this
approach became quite well known but was
rarely used outside the development of systems with very high reliability requirements.
Summary
Until the late 1980s, iteration was mostly considered a necessary evil, needed to correct
bugs, but not as a deliberate part of the life
cycle. As a result, “life cycle” was considered
synonymous with “linear life cycle,”59 which
led to publications such as “Life-Cycle Concept
Considered Harmful.”60 As this publication
shows, the “antithesis” (following Boehm’s terminology) was starting even though not widely
accepted yet.
There was plenty of discussion of and
research into SDLCMs and software processes,
mostly concentrating on going into more detail
of the processes, integrating the development
methods and defining product templates and
roles, and building on that to support the processes using tools. As a result, the overall life
cycle models did not change much but gained a
much stronger foundation.
1990s and Early 2000s:
Growing Importance of Light
Weight and Agile Processes
Some of the main developments in software
engineering during the 1990s were the growing importance of time-to-market of software,
partly due to the emergence of the World Wide
Web, and the resulting need to better adapt to
unclear and changing requirements.61
In order to deal with these changes in the
environment, the use of iterative life cycles
grew further, initially still focusing on coding
and testing, while requirements analysis was
usually done up-front, before starting the cycle.
For example, daily (or nightly) builds,
where an iteration includes some design,
mainly implementation, integration, and testing, were becoming fairly popular, as described,
for example, for Microsoft.62(chap.5)
Gradually, new models appeared that did
explicitly support frequent iterations beyond
coding and testing. The first model to explicitly
include requirements analysis in the iterations
was Boehm’s spiral model published in 1988.63
A few years later, rapid application development was published by Martin.64
Furthermore, object-oriented development (beyond object-oriented programming)
started to gain momentum in industry, leading
to a need for new development methods, software processes, and life cycle models.
Rational Unified Process (RUP)
Development of the Rational Unified Process
(RUP)65,66 was triggered mainly by the growing
importance of object-oriented development,
which initially led to a number of different
notations and diagrams for analysis and design.
RUP was a commercial product describing
an approach to software development, published by the Rational Company in 1998, and
became widely used in the following years.
One of the main sources of RUP was the
Objectory Process developed by Ivar Jacobson,
based on his experience at Ericsson. His company, Objectory, was acquired by Rational in
1995, at the time the driving force behind UML,
where Grady Booch and James Rumbaugh
were working on a unified method to support
object-oriented development, in particular
the use of the new UML (for more information
about the history of RUP, see The Unified Software Development Process67).
The authors of RUP tried to combine the
concept of iterative development, delivering
visible results in short cycles, with the fact that
there is a natural progression across the entire
development life cycle from requirements to
test. To achieve this combination, RUP has a
two-dimensional structure consisting of workflows which describe the content of the work to
be done, and of phases and iterations showing
when to perform this work (Figure 4). This presentation visualizes the fact that a workflow
such as “Requirements” is performed in all
phases of the project, with varying intensity.
With this two-dimensional structure and
its strong focus on iterative development, RUP
incorporates a number of agile ideas. On the
other hand, it certainly is not a light-weight,
agile process but contains strong guidance
on the processes and methods to be used. The
authors of RUP therefore state that the belief
that software development “should be organized around the skills of highly trained individuals” with little “guidance in policy and procedure,” is “badly mistaken.” 68
Early Agile Thinking
The “antithesis” of the very structured, sequential thinking in software development and its
limitations led to the agile ideas that started
to grow from about the mid-1980s onward and
went beyond the use of iterative and incremental development. In his famous paper published in 1987, Brooks69 described incremental
development and prototyping as measures to
reduce complexity. DeMarco and Lister,70 in
their well-known book Peopleware, published
in the same year, went further and warned
against “method madness,” pointing out that
software developers, as a form of knowledge
workers, need management methods different
from that for workers in industrial production.
Again, this was a trend not specific to software development but for knowledge workers in
other industries as well, asking for a new company culture based on a common vision or goal
and allowing the (knowledge) workers a lot of
freedom about how to perform their work. This
resulted, for example, in the bestseller Thriving
on Chaos by Peters71 and the rise of the then-new
July–September 2017
49
FEATURE ARTICLE
topic of knowledge management, for example
with the publication72 in which Takeuchi and
Nonaka first introduced Scrum as a method for
(general) development. This latter concept was
later taken up by Schwaber and Sutherland and
adapted to software development.73
Similarly in industrial production, the
concept of “lean production” had become quite
important, and many ideas introduced by lean
production were later taken up by agile software development methods.
Light Weight and Agile Processes
The movement for iterative development and
fewer restrictions on development processes
grew stronger during the 1990s, leading to
“light-weight” processes. Many new development methodologies were introduced, such as
Scrum,74 Extreme Programming,75 Dynamic
Systems Development Method, the Crystal family of methods, and Feature Driven Development.
In various forms and with different
emphasis these methods extended the concept
of iterative and incremental development and
added principles such as intensive communication within the project, fast feedback, self-organizing teams with few external rules etc.
Initially, these methods were called “lightweight” as opposed to the fairly detailed
“heavy-weight” processes. With the Agile Manifesto in 2001, the new term “agile” was introduced and immediately widely accepted.76,77
Summary
The common values and principles of agile
methods, as summarized in the Agile Manifesto,
helped them to gain considerable importance
and acceptance, leading to many fundamental and sometimes very emotional discussions
about the adequacy of classical, plan-driven
versus agile development.
Boehm and Turner,78 on the other hand,
argued that both approaches do not necessarily
contradict each other but in many cases a combination of both is the best solution.
To some extent, this can be seen as a conflict between management, which tries to
keep control of the projects and processes, and
developers, who derive job satisfaction from
self-development and dislike those very controls but try to manage the complexity through
50
www.computer.org/annals
self-organization within a wide framework.79
Process models mostly take the management
view, describing expectations of how the work
is to be performed (which is not necessarily the
way the process is performed in reality).
Software Processes Today
Today, we have a great variety of software processes and life cycles, both in theory and in
practice. Many projects try to identify their
individual combination of plan-driven and agile
methods; for example, agile projects “scale” their
work when they grow in size and complexity
by introducing an overhead, using methods at
least very similar to plan-driven methods (even
though this wording would usually be disputed
strongly) to coordinate small agile teams. On
the other hand, projects that would have little
chance of success with a purely agile approach
such as the development of large technical systems including hardware and software introduce individual agile practices into their work
and move from project durations measured in
years to, for example, annual releases.
New types of life cycle models include
Kanban and DevOps. Kanban80 was originally
introduced in industrial production but now
has been adapted to software development and
aims to reduce processing times by restricting the number of tasks worked on in parallel.
DevOps81 focuses on the notoriously difficult
interaction between development and operations. It aims to bring newly developed software functionality into productive use more
or less immediately, without sacrificing the
production quality achieved by the classical
lengthy test and roll-out phases once development considers the software “finished.”
Another new challenge that software processes face today are the distributed project
teams that come with globalization and with
open-source development, making cooperation and the coordination of project work
increasingly difficult.
Look Back on the Main Trends
If we look at the overall history of SDLCMs, we
can see the following major trends:
»» Early years: try to understand what needs
to be done.
»» 1970s: improved understanding of the
basic development steps, leading to a focus
on development methods, structured
development etc.; this usually resulted
in sequential SDLCMs and strongly controlled development activities.
»» 1980s: growing focus on processes,
strong control, tool support, inclusion
of supporting processes, etc. as a framework for the development methods used;
this however led to a counter-movement
arguing for self-organization, iterative
life cycles, and so forth.
»» 1990s and early 2000s: the counter-movement grows in importance, leading to a
rise of agile methodologies and the parallel plan-driven and agile cultures; tool
support is no longer process-based but
gives the developer the freedom to select
which task to perform when.
»» Today: focus on “scaling agile”; growing
understanding that both cultures have
their pros and cons, and resulting efforts
to bring the best of both cultures together;
also more emphasis on the interface from
development to operations.
Several of these major trends did not start
in software development but started outside
and were then taken up in the software industry, in particular the strong process orientation
in the 1980s with the following antithesis of
self-organization, lean thinking, and knowledge management concepts.
L
ooking at the development of largescale systems, the life cycle used to
develop large-scale software systems
develops slowly and has changed little since
the 1960s.82 The changes here are found on a
lower level, in the development methods used,
the level of detail of process descriptions, and
to some extent in the more frequent releases,
going through the full cycle in a shorter timeframe.
Acknowledgments
The author would like to thank Gerhard
Chroust for his support, in particular, relating
the work done at IBM in the 1980s.
References and Notes
1. B. Curtis, M.I. Kellner, and J. Over, “Process Modeling,” Commun. the ACM, 1992,
vol. 35, no. 9, pp. 75–90.
2. B.W. Boehm, “A View of 20th and 21st Century Software Engineering,” Int’l Conf. on
Software Eng., 2006, pp. 12–29.
3. G.M. Amdahl, G.A. Blaauw, and F.P.
Brooks, “Architecture of the IBM System/360,” IBM J. of Research and Development, vol. 8, pp. 87–101, 1964.
4. N. Ensmenger, “Software as History
Embodied,” Annals of the History of Computing, vol. 31, no. 1, pp. 88–91, 2009.
5. G. O’Regan, A Brief History of Computing,
Springer, 2012.
6. H.H. Goldstine and J. von Neumann,
“Planning and Coding of Problems for
an Electronic Computing Instrument,”
tech. report Part II, vol. 1–3, Institute of
Advanced Study, 1947. Available at https:
//archive.org/download/planningcodingof0103inst/planningcodingof0103inst
.pdf.
7. T. Haigh, M. Priestley, and C. Rope, “Engineering ‘the Miracle of the ENIAC’: Implementing the Modern Code Paradigm,”
Annals of the History of Computing, vol. 36,
no. 2, pp. 41–59, 2014.
8. Goldstine and von Neumann, “Planning
and Coding of Problems for an Electronic
Computing Instrument.”
9. Boehm, “A View of 20th and 21st Century
Software Engineering.”
10. H.D. Benington, “Production of Large
Computer Programs,” Annals of the History
of Computing, vol. 5, no. 4, pp. 350–361,
1983.
11. Benington, “Production of Large Computer Programs.”
12. Benington, “Production of Large Computer Programs.”
13. W. Hosier, “Pitfalls and Safeguards in
Real-Time Digital Systems with Emphasis
on Programming,” Engineering Management, IRE Transactions, vol. EM-8, no. 2,
pp. 99–115, 1961.
14. Software Engineering. “Report on a
Conference Sponsored by the NATO
Science Committee, Garmisch, Germany,
7th to 11th October 1968,” NATO Science
July–September 2017
51
FEATURE ARTICLE
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
52
www.computer.org/annals
Committee, tech. report, 1969. Available
at: http://www.scrummanager.net/files
/nato1968e.pdf.
P.E. Rosove, Developing Computer-Based
Information Systems, John Wiley and Sons,
1967.
Rosove, Developing Computer-Based Information Systems.
N. Ensmenger and W. Aspray, “Software as
Labor Process,” History of Computing: Software Issues, U. Hashagen, R. Keil-Slawik,
and A. L. Norberg, eds., Springer-Verlag,
2002, pp. 139–166.
C. Larman and V.R. Basili, “Iterative and
Incremental Development: A Brief History,” IEEE Computer, vol. 36, no. 6, pp. 2–11,
2003.
F.W. Zurcher and B. Randell, “Iterative
Multi-Level Modeling—A Methodology
for Computer System Design,” Proc. IFIP
Congress 68. IEEE CS Press, pp. 138–142,
1968.
W.W. Royce, “Managing the Development
of Large Software Systems,” Proc., IEEE
Wescon, pp. 1–9, 1970.
Larman and Basili, “Iterative and Incremental Development: A Brief History.”
T.E. Bell and T.A. Thayer, “Software
Requirements: Are They Really a Problem?” Proc. 2nd Int’l Conf. on Software Eng.,
pp. 61–68, 1976.
B.W. Boehm, Software Engineering Economics, Prentice-Hall, 1981.
Royce, “Managing the Development of
Large Software Systems.”
Royce, “Managing the Development of
Large Software Systems.”
Royce, “Managing the Development of
Large Software Systems.”
Larman and Basili, “Iterative and Incremental Development: A Brief History.”
Larman and Basili, “Iterative and Incremental Development: A Brief History,” p. 47f.
Larman and Basili, “Iterative and Incremental Development: A Brief History,” p. 52f.
S. Payette, “Hopper and Dijkstra: Crisis,
Revolution and the Future of Programming,” Annals of the History of Computing,
pp. 654–673, 2014.
Boehm, “A View of 20th and 21st Century
Software Engineering,” §2.3.
32. Boehm, “A View of 20th and 21st Century
Software Engineering,” §2.4.
33. W.S. Humphrey, Managing the Software
Process, Addison-Wesley Professional,
1989.
34. M.C. Paulk et al. “Capability Maturity
Model SM for Software, version 1.1,” tech.
report CMU/SEI-93-TR-024, Software Engineering Inst., Carnegie Mellon Univ., 1993.
35. Central Computer and Telecommunications Agency (CCTA), Service Level Management, HMSO, 1989.
36. H. Hummel, “The Life Cycle Methodology
for Software Production and the Related
Experience,” Approving Software Products,
Proc. the IFIP WG 5.4 Working Conference,
W. Ehrenberger, ed., North-Holland, 1990.
37. C. Potts, Proc. a Software Process Workshop,
February 1984, IEEE Computer Society,
1984.
38. A. Fuggetta, R. Conradi, and V. Ambriola,
eds., First European Workshop on Software
Process Modeling: CEFRIEL, Milan, Italy,
30–31 May 1991, Associazione Italiana
per l’Informatica ed il Calcolo Automatico, Working Group on Software Engineering, 1991, cf. http://trove.nla.gov.au
/work/22457426.
39. G. Chroust, Modelle der Software-Entwicklung, R. Oldenbourg Verlag München
Wien, 1992.
40. P. Kruchten, The Rational Unified Process:
An Introduction, Addison-Wesley, 1998.
41. IBM Corp., “Command—Vorgehensmodell für professionelle Entwicklung und
Wartung von Anwendungen,” tech. report
Form GT 12-3255-1, 1986.
42. G. Chroust, “Application Development
Project Support (ADPS)—An Environment
for Industrial Application Development,”
ACM Software Engineering Notes, vol. 14,
no. 5, pp. 83–104, 1989.
43. G. Merbeth, “MAESTRO-II—das integrierte CASE-System von Softlab,” CASE—
Systeme und Werkzeuge, B-I Wissenschaftsverlag, 1992, pp. 215–232.
44. H. Huenke, ed., “Software Engineering
Environments,” Proc. Lahnstein, Germany,
1980.
45. I. Sommerville, ed., Software Engineering
Environments, P. Peregrinus, 1986.
46. V. Gruhn, “Process-Centered Software
Engineering Environments. A Brief History and Future Challenges,” Annals of
Software Engineering, pp. 363–382, 2002.
47. B.C. Warboys, “The IPSE 2.5 Project: Process
Modelling as a Basis for a Support Environment,” Proc. the First International Conference
on System Development Support Environments
and Factories, Pitman, 1990, pp. 59–74. Available at: http://apt.cs.manchester.ac.uk/ftp
/pub/IPG/bw89.pdf.
48. L. Osterweil, “Software Processes are Software Too,” Proc. the Ninth International Conference on Software Engineering, 1987.
49. B.W. Boehm, “Guidelines for Verifying
and Validating Software Requirements
and Design Specifications,” Euro IFIP 79,
P.A. Samet, ed., North-Holland Publishing
Company, 1979, pp. 711–719.
50. K. Forsberg and H. Mooz, “The Relationship of System Engineering to the Project
Cycle,” Proc. the First Annual Symposium
of National Council on System Engineering,
1991.
51. Osterweil, “Software Processes are Software Too.”
52. Boehm, “Guidelines for Verifying and Validating Software Requirements and Design
Specifications.”
53. Forsberg and Mooz, “The Relationship of
System Engineering to the Project Cycle.”
54. Hummel, “The Life Cycle Methodology
for Software Production and the Related
Experience.”
55. C.B. Jones et al. A Formal Development Support System, Springer-Verlag, 1991.
56. R. Kneuper, “Limits of Formal Methods,”
Formal Aspects of Computing, vol. 9, no. 4,
pp. 379–394, 1997.
57. H.D. Mills, M. Dyer, and R.C. Linger,
“Cleanroom Software Engineering,” IEEE
Software, vol. 4, no. 5, pp. 19–25, 1987.
58. R.C. Linger and C.J. Trammell, “Cleanroom Software Engineering. Reference
Model Version 1.0,” tech. report CMU/
SEI-96-TR-022, Software Engineering
Inst., Carnegie Mellon Univ., 1996.
59. Larman and Basili, “Iterative and Incremental Development: A Brief History,” p. 51
60. D. McCracken and M. Jackson, “LifeCycle Concept Considered Harmful,” ACM
61.
62.
63.
64.
65.
66.
67.
68.
69.
70.
71.
72.
73.
74.
75.
76.
77.
78.
79.
Software Eng. Notes, vol. 7, no. 2, pp. 29–32,
1982.
Boehm, “A View of 20th and 21st Century
Software Engineering,” p. 18.
M.A. Cusumano and R.W. Selby, Microsoft
Secrets, Touchstone, 1995.
B.W. Boehm, “A Spiral Model of Software
Development and Enhancement,” IEEE
Computer, vol. 21, no. 5, pp. 61–72, 1988.
J. Martin, Rapid Application Development,
Macmillan, 1991.
Krutchen, The Rational Unified Process: An
Introducion.
I. Jacobson, G. Booch, and J. Rumbaugh,
The Unified Software Development Process,
Addison-Wesley, 1999.
Jacobson, Booch, and Rumbaugh, The Unified Software Development Process.
Jacobson, Booch, and Rumbaugh, The Unified Software Development Process, preface.
F.P. Brooks Jr., “No Silver Bullet—Essence
and Accidents of Software Engineering,”
IEEE Computer, vol. 20, no. 4, 1987.
T. DeMarco and T. Lister, Peopleware. Productive Projects and Teams, Dorset House
Publishing, 1987.
T. Peters, Thriving on Chaos. Handbook for a
Management Revolution, Alfred A. Knopf,
1987.
H. Takeuchi and I. Nonaka, “The New New
Product Development Game,” Harvard
Business Rev., Jan. 1986.
K. Schwaber, “SCRUM Development Process,” OOPSLA 995 Workshop Proc., Springer,
1997. Available at: http://jeffsutherland.
org/oopsla/schwaber.html.
Schwaber, “SCRUM Development Process.”
K. Beck, Extreme Programming Explained—
Embrace Change, Addison-Wesley, 1999.
A. Alliance, Manifesto for Agile Software
Development, 2001. Available at: http://
agilemanifesto.org/.
J. Highsmith, “History: The Agile Manifesto,”
2001. Available at: http://agilemanifesto
.org/history.html.
B.W. Boehm and R. Turner, Balancing Agility
and Discipline: A Guide for the Perplexed, 1st
ed., Addison-Wesley Professional, 2003.
Ensmenger and Aspray, “Software as labor
Process,” p. 150.
July–September 2017
53
FEATURE ARTICLE
80. M.O. Ahmad, J. Markkula, and M. Oivo, “Kanban in Software Development: A Systematic
Literature Review,” 39th EUROMICRO Conference on Software Engineering and Advanced
Applications (SEAA), 2013, pp. 9–16.
81. M. Loukides, “What is DevOps?” Radar, 7
June 2012. Available at: http://radar.oreilly
.com/2012/06/what-is-devops.html.
82. D.A. Cook, “CrossTalk and Software—
Past, Present and Future. A Twenty-Five
Year Perspective,” CrossTalk, vol. 24, no. 6,
pp. 4–7, 2013.
quality management, and software processes.
Currently, he is a consultant on quality management and process improvement and professor of business informatics at the International
University of Applied Sciences (IUBH). He has
published extensively on Capability Maturity
Model Integration (CMMI), process improvement, and process quality. Contact him at
ralf@kneuper.de.
Ralf Kneuper earned a Diploma
in Mathematics at the University
of Bonn, Germany (1985), and then
a PhD in Computer Science at the
University of Manchester, United
Kingdom (1989). He has worked with various
companies on software quality assurance,
EXECUTIVE STAFF
PURPOSE: The IEEE Computer Society is the world’s largest association
of computing professionals and is the leading provider of technical
information in the field.
MEMBERSHIP: Members receive the monthly magazine Computer,
discounts, and opportunities to serve (all activities are led by volunteer
members). Membership is open to all IEEE members, affiliate society
members, and others interested in the computer field.
OMBUDSMAN: Email ombudsman@computer.org.
COMPUTER SOCIETY WEBSITE: www.computer.org
Next Board Meeting: 12–13 November 2017, Phoenix, AZ, USA
EXECUTIVE COMMITTEE
President: Jean-Luc Gaudiot; President-Elect: Hironori Kasahara; Past
President: Roger U. Fujii; Secretary: Forrest Shull; First VP, Treasurer:
David Lomet; Second VP, Publications: Gregory T. Byrd; VP, Member &
Geographic Activities: Cecilia Metra; VP, Professional & Educational
Activities: Andy T. Chen; VP, Standards Activities: Jon Rosdahl; VP,
Technical & Conference Activities: Hausi A. Müller; 2017–2018 IEEE
Director & Delegate Division VIII: Dejan S. Milojičić; 2016–2017 IEEE
Director & Delegate Division V: Harold Javid; 2017 IEEE Director-Elect &
Delegate Division V-Elect: John W. Walz
BOARD OF GOVERNORS
Term Expiring 2017: Alfredo Benso, Sy-Yen Kuo, Ming C. Lin, Fabrizio
Lombardi, Hausi A. Müller, Dimitrios Serpanos, Forrest J. Shull
Term Expiring 2018: Ann DeMarle, Fred Douglis, Vladimir Getov, Bruce
M. McMillin, Cecilia Metra, Kunio Uchiyama, Stefano Zanero
Term Expiring 2019: Saurabh Bagchi, Leila De Floriani, David S. Ebert, Jill
I. Gostin, William Gropp, Sumi Helal, Avi Mendelson
Executive Director: Angela R. Burgess; Director, Governance & Associate
Executive Director: Anne Marie Kelly; Director, Finance & Accounting:
Sunny Hwang; Director, Information Technology & Services: Sumit
Kacker; Director, Membership Development: Eric Berkowitz; Director,
Products & Services: Evan M. Butterfield; Director, Sales & Marketing:
Chris Jensen
COMPUTER SOCIETY OFFICES
Washington, D.C.: 2001 L St., Ste. 700, Washington, D.C. 20036-4928
Phone: +1 202 371 0101 • Fax: +1 202 728 9614 • Email: hq.ofc@computer.org
Los Alamitos: 10662 Los Vaqueros Circle, Los Alamitos, CA 90720
Phone: +1 714 821 8380 • Email: help@computer.org
MEMBERSHIP & PUBLICATION ORDERS
Phone: +1 800 272 6657 • Fax: +1 714 821 4641 • Email: help@computer.org
Asia/Pacific: Watanabe Building, 1-4-2 Minami-Aoyama, Minato-ku, Tokyo
107-0062, Japan • Phone: +81 3 3408 3118 • Fax: +81 3 3408 3553 •
Email: tokyo.ofc@computer.org
IEEE BOARD OF DIRECTORS
President & CEO: Karen Bartleson; President-Elect: James Jefferies; Past
President: Barry L. Shoop; Secretary: William Walsh; Treasurer: John
W. Walz; Director & President, IEEE-USA: Karen Pedersen; Director &
President, Standards Association: Forrest Don Wright; Director & VP,
Educational Activities: S.K. Ramesh; Director & VP, Membership and
Geographic Activities: Mary Ellen Randall; Director & VP, Publication
Services and Products: Samir El-Ghazaly; Director & VP, Technical
Activities: Marina Ruggieri; Director & Delegate Division V: Harold Javid;
Director & Delegate Division VIII: Dejan S. Milojičić
revised 31 May 2017
54
www.computer.org/annals
Download