Methodologies for IS development

advertisement
COM332 – SA3
RAD Overview
DSDM
• RAD developed as a reaction to the
problems of traditional development
– Problems of long development lead times
– Problems associated with changing and
evolving requirements during the
development process
•
An Overview of RAD
– A complete methodology
– Covering systems development
• From business requirements to ongoing development (maintenance)
–
–
–
–
–
–
–
–
A ‘tool kit’ methodology
Adaptable to different projects
Can utilise wide range of techniques & tools
Places emphasis on user involvement & responsibility throughout
development
Properties of rapid must be defined
Delivered in anything from 2-6mths
When project too large – split into
Increments
• Each increment implemented separately with frequent delivery of working parts of
system
–
–
–
–
–
Aim of development to be cost effective
User quality & speed of delivery paramount
Quality of systems must be maintained
Need effective project management
Need appropriate up to date:
o
o
o
o
o
o
o
Documentation
testing
quality assurance
requirements specs.,
designs
maintainability
reuse
• A RAD approach is more applicable to many
organisations for the following reasons:
– business operates in increasingly competitive world market
place
• right systems at right time provide competitive edge
– business organisations are dynamic & evolving
• requirements may change as system being built
– IT now viewed as cost centre as opposed to a resource
• Systems delivered early can start saving or earning money
sooner
– Systems operate in the social & political environment of
organisation
• system jointly developed by users then is more likely to be
accepted
• Characteristics of RAD
–
–
–
–
–
–
–
–
Incremental development
Timeboxing
The Pareto principle
MoSCoW rules
JAD workshops
Prototyping
Sponsor and Champion
Toolsets
• Incremental development
– Important element of RAD philosophy
• Not all systems requirements can be identified and
specified in advance
• Some requirements will only emerge when the users see
and experience the system
• Requirements are never seen as complete but evolve
and change over time with changing circumstances
• Why waste time?
– RAD starts with a high level imprecise list of
requirements
• Refines and changed during the process
– RAD identifies the easy, obvious requirements and
in conjunction with 80/20 rule just uses this as the
starting point, recognising the future iterations or
timeboxes can handle the evolving requirements.
• Timeboxing
• An agreed period of time in which task (stage) is
to be completed
• Sometimes cost or functionality has to be
compromised to meet time deadline
• Deadline for each timebox is fixed & cannot be
moved
• Project is broken down into series of timeboxes
• Each results in delivery of working functionality
• Users can receive their portion of new system when
required
• Timeboxing leads to frequent phased delivery of products
to end users
– Feasibility study and requirements
gathering
• System then broken down into timeboxes with
each functional area occupying a timebox of its
own
• JRP workshop decides which phases are
critical to launch a system – rest of system has
to be in place not later than 1 month after
launch
• If functionality not ready for the release – it will
go ahead without missing element
• Functional prototyping stage will therefore
include its own prioritisation to recognise what
is critical & what can be safely deferred to later
delivery of software
• The timebox can only work satisfactorily if 2
conditions met
– Development team includes a user representative
who is on hand to make & criticise design
decisions & test development as it proceeds
– Timebox assumes that time is fixed & functionality
is variable
– This is culture change for most users who
expect functionality to be fixed & time (within
reason) to be variable
• Development team must include a user who recognises
this & can explain the concept to other users
• Development conducted using a prototyping tool to
support models & convert these models to full working
systems
• The Pareto principle
– 80/20 rule
– Around 80% of the system requirements can be
delivered with around 20% of the effort needed to
complete 100% of the requirements.
• MoSCow rule
– The Must haves
• without these features the project is not viable
– The Should haves
• To gain maximum benefit these features will be
delivered, but the project’s success does not rely on
them
– The Could haves
• If time and resource allow these features could be
delivered
– The Won’t haves
• These features will not be delivered
• JAD workshops
– Key users, client, developers and scribe produce
system scope & business requirements under
direction of a facilitator
– Takes place away from business environment
– Must come up with business requirements fully
documented (3-5 working days)
– Facilitator has to be skilled in running workshop
(ensure all issues covered)
– Need to obtain a common purpose
– Facilitator must ensure that client & users are fully
involved & developers do not dominate workshop
• This can eliminate months of Analysis in the
traditional development
– The key element is that developers
must have powerful software
development tools to rapidly develop
systems to sufficient software quality
• Structure of RAD projects
– Reduced time scales for deriving business
requirements including use of JAD workshops
– Iterative development in conjunction with users
involving prototyping & frequent delivery of
working products
– Business requirements analysis & systems
analysis undertaken in fundamentally different way
– After feasibility study & appropriate research &
preparation into application a Joint Application
Design (JAD) workshop held
DSDM
• Despite the proposed advantages RAD
approach has been criticised as “quick but
dirty”
– Some shortcuts are taken for the sake of speed
• In 1994 group of systems developers
interested in RAD formed a consortium
• They believed RAD should not only be rapid
but also disciplined and high quality
Av & Fitz’s (2003)
• The consortium defines
– RAD as a project delivery framework that actually
works.
• It aids the development and delivery of
business solutions in tight timescales and
fixed budgets.
–
–
–
–
First published in 1995
Second version in the same year
Version 3 in 1997
In 2000 eDSDM - tailored version of DSDM for
organisation and projects undertaking e-business
initiatives.
– Version 4 is introduced in 2001 described as a
frame work for business centred development.
– Most recent version is 4.2
• DSDM
– Framework not a methodology
• Details on how things to be done and what the
various products will contain is left to the
organisation or the individual to decide
• Does not prescribe tools & techniques
• Serves as high-level project mgmt.
framework for RAD
• There are nine principles for DSDM, all are
critical for the project success
– Active user involvement
– DSDM teams must be empowered to make
decisions
– Focus on frequent delivery of products
– Fitness for business purpose is the essential
criteria for acceptance of deliverables
– Iterative & incremental development necessary to
converge on an accurate business solution
– Changes during development are reversible
– Requirements are baselined at a high level
– Testing integrated throughout the life cycle
– A collaborative & co-operative approach between
all stakeholders is essential
• DSDM defines five phases for the
development
• Feasibility study and business study are
carried out sequentially and prior to other
phases
• The sequence and overlapping for the rest of
the phases are left to the organisation
– Feasibility study
• Includes the normal feasibility elements, the cost/benefit
case etc.
• Also concerned with determining whether DSDM is the
right approach
– Business study
• Quick and high level
• Gaining an understanding of the business
– Functional model iteration
• High level functions and information requirements from
business study are refined.
• Standards analysis models are produced followed by the
development of prototype and then the software
– System design and build iteration
• System is built ready for delivery to the users including
the minimum usable subset of requirements.
– Implementation
• Cut over from the existing system or the environment to
the new. Includes training, development and completion
of the user manuals and documentation
• Project review document is produced which assesses
whether all the requirement have been met or whether
further iterations or timeboxes are required.
Classifying and comparing
methodologies
• Avison & Fitzgerald’s categories (2003)
– Process-oriented methodologies
• Examples: STRADIS, YSM (Yourdon), JS
– Blended methodologies
• Examples: SSADM, Merise, IE, Welti ERP
– Object-oriented methodologies
• Examples: OOA, RUP
– Rapid development methodologies
• Examples: JMRAD (James Martin), DSDM, XP, WiSDM
– People-oriented methodologies
• Examples: ETHICS, KADS, CommonKADS
– Organisational-oriented methodologies
• Example: SSM, ISAC, PI, PRINCE, Renaissance
– Frameworks
• Example: Multiview, SODA, CMM, Euromethod
• Bansler’s “Three theoretical schools” (1989)
– Systems-theoretical
•
•
•
•
Knowledge interest: Profit maximisation
Organisation: Cybernetic system
Labour force: System components
Labour relations: Common interest
– (characterised by ISAC)
– Socio-technical:
•
•
•
•
Knowledge interest: Job satisfaction (participation)
Organisation: Socio-technical system
Labour force: Individuals
Labour relations:Common interest
– (characterised by ETHICS)
– Critical
•
•
•
•
Knowledge interest: Industrial democracy
Organisation: Framework for conflicts
Labour force: Groups
Labour relations:Opposing interests
– (characterised by “Scandinavian approaches” historical
examples)
Comparing Methodologies
• Two main reasons for comparing
methodologies
– Academic reason – better understanding of
the nature of the methodologies
– Practical reason – choosing a
methodology, a part of one or a number of
methodology for a particular application
• Few of the ideal type criteria for assessing methodologies
– Rules: methodology should provide formal guidelines to cover
phases, tasks, deliverables etc
– Total coverage: methodology should cover the entire system
development process
– Understanding the information resource: effective utilisation of data
available and processes which need to make use of the data
– Ongoing relevance: methodology should be capable of being
extended so that new techniques and tools can be incorporated
– Automated development aids: where possible use software tools to
enhance productivity
– Consideration of user goals and objectives: to satisfy the users
– Participation: encourage participation through simplicity and ability
to facilitate good communication
– Relevance to practitioner: appropriate to the practitioner using it, in
terms of level of knowledge, experience with computer, and social
and communication skills
– Relevance to application: must be appropriate to the type of the
system being developed
• Comparison frame works
– Bjorn-Andersen (1984) checklist includes criteria
relating to values and society
– Jayaratna’s comparative framework (NIMSAD)
(1994)
• Jayaratna’s framework is based on the models and the
epistemology of system thinking.
• The evaluation has three elements
• The problem situation or methodology context
• The intended problem-solver or methodology user
• The problem-solving process - the methodology itself
• the problem situation
– Because
• effectiveness of systems measured against contribution
to information users in org
• developers need to interact with members of org
• this is the area where methodology users are introduced
to the problem as seen by the client
• and where interpersonal relationships are formed.
– Ask how does the methodology helps
• intended problem–solver
– Ask
• why does the ips select some elements of
action world as relevant/significant/useful,
regard others as irrelevant/insignificant/useless
• how do they select or abstract elements of the
action world
• what are the implications of this selection?
• the problem-solving process
– Phase 1 Problem formulation
• Stage 1 understanding the situation of concern
• Stage 2 performing the diagnosis (where are we now?)
• Stage 3 defining the prognosis outline (where do we
want to be and why?)
• Stage 4 defining problems
• Stage 5 deriving notional systems
– Phase 2 Solution design
• Stage 6 performing conceptual/logical design
• Stage 7 performing physical design
– Phase 3 Design implementation
• Stage 8 implementing the designs
• One feature of this frame work is it
recommends the evaluation conducted
at three stages
– before intervention- before a methodology
is adopted
– during intervention - during use
– after intervention – assessment of success
• The best methodology cannot be find in isolation, the
search for an appropriate methodology should be
done in the context of the problem being addressed,
the application and the organisation and its culture.
• Practical problems in comparing methodologies
– Methodologies are not stable, they are continuing to develop
and evolve
– Commercial reason – the documents are not always
published or readily available to the public
– The practice of methodology in sometimes significantly
different form what is prescribed by the documentation
– The consultants or the developers using the methodology
interpret the concepts in different way.
•
Framework
–
–
–
Comparing methodology is a very difficult task.
This framework is not supposed to be fully comprehensive.
This framework has seven basic elements
1. Philosophy
a.
b.
c.
d.
2.
3.
4.
5.
6.
Paradigm
Objectives
Domain
Target
Model
Techniques and tools
Scope
Outputs
Practice
a. Background
b. User base
c. Participants
7. Product
•
Philosophy
–
–
–
–
Philosophy distinguishes methodology from a
method.
The choice of the area covered by the
methodology, the systems, data or people
orientation, the bias towards purely IT based
solution are made on the basis of philosophy of
the methodology.
Philosophy is the principle or the set of principles
that underlie the methodology
There are four factors to the philosophy
•
•
•
•
Paradigm
Objectives
Domain
Target
•
Paradigm
–
–
–
“A specific way of thinking about problems,
encompassing a set of achievements which are
acknowledged as the foundation of further
practice”. Kuhn (1962)
Science paradigm – characterised by the hard
scientific development
Consists of
•
•
•
–
Reductionalism
Repeatability
Refutation
System paradigm – characterised b y the holistic
approach
•
•
Human activity system.
Characterised by the emergent properties (whole is
greater than the sum of the parts)
•
Objectives
– One direct clue to the methodology philosophy is
stated objective or objectives.
– Build a computerised information system
– Discover whether there is a need for an
information system
– Determine the boundaries of the area of
interests.
•
Domain
– Domain of situations that the methodology
addresses.
•
Target
– Applicability of the methodology.
– Some are specially targeted for specific problems
while others are general purpose.
•
Model
–
–
–
Model is the basis of the methodologies view of
the world
Abstraction and representation of the importance
factors of information system or organisation.
The model works at number of different levels
•
•
•
•
The means of communication – pictorial, iconic or
schematic
As a way of capturing the essence of the problem or
design
Representation which provides and insight to the
problem area
Techniques and tools
–
Identify the techniques and tools used in a
methodology
•
Scope
–
•
Outputs
–
•
Scope is the indication of the stages of the life
cycle of system development which the
methodology covers.
What the methodology is producing in terms of
deliverables
Practise
–
Methodology background
•
•
–
Commercial
Academic
The user base
•
The participants in the methodology and their skill
levels
• Product
– What the purchasers actually get for their
money
•
•
•
•
•
Software
Written documentation
Training
Telephone help line
Consultancy
Download