What is research?

advertisement
Research Methods
Types of Methods

Software Methods

Scientific Methods

Requirements Elicitation
Software Development
Methodologies / Models
Development
Methodologies





Traditional Waterfall Model
Systems Development Life Cycle (SDLC)
Structured Systems Analysis and Design
Rapid Applications Development (RAD)
Spiral Model
Agile Methodologies
Development
Methodologies (1/2)






Agile software

development

Agile Unified Process 
(AUP)

Open Unified Process
Best practice

Cathedral and the
Bazaar, Open source
Constructionist design
methodology (CDM)
Cowboy coding
Design by Use (DBU)
Design-driven
development (D3)
Don't repeat yourself
(DRY) or Once and Only
Once (O3)
Dynamic Systems
Development Method
(DSDM)
Extreme Programming
(XP)
Development
Methodologies (2/2)








Iterative and incremental
development
KISS principle (Keep It
Simple, Stupid)
MIT approach
Quick-and-dirty
Rational Unified Process
(RUP)
Scrum (management)
Spiral model
Software Scouting






Test-driven
development (TDD)
Unified Process
Waterfall model
Worse is better (New
Jersey style)
Extreme
Programming (XP)
You Ain't Gonna
Need It (YAGNI)
The Waterfall Model

Whatever means of software
acquisition you choose, all the stages
of the development life cycle are
followed. However there are some
differences in terms of what happens
at each stage depending on whether
you opt for bespoke, off-the-shelf
purchase or end-user development
Drawbacks of SDLC

Sequential nature of life cycle





Bureaucratic, long winded, expensive
Minor changes can cause problems
Cost of correcting errors
Misunderstandings/omissions may not come to
light until user acceptance test stage – maybe
too late to make significant changes
Change may be needed after sign off by user
Drawbacks of SDLC

User Dissatisfaction






Early sign-off
Incorrect functionality
Incomplete functionality
User friendliness
Bugs
Lack of participation
Drawbacks of SDLC

Applications backlog



Failure to meet needs of management


Visible
Invisible
Strategic/tactical potential ignored
Unambitious systems design
Drawbacks of SDLC

Problems with documentation





User acceptance
Restrictive
Slow
Maintenance workload
Inflexibility

To cope with rapidly changing business
climate
‘V’ Model
Project
Requirements
Initiation
specification
Detailed
requirements
specification
Product
Evolution
phase out
Acceptance
Specification
Verified System
Architectural
testing
System Integration & test
software design
Integrated software
Design
Software Integration & test
QA
Detailed software design
QA
Debugged
Module Design
Modules
Code & Unit test
SSADM


Only covers part of the system
development process, i.e. analysis and
design.
It emphasises the importance of the
correct determination of systems
requirements.
SSADM Stages

Feasibility Study


Requirements Analysis



Stage 0 – Feasibility
Stage 1 – Investigation of current
requirements
Stage 2 – Business Systems Options
Requirements Specification

Stage 3 – Definition of Requirements
SSADM Stages

Logical System Specification



Stage 4 – Technical System Options
Stage 5 – Logical Design
Physical Design

Stage 6 – Physical Design
Rapid Applications
Development (RAD)


A method of developing information systems
which uses prototyping to achieve user
involvement and faster development
compared to traditional methodologies such
as SSADM.
A prototype is a preliminary version of part
or a framework of all of an information
system which can be reviewed by endusers. Prototyping is an iterative process
where users suggest modifications before
further prototypes and the final information
system is built.
The Spiral Model


Developed by Boehm (1988)
An iterative systems development
model in which the stages of analysis,
design, code and review repeat as
new features for the system are
identified.
The Capability Maturity
Model for Software Development



A 5 stage model for judging the maturity of
the software processes of an organisation
and for identifying the key practices that are
required to increase the maturity of these
processes.
Many large specialist organisations (e.g.
NASA) have achieved the higher levels.
Many smaller companies have processes
that are at stage 1 or 2.
Dynamic Systems Development
Methodology (DSDM)




A methodology that describes how RAD can
be approached.
The focus of this approach is on delivering
the business functionality on time.
Testing is integrated throughout the life
cycle and not treated as a separate activity.
For further information refer to:
Scientific Research
Methodologies / Models
Quantitative & Qualitative
argumentation based on
numbers and calculations QUANTITATIVE
argumentation not based on
numbers and calculations QUALITATIVE
.
. .
.
.
.
.. .. .
.
.
.
.
.
.
.
.
.
.
.
. . . . ... . . . . . . .. .. .. .. ..
. . . . . .. . . . . .
.. .
.. . .
. .. ..
Main distinctions seen between
quantitative and qualitative ‘paradigms’
- The conventional and constructivist Belief Systems
(Adapted from Guba and Lincoln 1989)
Fundamental
questions
Conventional
beliefs
What is there that can be
REALISM
known? –
ONTOLOGY
What is relationship of
the knower to the known OBJECTIVIST
(or knowable)? –
EPISTEMOLOGY
What are the ways of
finding out knowledge? – INTERVENTIONIST
METHODOLOGY
Constructivist
beliefs
RELATIVISM
SUBJECTIVIST
HERMENEUTIC
Main distinctions seen between
quantitative and qualitative ‘paradigms’
- Common dichotomies in methodological literature
qualitative
quantitative
* bad
descriptive
empiricism
atheoretical
good
predictive
rationalism
theoretical
subjective
inductive
participant observation
anthropology
naturalism
art
hermeneutics
aristotelian
teleological
finalistic
understanding
Verstehen
phenomenological
objective
deductive
survey techniques
sociology
anti-naturalism
science
positivism
galilean
causal
mechanistic
explanation
Erklaren
logical positivism
Basic research methods




Quantitative research (e.g. survey)
Qualitative research (e.g. face-to-face
interviews; focus groups; site visits)
Case studies
Participatory research
Quantitative research




Involves information or data in the form of
numbers
Allows us to measure or to quantify things
Respondents don’t necessarily give numbers
as answers - answers are analysed as
numbers
Good example of quantitative research is the
survey
Surveys



Think clearly about questions (need to
constrain answers as much as possible)
Make sure results will answer your research
question
Can use Internet for conducting surveys if
need to cover wide geographic reach
Qualitative research




Helps us flesh out the story and develop a
deeper understanding of a topic
Often contrasted to quantitative research
Together they give us the ‘bigger picture’
Good examples of qualitative research are
face-to-face interviews, focus groups and
site visits
Face-to-face interviews




Must prepare questions
Good idea to record your interviews
Interviews take up time, so plan for an hour
or less (roughly 10 questions)
Stick to your questions, but be flexible if
relevant or interesting issues arise during
the interview
Focus groups




Take time to arrange, so prepare in advance
(use an intermediary to help you if you can)
Who will be in your focus group? (e.g. age,
gender)
Size of focus group (8-10 is typical)
Consider whether or not to have separate
focus groups for different ages or genders
(e.g. discussing sex and sexuality)
Site visits and observation




Site visits involve visiting an organization,
community project etc
Consider using a guide
Observation is when you visit a location and
observe what is going on, drawing your own
conclusions
Both facilitate making your research more
relevant and concrete
Case studies


Method of capturing and presenting
concrete details of real or fictional
situations in a structured way
Good for comparative analysis
Participatory research



Allows participation of community being researched
in research process (e.g. developing research
question; choosing methodology; analysing results)
Good way to ensure research does not simply
reinforce prejudices and presumptions of
researcher
Good for raising awareness in community and
developing appropriate action plans
Planning your research:
Key questions






What do you want to know?
How do you find out what you want to know?
Where can you get the information?
Who do you need to ask?
When does your research need to be done?
Why? (Getting the answer)
Step 1: What?


What do I want to know?
When developing your research question,
keep in mind:
Who your research is for;
What decisions your research will inform;
What kind of information is needed to inform those
decisions.


Conduct a local information scan
Take another look at your research question
Step 2: How? Where? Who?




How do I find out what I want to know?
Where can I get the information I need?
Who do I need to ask?
Choose your methodology





quantitative or numbers information
qualitative in-depth explanatory information
case studies
site visits or observation
participatory research
Step 3: When?




When do all the different parts of the research need
to be done?
List all your research work areas
Map them against a timeline
Develop a work plan
Step 4: Why?
Getting the answer






Collect your data
Keep returning to your research question
Organize your research results to answer the
question
Keep in mind who you are doing the research for
Focus on what research results do tell you
Be creative, methodical and meticulous
Requirements Elicitation
Requirements
Elicitation
Information to elicit:
– Description of the problem domain
– List of problems/opportunities requiring solution
(the requirements)
– Any client-imposed constraints upon system
Requirements
Elicitation
Requirements Elicitation Techniques:
– Background Reading
– Hard data collection
– Interviews
– Questionnaires
– Group Techniques
– Participant Observation
– Ethnomethodology
– Knowledge Elicitation Techniques
Sources of
Information







Clients (actual and potential)
Users of systems (actual and potential)
Domain Experts
Pre - existing system (within the problem domain)
Other relevant products
Documents
Technical standards and legislation
Challenges of
Elicitation (1/2)
• Thin spread of domain knowledge
– The knowledge might be distributed across
many sources. It is rarely available in an explicit
form (i.e. not written down)
– There will be conflicts between knowledge
from different sources.
• Tacit knowledge (The “say - do” problem)
- People find it hard to describe knowledge they
regularly use.
Challenges of
Elicitation (2/2)
• Limited Observability
– The problem owners might be too busy coping with the
current system.
– Presence of an observer may change the problem, e.g.
Probe Effect, Hawthorne Effect
• Bias
– People may not be free to tell you what you need to know.
– People may not want to tell you what you need to know.
• The outcome will affect them, so they may try to influence you
(hidden agendas)
EXERCISE
Quantitative Methods
Qualitative Methods
Positivistic, Empirical
Interpretive
Experimental
Falsification
Construction of
Reality
Correlational
Ethnographic
Surveys
Case Studies
Structured Interview
Unstructured Interview
Postal Questionnaires
Participant Observation
Tests of Performance
Diary Keeping
Attitude Intervention
Narratives
Nomothetic
Ideographic, Hermeneutic
Download