Introduction to Design Research: a Methodological Background for

advertisement
Introduction to Design Research:
a Methodological Background for
Scientific Work
Elena Paslaru Bontas
Semantic Web PhD Network Berlin
Brandenburg
30.09.2005
Outline





Motivation
Types of research
Design Research Basics
Evaluation in Design Research
Conclusion
Motivation

Motivation for research:




pure research: enhance understanding of
phenomena
instrumentalist research: a problem needs a solution
applied research: a solution needs application fields
Motivation for research methodology




(qualitatively) control research process
validate research results
compare research approaches
respect rules of good scientific practice
Research: A Definition

Research:


an activity that contributes to the understanding of a
phenomenon [Kuhn, 1962; Lakatos, 1978]
 phenomenon: a set of behaviors of some entity(ies)
that is found interesting by a research community
 understanding: knowledge that allows prediction of
the behavior of some aspect of the phenomenon
 activities considered appropriate to the production of
understanding (knowledge) are the research methods
and techniques of a research community
paradigmatic vs multi-paradigmatic communities
(agreement on phenomena of interest and research
methods)
Scientific Disciplines

Types of research [Simon, 1996]:




natural sciences: phenomena occurring in the world
(nature or society)
design sciences ~ sciences of the artificial:
 all or part of the phenomena may be created
artificially
 studies artificial objects or phenomena designed to
meet certain goals
social sciences: structural level processes of a social
system and its impact on social processes and social
organization
behavioural sciences: the decision processes and
communication strategies within and between
organisms in a social system
phenomena
design
sciences
Semantic Web
(CS)
activities
[Owen,1997]
Design research basics


Process model
Artifact types:


Artifact structure


result of the research work
content of the research approach
Evaluation:


evaluation criteria
evaluation approach
Process model

a problem-solving paradigm:

seeks to create innovations that define the
ideas, practices, technical capabilities, and
products through which the analysis, design,
implementation, and use of information
systems can be effectively and efficiently
accomplished [Tsichritzis 1997; Denning
1997]
Design research process
knowledge
+ operation and goal knowledge
flows
circumscription
process
steps
Awareness of
problem
Suggestion
Development
Evaluation
Conclusion
logical
formalism
abduction
deduction
[Takeda,1990]
Artifacts


are not exempt from natural laws or
behavioral theories
artifacts rely on existing "kernel theories"
that are applied, tested, modified, and
extended through the experience,
creativity, intuition, and problem solving
capabilities of the researcher [Walls et al.
1992; Markus et al. 2002]
Design research outputs
[March & Smith, 1995]





Constructs
 conceptual vocabulary of a problem/solution domain
Methods
 algorithms and practices to perform a specific task
Models
 a set of propositions or statements expressing
relationships among constructs
 abstractions and representations
Instantiations
 constitute the realization of constructs, models and
methods in a working system
 implemented and prototype systems
Better theories
 artifact construction
Design research outputs
constructs
better theories
models
emergent theory about
embedded phenomena
abstraction
abstraction
knowledge as
operational principles
models
methods
constructs
better theories
abstraction
artifact as situated implementation
[Purao , 2002]
instatiations
methods
constructs
Examples












Open up a new area
Provide a unifying framework
Resolve a long-standing question
Thoroughly explore an area
Contradict existing knowledge
Experimentally validate a theory
Produce an ambitious system
Provide empirical data
Derive superior algorithms
Develop new methodology
Develop a new tool
Produce a negative result
Artifact structure

Structure of the artifact



the information space the artifact spans
basis for deducing all required information
about the artifact
determines the configurational
characteristics necessary to enable the
evaluation of the artifact
Evaluation criteria

Evaluation criteria


the dimensions of the information space
which are relevant for determining the utility
of the artifact
can differ on the purpose of the evaluation
Evaluation approach

Evaluation approach



the procedure how to practically test an
artifact
defines all roles concerned with the
assessment and the way of handling the
evaluation
result is a decision whether or not the artifact
meets the evaluation criteria based on the
available information.
Evaluation approach (2)

Quantative evaluation:


originally developed in the natural sciences
to study natural phenomena
approaches:




survey methods
laboratory experiments
formal methods (e.g. econometrics)
numerical methods (e.g. mathematical
modeling)
Evaluation approach (3)

Qualitative evaluation:



developed in the social sciences to enable
researchers to study social and cultural phenomena
approaches:
 action research
 case study research
 ethnography
 grounded theory
qualitative data sources:
 observation and participant observation (fieldwork)
 interviews and questionnaires
 documents and texts
 the researcher’s impressions and reactions
Constructs
Structure
Evaluation criteria
Evaluation approach
meta-model of
the vocabulary
construct
ontological analysis
deficit
construct overload
construct redundancy
construct excess
Methods
Structure
process-based
meta
model
intended applications
conditions of
applicability
products and results of
the method application
reference to constructs
Evaluation criteria
Evaluation approach
appropriateness
laboratory
completeness
consistency
research
field inquiries
surveys
case studies
action research
practice descriptions
interpretative research
Models
Structure
Evaluation criteria
Evaluation approach
domain
 scope, purpose
 syntax and semantics
 terminology
 intended application
correcteness
syntactical

completeness
clarity
flexibility
simplicity
applicability
implementability
validation
integrity checking
sampling using selective
matching of data to actual
external phenomena or
trusted surrogate
integration tests
risk and cost analysis
user surveys
Instantiations
Structure
Evaluation criteria
Evaluation approach
executable
functionality
code
implementation
in a programming language
reference to a design model
reference to a requirement
specification
reference to the
documentation
reference to quality
management documents
reference to configuration
management documents
reference to project
management documents
usability
reliability
performance
supportability
inspection
testing
code analysis
verification
Conclusion
Good research results require a careful
design of the research methodology and
considerable evaluation efforts
References












„DFG Rules of Good Scientific Practice“ available at www.dfg.de, last seen September 2005
Tsichritzis, D. "The Dynamics of Innovation," Beyond Calculation: The Next Fifty Years of
Computing, Copernicus, 1997, pp. 259-265
Denning, P.J. "A New Social Contract for Research," Communications of the ACM (40:2),
February 1997, pp. 132-134
Simon, H.A. The Sciences of the Artificial, 3rd Edition, MIT Press, Cambridge, MA, 1996
Markus, M.L., Majchrzak, A., and Gasser, L., "A Design Theory for Systems that Support
Emergent Knowledge Processes," MIS Quarterly (26:3), September, 2002, pp. 179-212
Walls, J.G., Widmeyer, G.R., and El Sawy, O.A. "Building an Information System Design
Theory for Vigilant EIS," Information Systems Research (3:1), March 1992, pp. 36-59
Kuhn, T.S. The Structure of Scientific Revolutions, 3rd Edition, University of Chicago Press,
1996
March, S.T. and Smith, G. “Design and Natural Science Research on Information
Technology,” Decision Support Systems (15:4), December 1995, pp. 251-266
Lakatos, I. „The Methodology of Scientific Research Programmes“, John Worral and Gregory
Currie, Eds., Cambridge, Cambridge University Press, 1978
Wikipedia available at www.wikipedia.org, last seen Semptember 2005
Purao, S. “Design Research in the Technology of Information Systems: Truth or Dare.” GSU
Department of CIS Working Paper. Atlanta, 2002
Danke für die Aufmerksamkeit
Viel Erfolg für die Promotion
paslaru@inf.fu-berlin.de
Download