Chapter 3
Designing Research Concepts,
Hypotheses, and Measurement
Research Design
Must create a Research Design
Questions are composed of concepts
Must start with a research question
Stages of Research
1)
2)
3)
4)
5)
6)
7)
Developing Concepts
Operationalization
Selection of Research Method(s)
Sampling Strategy
Data Collection ‘Plan’
Analyses
Results and Writing
Also need to consider budget issues
Operationalization
It is critical to survey research to
understand how to go from ideas to
concepts to variables – operationalization.
Concepts
Concept (p.35): an idea, a general mental formulation
summarizing specific occurrences
A label we put on a phenomenon, a matter, a “thing” that
enables us to link separate observations, make
generalizations, communicate and inherit ideas.
Concepts can be concrete, abstract, tangible or intangible.
Concrete: Height, Major
Abstract: Happiness, Love
Transferring Concepts into
something Measurable
Variable:
A representation of concept in its variation of
degree, varieties or occurrence.
A characteristic of a thing that can assume
varying degrees or values.
Fixed meaning = constant
Most variables are truly variable = multiple categories or
variables
Example: Concept and Variable
Concept:
Political participation
Variables:
Voted or not
How many times a person has voted
What party a person votes for
How to be measured?
Conceptualization:
The
process
conceptualization includes coming to
agreement about the meaning of the concept
of
some
In practice, you often move back and forth between
loose ideas of what you are trying to study and
searching for a word that best describes it.
Sometimes you have to “make up” a name to
encompass your concept.
Conceptualization
As you flush out the pieces or aspects of a concept,
you begin to see the dimensions; the terms that
define subgroups of a concept.
With each dimension, you must decide on
indicators – signs of the presence or absence of
that dimension.
Dimensions are usually concepts themselves.
Operationalizing Choices
You must operationalize: process of converting
concepts into measurable terms
The process of creating a definition(s) for a concept
that can be observed and measured
The development of specific research procedures that will
result in empirical observations
SES is defined as a combination of income and education
and I will measure each by…
The development of questions (or characteristics of data in
qualitative work) that will indicate a concept
Variable Attribute Choices
Variable attributes need to be exhaustive and
exclusive
Represent full range of possible variation
Degree of Precision
selection depends on your research interest
Is it better to include too much or too little?
Variables
The dependent variable is the variable that the
researcher measures; it is called a dependent variable
because it depends upon (is caused by) the
independent variable.
The independent variable is the one that the
researcher manipulates.
Example: If you are studying the effects of a new educational program
on student achievement, the program is the independent variable and
your measures of achievement are the dependent ones.
Variables
Qualitative Variable: Composed of categories which are
not comparable in terms of magnitude
Quantitative Variable: Can be ordered with respect to
magnitude on some dimension
Continuous Variable: A quantitative variable, which can be
measured with an arbitrary degree of precision. Any two
points on a scale of a continuous variable have an infinite
number of values in between. It is generally measured.
Discrete Variable: A quantitative variable where values can
differ only by well-defined steps with no intermediate values
possible. It is generally counted.
Level of Measurement
Nominal
Ordinal
Interval
Ratio
Nominal Measures
Only offer a name or a label for a variable
There is not ranking
They are not numerically related
Gender; Race
Ordinal Measures
Variables with attributes that can be rank ordered
Can say one response is more or less than another
Distance between does not have meaning
lower class, middle and upper class
Note: Scales and indexes are ordinal measures, but
conventions for analysis allow us to assume equidistance
between attributes (if it makes logical sense); treat them like
“interval” measures; and subject them to statistical tests
Interval Measures
Distance separating attributes has meaning and is
standardized (equidistant)
“0” value does not mean a variable is not present
Score on an ACT test 50 vs. 100
does not mean person is twice as smart
Ratio Measures
Attributes of a variable have a “true zero point” that
means something
Waist measures and Biceps measures
Allows one to create ratios
Hypotheses
Hypotheses: (pg. 36) Untested statements
that specify a relationship between 2 or more
variables.
Example: Milk Drinkers Make Better Lovers
Characteristics of a Hypothesis
States a relationship between two or more variables
Is stated affirmatively (not as a question)
Can be tested with empirical evidence
Most useful when it makes a comparison
States how multiple variables are related
Theory or underlying logic of the relationship makes sense
Hypotheses should be clearly stated at the
beginning of a study.
Do not have to have a hypothesis to conduct
research, general research questions.
Positive and Negative (Inverse)
Relationships
Positive: as values of independent variable
increase, the values of the dependent variable
increase
Negative: as values of independent variable
increase, the values of the dependent variable
decrease (or vice versa)
Two-directional Hypotheses
More general expression of a hypothesis
Usually default in stat packages
Suggests that groups are different or concepts
related, but without specifying the exact direction
of the difference
Example: Men and women trust UK security differently.
One-directional hypotheses
More specific expression of a hypothesis
Specifies the precise direction of the
relationship between the dependent and
independent variables.
Example: Women have greater trust in UK
security compared to men.
Determining Quality of Measurement
Accuracy and Consistency in Measurement
Validity is accuracy
Reliability is consistency
Reliability
Definition -- The extent to which the same
research technique applied again to the same
object (subject) will give you the same result
Reliability does not ensure accuracy:
a measure can be reliable but inaccurate
(invalid) because of bias in the measure or in
data collector/coder
Validity
Definition -- The extent to which our
measure reflects what we think or want them
to be measuring
Face Validity
Face validity: the measure seems to be related to
what we are interested in finding out even if it does
not fully encompass the concept
concept = intellectual capacity
measure = grades (high face validity)
measure = # of close friends (low face validity)
Criterion Validity
Criterion validity (predictive validity): the measure
is predictive of some external criterion
Criterion = Success in College
Measure = ACT scores (high criterion validity?)
Construct Validity
Construct Validity: the measure is logically related
to another variable as conceptualized it to be
construct = happiness
measure = financial stability
if not related to happiness, low construct validity
Content Validity
Content Validity: how much a measure covers a
range of meanings; did you cover the full range of
dimensions related to a concept
Example: You think that you are measuring
prejudice, but you only ask questions about race
what
about sex, religious etc.?
Methodological Approaches,
Reliability and Validity
Qualitative research methods lend themselves to
high validity and lower reliability.
Quantitative research methods lend themselves to
lower validity and higher reliability