CHAPTER 2: METHODS
How we do research in psychology…
Important Terms:
- Basic vs applied
- Basic research seeks to get more information/knowledge
- Applied research is seeking to solve a problem in the real world
- Often the two will help each other/work together
- Reciprocal determinism = i can influence you while you influence me (you are constantly
influenced and influencing)
- Naive realism = we think the world is exactly how we observe it (how some people will
believe that the world is flat)
- Skepticism = opposite of naive realism; we need to question and test our perceptions of
the world (conspiracy theories will be the people who go too far down this scale)
- Both naive realism and skepticism in adults could lead to delusion
- Peer review = in order for your research to be put into the world, it must go through the
process of peer review (where educated people determine if your research is good and
valid and can be released into the world)
Scientific Method
- Observe a phenomenon
- Develop a theory (an explanation for the phenomenon)
- Needs to be a testable theory
- Formulate a hypothesis
- More restrained/specific than the theory
- Design a study
- Sometimes you may need multiple studies
- Run the experiment and collect data
- Analyze data and draw conclusions on your theory and hypothesis
- Report data and revise theory
Roadblocks to Scientific Thinking
- Hindsight bias = happens when you know the outcome and that makes you believe you
knew the outcome from the start
- You're watching a game and your fav team loses, and you convince yourself into
believing you always knew they were gonna lose
- Overconfidence = we naturally think and feel we are better than we actually are
- Tetlock (1998 and 2005) collected 20,000 different predictions from experts on
what they thought would happen in the world: what they found was 40% of
predictions were right, but all were 80% confident that they were correct
- Perceiving patterns where there aren’t any
- Superstitious fans for a sports team
- Seeing faces in nature where there really isn’t any
- Issues with correlation and causation
Elements of Research
-
Hypotheses = a testable statement for your research question
independent variable (cause) = variables that are under control of the experimenter;
these are things that researching think will cause a change
- There are levels of change that you can look for in independent variables
- dependent variable (effect) = the variable that you think will be changed
- Operational definition = an abstract concept and creating a concrete, testable definition
of the concept
- Happiness → smiles, laughs, hormones (dopamine levels), watching brainwaves
during activity
- Productivity → see how much work someone gets done in certain amount of time
- Memory → the simon game with the colors to see how many they get right, how
many words they can remember
Types of Research
- Experiment: experimenter has to be able to manipulate the independent variable
(change and control), a way of assigning subjects to variables,
- quasi experiment = where one of the independent variables cannot be random or
changed, or randomly assigned (age)
- Descriptive designs = the same as a quasi experiment, however there isn't
enough knowledge in the field to make a specific experiment
- Case studies
- naturalistic observation
- Qualitative and quantitative research
- Cross sectional research = where you think there will be a change in the environment so
you take a small section of the population (type of quasi experiment)
- naturalistic observation = where you unobtrusively observe your participants (type of
quasi experiment)
- Survey = asking people to answer questions to obtain data
- Between subjects = looking for changes between subjects (only exposed to one level of
the independent variable)
- Within subjects = look for change within ONE subject (expose your subject to every level
of the independent variable)
Important Research Considerations
- Reliability = you will get the same score/response on something across multiple time
periods or over repeating testing
- Validity = are you measuring the concept that you think you’re measuring (in order to be
valid, your testing concept also has to be reliable)
- Internal validity = amount of control an experimenter has over their study; the
amount of confidence an experimenter has that their independent variable
caused the change in their dependent variable
- External validity = how similar your phenomenon looks to the real world
- Sample vs. population = the sample is the people/subjects participating in the study;
population is the larger group that your sample is coming from
- You can only apply the results of your study to the population that your sample is
from
-
-
Issues with samples in the past: WEIRD (western pop, educated pop,
industrialized pop, rich pop, democratic pop) so we cannot make generalizations
to the world
- Self selection = because your participants are choosing to participate in the
study, it makes them different from those who did not choose to participate
- Bias → because a study has similar people (college students) there is a general
bias
Statistical Considerations
- Descriptive stats = describe what our data looks like
- Mean (average)
- Median (middle)
- Mode (most often)
- Inferential stats = tells us if there is a statistical difference in the data (gives more
context)
- T test: compares two groups and tells if there is a significant difference
between those two groups
- F stat: used when there is more than two groups
- P value: 0.05 or less = significant difference in data
More than 0.05 = not a significant difference
- Correlation strength = the further from 0 the stronger the correlation
- Correlation direction = positive (up) and negative (down) correlations
Ethics
- Informed consent = participants have to know the risks they are agreeing to
- Respect for persons = you cannot coerce people into participating in your research
(physically, emotionally, or financially)
- Beneficence = inform the participants to the costs and benefits of the research
- Benefits to the participants can be indirect (such as gaining knowledge)
- Confidentiality = no way of tying the participant data to the participants identity; you
cannot expose their identity
- Fairness = all potential benefits have to be distributed across all participants (equal
opportunity)
- Debriefing = you have finished with the study and are reminding the participant of what
they did during the study
- You can lie to the participants: theoretical justification (requires you to lie) and
you cannot lie to them about any potential risks, but other than that you can lie
- Ex: lying about the type of study because you need to stage the crime
- During the debriefing is when you come clean and reveal the deception,
explaining why it was important
- Tuskegee (Jones, 1993) = researchers wanted to better understand syphilis, so they
went to a poor black community and unknowingly to the participants injected them with
syphilis. They then followed them from 1932-1972, never revealing that they'd been
exposed. 128 died, 48 passed it to their wives, and 19 children were born with it.
- This experiment was what caused the science community to realize that we
needed a set of ethics in order to control experiments
-
Animal research = regulations for animal research is much more strict compared to
human research because they cannot give or take consent (lack of communication)
Problems with Research
- Misconduct = itential, ethical violation being committed
- Plagiarism = take someones ideas/words without giving proper credit
- Can extend to research ideas
- Falsification = changing your date in order to back up your theory or hypothesis
- Fabrication = making up your data
- Experimenter expectancy = the experimenter’s behavior accidently influences their
subjects
- The subject unconsciously changes their behavior, thus gaining support for their
hypothesis (bias)
- Demand characteristics = subtle unconscious behaviors that work to
communicate the expectations to the participant (EX: Clever Hans, the horse
who could do math)
- Blindness = single blind study (participate doesn’t know what experimental group
they are in) and double blind study (neither the participants nor the experimenter
know which group the participant is in)
- You cannot expect anything from their behaviors
- Hawthorne Effect = people act differently when they know they are being watched (best
combated by naturalistic observation)
- Social desirability bias = we want to be expected into society, so we answer in a way that
we feel will make us accepted (participants say what they think the experimenters want
them to say versus the actual answer)
- Bogus pipeline = you as a researcher have a way of telling if the participant is
lying about their answer (could be a sequence of responses)
- confound/third variable problem = there is a variable (outside of the independent and
dependent variable) that is not being examined that is causing the results
- Li (1975) collect information on households in Taiwan and found that the
strongest correlation was the if you had more household appliances you were
more likely to take birth control (makes no sense, most likely the confounding
variable was the amount of money one has)
Psychology Best Practices
- Experiment = have control over the independent variable which allows you to determine
causation
- Manipulated variable
- Random assignment
- Double blind study to avoid any experimenter effects
Chapter 2 Notes:
- Stanford Prison Study = a human experiment that studied if people would respond in
extreme ways when placed into certain situations
- Some serious issues with the study as the role between reality and acting blurred
and many people experienced severe emotional trauma
-
-
-
-
Science entails the collection of data and evidence in real world situations to see if they
support our own theories and ideas
Rationalism = the view that using logic and reason is the way to understand how the
world works
Logic works as well, however it only shows us how the world should work
Sensory experiences and how they are viewed varies from person to person
People also will tend to generalize the situation they are in so that it is then applied to
other similar situations
Science is: physical science (study the world of things = stars, waves, atoms), biological
science (plants and animals), social science (psychology)
Scientific thinking = a process using the cognitive skills required to generate, test, and
revise theories
- Keep belief and evidence distinct
- Make testable claims
- Evaluate the strengths and weaknesses of the evidence
- Try to disconfirm your ideas after it has been confirmed
- Have your belief follow the best evidence
Doubt is the foundation of science
Judge and evaluate the strengths and weaknesses of evidence
Try to disconfirm the idea or theory that you had just confirmed (to further test the
strength of said solution)
- Findings only become trusted when they can be tested over and over and get the
same results = replication
If evidence goes against an idea then move your belief to skepticism; if the evidence
goes with the idea then move toward belief
A balance between openness and skepticism in order to find new ideas and theories
Single blind studies = studies in which participants do not know the experimental
condition to which they have been assigned
Double blind studies = studies in which neither the participants nor the researchers know
who has been assigned to the experimental or control group
Experimenter expectancy effects = a result that occurs when the behavior of the
participants is influenced by the experimenter’s knowledge of who is in the control group
and the experimental group
Demand characteristics = subtle cues given by the experimenter to the participants as to
how they should behave in the role
Self fulfilling prophecy = a statement that affects events to cause the prediction to
become true
Longitudinal designs = research that includes observations of the same people over
time, ranging from months to decades
Twin adoption studies = research into hereditary influence on twins who were raised
apart and who were raised together
Gene by environment research = a method of studying heritability by comparing genetic
markers
-
Meta analysis = a research technique for combining all research results on one question
and drawing conclusions
Effect size = a measure of the strength of the relationship between two variables or the
extent of an experimental effect
Social desirability = the tendency toward favorable self-presentation that could lead to
inaccurate self reports
Behavioral measures = measures based on systematic observation of people’s actions
either in their normal or lab setting
Physiological measures = measures of bodily responses used to determine changes in
psychological state
Institutional review boards (IRBs) = organizations that evaluate research proposals to
make sure research involving human subjects does not cause undue stress or harm