Craft of Research * Day 2

advertisement
Craft of Research – Day 2
Marie Vendettuoli
SPIRE-EIT 2011
June 15, 2011
Reminders
• Problem paragraph due today: see email from
Stephen Gilbert
• IRB application draft due next week (6/22)
• Reading for Journal Club updated
• Volunteers needed for next week’s Journal
Club
Review: Craft of Research Quiz
#2: Moore ISD
#4: WorldComp
#6: Touchsense
#7: ETP
Review: Literature Management
Review: IRB Responses
“…research has to be specific and planned out, which the IRB is
ensuring”
“I found the IRB training to be a little unnecessary”
“…why they want you to go through this training: not everyone
thinks rationally”
“I feel that they may slow down scientific process”
“with all the hoops researchers have to jump through, I’m
surprised we’ve had so many advances in science”
“I found a lot of information … to be common sense”
A detour:
Type of media
Most of Time
Some of Time
Little
Print
TV
YouTube, Hulu,
Netflix
music
Non-music audio
video & computer
games
Phone calls
IM
SMS
email
web
Business productivity
How often do you use
each media type with
another (multitask)?
Never
A detour: multitasking
Ignore the blue
rectangles. Have the
red ones changed
orientation?
HMM = heavy media multitaskers
LMM = low media multitaskers
WhatReady?
happened?
What is a Research question?
This is the specific matter you are trying to
address with your project formulated as
a question.
• Don’t overpromise
• Define vague terms, e.g. what does
“improve” really mean?
Chronic media multitasking is quickly becoming ubiquitous,
although processing multiple incoming streams of
information is considered a challenge for human cognition.
A series of experiments addressed whether there are
systematic differences in information processing styles
between chronically heavy and light media multitaskers. A
trait media multitasking index was developed to identify
groups of heavy and light media multitaskers. These two
groups were then compared along established cognitive
control dimensions. Results showed that heavy media
multitaskers are more susceptible to interference from
irrelevant environmental stimuli and from irrelevant
representations in memory. This led to the surprising result
that heavy media multitaskers performed worse on a test of
task-switching ability, likely due to reduced ability to filter out
out interference
interference
fromfrom
the irrelevant
the irrelevant
tasktask
set. set.
These
These
results
results
demonstrate that media multitasking, a rapidly growing
societal trend, is associated with a distinct approach to
fundamental information processing.
Results
Why is
this research
relevant?
Implications
What they did
Does chronic media multitasking
affect memory and/or taskswitching ability?
Exercise: Write a Research Question
Remembering often requires the selection of goal-relevant memories in the face of competition
from irrelevant memories. Although there is a cost of selecting target memories over competing
memories (increased forgetting of the competing memories), here we report neural evidence for
the adaptive benefits of forgetting—namely, reduced demands on cognitive control during future
acts of remembering. Functional magnetic resonance imaging during selective retrieval showed
that repeated retrieval of target memories was accompanied by dynamic reductions in the
engagement of functionally coupled cognitive control mechanisms that detect (anterior cingulate
cortex) and resolve (dorsolateral and ventrolateral prefrontal cortex) mnemonic competition.
Strikingly, regression analyses revealed that this prefrontal disengagement tracked the extent to
which competing memories were forgotten; greater forgetting of competing memories was
associated with a greater decline in demands on prefrontal cortex during target remembering.
These findings indicate that, although forgetting can be frustrating, memory might be adaptive
because forgetting confers neural processing benefits.
Exercise: Write a Research Question
Linking high-throughput experimental data with biological networks is a key step for understanding complex
biological systems. Currently, visualization tools for large metabolic networks often result in a dense web of
connections that is difficult to interpret biologically. The MetNetGE application organizes and visualizes biological
networks in a meaningful way to improve performance and biological interpretability. MetNetGE is an interactive
visualization tool based on the Google Earth platform. MetNetGE features novel visualization techniques for
pathway and ontology information display. Instead of simply showing hundreds of pathways in a complex graph,
MetNetGE gives an overview of the network using the hierarchical pathway ontology using a novel layout, called
the Enhanced Radial Space-Filling (ERSF) approach that allows the network to be summarized compactly. The nontree edges in the pathway or gene ontology, which represent pathways or genes that belong to multiple
categories, are linked using orbital connections in a third dimension. Biologists can easily identify highly activated
pathways or gene ontology categories by mapping of summary experiment statistics such as coefficient of
variation and overrepresentation values onto the visualization. After identifying such pathways, biologists can
focus on the corresponding region to explore detailed pathway structure and experimental data in an aligned 3D
tiered layout. In this paper, the use of MetNetGE is illustrated with pathway diagrams and data from E. coli and
Arabidopsis. MetNetGE is a visualization tool that organizes biological networks according to a hierarchical
ontology structure. The ERSF technique assigns attributes in 3D space, such as color, height, and transparency, to
any ontological structure. For hierarchical data, the novel ERSF layout enables the user to identify pathways or
categories that are differentially regulated in particular experiments. MetNetGE also displays complex biological
pathway in an aligned 3D tiered layout for exploration.
• How can your lit review support formulating a
quality Research Question?
• How does a strong Research Question direct
your lit review?
Tools for finding articles
Ophir E, Nass C, Wagner AD (2009) Cognitive control in
media multitaskers PNAS 106: 15583- 15587
•
•
•
•
•
lib.iastate.edu
scholar.google.com
ISI Web of Knowledge
ACM Digital Library
IEEE eXplore
Following the Citation Trail
1. Start with an article
2. Look at references for interesting articles
3. Look for references that cite your article
1. Repeat for new articles
Lit review to Organize Paper
For each article in your collection, on a piece of
paper:
• Draw a box for each sub-topic discussed by
author.
• Label box with sub-topic and author
• If multiple papers discuss same sub-topic,
they can be in the same box
These boxes are the topics in your paper.
Organizing by Lit Review: Example
What makes the perfect cake? (nom)
Cake Flavors
• Greenspan, Dorie
(2006)
Oven type
• Kasper, L.R. (2010)
Frosting Choices
• Perelman, Deb (2012)
(forthcoming)
Decorating
• Dudley, Angie (2010)
Chiffon Cakes
• Greenspan, Dorie
(2006)
• Child, Julia (1996)
Vegan and
gluten-free
options
• Swanson, Heidi (2011)
Assembly of
layer cakes
• Child, Julia (1996)
Butter cakes
• Perelman, Deb (2012)
(forthcoming)
•Beranbaum, R.L. (2009)
• Greenspan, Dorie (2006)
• Child, Julia (1996)
Refrigerator cakes
• Stewart, Martha (2007)
Storage &
Transportation
• CIA (2001)
• Dudley, Angie (2010)
Questions to answer
• Why do this research? Why is your question
relevant?
• What are the important theories in your
specialty?
• What are other researchers doing? How is
your work different?
• What are the limits of what other researchers’
work?
Looking ahead: Research methods
Qualitative
Quantitative
Focus groups, interviews, reviews
Surveys, measured response (e.g. eyetracking)
Inductive process to formulate theory
Deductive reasoning
Few cases, in-depth
Larger sample size
Subjective descriptions
Objective measures
Unstructured response
Fixed response options
Non-numerical data
Numerical data
No statistics
Statistical tests for analysis
Quality depends on skill of researcher
Reliability depends on instrument or
device.
Time: Less planning, more analysis
Time: more planning, less analysis
Less generalizable
More generalizable
Source: HCI 521
Types of data
Type
Example
Evaluate using…
Nominal
Unique in some
way
Names
Binary
Takes on one of two Male/Female;
values
Dead/Alive;
Right/Left
Ordinal
Ranked, but
differences vary
between rankings
Top 100 lists
Continuous
Numerical, no
absolute zero
System usability
score
Source: HCI 521
Counts,
Frequencies,
description
As above + mean,
median, average,
summary statistics
Usability Testing: Metrics
Satisfaction
• Ease of use
• SUS (System Usability Score)
• Ease to learn
• Enjoyment
• Usefulness
• Expectations
Performance
• Time to completion
• Success at task
• Errors
• Efficiency
When should you consider what metrics to
capture or what analysis process you will use?
How does the type of data you collect affect the
type of analysis that may be performed?
Are there ways to capture satisfaction without
self-reporting? Why might this be useful?
Homework (due by next Wed)
1. As a team: Post your problem paragraph (if not
already done). Categorize this with your team name.
2. Individually: Respond to two teams with the research
question that you think their problem paragraph
suggests.
3. Individually: Post two (2) references that will be part
of your lit review. Cite as in your reference list.
a)
b)
c)
d)
Describe how you came across this reference.
Summarize the research in ~50 words/article
Do not duplicate references posted by fellow interns
Categorize these posts as “Craft of Research”
4. Sign up for your Journal Club time slot.
a) Todd and Sharrod will lead this week’s discussion
Download