INF397C Understanding Research Fall 2011 Unique # 28700 Wednesdays 9-12

advertisement
INF397C Understanding Research
Fall 2011
Unique # 28700
Wednesdays 9-12
UTA 1.208
Professor Diane E. Bailey
debailey@ischool.utexas.edu, UTA 5.438
Office Hours: email is best, otherwise see me directly after my 397C sessions (please don’t leave office vm)
Professor Randolph G. Bias
rbias@ischool.utexas.edu, UTA 5.424, cell 512-657-3924
Office Hours: Thursdays, 11:00 – 12:00, and by appointment. (Especially by appointment!)
TA: Virginia Luehrsen
luehrsen@mail.utexas.edu
Teaching Intern: Jeannine Finn
jefinn@ischool.utexas.edu
OVERVIEW
Every day you make decisions. You decide to take IH-35, rather than MoPac, to drive to school because you
think it will provide you a quicker, safer, and/or happier trip. You base this decision on some data you have
collected from your previous experience, or from information people have told you, or from information
gleaned from a map, or from radio and TV reports. Or maybe you just have a feeling.
During that drive to school, and likely before, and certainly after, you will hear or read many, many claims.
- “Crest makes your teeth brighter.”
- “Our candidate will improve Austin traffic.”
- “Taking this course will help you be a better information scientist.”
- “I like you.”
- “This is a better way to design your web site.”
Unprepared information scientists and professionals – indeed, unprepared citizens – are forced to consider
the torrent of claims they hear every day, and either accept or reject them based on faith. Prepared
scientists/professionals/citizens can, instead, consider the methods used to gain and analyze the information
on which the claims are made, and evaluate for themselves the likely goodness of the claims.
In one of the required textbooks for this course, the author Vincent Dethier asserts that, “An experiment is
[personkind’s] way of asking Nature a question.” As an information scientist, you will read many, many
answers that information scientists and other scholars have inferred from questions they have asked of
Nature and of humans. To help you evaluate and understand those answers, we will address quantitative and
qualitative research methods, as well as a number of distinct approaches that information scientists commonly
undertake, including rhetorical analysis, historical analysis, design research, and computational research.
pg. 1
Overall, this course is designed to help you develop skills and awareness for understanding research in
information studies. Expect a course flavored by an awareness of, and an appreciation for, various ways to
conduct research. Expect assignments that will provide you with a chance to demonstrate that you
understand the basics of these various ways of research. Expect some lecture, some discussion, and some
hands-on in-class exercises. Expect to be surprised by how interesting (and painless) this stuff can be,
regardless of how math phobic or narrative intolerant you may be. Expect to come out of the course being
able to evaluate whether a piece of research you read about was appropriately designed and well conducted.
Note that our fundamental goal is NOT to empower you to conduct your own research, but rather to well
prepare you to be critical consumers of research in your academic and professional careers.
LEARNING OUTCOMES
This class is designed to arm you with a scientist’s skepticism and a scientist’s tools to understand and
evaluate research. Hence, the student who successfully completes this course will, at a general level:
 Recognize authors’ philosophical stances towards research
 Understand research design, and know how to evaluate the appropriateness of designs
 Understand the difference between, and the relative benefits of, quantitative and qualitative research
 Be aware of the primary research designs and methods employed in information studies research
 Be better able to discern the quality or soundness of research
Specifically, a student who successfully completes this course will:
 Recognize when hypotheses, propositions, or research questions are appropriate
 Understand descriptive statistics, and know how to represent a collection of numbers
 Understand inferential statistics and hypothesis testing
 Appreciate the strengths, weaknesses, and validity concerns of a variety of research methods
 Become familiar with research conducted by iSchool faculty
COURSE POLICIES
Attendance and Participation
You are expected to attend each week’s class session and to have completed the reading and any assignments
so that you can actively engage in discussions. You are also expected to work diligently and cooperatively on
in-class exercises. Poor attendance and participation will lower your grade; good attendance and participation may improve it.
Grading
See end of syllabus for descriptions of the assignments in this course.
HW#
Assignment
Percentage of Grade
HW #1
Paper Parts
6
HW #2
Quantitative I
12
HW #3
Quantitative II
12
HW #4
Evaluating Paired Papers
9
HW #5
Qualitative I
12
HW #6
Qualitative II
12
HW #7
Quantitative III
12
Comprehensive
Evaluating Related Research Papers
25
Total
100
Due Date
8/31
9/14
9/28
10/12
10/19
11/2
11/16
11/30
pg. 2
Submission of On-Time and Late Work
All written assignments should be submitted in hard copy on the date shown. HW #4, which serves as the basis
for an in-class exercise, cannot be late (i.e., late submissions will earn zero points). For all other assignments, email
submission will incur a 5% penalty, and you will lose 10% of your grade for work submitted by noon on
Thursday and another 10% per day for each additional day late. Late work, and only late work, should/must
be submitted by email.
University of Texas Honor Code
The core values of The University of Texas at Austin are learning, discovery, freedom, leadership, individual
opportunity, and responsibility. Each member of the university is expected to uphold these values through
integrity, honesty, trust, fairness, and respect toward peers and community. Source:
http://www.utexas.edu/welcome/mission.html
Documented Disability Statement
Any student with a documented disability who requires academic accommodations should contact Services
for Students with Disabilities (SSD) at (512) 471-6259 (voice) or 1-866-329-3986 (video phone). Faculty are
not required to provide accommodations without an official accommodation letter from SSD.



Please notify us as quickly as possible if the material being presented in class is not accessible (e.g.,
instructional videos need captioning, course packets are not readable for proper alternative text
conversion, etc.).
Please notify us as early in the semester as possible if disability-related accommodations for field trips
are required. Advanced notice will permit the arrangement of accommodations on the given day (e.g.,
transportation, site accessibility, etc.).
Contact Services for Students with Disabilities at 471-6259 (voice) or 1-866-329-3986 (video phone)
or reference SSD’s website for more disability-related information:
http://www.utexas.edu/diversity/ddce/ssd/for_cstudents.php
Tools
- Calculator. You’ll need one, but just the simplest of ones.
- Math skills. You’ll need them, but just the simplest ones.
Cheating
Don’t. Dire consequences.
pg. 3
DIGITIZED READINGS FOR THIS COURSE
(retrieve them via the library’s electronic databases or Google scholar)
#
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
Item
Aspray, William. 1985. “The Scientific Conceptualization of Information: A Survey.” Annals of the
History of Computing, 7(2): 117-140.
Barker, L. 2009. “Student and Faculty Perceptions of Undergraduate Research Experiences in
Computing.” ACM Transactions on Computing Education, 9(1): 5:1-28.
Barley, Stephen R. 1990. “Images of Imaging: Notes on Doing Longitudinal Field Work.”
Organization Science, 1(3): 220-247.
Boeije, Hennie. 2002. “A Purposeful Approach to the Constant Comparative Method in the Analysis
of Qualitative Interviews.” Quality & Quantity, 36: 391-409.
Clement, Tanya E. 2008. “’A Thing Not Beginning and Not Ending’: Using Digital Tools to DistantRead Gertrude Stein’s The Making of Americans.” Literary and Linguistic Computing, 23(3): 361-381.
Cohen, Kris R. 2005. “What Does the Photoblog Want?” Media, Culture, & Society, 27(6): 883-901.
Creswell, John W. and Miller, Dana L. 2000. “Determining Validity in Qualitative Inquiry.” Theory
into Practice, 39(3): 124-130.
Dillon, A., Kleinman, L., Choi, G., and Bias, R. 2006. “Visual Search and Reading Tasks Using
Cleartype and Regular Displays: Two Experiments.” Proceedings of CHI 2006, New York: ACM Press,
503-511.
Doty, Philip and Erdelez, Sanda. 2002. “Information Micro-Practices in Texas Rural Courts:
Methods and Issues for E-Government.” Government Information Quarterly, 19(4): 369-387.
Eisenhardt, Kathleen M. 1989. “Building Theories from Case Study Research.” The Academy of
Management Review, 14(4): 532-550.
Ensmenger, Nathan L. 2003. “Letting the ‘Computer Boys’ Take Over: Technology and the Politics
of Organizational Transformation.” International Review of Social History, 48: 153-180.
Erickson, Ingrid. 2010. “Geography and Community: New Forms of Interaction Among People and
Places.” American Behavioral Scientist, 53(8): 1194-1207.
Feinberg, Melanie. 2010. “Two Kinds of Evidence: How Information Systems Form Rhetorical
Arguments.” Journal of Documentation, 66(4): 491-512.
Galloway, Patricia. 2011. “Personal Computers, Microhistory, and Shared Authority: Documenting
the Inventor-Early Adopter Dialectic.” IEEE Annals of the History of Computing, 33(2): 60-74.
Golbeck, Jennifer, Koepfler, Jes, & Emmerling, Beth. 2011. “An experimental study of social
tagging behavior and image content.” Journal of the American Society of Information Science and Technology,
62 (9), 1750-1760.
Gracy, David B. II. 2011. “Is There Counsel in Those Curtains?” Research Agendas for the Times.”
The Library Quarterly, 81(3): 277-295.
Hannay, J., MacLeod, C., Singer, J., Langtangen, H., Pfahl, D., and Wilson, G. (2009). “How do
scientists develop and use scientific software?” In Proceedings of the 2009 ICSE Workshop on Software
Engineering for Computational Science and Engineering, pages 1–8. IEEE Computer Society.
Harmon, Glynn. 2008. “Remembering William Goffman: Mathematical Information Science
Pioneer.” Information Processing & Management, 44:1634-1647.
Hartel, Jenna. 2010. “Managing Documents at Home for Serious Leisure: A Case Study of the
Hobby of Gourmet Cooking.” Journal of Documentation, 66(6): 847-874.
Karadkar, Unmil P., Francisco-Revilla, Luis, Furuta, Richard, Hsieh, Haowei, and Shipman, Frank
M. 2000. "Evolution of the Walden's Paths Authoring Tools", Proceedings of WebNet 2000 - World
Conference on the WWW and Internet, San Antonio, TX, USA, ACM Press, New York, USA: 299-304.
Kortum, P., Bias, R. G., Knott, B. A., & Bushey, R. G. 2008. “The Effect of Choice and
Announcement Duration on the Estimation of Telephone Hold Time.” International Journal of
Technology and Human Interaction, 4, 29-53.
pg. 4
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
Kumar, Abhimanu and Lease, Matthew. 2011. “Learning to Rank from a Noisy Crowd.” Proceedings of
SIGIR’11, Beijing, China.
LeCompte, Margaret D. and Goetz, Judith Preissle. 1982. “Problems of Reliability and Validity in
Ethnographic Research.” Review of Educational Research, 52: 31-60.
Leydon, Geraldine M., Boulton, Mary, Moynihan, Clare, Jones, Alison, Mossman, Jean, Boudioni,
Markella, and McPherson, Klim. 2000. “Cancer Patients’ Information Needs and Information
Seeking Behaviour: In Depth Interview Study.” British Medical Journal, 320(7239): 909-913.
Longo, Daniel R., Schubert, Shari L., Wright, Barbara A., LeMaster, Joseph, Williams, Casey D., and
Clore, John N. 2010. “Health Information Seeking, Receipt, and Use in Diabetes Self-Management.”
Annals of Family Medicine, 8:334-340.
Lukenbill, Bill and Immroth, Barbara. 2009. “School and Public Youth Librarians as Health
Information Gatekeepers: Research from the Lower Rio Grande Valley of Texas.” School Library
Media Research, 12.
MacCoun, R. J. 1998. “Biases in the Interpretation and Use of Research Results.” Annual Review of
Psychology, 49: 259-87.
Marwick, Alice E. and boyd, danah. 2011. “I Tweet Honestly, I Tweet Passionately: Twitter Users,
Context Collapse, and the Imagined Audience.” New Media Society, 13(1): 114-133.
Maxwell, Joseph A. 1992. “Understanding and Validity in Qualitative Research.” Harvard Educational
Review, 62(3): 279-300.
Morgan, David L. 1996. “Focus Groups.” Annual Review of Sociology, 22:129-152.
Pavelka. 1994. “Treating Manuscripts from the William Faulkner Collection.” The Book & Paper
Group Annual, 13.
Ramos, Kathleen, Linscheid, Robin, and Schafer, Sean. 2003. “Real-time Information-seeking
Behavior of Residency Physicians.” Family Medicine, 35(4): 257-260.
Loriene Roy, Bolfing, Trina and Brzozowski, Bonnie. 2010. “Computer Classes for Job
Seekers: LIS Students Team with Public Librarians to Extend Public Services.” Public Library
Quarterly, 29(3): 193-209
Sanchez, C. A., and Wiley, J. 2009. “To Scroll or Not to Scroll: Scrolling, Working Memory
Capacity, and Comprehending Complex Texts.” Human Factors: The Journal of the Human Factors and
Ergonomics Society, 51, 5. Santa Monica, CA: Human Factors and Ergonomics Sociey. Pp. 730-738.
Sapp, M., and Gillan, D. J. 2004. “Length and Area Estimation with Visual and Tactile Stimuli.” In
Proceedings of the Human Factors and Ergonomic 48th Annual Meeting. Santa Monica, CA: Human Factors
and Ergonomics Society. Pp. 1875-1879.
Shankar, Kalpana. 2007. “Order from Chaos: The Poetics and Pragmatics of Scientific
Recordkeeping.” Journal of the American Society for Information Science and Technology, 58(10): 1457-1466.
Trace, Ciaran B. 2008. “Resistance amd the Underlife: Informal Written Literacies and Their
Relationship to Human Information Behavior.” Journal of the American Society for Information Science and
Technology, 59(10): 1540-1554.
Westbrook, Lynn. 2009. “Information Myths and Intimate Partner Violence: Sources, Contexts, and
Consequences.” Journal of the American Society for Information Science and Technology, 60(4): 826-836.
Winget, Megan. 2008. “Annotations on Musical Scores by Performing Musicians: Collaborative
Models, Interactive Methods, and Music Digital Library Tool Development.” Journal of the American
Society for Information Science and Technology, 59(12): 1878-1897.
Zhang, Yan. 2010. “Dimensions and Elements of People’s Mental Models of an Information-Rich
Web Space.” Journal of the American Society for Information Science and Technology, 61(11): 2206-2218.
pg. 5
DIGITAL MATERIALS ON BLACKBOARD
#
1
2
3
4
Item
Bailey, Diane E., Leonardi, Paul M. and Barley, Stephen R. Forthcoming. “The Lure of the Virtual.”
Organization Science, Special Issue on Digital Innovation.
Best, J. 2001. “Thinking about Social Statistics: The Critical Approach.” In Damned lies and statistics:
Untangling numbers from the media, politicians, and activists (pp. 160-171). Berkeley, CA: University of
California.
Cronin, B. 1992. “When is a Problem a Research Problem?” In Leigh Stewart Estabrook (Ed.),
Applying research to practice: How to use data collection and research to improve library management decision making
(pp. 117-132). Urbana-Champaign, IL: University of Illinois, Graduate School of Library and
Information Science.
Howison, James and Crowston, Kevin. Under review. “Collaboration Through Superposition: How
the IT Artifact as an Object of Collaboration Affords Technical Interdependence Without
Organizational Interdependence.”
PHYSICAL MATERIALS ON RESERVE
#
1
Item
Dethier, V. G. 1989. To know a fly. Boston: McGraw-Hill. (This is out of print. Four copies are on
two-hour loan from the reserves file drawer in the iSchool IT Lab.)
PHYSICAL MATERIALS YOU MUST ACQUIRE, THEIR PRICE AND SOURCE
#
1
2
Item
Huff, Darrell. 1993. How to lie with statistics. New
York: W. W. Norton and Company.
Hinton, Perry R. 2001. Statistics explained: A guide
for social science students. New York: Routledge.
(Either 1st or 2nd edition.)
Cost of Course Materials
Cost
$6.35
$32.50
Source
Amazon or other online booksellers.
Maybe Half-Price Books.
Amazon or other online booksellers.
~$40
pg. 6
WEEKLY CLASS SCHEDULE
Area
Day
1
8/24
Topic
Introduction
 Course mechanics and aim
 Dialogue between Drs. Bias and Bailey:
What makes a good research study?
 Parts of a research paper
 How to (closely, carefully) read papers
Instructor




Items to Do/Read PRIOR to Class
(see tables above for full citations)
Due in
Class

Bailey
Bias
Luehrsen
Finn
in this course
 How to (comprehensively,
conscientiously) complete HW in this
course
 Tips for finding, storing, annotating,
tagging, and retrieving research articles
2
Quantitative
8/31
3
9/7
Scientific Method
Operationalizing variables
Hypothesis testing
Sampling
Independent and dependent variables
−
−
−
−
Experiments
Hypothesis testing (revisited)
Controls, confounds, counterbalancing
The ethics of studying humans
Within-, between-subject designs
Reliability and validity
Ceiling and floor effects
−
−
−
−
−
−
 Bias

Cronin. 1992. “When is a Problem a Research
Problem?”
 HW #1


Dethier (1989)
Dillon et al. 2006. “Visual Search and Reading
Tasks Using Cleartype and Regular Displays:
Two Experiments.”
One more online resource, TBD.


pg. 7
Area
Day
4
Quantitative
9/14
5
9/21
6
Qualitative
9/28
7
10/5
Topic
Representing Data: Descriptive
Statistics
− Collecting some data
− Frequency distributions
− Representing data
− Measures of central tendency
Instructor
 Bias
More Descriptive Statistics
More “why?” and “how?”
Frequency distributions (revisited)
Representing data (revisited)
Measures of central tendency (revisted)
Measures of spread
z scores
Types of Qualitative Research
Due in
Class
 HW #2

MacCoun. 1998. “Biases in the Interpretation
and Use of Research Results.”


Lukenbill & Immroth. 2009. “School and Public  HW #3
Youth Librarians …”
Westbrook. 2009. “Information Myths and
Intimate Partner Violence…”
Trace. 2008. “Resistance and the Underlife…”
Ensmenger. 2003. “Letting the ‘Computer Boys’
Take Over…”


−
−
−
−
−
−
− Philosophical underpinnings (e.g.,
Items to Do/Read PRIOR to Class
(see tables above for full citations)
Best. 2001. “Thinking about Social Statistics:
The Critical Approach.”
Hinton, Ch. 1-5
 Bailey
positivist, interpretivist, criticalist)

− Deconstructing articles for stance
study, grounded theory, historiography)


Qualitative Methods

− Approaches (e.g., ethnography, case
− Data collection (e.g., interviews,
observation, texts, visual materials)
− Data analysis (e.g., discourse analysis,
memos, coding, content/text analysis)



Boeije. 2002. “A Purposeful Approach to the
Constant Comparative Method….”
Morgan, 1996. “Focus groups.”
Barley, 1990. “Images of Imaging…”
Eisenhardt. 1989. Building Theories from Case
Study Research.”

pg. 8
Area
Day
8
10/12
Topic
Learning Your Way Around Research
− Honing your evaluative skills
− The varied role of research in the
Instructor
 Luehrsen
 Finn

master’s program
− Getting started on your comprehensive
assignment (if you haven’t already)

9
10/19
Qualitative
10
10/26
Validity of Design & Conclusions
Design fit
Triangulation
Data collection safeguards
Data analysis safeguards
−
−
−
−
Qualitative Research in the iSchool
− More examples of approaches and
methods from recent iSchool faculty
research
− Focus on how design, methods, and
conclusions work together to facilitate
inquiry
New directions
o Autoethnography
o Science & art combined
 Bailey









Items to Do/Read PRIOR to Class
Due in
(see tables above for full citations)
Class
Quantitative Set
 HW #4
o Sapp & Gillan. 2004. “Length and Area
Estimation with Visual and Tactile Stimuli.”
o Sanchez & Wiley. 2009. “To Scroll or Not
to Scroll . . .”
o Kortum et al. 2008. “The Effect of Choice
and Announcement Duration on the
Estimation of Telephone Hold Time.”
Qualitative Set
o Erickson. 2010. “Geography and
Community…”
o Marwick & Boyd. 2010. “I Tweet Honestly, I
Tweet Passionately…”
o Cohen. 2005. “What Does the Photoblog
Want?”
Creswell & Miller. 2000. “Determining Validity
in Qualitative Research.”
Maxwell, 1992. “Understanding and Validity in
Qualitative Research.”
LeCompte & Goetz. 1982. “Problems of
Reliability and Validity…”
Howison. Under Review. “Collaboration
through Superposition…”
Roy, Bolfing, & Brzozowski. 2010. “Computer
Classes for Job Seekers.”
Barker. 2009. “Student and Faculty Perceptions
of Undergraduate Research Experiences….”
Winget. 2008. “Annotations on Musical Scores
by Performing Musicians…”
Doty & Erdelez. 2002. “Information MicroPractices in Texas Rural Courts…”
Bailey, Leonardi, and Barley. Forthcoming.
“The Lure of the Virtual.”
 HW #5

pg. 9
Area
Day
11
Quantitative
11/2
12
11/9
13
Humanities
11/16
Topic
Surveying
also
Inferential Statistics
− Standard error of the mean
− Confidence intervals
− t tests
− Statistical significance
Instructor
 Bias



Inferential Statistics (cont’d)
− Chi-square
− Correlation
− Conducting an experiment and a t-test
Distinct Approaches
− Rhetorical analysis
− Historical analysis

Items to Do/Read PRIOR to Class
(see tables above for full citations)
Hannay et al. 2009. “How do scientists develop
and use scientific software?”
Perhaps one more citation, TBD.
Hinton, Ch. 6-11, 13, 14, 19
 Bailey





Due in
Class
 HW #6

Gracy. 2011. “’Is There Counsel in Those
Curtains?’…”
Galloway. 2011. “Personal Computers, MicroHistory, and Shared Authority…”
Feinberg. 2010. “Two Kinds of Evidence…”
Harmon. 2008. “Remembering William
Goffman…”
Aspray. 1985. “The Scientific Conceptualization
of Information…”
 HW #7
pg. 10
Area
Day
Design & Computational
14
11/23
Topic
Instructor

Distinct Approaches (cont’d)
− Conservation research
− Design research
− Computational research

o Computer science
o Digital humanities



15
11/30
Wrapping Up
− Why mixed method approaches
sometimes work best
− When to temper the critic
− Going forward




Bailey
Bias
Luehrsen
Finn
Items to Do/Read PRIOR to Class
(see tables above for full citations)
Pavelka. 1994. “Treating Manuscripts from the
William Faulkner Collection.”
Kumar & Lease. 2011. “Learning to Rank from
a Noisy Crowd.”
Zhang. 2010. “Dimensions and Elements of
People’s Mental Models…”
Clement. 2008. “‘A Thing Not Beginning Or
Ending’…”
Karadkar, Francisco-Revilla, et al. 2004.
“Metadocuments Supporting Digital Library
Information Discovery.”
Due in
Class

 Comp.
Assgnm’t
o Reading research for your
coursework and your professional
work
o Learning to do research
pg. 11
ASSIGNMENTS
The point of this course is to build your skills in understanding the research that you read and hear
about. For this reason, you are to complete every assignment in this course independently from your
fellow students. We developed these assignments as alternatives to mid-terms and a final exam,
which you would have completed on your own, so please view them in the same light. Because we
recognize that some students learn better when the context is more social and group oriented, we
encourage you to form study groups to discuss the readings assigned for in-class sessions, to review
the material that we cover in class, and even to share strategies for how to approach the assignments.
But you must complete the assignments on your own. That way, your grades will reflect how well
you understand the material and will provide an indication to you of what material you have grasped
and what material you need to revisit. Our goal is to help you gain the skills necessary to understand
research – skills that will help you tremendously not only in this master’s program, but also in your
professional career. With this goal in mind, we want to see you do well on these assignments.
Therefore, if you have difficulties understanding the instructions or the material covered by any
assignment, talk to us or the TA, preferably well in advance of the due date.
HW #1
Paper Parts
6%
Due: 8/31
Closely, carefully read the two papers listed below, one quantitative and one qualitative.
1) For each paper, identify the items listed below. If you insert quotes from the article as part of
your answer, include page numbers.
a. The problem statement
b. The gap in the literature that the paper seeks to fill (briefly include what the literature
has studied, so that what is missing is clear)
c. The literature on which the study draws (describe in a few sentences)
d. The research questions or hypotheses of the current study
e. The type of investigation undertaken (e.g., experiment, case study, survey)
f. The sample (types and number of subjects/informants/data)
g. Data collection method(s)
h. Data analysis methods(s)
i. Main findings (describe in a few sentences)
j. Conclusions (describe in a few sentences)
2) By the time you finished reading each paper, you no doubt had formed some opinion about the
paper’s quality in terms of writing (Was the exposition clear? Was the reading enjoyable?),
importance (Did this topic seem worth the effort of the author to write it and you to read it?),
and soundness (Did you believe the conclusions based on the author’s description of how the
research was conducted?). Discuss how each paper shaped your opinion of its quality of writing,
importance, and soundness. Give specific examples of material or details within each paper that
strengthened or weakened your opinion of it.
Quantitative Paper
Golbeck, Jennifer, Koepfler, Jes, & Emmerling, Beth. 2011. “An experimental study of social
tagging behavior and image content.” Journal of the American Society of Information Science and Technology,
62 (9), 1750-1760.
Qualitative Paper
Shankar, Kalpana. 2007. “Order from Chaos: The Poetics and Pragmatics of Scientific
Recordkeeping.” Journal of the American Society for Information Science and Technology, 58(10): 1457-1466.
pg. 12
HW#2
Designing an Experiment
12%
Due 9/14
Design an experiment. Specify what your research question is. Do not write a full methods section.
Rather, specify your null and alternative hypotheses. Specify what your independent and dependent
variables are, and how they are operationally defined. Specify what controls and counterbalancing
you’d employ to avoid confounds. Whom will you study, as your test participants? How will you
sample them? (I might envision all of the above as a spreadsheet.) Design an experiment that, if you
really did have that time and that money, and did carry it out, you would likely have gotten an answer,
from "Nature." Now, spend some time being reflective. Write a paragraph on which parts of this
were hard for you and which parts were easy. Why do you think so?
Pick something that interests you. No, it doesn't have to be one that we've talked about in class. If
you wish, you can send me your research question, and I'll tell you if I think it sounds like a good
one.
Objective? Think of this as your first (?) experimental design. You get to have fun imagining the
research, without actually carrying out the work. Go for it. Design a study from which you'd like to
know the results.
HW #3
Descriptive Statistics Exercise
HW #4
Evaluating Papers
12%
Handed out: 9/14
9%
Due 9/28
Due 10/12
The six papers listed for class reading on 10/19 are grouped in two sets (one quantitative, one
qualitative) of three. Closely, carefully read each set and then order the papers within the set in terms
of strongest to weakest in terms of your assessment of the research quality. You may judge quality in
terms of writing, importance, and soundness, or you may define and justify other reasonable aspects
of quality and then rate the papers according to them. Write an essay in which you provide a
convincing justification for your quality assessments of the papers within each set (first discuss the
quantitative set, then the qualitative set), including short quotes of relevant passages as necessary to
support your arguments. In addition to serving as a HW submission, your arguments in this essay will
inform your in-class discussion activity that day. For this reason, this HW cannot be late.
HW #5
Qualitative I
12%
Due 10/19
Part 1. Closely, carefully read the three papers listed below, which employ a variety of qualitative
methods to investigate information seeking behavior among physicians or patients.

Leydon, Geraldine M., Boulton, Mary, Moynihan, Clare, Jones, Alison, Mossman, Jean,
Boudioni, Markella, and McPherson, Klim. 2000. Cancer Patients’ Information Needs and
Information Seeking Behaviour: In Depth Interview Study. British Medical Journal, 320(7239): 909913.

Longo, Daniel R., Schubert, Shari L., Wright, Barbara A., LeMaster, Joseph, Williams, Casey D.,
and Clore, John N. 2010. Health Information Seeking, Receipt, and Use in Diabetes SelfManagement. Annals of Family Medicine, 8:334-340.
pg. 13

Ramos, Kathleen, Linscheid, Robin, and Schafer, Sean. 2003. Real-time Information-seeking
Behavior of Residency Physicians. Family Medicine, 35(4): 257-260.
Answer the following questions:
(1) None of the three papers identifies its philosophical stance. Discuss what stance you think
the paper took (positivist, interpretivist, or criticalist) and what your clues were. For example,
look for research questions versus hypotheses, and consider the kinds of data the authors
collected, how they collected the data, and how they analyzed the data. Were they searching
for universal truths or situated understanding? Your answers may not be clear cut (I
specifically chose these papers for that reason); thus, don’t feel pressured to stake a claim for
one stance versus the other. Rather, if ambiguity exists, discuss all the evidence in favor of
each stance, and then give your overall conclusion.
(2) In what ways and to what extent was each paper aimed at extending theory or, more
generally, a body of literature?
(3) In what ways and to what extent was each paper aimed at improving practice?
(4) Were the methods of each paper written up in a manner such that a different researcher
might reasonably replicate the data collection and analysis techniques? If not, what
information appeared to be missing?
(5) What was beneficial about the use of the chosen research method(s) in terms of the findings
and knowledge it generated in each paper?
(6) To what extent could the researchers have gathered the same information as found in these
studies using non-qualitative methods, such as surveys or experiments? Explain.
Part 2. For each of the interview questions below (listed in the order the researcher intends to ask
them), evaluate how good the interview question would be if the research question were, “In the face
of increasingly virtual librarianship, how intimate and personal are librarians’ relationships to the
public?”, the approach were ethnographic (observations in public libraries combined with
interviews), and the informants for this interview protocol were library staff members. Provide
reasoning for your assessment. If the question can be significantly improved, write your improved
version; if it should be deleted, say why.
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
Let’s get started How old are you?
Do you like your job?
How long have you worked at this library?
Six months ago your library started a new virtual chat reference service. I’d like you to think
back to a typical day right before the service was launched and tell me how that day went.
Then I’d like you to think about a day right after the service was begun and tell me how that
day went. For both days, I’d like you to detail for me what kinds of questions you received.
I noticed a flyer for a gardening class here at the library. The nice thing about a class like
that is personal contact with patrons, but isn’t that a weird service for a library to offer?
How many times per day do library staff members talk to patrons face to face?
Describe for me what, in your position, your interaction is like with the public. (Probe if
response is vague: I mean in terms of things like tasks you do that put you in personal
contact with patrons. Probe if response is short and evaluative (e.g., “Good”): Can you give
me a recent example that might illustrate?)
I imagine that you are pretty worried that all the new technologies and automation and
digital search mean that libraries will cut their staff. We see that a lot in our study.
What kinds of changes in terms of the library’s interaction with the public have you seen
over time here at the library?” Follow-up for each change:
a. What prompted that change?
b. How did it make you feel?
c. What was the patrons’ response like?
pg. 14
(10) What do you think patrons expect out of their library today? How is that different from five
years ago?
Part 3. Closely, carefully read the following ethnographic study, which employs a grounded theory
approach to data analysis. Whereas quantitative studies in a positivist tradition typically begin with
theory, which they then test through data, qualitative work in interpretivist or criticalist traditions, to
the extent that they care for theory at all, tend to begin from data and build to theory. Yet, this article
has a rather extensive literature review and a conceptual framework in the front end, much like one
might see in a hypothesis-testing study. Discuss how the author used grounded theory to elicit
concepts from her data and how she tied these new concepts (personal culinary libraries, mother
lode, zone, etc.) to the theory she discussed in the beginning of the paper. Why was an ethnographic
approach, and grounded theory in particular, an apt choice for this study?

HW #6
Hartel, Jenna. 2010. Managing Documents at Home for Serious Leisure: A Case Study of the
Hobby of Gourmet Cooking. Journal of Documentation, 66(6): 847-874.
Qualitative II
12%
Due 11/2
Part 1. Consider a researcher who intends to conduct a qualitative study of identity construction in
social media by investigating the motivations of reviewers on Yelp. She plans to triangulate her data
collection by interviewing a sample of Yelp users who are active and prominent reviewers,
conducting a content analysis of a selection of reviews from each of these reviewers, and interviewing
a number of the “friends” (Yelp term) of each reviewer in her sample. (If you are unfamiliar with
Yelp, you might want to gain a bit of understanding by logging on, reading some reviews, and
looking at the profiles of a few reviewers.)
(1) Discuss how the researcher’s planned triangulation will aid the validity of her findings.
(2) Discuss at least two other techniques that this researcher might gainfully employ to help
establish the validity of her findings, explaining in detail how she could use them in the
specific context of her inquiry.
(3) Suppose the researcher determines that Yelp reviewers have three primary motivations in
constructing their identities on Yelp (not every reviewer exhibits all of them, and some
exhibit only one): (a) establishing the persona of someone who knows cool, hip places, (b)
establishing the persona of a witty, talented writer, and (c) establishing the persona of a
discriminatory food and restaurant critic. What evidence would you look for in the write-up
of this study that would encourage you to accept these three motivations as primary
motivations (as opposed, for example, to some other motivations you might have been
expecting to arise, such as establishing centrality in a social network of hipsters, or
fewer/more motivations than the three motivations listed)?
Part 2. Pick any two of the qualitative papers by iSchool faculty that we read for class 9/28 and
10/26. READ THEM (CLOSELY, CAREFULLY) A SECOND TIME, this time taking particular
notes with respect to validity. Write an essay in which you discuss how the authors of these papers
wrote up their research in ways that elicited a sense of validity to their conclusions. If the authors
spurned the very idea of validity, provide evidence of the author’s position and discuss whether,
nonetheless, you feel confidence in the findings. Draw upon examples from each paper to support
your arguments, drawing out similarities and differences across the two papers.
HW #7
Inferential Statistics Exercise
12%
Handed out 11/2
Due 11/16
pg. 15
Comprehensive Assignment
Evaluating Related Research Papers
25%
Due 11/30
On your own, find and read (closely, carefully, but you know this by now!) eight research papers in
English on a single, common topic in information studies. Example topics include privacy concerns
in medical informatics, information seeking behaviors of older adults, impact of information literacy
interventions in public libraries, web user interface for physically challenged adults, and the role of
communicative artifacts in scientific work. Make the topic narrow enough so that the papers speak to
one another. A reasonable approach may be to find one paper you like, find other papers that cite it
or papers that it cites, and then continue snowballing in a similar manner across papers until you
have eight papers total that you are keen to read. At most, four of the eight papers can be mixed
methods or can lie in the distinct approaches of humanities, design, and computational research. The
remainder should represent an even or nearly even (off by one only) split between quantitative and
qualitative papers. Do not include any papers by our iSchool faculty or any papers that we read in
this course. Group the papers by quantitative, qualitative, mixed, and distinct approach papers, and
then answer the following questions:
(1) What research approach and methods did the authors employ in their investigation?
(2) What were the research questions or hypotheses? Identify the page on which they first
became apparent. Did the authors stay true to these questions/hypotheses throughout the
balance of their paper? If not, discuss where they digressed.
(3) How well did the methods suit the questions or hypotheses? Use a three-anchor rating scale
(very well, reasonably well, not well) and provide a clear rationale for your assessment.
(4) For quantitative, qualitative, or mixed methods papers: What was the sample size? Briefly
describe the informants or subjects (e.g., undergraduate students, individuals with physical
impairments, academic librarians). For all other papers: Describe the data upon which the
paper drew.
(5) Were the study’s findings and conclusions believable? Use a four-anchor rating scale (very
much so, reasonably so, not exactly, not at all) and provide a clear rationale for your
assessment.
(6) Did the study’s findings and conclusions strike you as important? Why or why not?
(7) Use a three-anchor rating scale (high, moderate, low) to rate each paper in terms of the
overall quality of the research conducted and provide a clear rationale for your assessment.
Next, create a table to summarize your responses to questions 1-7 above, with a row for each paper
and separate columns for each question or parts thereof. Fill in the information for each paper, and
then answer these remaining questions:
(8) In your opinion, which paper among the eight was strongest, and why?
(9) What did you learn substantively from these papers as a set? In other words, what did you
learn about the topic they jointly addressed? To what extent and in what ways did they
extend the literature on this topic?
(10) From the perspective of understanding research, what did you learn from your reading and
evaluation of these papers as a set?
Finally, provide full citations to the eight papers in a single, consistent citation format.
pg. 16
Download