Recent Library Practitioner Research - Myweb @ CW Post

advertisement
Recent Library Practitioner Research: A Methodological Analysis and
Critique
Submitted by
Dr. Charles R. Hildreth and Ms. Selena Aytac
Palmer School of Library and Information Science
Long Island University/C.W. Post Campus
720 Northern Blvd.
Brookville, NY 11548
516-299-2178
(Send all electronic correspondence to hildreth@liu.edu)
1-(2/15/2016)
ABSTRACT
Recent Library Practitioner Research: A Methodological Analysis and Critique
Every few years Library and Information Science (LIS) scholars report on the state of
research in librarianship in general and practitioner research, in particular. These reviews,
some more extensive in journal coverage than others, focus on topics addressed,
methodologies used, and quality of the research. At the January 2003 ALISE meeting, we
presented the findings of our review of the 1998-2002 published librarianship research [1].
We decided to update that analytical review for two reasons: to look at more recent literature
(2003-2005), and to use refined methodological criteria to evaluate these studies. From a
purposive sample of 23 LIS journals we have mined 401 research articles. 206 articles were
randomly selected for this in-depth analysis. A checklist of 35 factors was applied in our
analysis of these research articles. These project and report factors include authorship, topic,
location and setting, type of research, data collection methods, type of data analysis, statistical
techniques, and components and quality of the report itself. The data collected were
statistically analyzed to produce summary data and to provide an opportunity to discover any
significant associations among the many variables. Such analysis supports comparisons of
practitioner and academic scholar research. The descriptive data has enabled us to document
the status quo in recent practitioner research. These findings are used to explore recent
patterns or trends in library practitioner research and they provide a basis for comparisons
with earlier reported findings and assessments.
2-(2/15/2016)
Recent Library Practitioner Research: A Methodological Analysis and
Critique
INTRODUCTION
Every few years Library and Information Science (LIS) scholars report on the state of
research in librarianship in general, and practitioner research, in particular. These reviews,
some more extensive in journal coverage than others, typically focus on topics addressed,
methodologies used, and quality of the research. At the January 2003 ALISE meeting, we
presented the findings of our review of the 1998-2002 published library research, with special
focus on research conducted by library practitioners [1]. We decided to update that analytical
review for two reasons: to look at more recent literature (2003-2005), and to use refined
methodological criteria to evaluate these studies. Consistent with the aims of most earlier
reviews, we designed a study that would permit us to discover and document the status quo of
recently published library research. In the review reported here we focus primarily on
research methodology issues and the quality of the published reports of both practitioner and
academic scholar research. By “academic scholar” we mean those individuals who are
members of teaching and research faculties at colleges and universities, typically in graduate
degree programs in library or information science. Topics of research were also included in
this review and analysis. Topics found in the research literature were coded and documented
using a concise taxonomy of broad categories. In a future study we plan to employ a more
detailed, finely-grained classification of research topics that will more accurately represent the
number and diversity of research topics we encountered in this review of the research
literature.
The data recorded for more than 30 characteristics of the 2003-2005 published research
studies in our sample has enabled us to identify patterns and trends in research approaches,
strategies, and use of a variety of methodologies. Our findings permit comparisons with
earlier studies and assessments of the accuracy of past predictions regarding the direction of
future library research. As the title of this document indicates, we were most interested in
documenting and analyzing recent library practitioner research. For purposes of comparative
3-(2/15/2016)
analysis we also looked at a representative sample of research conducted and published by
academic scholars during the same period, 2003-2005.
LITERATURE REVIEW
If there is a common theme in past reviews of the library and information science (LIS)
research literature it is this: library-related research is substandard, although developments
over the last few decades of the twentieth century indicate that improvements are continually
being made. As summarized by Powell and Connaway, “Those who have assessed the
previous research of librarians have been of a consensus that the quantity and quality have left
something to be desired.”[2] No doubt mindful of Hernon and Schwartz’s “call for a higher
standard for the conduct of research and the publication of research results,”[3] Powell and
Connaway, focusing on recent library research in the first chapter of their 2004 textbook,
Basic Research Methods for Librarians, present a credible case that there has been marked
improvement in library research in recent years. The areas of improvement they highlight
include better conceptualizations of topics and questions to be investigated, and more rigorous
application of a variety of research methods and statistical analysis techniques. The research
“standard of quality” referenced in reviews like these reflect the reviewer’s views on what
constitutes sound “scientific inquiry.” It is not always clear if a research orientation is
primarily positivist or post-positivist. One’s standard of “scientific inquiry” may be no more
than the elementary, conventional version of the five-step “scientific method” in which one
proceeds from formulation of problem, through testing, to results. Hernon and Schwartz more
felicitously describe the research process as “an inquiry process that includes components for
reflective inquiry, research design, methodology, data collection and analysis, and the
communication of findings.”[4] Unlike Hernon and Schwartz, reviewers of library research
rarely reveal their own philosophical or methodological orientations and biases. They may be
positivists or post-positivist-constructivists, experimentalists or ethnographers. They may
prefer a qualitative methodological approach over quantitative methods, or vice versa. These
orientations and preferences may well influence a reviewer’s assessment of “good” research
and the type of library research area most in need of improvement. We think it is safe to say
that critical reviews published before the 1980s reflect a positivist point of view, with its
preference for quantitative research methods.
4-(2/15/2016)
In his 1967 invited editorial, Ennis, senior researcher at the National Opinion Research
Center, observes that “library research is non-cumulative, fragmentary, generally weak, and
relentlessly oriented to immediate practice….Most of these studies have been critically
reviewed recently and the conclusions of the review are that there is little hard and useful
knowledge gained. Samples were too small or poorly drawn, study designs were
underconceptualized, measuring instruments and techniques were primitive.”[5] Commenting
on the library literature of the 1970s, Grotzinger bemoans the lack of progress in library
research and fears that librarians will continue to be “not only reluctant but resistant to the
achievement of an understanding of the nature of scientific inquiry.”[6] Grotzinger attributes
this resistance to the failure of LIS educators and their curricula to instill in library
professionals a knowledge-based respect for the true “meaning of research” and scientific
inquiry. The author makes it clear that competence in quantitative methods and statistics is the
knowledge to be gained and applied to library research questions.
In their 1980 textbook, Research Methods in Librarianship: Techniques and Interpretations,
Busha and Harter state that “Unfortunately, a large population of librarianship’s research has
been uneven in quality and demonstratively weak methodologically; therefore, too much of it
has only approximated scientific inquiry.”[7] The authors’ positivist view of “scientific
inquiry” is made clear immediately after this passage. Experimental research is the “key
instrumentation of science,” to be used for the testing of hypotheses. Furthermore,
establishing the validity and reliability of measurement instruments is of paramount
importance to these authors.
Beginning in the 1990s, reviewers of library research demonstrate a sympathetic
understanding of non-positivist, qualitative research approaches. The methodologies of the
social sciences have begun to exert a greater influence on LIS scholars than the longdominant quantitative methodologies of the natural sciences. This appreciation of the diverse
methodologies applied insightfully and successfully in the social sciences can be seen in their
assessments of library research. This influence is evident in the construction of more inclusive
taxonomies of research methods, and in new and revised standards of “good” research.
5-(2/15/2016)
Discussions of good research now emphasize the unequaled benefits of qualitative research
and methodological “triangulation,” the complementary use of more than a single type of
method. The emphasis on “quality” research shifts somewhat from maximizing the validity
and reliability of data collection procedures, to personal immersion in the context of study,
and insightful analysis of that context and the verbal or textual data extracted from that
context. From the early 1990s to the present day, reviewers of library-related research
increasingly place their analyses in the context of the social sciences and the qualitative
methodologies that have gained favor in those disciplines. In her 1991 review of the quantity
and quality of LIS research (“more but not better”), Van House turns to research in the social
sciences to shed light on the deficiencies in LIS research, and for guidance on ways to
improve this research, with special focus on research methodologies. She asks of LIS
research, “How good is it, according to the standards of social science research?”[8] She
concludes that library-related research “suffers from serious, but not fatal, methodological and
content problems.” The causes of this state of affairs are best understood by looking at the
quality of research in the social sciences and how that quality is achieved and evaluated by
social science researchers. Oddly, when Van House discusses reasons for the methodological
shortcomings of LIS research, she looks inwardly to the profession and discipline of library
science. The “culprits” identified by Van House include the pressure for academic library
practitioners to “publish or perish,” and the inadequate training in research methods provided
by the graduate library schools. She recommends that practitioners not conduct research
unless they are well-trained and can address significant problems of interest to a wider
audience in related disciplines. Until this comes about, Van House recommends that
practitioner research be published primarily in “practitioner-oriented journals,” not scholarly
journals.
An enthusiastic proponent of qualitative research, Raya Fidel documents the growing interest
in and increased use of qualitative methods in information retrieval research.[9] Her review of
the LIS research literature is confined to research in information retrieval, but she pleads her
case for the expanded use of qualitative methods in all areas of LIS research. She offers three
explanations for the increased use of qualitative methods: “the failure of quantitative methods
to produce what was expected of them; the move toward a user-centered approach; and the
6-(2/15/2016)
growing interest in qualitative methods in other areas in the social sciences.”[10] Fidel clearly
believes the path to improved, more relevant library research goes through the methodological
territory of the social sciences.
In his examination of recent trends in LIS research at the end of the twentieth century, Powell
tells us he “reviewed the research literature of the social sciences, including library and
information science.”[11] He reports on the many research methods identified and “deemed to
be relevant to library and information science research.” Special attention is given to
qualitative methods used in the social sciences. In successive editions of his well-respected
textbook on library research methods, Powell has consistently documented the shortcomings
of library research. In this “Trends” paper, Powell encourages researchers to expand their
methodological repertoire to include qualitative methods used in the social sciences. His
optimism about improvements in future library research is accompanied by his prediction that
the use of qualitative methods and “triangulated” research designs will increase.
In 2002, McKechnie and her colleagues conducted a methodological analysis of human
information behavior research published between the years 1993 and 2000.[12] The aim of
this review was to document recent methodological practices and identify any methods trends.
This is one of the few reviews of LIS research to consider separately research by practitioners
and research by academic scholars (generally, members of LIS graduate teaching faculties).
The authors discovered the unfortunate fact that less than four percent of first authors of
research reports in this area were practitioner researchers. McKechnie et al. report that the
majority of the human information behavior studies were primarily qualitative and used two
or more research methods. Interviews were identified as the method of choice, followed by
survey questionnaires, observation, and content analysis.
Wishing to focus on practitioner researchers in academic libraries, Watson-Boone examines
24 practitioner-authored research articles published in the Journal of Academic Librarianship
between the years of 1985 and 1995.[13] After an illuminating context-based discussion of
practitioners’ diverse motives for conducting research, she applies a six-method taxonomy to
document methods used in their research. “Survey research” was the method of choice in one-
7-(2/15/2016)
half of the studies (12), followed by “action research” (5, 20.8%). Only one experiment was
found, and one study was classified as “evaluative.” Watson-Boone’s review suffers not only
from its small sample of practitioner research, but also from its short, confusing list of
research methods. She equates types of research with specific data collection techniques.
Applied research, for example, may well use survey research methods for data collection. In
short, she confuses research orientations or motive-based research strategies and approaches
with specific data collection methods (e.g., questionnaires, interviews, observations, etc.). In
our study we have attempted to avoid this confusion by making clear distinctions between
methodological orientation, research types and strategies on the one hand, and data collection
and data analysis methods, on the other hand.
Inspired by the emergence of the “evidence-based librarianship movement” (EBL),
Koufogiannakis and her associates conducted an extensive review of 807 research articles
mined from 91 LIS journals for the year 2001. Their content analysis of these articles focused
on topics researched, methods used, authorship characteristics, journal source, and indexing
and abstracting services. The authors also attempted to “identify what if any correlation exists
between the research method used and the subject (‘domain’) of research.”[14] After
providing an extensive literature review of previous LIS research study reviews,
Koufogiannakis et al. report that few of these studies have attempted to demonstrate
correlations between variables studied, especially correlations between subject areas
researched and research methods employed. In pursuit of this objective, Koufogiannakis et al.
create a six-item subject domain taxonomy and a 15-item methods (“study type”) taxonomy.
Their descriptive data analysis revealed that the subject domain most researched was
“information access and retrieval” (38.9%), followed by “collections” (24%), and
“management” (16.7%). Research methods identified in the 15-item methods classification
ranged from the general type (“descriptive”) to the specific (computer logging and analysis).
The highest proportion of the research articles were classified as “descriptive” studies,
followed in frequency by “comparative” studies. Warning that their classification of research
methods may need further refinement, the authors display counts of articles by subject domain
that reflect the use of one or another study type. Thus, we are presented with suggested
correlations between subjects and study types. Unfortunately, there is no attempt to use
8-(2/15/2016)
statistical techniques such as Chi Square tests for independence to determine if any of these
associations between subject domains and study type (method) are statistically significant.
In many ways our current study updates the review of the 1998-2002 library practitioner
research literature presented by Hildreth.[15] That methodological study looked at a random
sample of 100 research articles selected from 23 LIS journals. The purposive sample of
journals from which these articles were mined was designed to include a majority of
practitioner journals. A formal checklist was used to document both study characteristics
(e.g., topic, research type, methods, statistics, etc.) and “quality” report characteristics (e.g.,
presence of clear research question, literature review, evidence-based discussion of findings,
discussion of study limitations, etc.). The descriptive analysis revealed that survey
questionnaires topped the list of most-used data collection methods, and that many studies
used two or more data collection methods, suggesting an increase in methodological pluralism
in LIS research. Parting from earlier reviews, Hildreth attempted to distinguish between
overarching “research approaches” based on researcher orientation, motivation, and study aim
or strategy (e.g., descriptive, exploratory), and data collection methods or data analysis
techniques. These separate aspects of research study design, methods, and analyses have not
been sufficiently distinguished and coded in earlier review studies. Applying these
distinctions, Hildreth reported that only a low percentage of the studies could be classified as
explorative in aim (25%). Only 29 percent of the studies used qualitative analysis in whole or
part. Few of the studies employed correlational or inferential statistical analyses. Hildreth did
not use statistical analysis in his 2003 study to determine the presence of significant
relationships between variables such as topic and method, or authorship (practitioner or
academic scholar) and research type. In the sections that follow we report on our review of the
2003-2005 LIS research literature. The findings include discovery of statistically significant
correlations between key variables of interest.
RESEARCH DESIGN AND METHOD
A purposive sample of 23 LIS journals was selected for this analysis of the recent research
literature in librarianship (see Appendix A). This selection of journals is weighted toward
practitioner research. Several key journals that are primarily publishing venues for academic
9-(2/15/2016)
scholars are included in this sample for purposes of comparison. Every issue of these 23
journals published between 2003-2005 was examined by the authors of this report to identify
all of the articles we deemed to be reports of original empirical research studies. The criteria
we applied in this selection of research articles is based on Hernon’s definition of research as
“an inquiry process that has as its aim, the discovery or creation of knowledge, or theory
building; testing, confirmation, revision, refutation of knowledge and theory; and/or
investigation of a problem for local decision making.”[16] Hernon lists five activities that
comprise such research activity: reflective activity, including identification of a problem and
research questions; adoption of appropriate procedures in the research design and choice of
methodologies; collection of data; data analysis; and presentation of findings and
recommendations for future study.
We judged a published article a good candidate for our review if it reported on original
empirical research, research that addressed one or more specific research questions,
systematically collected and analyzed relevant data, and presented and discussed the databased findings of the research. Thus, excluded from this review were literature reviews,
methodology papers, historical studies, and analytical and opinion pieces. This stringent
identification and selection process yielded 401 research articles. 206 of these articles were
randomly selected for in-depth analysis. Appendix A displays the number of articles initially
selected from each of the 23 journals, and, in parentheses, the number of articles from each
journal included in the randomized sample. Each of the 401 candidate articles was given a
unique number identifier, and, journal by journal, a random number generator program was
used to select approximately one-half the articles for inclusion in our review and analysis
sample.
From a set of ten general variables (e.g., topic, setting, etc., See Appendix B), a coded
checklist of 35 factors was developed, tested, and refined to support analysis and
documentation of the characteristics of each of the 206 research articles. The final checklist of
analysis factors emerged inductively from the actual literature. The study variables were
divided into two broad categories: study variables, and report variables. The study variables
focused on authorship, topic, setting, and methodological factors. The report variables focused
10-(2/15/2016)
on the report itself, its components, organization, and overall quality of composition. The
definitions of each of these variables and how they were applied in the article analyses will be
discussed in the presentation of findings which follows. The 35-item article checklists were
used independently by each of the authors of this paper. There were few differences in our
separate analyses, and these differences were easily reconciled upon further reading,
rereading, and fine-tuning of our factors. The data recorded on each article checklist were
coded and entered into a database for statistical analysis. This analysis produced summary
descriptive data on key methodological factors and provided a basis for analysis of significant
correlations between key variables, including comparisons of practitioner and academic
scholar research. (Hereafter, academic scholar research or authorship rubrics will be shortened
to “academic,” with “scholar” assumed, and this will be differentiated from library
“practitioner.”)
Research questions addressed in this study include:
1. With regard to the key methodological variables, what do the data show?
2. What is the quality of recent research study reports?
3. In terms of methodology, has there been any change in patterns of use?
4. Have the predictions of earlier reviewers regarding methodological trends been confirmed?
5. How do practitioner and academic researchers compare on methods used and report
quality?
6. What, if any, methodological aspect of this research literature most distinguishes
practitioner research and academic research?
7. What significant categorical associations might emerge from these data (e.g., authorship
and type or method of research)?
FINDINGS
Research-oriented Journals
Appendix A identifies those journals that have yielded the greatest number of research articles
for this study. Six of the 23 journals account for more than one-half of the research articles
initially reviewed for inclusion in the study. They are led by the Journal of Academic
Librarianship (69), the Journal of Documentation (48), and Library & Information Science
11-(2/15/2016)
Research (33). On a per issue basis, the leading publishers of research articles among this
group of 23 journals are the Canadian Journal of Information & Library Science, the Journal
of Academic Librarianship, Journal of Documentation, Library & Information Science
Research, Library Resources and Technical Services, and Research Strategies. Findings on
the authorship variable will be presented next, but we have ascertained that six journals in our
purposive sample are predominantly academic scholar in authorship, and ten journals are
predominantly practitioner in authorship. Taken together, 82.4% of research articles selected
from the academic journals were authored only by academic scholars (i.e., non-practitioners).
Of the ten journals classified as practitioner journals, practitioners account for 74% of the
articles authored only by practitioners. Contrary to our expectations, academic scholar
journals did not yield more of the research articles for this study than the practitioner journals.
Three of the top publishers of research articles are predominantly practitioner journals, two
are predominantly academic scholar journals, and one has a nearly equal amount of
practitioner-authored articles, academic scholar articles, and mixed authorship articles (at
least one practitioner and one academic). Additional findings from a comparative analysis of
practitioner journals and academic journals will be published in a separate paper.
Authorship
Three factors related to the research article authorship variable were examined: number of
authors per article; relative frequencies of articles having only academic authors, those having
only practitioner authors, and those with mixed authorship; and frequency of each type of
author by journal. While the range of authors per article was 1-7, the mean number of authors
per article was 1.86, the median, 2. There was very little mixed authorship found in this
sample of 206 research articles (20, 9.71%). As indicated previously, six of the 23 journals are
predominantly academic scholar in authorship, and ten are predominantly practitioner in
authorship. None of the 23 journals yielded a significant amount of mixed authorship articles.
Topic Areas
The primary focus of this study is on research methodology, not topic coverage. We did use a
14-item list of topics to record the topics of research in each article (See Table 1, included
after the appendices). This proved difficult for a number of reasons. Some articles were
12-(2/15/2016)
encoded as having three or more topics, and it was not always clear which topic was primary,
which secondary, and so forth. Our taxonomy needs to be expanded and refined to represent
the many topics being addressed in library research. In light of these shortcomings, we
collapsed our 14-item list to four categories (library operations and services, use and user
studies, LIS education, and miscellaneous others). This short list of topic areas is used for
preliminary analysis of associations of method variables with topic areas researched (See
Table 2). The most surprising topical finding is the small amount of research having anything
to do with LIS education (9, 2.39%). Practitioners more frequently investigate library
operations, collections and services, and academics more frequently conduct use and user
studies (this difference is statistically significant, even when including mixed authorship:
=17.454, df=6, p=0.008, See Table 3). Both are investigating many additional topics.
Study Location and Setting
The study “location” in each article was coded as “specific site” or “site-independent.” The
former refers to “in-house” studies and the latter was used to record studies that investigated
phenomena of interest at more than a single institutional site and studies that investigated
phenomena that are truly site-independent, such as Web search engine use or performance.
These site-independent studies account for 59.71% of the total (See Table 4). Academic
scholars conduct significantly more site-independent studies than practitioners (=5.570,
df=1, p=0.019). The study domain variable “setting” was encoded as university, public
library, special library, and other. The majority of the studies were coded as “university” (115,
55.83%). A significant number of the studies (55, 26.7%) were not library domain studies
(See Table 5). We were intrigued to discover that academic scholars conducted most of the
research activity in non-university library settings. On the other hand, practitioners conducted
nearly two-thirds of the research set in the university environment (73 cases, 63.5%). Adding
studies with mixed authorship raises this total to 81 cases (70.4%).
Methodology
We now turn to the primary focus of this study, methodology. Five variables were identified
for this analysis: type of research by aim or strategy, sampling procedure and sample drawn,
data collection method, data analysis approach, and statistical analysis employed.
13-(2/15/2016)
Type of Research
Several classifications of research types have become familiar to readers of the research
literature. The best known is the classification of research as “basic or pure,” “applied,” and
“action research.” A simpler approach distinguishes “primary” research from “secondary”
research. More recently there has been the thought-provoking debate about the virtues of
“qualitative” research and “quantitative” research. We will use this distinction in our
characterization of the data analysis approaches used in research studies. We find most useful
and discriminating the classification of research types described by Schutt in his book titled,
Investigating the Social World: The Process and Practice of Research [17]. Schutt describes
four types of research, based on the overarching aim, purpose, and intentions of the researcher
evident in the conceptualization and design of the research. The four types of research are
descriptive, exploratory, explanatory, and evaluative:
Descriptive research aims to discover and document the “facts,” the status quo. The data
acquired is usually quantitative data, such as counts, percentages and ratios, data from which
means and medians can be calculated. Systematic observations on a well-defined and
measurable variable and survey questionnaires are typical methods used.
Exploratory research aims to “find out what’s going on” in a specific situation by immersion
in that situation, prior to and perhaps in place of taking any measurements of designated
variables. Seeing “reality” through their eyes, seeking meaning as understood by the subjects
of the investigation. Less-structured than the other types of research, the pertinent data is
typically qualitative data, such as textual or verbal data. Methods include participant
observation, open-ended interviews and focus groups. Exploratory research usually employs
qualitative research methods, and qualitative methods may also be used in other types of
studies.
Explanatory research, often in the past considered the paramount type of research,
explanatory research aims to answer the “Why?” question by discovering and establishing
(confirming) the actual cause of some phenomenon or state of affairs. This research typically
tests hypotheses through the application of rigorous protocols for observing and measuring
interactions between variables. The data is quantitative and is analyzed using inferential
14-(2/15/2016)
statistics. Experiments, clinical trials, and even descriptive surveys are used in explanatory
research.
Evaluative research is a kind of “action” research that brings science and scientific methods
to the evaluation of specific programs, policies, or techniques (e.g., outreach programs, web
sites). This evaluation may be “formative” (used as a program is being implemented) or
“summative” (used after a program or policy has been in place for a period of time, to assess
its impact and outcomes). Both quantitative and qualitative research and data may be used in
such studies.
Each study in our review was coded as one or more of these four types of research. Table 6
shows the number of studies in each category and separately reports the number of studies
characterized as reflecting in aim two or more research approaches (78, 37.9%). Most of the
studies are descriptive in aim and strategy. One-third of the studies are exploratory in aim or
are included in studies having multiple aims reflected in multiple research types in the design
of the research. Few studies are explanatory studies (12.62%). We anticipated a larger
percentage of evaluative studies, especially by practitioners, as they are the ones closest to
programs and policies that often beg for objective assessments of impacts and outcomes.
Practitioners conducted only slightly more evaluative research than academics.
Practitioners conducted a larger percentage of the descriptive research than academics. And
this difference is significant (=4.836, df=1, p=0.028). Academics conducted significantly
more exploratory research than practitioners (=3.881, df=1, p=0.049). There was too little
explanatory research to determine significant relationships between authors of each type. The
same must be said for evaluative research. 37.9 percent of the studies reflect two or more
research approaches. The authorship of these multi-purpose studies is nearly equally
practitioner and academic.
Samples and Data Collection Methods
In addition to recording methods used for data collection, we looked carefully at the
descriptions of study participants and samples used in these studies. If samples were used we
tried to determine if the sample was a randomized, probability sample, or a non-probability
15-(2/15/2016)
sample (e.g., purposive, convenience, quota, etc.). Most authors described the participants and
samples used in their studies. From these descriptions we were able to determine that only
17.7% (31) of the studies that used samples used randomized sampling procedures to obtain
probability samples. There was little difference between academics and practitioners on the
use of randomized samples (18.31% and 16.5%).
Eleven data collection methods were identified in these studies (See Table 7) Many of the
studies employed more than a single data collection method, for example, survey
questionnaires and focus group interviews (multiple methods frequency = 84, 40.8%).
Practitioner researchers appear to prefer survey questionnaires, observational methods, and, to
a lesser degree, content analysis. Academic researchers use survey questionnaires in onequarter of their studies. More frequently than practitioners, academics use content analysis,
structured and semi-structured interviews, and bibliometric methods. Academic researchers
employed multiple data collection methods significantly more often than practitioners
(=5.183, df=1, p=0.023).
Data Analysis Method, Quantitative or Qualitative
The research studies in our sample were coded as quantitative, qualitative, or “triangulated”
(used both methods) with regard to the type of data analysis used. Quantitative analysis is
used when the data collected is categorical (counts) or some other form of numerical data.
Qualitative analysis is used when the data collected is verbal, pictorial, or textual. Following
Swygart-Hobaugh [18], we coded a study as quantitative if it used descriptive statistics
(frequency counts, percentages, means, etc.) or inferential statistics in tests of significance
(chi square tests, regression analysis, ANOVA, etc). We coded studies as qualitative if they
employed, for example, participant observation, in-depth interviews, focus groups, or content
analysis. These methods typically yield verbal and textual data. Table 8 displays the
frequencies for each type of data analysis. Unmixed, qualitative-only studies make up only
16.99 percent of the studies. However, an additional 29.61 percent of the studies included
some qualitative data analysis, and this brings the total percentage of studies employing
qualitative analysis to nearly 47 percent. Academic researchers conduct more qualitative
research than practitioners, but the difference is not statistically significant.
16-(2/15/2016)
Statistical Analysis Techniques Used
Statistical analyses employed in these studies were encoded as descriptive, correlational,
inferential (including regression analysis), or some combination of these techniques. Table 9
displays the frequencies of these types of statistical analyses in the studies reviewed. Studies
having no statistical analyses were excluded from these counts. The frequent use of
descriptive statistics was anticipated. The low level of chi square and correlation coefficient
analysis was not anticipated. The data for conducting such analyses was often present.
Academic researchers did conduct more of this type of statistical analysis (=9.558, df=1,
p=0.002), and their studies more frequently employed two or more types of statistical analysis
(=9.180, df=1, p=0.003).
Research Report Variables
Each of the published research reports in this study were analyzed and evaluated for content
and presentation. Ten factors were identified for this purpose (See Table 10). With the
exception of “Visual Data Presentation,” these are binary (Yes/No) observations. The factor
was present or not. To avoid subjective assessments in the initial analysis, we did not make
judgements as to quality of description. One possible exception to this could be the first
factor, “Is the report well-organized with appropriate sections easily identified?” The quality
of the presentations varied greatly. If anything, we erred in the direction of being too liberal in
our assessments of content presentation. The data seem to support this. Almost all reports
were judged “Yes” on the first six report factors. Some research questions and aims were
present but poorly stated, and many literature reviews were token attempts to cite something
relevant. In some cases, descriptions of samples and data collection procedures were not
sufficient to support replication of the studies. However, if the factor was described, it was
marked as “Yes” for present. A future study could use a more refined assessment scale that
would be applied to published reports by a panel of expert judges. This would improve the
validity of such assessment attempts. Having said this, our humble opinion is that research
reports in librarianship have improved considerably in recent years. Improvements have been
made in content, organization, and readability. Our data do indicate there is room for
improvement in report organization on the part of practitioners. There has been noticeable
17-(2/15/2016)
improvement in the discussions of data-supported findings. Especially evident is the increased
use of visual data formats such as tables, figures, and charts. It is a rare thing to encounter
visual displays of data 20-30 years ago.
We turn now to three report factors we group together in a category we call the “Humility
Factor.” This characteristic is sadly absent in a majority of research reports. As Table 10
indicates, not even one-third of the reports addressed the critical issues of validity and
reliability. Furthermore, nearly two-thirds of the published reports made no mention of study
limitations or shortcomings. Recommendations for needed, pertinent research on the topic
investigated are found in only slightly more than one-half of the reports. Comparing
practitioners and academics on the Humility Factor, we found no difference. Editors and
reviewers must share some of the responsibility for this.
DISCUSSION
From our sample of 206 articles from 23 LIS journals, it is clear that there is a considerable
amount of systematic, empirical research being conducted by practitioners. 97 (47.1%) of the
articles are practitioner-only in authorship, 89 (43.2%) are by academics and 20 (9.71%) have
mixed authorship. The data indicate significant collaboration among colleagues in the same
profession, but collaboration between academics and practitioners is rare. Only 9.71% of the
studies had mixed-type authorship. This low level of collaboration is also reflected at the
journal level. Journals tend to be predominantly practitioner or predominantly academic.
Perhaps the recommendations of Van House, that practitioners publish in practitioner
journals, have borne fruit. Measured by having a large majority of authors of one type or
another, six journals can be classified as academic in authorship and ten journals can be
classified as practitioner in authorship. We have indicated these journal types in Appendix A.
The six top journals that produce the greatest number of research articles are one-half
practitioner and one-half academic. These findings are consistent with the findings of the
Schoegl and Petschnig study of 38 LIS journals. Based on a survey of editors, they found that
authorship distributes nearly equally between academics (‘scientists’) and practitioners. They
conclude that “a clear distinction can be made between journals with predominantly
publishing scholars and those with a higher share of writing practitioners.”[19] Our study
18-(2/15/2016)
found no difference between academic journals and practitioner journals in the percentage of
published articles having mixed authorship.
Many and diverse topics are addressed in these research studies. It seems no economical
taxonomy of research topics can adequately document the variety of topics being addressed.
A list of unique topics addressed in our sample studies would exceed 25 in number. Our
“other” topics category is well-populated. New topics appear (e.g., Web searching or
usability, chat reference, digital repositories), and interest in earlier topics wanes (library
history, integrated library systems). When topics are collapsed into the four general areas,
library operations and services, use and user studies, LIS education, and other, the data show a
significant difference in topic areas researched by practitioners and academics. Practitioners
more frequently investigate aspects of library operations, collections and services. Academics
more frequently investigate use and user problems and issues. This difference is significant
(=10.835, df=1 , p=0.001). With regard to location and setting of research, academics
conduct most of the public library-related research and practitioners conduct nearly all of the
academic library-related research.
We believe we have contributed to the literature on library research methods by making
distinctions between research type or approach (based on intention, purpose or strategy), data
collection methods, and types of data analysis. Studies may be exploratory in aim, but
include both qualitative and quantitative data collection and analysis methods. Studies may be
descriptive in aim, yet include qualitative elements. We believe the type of research to pursue
depends as much on the intention and purpose of the researcher as on the nature of the data
considered relevant for this pursuit. It is often said that the data should dictate the method to
be used. We say that the type of research the researcher wishes to pursue to satisfy a particular
research objective (exploratory, descriptive, explanatory, evaluative) as seen by that
researcher, should determine the choice of data collection method and type of data analyses.
In effect, we are saying the researcher is primary. We have found that most exploratory
research is conducted by academics. Exploring reasons why practitioners conduct little
exploratory research is a much-needed research study.
19-(2/15/2016)
When samples were described in these studies, we found two disturbing trends. Only 31
(17.7%) reported the use of randomized, probability samples. Without such samples, no
claims can be made about the representativeness of the sample or application of findings to
the larger population. Thus, most library-related research has no external validity and the
findings can only be viewed as preliminary and tentative. Follow-up studies that address these
limitations are simply not found in the literature. Academics do not grade better on this factor
than practitioners. In addition, reported survey response rates were very low (seldom more
than 30%), and rarely was there any analysis of the constituents of the non-response group
and how this might affect the value of the sample findings. Join these findings with the
infrequent discussions of validity issues and study limitations, and we have identified a major
area of concern and needed improvement in library-related studies.
Survey questionnaires and interviews continue to be the most used methods for data collection
(38.3%), followed in frequency by content analysis (20.4%) and semi-structured interviews,
including focus groups (10.2%). A variety of methods were used in the remaining studies.
Practitioners employed proportionally more survey questionnaires and observational
techniques than academic researchers. The latter employed proportionally more survey
interviews, semi-structured interviews, and content analyses than practitioners. However,
none of these differences were statistically significant. Less than one-half of the studies
employed two or more data collection techniques (40.8%). Academic researchers used
multiple methods significantly more often than practitioners (= 5.183, df=1, p=0.023).
We characterized the type of data analysis used in the studies as quantitative, qualitative, or
both. Only 17 percent of the studies were solely qualitative, but an additional 30 percent
combined qualitative and quantitative methods. Thus, nearly one-half of the studies had a
qualitative component. As for the use of statistical techniques, academics employ
correlational and combined techniques far more often than practitioners (= 11.28, df=2,
p=0.004 and = 12.11, df=2, p=0.003).).
With regard to the quality of published reports, compared to 10-20 years earlier, there has
been undeniable progress. Journal editors and reviewers deserve a large amount of the credit
20-(2/15/2016)
for these improvements in the published accounts. The majority of the reports are detailed,
comprehensive, and well-organized. The greater use of tables and graphic displays is
welcome. The news is not all good. Discussions of validity and reliability, study limitations,
and future research are unacceptably low. More attention must be given to these factors by
researchers, reviewers, and editors.
CONCLUSION
We have identified significant differences between practitioner research and academic
research. What most distinguishes these researchers and their research studies? As it turns out,
not a great deal. There is little difference in the quality and organization of their published
reports. With regard to topics investigated, practitioners conduct more library-specific studies
and academics conduct more use and user studies. This topical gap is not wide, and there is
cross-over, if little collaboration between academics and practitioners. A closer look at the
frequencies for “information seeking” and “database searching” combined reveals that
academics account for 59.1 percent of theses studies, practitioners account for 27.3 percent,
and the remainder (13.64 percent) have mixed authorship. Academics appear to be more
sophisticated in the use of methods of data collection and analysis, but this gap may be
closing. At this time, academics conduct more qualitative studies than practitioners, and more
often use multiple methods of data collection and statistical analysis.
Nothing in our review of the 2003-2005 research literature brings into question earlier
predictions by Fidel, Powell, and others. Our review confirms that qualitative research
methods are being used to investigate a variety of research topics. However, the use of
qualitative methods may have leveled off, and the low use of qualitative methods by
practitioners should be a concern of LIS educators and researchers. Whatever type of library
or information center they are employed in, practitioners are closest to the “lebenswelt,” the
world as perceived and engaged in by those who we so often study. Practitioners, to a greater
or lesser degree, are participants in this world, yet participant observation and other
qualitative methods have a low use in practitioner studies. Do we know why this is the case?
Can it be attributed to lack of time, or lack of training in qualitative methods? Would there be
more qualitative research conducted if there were more collaboration between academics and
21-(2/15/2016)
practitioners? How can such collaboration be encouraged and supported? Who should take the
lead? It is precisely this kind of collaboration that is encouraged by Simon and Schlichting in
their report on a successful collaboration between public librarians and university professors
in the design and administration of a user survey.[20] This case is also interesting because the
professors who contributed research expertise to the project were from journalism and
sociology departments, not LIS faculties.
We suspect that every published researcher suffers from the “nagging doubts syndrome” at
the close of a study. Have we missed anything? Did we have a good sample for our purposes?
Are our calculations accurate? Should we have approached this question in an entirely
different manner? Such reflections are an essential component of the research adventure. We
would like to see more evidence of such reflections in the library research literature. We are
aware of the limitations of our study. The selection of journals for our sample frame was
purposive, as our aim was to include several top academic journals and a broad selection of
practitioner journals. Another set of journals might well yield a different distribution of
research articles. Furthermore, our 51.4 percent sample of articles (206 of 401), although
selected randomly, might not be representative of the whole group. We believe there is a high
probability that they are representative of the research published in these 23 journals during
this period. We do have doubts about our list of data collection methods. This group emerged
inductively from the reports, but our named categories may benefit from greater delineation.
This might be an appropriate project for a Delphi study by methods experts. The problem is
similar but more complex with regard to research topic naming and classification. This is a
slippery, ever-changing terrain. LIS researchers are branching out, topically speaking, into
many new domains. We refer this task to the knowledge organization experts in our
profession. In the meantime we think it is best to document the topics as they appear in the
literature.
Several areas for future research have been mentioned in this report. We will not attempt to
prioritize them, but a few tug hardest on our coat-tails. We would like to see more qualitative,
exploratory research on practitioner research (or lack thereof) to investigate barriers to
qualitative research in this domain. We would like to see more qualitative, exploratory
22-(2/15/2016)
research on the low level of collaboration between academic scholars and library
practitioners. We know that it is low, but we do not know why it is low and how it might be
increased. Are there meaningful comparisons to be discovered in the health sciences and
professions? Additional problem areas that could benefit from qualitative research need to be
identified. Lastly, would not it be interesting to learn of any developments in LIS graduate
curricula in response to study findings like those reported here?
REFERENCES
1. Hildreth, Charles R. "How Are They Going About It? A Comparison of Research Methods
Used by LIS Academic and Practitioner Researchers." Peer-reviewed and juried paper
presented on January 22, 2003 at annual meeting of the Association for Library and
Information Science Education (ALISE), January 21-24, 2003, Philadelphia, PA. (Available
at: http://myweb.cwpost.liu.edu/childret/practitioner-res.ppt)
2. Powell, R. R. & Connaway, L.S. (2004). Basic research methods for librarians. 4th ed.
Libraries Unlimited, Westport, Connecticut, and London. 3.
3. Hernon, P. & Schwartz, C. (1999). Editorial: LIS Research – Multiple Stakeholders.
Library & Information Science Research. 21(4), 423-427.
4. Hernon, P. & Schwartz, 423.
5. Ennis, P. H. (1967). Commitment to Research. Wilson Library Bulletin, 41, 899-901.
6. Grotzinger, L. (1981). Methodology of Library Science Inquiry – Past and Present. In
C.Busha (Ed.), A Library Science Research Reader and Bibliographic Guide (pp.38-50).
Littleton, CO: Libraries Unlimited.
7. Busha, C. H. & Harter, S. P. (1980). Research Methods in Librarianship: Techniques
Interpretation. Academic Press, Inc.
&
8. Van House, N. A. (1991). Assessing the Quantity, Quality, and Impact of LIS Research.
Library & Information Science Research, Norwood, NJ: Ablex, 85-100.
9. Fidel, R. (1993). Qualitative Methods in Information Retrieval Research. Library &
Information Science Research. 15, 219-247.
10. Fidel, R., (1993) 233.
11. Powell, R. R. (1999). Recent Trends in Research: A Methodological Essay. Library &
Information Science Research, 21(1), 91-119.
23-(2/15/2016)
12. McKechnie, L. E.F. & Baker, L. & Greenwood, M. & Julien, H. (2002). Research Method
Trends in Human Information Behaviour Literature. Paper presented at: ISIC-2002:
Information Seeking in Context. The Fourth International Conference on Information Needs,
Seeking and Use in Different Contexts. Lisbon, Portugal, September 11 – 13.
13. Watson-Boone, R. (2000). Academic librarians as practitioner-researchers. The Journal of
Academic Librarianship. 26(2), 85-93.
14. Koufogiannakis, D. & Slater, L. & Crumley, E. (2004). A content analysis of librarianship
research. Journal of Information Science. 30(3), 227-239.
15. Hildreth, C. (2003).
16. Hernon, P. (1991). The Elusive Nature of Research in LIS. In C. McClure and P. Hernon
(Eds.), Library and Information Science Research: Perspectives and Strategies for
Improvement (pp.3-14). Norwood, NJ: Ablex.
17. Schutt, R. (2006). Investigating the Social World: The Process and Practice of Research,
5th ed. (13-17). Sage Publications.
18. Swygart-Hobaugh, A. J. (2004). A Citation Analysis of the Quantitative/Qualitative
Methods Debate’s Reflection in Sociology Research: Implications for Library Collection
Development. Library Collections, Acquisitions, & Technical Services, 28, 180-195.
19. Schloegl, C. & Petschnig, W. (2005). Library and Information Science Journals: An
Editor Survey. Library Collections, Acquisitions, & Technical Services, 29, 4-32.
20. Simon, J. & Schlichting, K. (2003). The College Connection: Using Academic Support to
Conduct Public Library Surveys. Public Libraries, November/December, 375-378.
24-(2/15/2016)
APPENDIX A:
Journal Titles for 2003-2005 Study of Research Methods:
TITLE
ARTICLES ARTICLES
MINED
SELECTED
Behavioral & Social Sciences Librarian @
7
4
Canadian Journal of Information & Library Science @
18
9
Cataloging & Classification Quarterly
16
8
Collection Management
6
3
College & Research Libraries
4
2
Information Technology and Libraries %
21
11
Journal of Academic Librarianship %
69
35
Journal of Documentation @
48
24
Journal of Librarianship & Information Science
21
11
Journal of Library Administration %
10
5
Library & Information Science Research @
33
17
Library Collections, Acquisitions & Tech. Services %
18
9
Library Resources and Technical Services %
23
12
Library Trends @
16
8
Public Libraries
10
5
Public Library Quarterly @
12
6
Public Services Quarterly %
3
3
Reference & User Services Quarterly
17
9
Reference Librarian
8
4
Reference Services Review %
12
6
8
Research Strategies %
4
Serials Review %
10
5
Technical Services Quarterly %
11
6
TOTALS
401
206
Journals in bold type had two or more research articles per issue.
@ - Predominantly academic journals
% - Predominantly practitioner journals
25-(2/15/2016)
APPENDIX B:
Study and Report Variables
STUDY VARIABLES
1. Journal Title
2. Article Year of Publication
3. Number of Authors
4. Practitioner or Academic Author
5. Topic-1
6. Topic-2
7. Location of Study (specific site or site-independent)
8. Setting/Context of Research (Academic, Public, Special Lib., or Misc.)
Research Approach:
9. Descriptive
10. Explanatory
11. Exploratory
12. Evaluative
13. Multiple/Combined Approaches
Sample & Research Methods:
14. Sample Type
15. Method-1
16. Method-2
17. Method-3
18. Multiple Methods Used (2 or more)
Data Analysis Type:
19. Quantitative
20. Qualitative
21. Both
Statistical Analysis Employed:
22. Descriptive
23. Correlation (variable associations)
24. Inferential (significance tests, linear regression)
25. Multiple Analyses Used
26. Not Applicable to this study
REPORT VARIABLES
27. Well-organized?
28. Research Question/Problem/Aim Stated at Beginning?
29. Literature Review?
30. Data Collection Explained?
31. Data-supported Presentation of Results?
32. Visual Presentation of Data Findings?
33. Validity/Reliability Issues Addressed?
34. Limitations of Study Mentioned?
35. Future Research Recommendations Discussed?
26-(2/15/2016)
Table 1.
Research Topics
TOPIC
Admin/Mgmt/Ops
Systems/IT
Collections
Services
Users (needs, preferences, behavior)
Info. Seeking/use
Database searching/IR
Catalogs/indexes, etc
Publishing
Citation Analysis
Cooperation/Consortia
LIS Education
User Education/Info. Literacy
Other
Frequency
36 (9.55)
33 (8.75)
40 (10.61)
45 (11.94)
49 (12.00)
42 (11.14)
24 (6.37)
26 (6.90)
17 (4.51)
14 (3.71)
4 (1.10)
9 (2.39)
24 (6.37)
14 (3.71)
Table 2.
Topic Categories
CATEGORY
Frequency
Library Ops. & Services 158 (41.91)
Use & User Studies
139 (36.87)
LIS Education
9 (2.39)
Other
71 (18.83)
27-(2/15/2016)
Table 3.
Contingency Table: Topic Category by Author Type
FREQUENCY|
TOT PCT |
ROW PCT |Lib. Use/ LIS
COL PCT |Ops. |User |Educ.|Other|
-----------------------------------|
13|
19|
0|
5|
Mixed AU
| 3.4| 5.0|
.0| 1.3|
| 35.1| 51.4|
.0| 13.5|
| 8.2| 13.7|
.0| 7.0|
-----------------------------------|
54|
69|
6|
35|
Academic
| 14.3| 18.3| 1.6| 9.3|
TOTAL
37
9.8
164
43.5
| 32.9| 42.1| 3.7| 21.3|
| 34.2| 49.6| 66.7| 49.3|
-----------------------------------|
91|
51|
3|
31|
176
Practitioner| 24.1| 13.5|
.8| 8.2|
46.7
| 51.7| 29.0| 1.7| 17.6|
| 57.6| 36.7| 33.3| 43.7|
-----------------------------------TOTAL
158
139
9
71
377
41.9 36.9
2.4 18.8
100.0
Statistic
DF
Value
p-value
------------------------------------------Chi-Square
6
17.454
0.008
Table 4.
Research Location Type Frequencies
Specific Site
Site-independent
83 (40.29)
123 (59.71)
Table 5.
Research Setting Type Frequencies
University
Public Library
Special Library
Misc. other
115
28
8
55
(55.83)
(13.59)
(3.880)
(26.70)
28-(2/15/2016)
Table 6.
Research Type Frequencies
Descriptive
Explanatory
Exploratory
Evaluative
Multiple
158
26
68
36
78
(76.70)
(12.62)
(33.01)
(17.48)
(37.86)
Table 7.
Data Collection Method Frequencies By Authorship
Data Collection
Method
Survey Questionnaire
Survey Interview
Observation
Semi-structured Interview
Experiment
Computer Log Analysis
Bibliometric/Citation Anal.
Content Analysis
Case Study
Delphi
Others
TOTAL
*Row Percentages
Mixed
Authorship
12(12.6)*
4(16.0)
0(0.0)
4(12.5)
4(21.1)
4(21.1)
1(5.6)
4(6.3)
1(10.0)
0(0.0)
0(0.0)
34
Academic
Author(s)
35(36.8)
14(56.0)
7(38.9)
16(50.0)
8(42.1)
8(42.1)
11(61.1)
35(54.7)
4(40.0)
0(0.0)
6(46.2)
144
Practioner
Authors
48(50.5)
7(28.0)
11(61.1)
12(37.5)
7(36.8)
7(36.8)
6(33.3)
25(39.1)
5(50.0)
1(100.0)
7(53.8)
136
TOTAL
95
25
18
32
19
19
18
64
10
1
13
314
Table 8.
Type of Data Analysis Frequencies
Quantitative
Qualitative
Combined
110 (53.40)
35 (16.99)
61 (29.61)
Table 9.
Statistical Analysis Frequencies
Descriptive
Correlational
Inferential
Combination
Not Applicable
171
44
32
57
36
(83.01)
(21.36)
(15.53)
(27.67)
(17.48)
29-(2/15/2016)
Table 10.
Study Report Variable Frequencies
Variable
Well-organized
Research Quest./Aim
Literature Review
Participants/Sample
Data Collection
Data-supported Findings
Validity or Reliability
Study Limitations
Future Research
Visual Data – Tabular
Visual Data – Graphic
Visual Data – Both
Visual Data – None
Visual Data – NA
Yes
162(78.64)
183(88.83)
172(83.50)
183(88.83)
199(96.60)
191(92.72)
56(27.18)
72(34.95)
108(52.43)
92(44.66)
23(11.17)
48(23.30)
16( 7.77)
27(13.11)
No
44(21.36)
23(11.17)
34(16.50)
23(11.17)
7( 3.40)
15( 7.28)
150(72.82)
134(65.05)
98(47.57)
30-(2/15/2016)
Download