Scientific Integrity and careful datamanagement, EUR, may 21st.

advertisement
Teaching Research Ethics or
Learning in Practice?
Preventing Fraud in Science
DIES NATALIS LECTURE ISS
THE HAGUE, 9 OCTOBER 2014
KEES SCHUYT, PHD, LL.M
SOCIOLOGY PROFESSOR EMERITUS, UNIVERSITY OF AMSTERDAM;
CHAIR NATIONAL OFFICE OF RESEARCH INTEGRITY (2006-2015)
Two phenomena, five topics
 Scientific integrity (what it is and isn’t)
 Data-management (good and bad practices)
Five topics:
1.
What do we want to prevent?
2. Good and bad practices
3. Why does it happen? - Tentative explanations
4. What is to be done? - Rules or principles
5. Educating, learning, mentoring
1. What do we want to prevent ?
 History of fraud in science (Baltimore-case (1986-
1996) as turning point; US Office of Research
Integrity, 1994
 Broad and Wade (1983); Van Kolfschooten (1996,
2012); Grant (2008)
 Levelt - report on the Stapel-case (2011/2012)
 What can we learn from incidents (outliers)?
(teamwork; the system is not watertight: good
datamanagement)
Scientific integrity
 Integrity is a self-chosen commitment to
professional values (B. Williams 1973)
 Resnik: “striving to follow the highest standards of
evidence and reasoning in the quest to obtain
knowledge and to avoid ignorance” (The Ethics of
Science,1998)
 Integrity is context bound, eg. fabulation in novels
and fabulation in science; leading values in science
(Merton 1949)
 Codes of Conduct: NL 2005/2012; ESF 2010
Violations
Violations of the game rules of science:
FFP :
fabrication or fabulation
falsification
plagiarism
Difference between F and P?
2. Good and bad practices
 Questionable research practices (trimming,
cooking, pimping, sloppiness, uncareful data
management, not archiving)
 Drawing the line (raw data, co-authorship,
impolite behaviour)
Trimming and cooking (Babbage 1830)
 Trimming: “consists of clipping of little bits here
and there from those observations which differ
most in excess of the mean, and in sticking them
on to those which are too small”
 Cooking: “to give ordinary observations the
appearance and character of those of the highest
degree of accurance. One of its numerous
processes is to make multitudes of observations,
and out of these to select only those which agree,
or very nearly agree”
Metaphorically:
“if a hundred observations are made, the cook must
be very unlucky if he cannot pick out fifteen or
twenty which will do for serving up” (Charles
Babbage, Reflections on the decline of science in
England and some of its causes, 1830; 1989 edited
by Hyman)
Four main distinctions:
 honest
vs
dishonest, fraudulent
 good
vs
bad practices
 controversies
vs
dishonest research
 game rules
vs
goal rules
Data-management
The scientific research cycle:
 3 strong controlling points: grants, peer review,
scientific community
 2 weak points: primary process and data-archiving
 Wide variations between disciplines: is everything
okay?
 Bad to good practices: single vs teamwork
 Scale of research: international data-gathering;
protocols
Variations in data and in data-gathering
 Experimental design data (lab)
 Stemcells, MRI-scan data
 Mathematical data, logical analysis
 Survey-data (pen and pencil)
 Public data (time series, economic data,
populations figures, official statistics)
 Historical data (archives)
 Anthropological field observation
 Simulations
3. Why does it happen?
 Three main explanations:
o Publication pressure: from who to whom?
o Sloppy science
o Pressure from contract research
 Alternative tentative explanatory scheme:
misplaced ambition, loose mentoring,
ignoring early signals, poor peer review, no
institutional response
Contract research
 What is the problem? Köbben 1995: scientific
independence; pressure from above (yes,
minister); conflicts of interests
 Research biases? Biomedical research; Roozendaal
 Patents, secrecy, firm’s data not public
 Remedies: “good fences make good neighbours”
(R.Frost), applied to contracts
 Research codes, guidance committees, High
Prestigious Research Group (hprg)
 Conclusion: be a hprg: integrity high, high skills,
independent
4. What is to be done?
 Learn from best practices across disciplines
 Peer pressure before peer review; data-manager
and/or statistical counseling; open discussions to
keep alert (not too often!)
 Scientific pledge or oath taking!?
 Lowering publication pressure? (causality!)
 Educating ethics in science; integrated in datamanagement courses
5. Educating, learning, mentoring
 The sixpack:
a
b
c
d
e
f
learning rules, discussing ethics
training research skills (eg. advanced statistics,
philosophy of science)
good mentoring (becoming a good scientist)
oath-taking (!?)
online learning, the dilemma game
reading Being a scientist
 Select your own best combination
Gift to all PhD students:
Thank you very much indeed
for your attention
Download