Quality of Research Links

advertisement
A Self-Guided Tour for the Educational Practitioner
Produced by Dana Hall, OCDSB
Some Friendly First Thoughts
To copy from one is plagiarism…to copy
from many is research! --Comedian Steve Wright
We’ve all had to research something at one
point
…it wasn’t always fun
…sometimes it was hard to get started
... sometimes it seemed like there was too
much information
…but we all got through it and maybe even
learned a thing or two!
The difference between then and now is that
instead of trying to please a teacher or
professor, we want to improve something in our
classrooms and schools.
We want to make sure that the work we do for
“our kids” is of excellent quality--they deserve
no less!
Research Is
The primary goal of research is to establish theoretical
structures by means of which observable phenomena
can be described, explained, predicted or controlled.
 (Humphries, 2000)
Ultimately, all research (even purely theoretical
research) leads us to the point where we can improve
something. Even if we only read the research of others,
it is still assumed that improvement of something is
our goal.
For principles and teachers, research topics can
spring out of everyday events and routines.
The researcher has to shape the general topic
into a specific hypothesis that can be verified.
OK…I have a lot of ideas…where
do I start?
There are a couple of things that need to be
sorted through in the beginning stages:



Background Work
Generating a Specific Hypothesis
Developing a Method
The Research Project: An Overview
Planning the Educational Research Project
Formulate hypothesis
Observations of real events
Background Research
Methodological Review
Project Design
Carry Out Data Collection
Seek Veriification of Hypothesis
Final Report
Validity Checks
Honest Interpretation
For Further Work
Remember the Science Lab?
A good research project is much like a good
science report.
It may begin with a moment of inspiration, a
question or curiosity, or there may be a specific
problem given to you to explore. Regardless,
the researcher has to shape the general topic
into a specific hypothesis that can be verified.
In order to create that specific hypothesis, the
researcher needs to do some background work.
Background work usually includes reviewing a
variety of literature
 a) to prevent repetition of already known facts;
 b) to build on the work of others; and
 c) to help the researcher develop an effective
methodology
Sources of Literature
Literature can be be formal (empirical) or informal
(non-empirical).
Informal literature tends to be non-academic. As such,
it may need to be used judiciously, but this does not
prevent it from being valuable. Magazines and
newspaper articles, for example, often help establish
context. They may help the researcher clarify his or her
own thinking. Specific data cited in informal sources
may be questionable, however.
The Peer-Reviewed Journal is the hallmark of
formal literature.
Simply put, peer-reviewed literature has been
examined and scrutinized by other experts in the
field for validity, significance, originality and
clarity.
http://www.senseaboutscience.org.uk/PDF/ShortPeerReviewGuide.pdf
Contents of the next group of slides
Explaining validity, significance, originality and clarity
Segue: A more detailed look at Validity and Reliability
Why Validity and Reliability are Important
The Different Types of Research
Getting Back to Developing the Hypothesis
Validity means that the results are credible and the most
appropriate methodology was used. (see next slide for an
elaboration or skip if desired)
The work needs to be important. Some areas of research target
very small populations. That doesn’t mean the information isn’t
generalizable to other situations or provide useful background
information for other projects. But some work is just plain
meaningless!
A researcher must give proper credit to others for their work. He
or she must be able to prove that his or her interpretations of
others’ work is accurate. Background research or replication of
data must be separated from the researcher’s own original work.
The work presented must be clear enough that it will not be
Segue: Validity and Reliability
In statistics, there are two important concepts called validity and
reliability.
Prior to 1985, Validity was understood to be that the test actually
measured what it claimed to measure. This definition was the one taught
to many people who are currently employed in education, and it is still
somewhat useful. But in 1985, The American Psychological Association
(APA), The American Educational Research Association and the National
Council on Measurement in Education defined validity as “the
appropriateness, meaningfulness, and usefulness of the specific
inferences made from test scores…The inferences regarding specific uses
of a test are validated, not the test itself.” (in Crowl, 1996, p. 110).
In other words, the researcher creates a design for a specific group and
for a specific purpose. His or her results, if valid, permit inferences to be
made about that group. (in Crowl, 1996, p. 102).
Reliability means that the method used to measure
something (e.g., a test) is consistent.
 If you gave a group of people a test, erased their
memories and gave them the same test, they would
get the same results-- every time. (in Crowl, 1996, p. 102).
There are many ways to test for reliability.
Caution:
The novice researcher often creates his or her own
customized test. When conducting research, there is
the risk that this test will not provide a consistent
measure.
Why is this important?
Consider a math test that has a lot of word
problems. Under certain circumstances, this test
may not be a valid means for assessing math
skills, particularly among student with language
difficulties. The test itself might be reliable
however. (in Crowl, 1996, pp. 102-03).
Any test that allows the researcher to draw
appropriate inferences (i.e., it is valid) must
logically be consistent, and therefore reliable.
(in Crowl, 1996, pp. 102-03).
“OK, OK, but we’re talking about a
grade 5 class….I mean, really…”
Most people willingly accept the need for thorough
testing when it comes to health care or new medicines.
Unfortunately, there is a tendency among teachers to
“downplay” new research findings as “just another
new initiative to worry about…”
Teachers often report that Ministry or board research is
too theoretical and doesn’t apply to authentic
classroom settings. Educational experts doubt the
teacher’s abilities to do research properly. (Lagemann, 2000)
Solution: Know Thine Own
Knowledge
Caveat:
The more we know about research, the more we
are prepared to engage in it for ourselves, and
the better we are at scrutinizing the work of
others.
Types of Research
McMillan, 2000, pp 9-14)
Research is divided intro three broad types:
Quantitative
Qualitative
Mixed Measures.
Each has certain advantages and disadvantages.
Each requires a different skill set on the part of
the researcher.
It is also important to know what the findings
are to be used for.
Another way of classifying research is based
on the four functions of research:
Basic
Applied
Action
Evaluation
Quantitative
Emphasizes numbers, measurements, deductive
logic, control, experimentation: “the hard facts”
Often seen in clinical or laboratory situations
If non-experimental, then it tends to focus on
describing existing patterns and relationships, or
on making comparisons.
If experimental, then the researcher can control
one or more variables and then draw conclusions
about what happened.
Qualitative
Emphasizes natural settings, the importance of “pointof-view” and long-term concepts or models..
Intention is to observe and record things as realistically
as possible.
Phenomenology tries to understand the essence of
something; ethnography describes a cultural or
sociological process; grounded theory applies theory to
an existing environment and case studies focus in-depth
on a particular situation with a very specific context.
Mixed Measures Research
With increasing acceptance of qualitative
methods, it would seem a natural next-step for
qualitative and quantitative methods to be
combined.
Mixed Methods were somewhat slow to catch
on
researcher needs expertise in both areas
Education and Educational Psychology have
lead the way in Mixed Measures research
There are at least 10 advantages to using a
mixed method. (see Zazie, et al, 2004, pp 9-13)
1. Triangulation: In the same way that navigators and surveyors
use the intersect of two different lines to pinpoint a location,
researchers can use both quantitative and qualitative methods.
Each method may bring different strengths and weaknesses, but
if they yield the same results, then the researcher can feel
confident that he or she has “pinpointed” accurately.
2. Pick-Me-Up Justification: If a study of one type provides
unexpected or difficult to explain results, then data from the
opposing viewpoint might offer a plausible explanation or,
equally importantly, it might confirm that the project is flawed.
A standardized questionnaire followed by more in-depth
interviews is a common way to mix the methods. If the
questionnaire yields surprising results, the interview may
explain why, or it may confirm that the questionnaire was poorly
designed.
3. Using one method as a “pilot” study to help refine the actual
study.
4. Mixing methods allows one to study the same phenomena but at
different levels. For example, there might be a social/personal
split, micro/macro, etc., .
5. Re-assessment of existing assumptions. Traditionally, a focus on
quantitative research sometimes ran the risk of ignoring gender
and race. Mixed methods forces the researcher to take a deeper
look at the theories that have shaped or that arise from the work.
6. Better interdisciplinary communication
7. Better communication between academics, practitioners and
children and the community.
8. “Exploiting” methods. Each method can inform the other.
Quantitative data, for example, often needs to be explained
qualitatively.
9. Mixed methods help convince a hostile or reluctant audience. As
in triangulation, when two different methods point to the same
conclusion, it is easier to convince the skeptics.
10.Improvements in theory: If two or more theories exist for the
same phenomena, then obviously the problem hasn’t been fully
solved. More (and different) research needs to be done
When to Use What Where
Quantitative: The clinical side of education.
Looking for Causes
Some examples:
studies involving cognition and memory
 the effects of Ritalin
visual perception and textbook design
When to Use What Where
Qualitative: The social side of education.
Studies on interpersonal communication; school
culture and climate; students’ responses to class
dynamics, etc., .
4 Functions of Research
Basic Research develops, refines and advances theories.
Applied Research builds on theory to improve the practice or
solve a problem , often within the educational field. A
psychological study on learning may lead to applied research on
curriculum restructuring.
Action Research is a specific type of applied research that
focuses on a specific classroom or school problem.
Evaluation research is used to make judgements during the
decision-making process (e.g., which reading programme is best
for our school).
OK…What were we talking about
again?
Understanding the types of methods and
the purpose of research is essential for :
performing research
judging its quality
Consider the following situation:
A teacher in Napanee picks up a magazine article and
reads about a teacher in New York who has been doing
something in her classroom that she claims is having
wonderful results. The teacher in Napanee wonders
whether she could do something similar and decides to
try it out. She gets some mixed results. She is seeing
some improvement, in some areas, but not consistently.
As part of her plan for professional growth, she
decides to engage in action research and study this
problem more formally. She writes up her report a year
later and the ALCDSB publishes it on their website. It
might be a great project, but it might be flawed. What
things need to be considered in judging the quality of
her work?
Ethnography. Did the teacher begin with false assumptions.
Could the NY and Napanee schools be so different culturally,
economically, sociologically, organizationally, etc., as to make
the New York’s report non-generalizable?
Was the teacher in New York honest? Was the article published
as a human interest story, or was there really a “project” in
place.
Assuming there was an effort to study this formally, and
assuming the teacher had only good intentions, is it possible she
“saw what she wanted to see”
Assuming there was a method of evaluating her
success, was it the right method and did she
have sufficient knowledge to use it?
Assuming she had had carefully considered her
methodology, did she make an effort to interpret
her results in light of existing theory?
Stop…You’re Scaring Me Off!!!!
Isn’t it enough that a teacher saw potential in
something, tried it, observed some improvement
and then began a way to try and improve it
some more. If we get so “hung up” on all this
research theory, we’ll never get any research
going.!
Yes…and No...
Experts agree that educational change takes
about three years to implement and that during
those three years, change can be difficult and
create tension. (Fullan, 2000a)
If the purpose of doing and/or reading research
is to improve, then there is a need to analyze
information carefully
Our teacher in Napanee is not wrong for wanting to try
something new, and she is to be commended for trying to
follow-it through for a long term.
But when it comes to engaging in research, or sharing research,
a bit of homework in the first place can save time in the longrun; can increase the quality and usefulness of the material; can
reduce the difficulties associated with implementation and most
importantly, it ethically reduces the risks for the students.
What if our hypothetical teacher’s good intentions had failed?
Would the students possibly be missing out on something
important? All change involves risk, but there is a moral
obligation to minimize risk for the sake of the children’s
learning.
The Good News
Teachers already collect data, all the time, without
recognizing that is has value outside of the classroom.
Teachers often mix methods when it comes to
evaluating students. Let’s hear it for Triangulation!
With increasing technology, teachers use a variety of
information-gathering tools such as digital pictures,
videos, electronic journals, etc., .
Really, it is a matter of a little work now with a big
payoff later!
Getting Back To Hypothesis
Generation
Combining Interest with Feasibility
See Slideshow Notes re: references
Start with a topic or a general area that is distinct, such as
giftedness or Early French Immersion.
Make sure that you have a genuine interest in the topic
Limit your selection of topics to ones that, given your resources,
etc., could actually be investigated
Focus on your participants. Who do you have access to? And
remember, educational research is not limited to classrooms.
Parents, administration, colleges and universities, federations,
etc., all play a role in the educational process.
Gradually Narrowing a Topic
Don’t just jump into a specific research
question.
Explore existing research while still in the
general topic area of developing the research
project. Although having many possibilities is
sometimes confusing, it is better to be
knowledgeable about the topic as a whole.
General Topic: Student attitudes
Possible narrower topics related to "Student attitudes":
a. Students' attitudes toward school
b. Students' attitudes toward an academic subject
c. Students' attitudes toward disabled classmates
Possible narrower topics related to "Student attitudes toward school":
a. Ninth graders' attitudes toward school
b. Community college students' attitudes toward school
c. Graduate students' attitudes toward school
Possible narrower topics related to 'Ninth graders' attitudes toward school":
a. Ninth graders' attitudes toward school as a function of type of school
b. Ninth graders' attitudes toward school as a function of gender
c. Ninth graders' attitudes toward school as a function of grade point average
Possible questions related to "Ninth graders' attitudes toward school as a function of type
of school":
a. Do ninth graders attending a junior high school have different attitudes toward school
than ninth graders attending a senior high school?
b. Do ninth graders attending a vocational high school have different
attitudes toward school than ninth graders attending an academic
high school'?
c. Do ninth graders attending a parochial school have different
attitudes toward school than ninth graders attending a public
school?
(cited verbatim from Crowl, 1996, p 29)
Generating a Researchable Topic
from a Published Study
Quality Research often concludes with a section
on “For Further Research”
Sometimes changing the population of existing
research yields a new study: (grade 9 to grade 6)
Sometimes changing the variables of existing
research yields a new study (e.g., attitudes to
self-concept)
Pseudohypotheses and Hypotheses
Pseudohypotheses stem from value judgements
such as Balanced Literacy makes better readers
or that it is good for students to be read to.
Pseudohypotheses may make sense intuitively,
but cannot be tested empirically. Historically,
many “common sense” ideas have been found
to be completely untrue.
Hypotheses are usually stated in the form of a
prediction.
A hypothesis should stem from the literature review,
regardless of the researcher’s personal view. For
example, a teacher believes A is better than B, even
though the research says B is better. The hypothesis
should predict B will do better regardless of the
researcher’s interest.
Hypotheses use clearly defined variables. Other
researchers may agree or disagree with the definitions,
but at least they are clear.
Pseudohypothesis: Integrating hands-on technology within
science lessons is good for the students.
Why?……It’s an opinion and it lacks focus!




Appropriate for different learning styles?
Career preparation?
Reduced gender-role stereotyping?
Which students?
Be Specific!
Hypothesis: Intermediate students who receive science
instruction with hands-on technologies will exhibit better career
preparation skills according to the [name of index] and/or will
exhibit significantly less gender-role stereotyping behaviour than
students who are taught only the academic material.
A Brief Word About Sampling
Researcher strives to have an unbiased representation
of the population
Simple random sampling
each member of the population had an equal chance of being
selected. (Crowl, 1996, p.9)
Cluster sampling
is random sampling that progressively narrows down the
subjects (e.g., from Districts to Middle Schools to schools
with EFI, etc., .
Crowl, 1996, p.97
Sampling cont’d
Systematic sampling
Looking at numbers
e.g., there are 200 possible subjects…
50 are needed
200/50 =4
pick every 4th students
Stratified sampling
when different populations vary drastically in size
Crowl, 1996, p.97
How do I Know If I have Enough
Participants
There is no definitive minimum number!
Best guideline is that the greater the sample
size, the more credible the data collected
Education is full of exceptions
e.g., Special Education classes, by nature, tend to
be
smaller, so smaller sample size is the
norm
Crowl, 1996, p.97
Sample Size
Group Comparisons…
Aim for a minimum of 15 per group
Correlational studies
Aim for at least 30 participants
Crowl, 1996, p.97
Summing Up
The Research Project: An Overview
Planning the Educational Research Project
Formulate hypothesis
Observations of real events
Background Research
Methodological Review
Project Design
Carry Out Data Collection
Seek Veriification of Hypothesis
Final Report
Validity Checks
Honest Interpretation
For Further Work
Criteria for Judging Quality of
Research
excerpted from The University of North Carolina:
http://www.serve.org/EdResearch/criteria.php
Quality of conception
Is there a Theoretical Base?
Are the research question(s) specific and clear?
Can the research questions be investigated
How well did the research investigate and answer
these questions?
The research base
 Is there a research base for the project under consideration?
We often assume that the advice of experts is based on the research that they and others have conducted. Very
often, though, no data are involved; the authors are simply giving their opinions and positions, or citing other
experts, in a kind of endless loop, giving the appearance that there is a substantial body of empirical findings.
 What is the quality of the studies making up the research base?
Even though many studies may have investigated topics like the one that interests you, the quality of the
evidence is still open to review and questioning. Poorly designed studies can result in unjustified claims of
effectiveness that might not stand up to a more rigorous research method.
 How appropriate is the design? Did A cause B, or is there simply a pattern worth looking into further?
The design of a study refers to built-in comparisons among intervention conditions. In experimental and quasiexperimental studies design refers to comparison of outcomes (e.g. achievement scores) of experimental and
one or more control groups. Without the appropriate design a study cannot answer causal questions.
 Are intervention conditions clearly defined and documented?
Completeness of description is important for several reasons.
(1) A detailed understanding of an intervention (as well as control conditions!), can help you to form your own
judgment about the meaning of research findings.
(2) It can tell you whether the intervention is a viable, practical option for the schools you are concerned with.
(3) In order for other researchers to replicate a study a detailed description is required.
Quoted from The University of North Carolina:
http://www.serve.org/EdResearch/criteria.php
Was the sample size appropriate relative to the
strength of the claim?
Did the researcher describe the participants?
How were participants identified and recruited
and assigned to ‘groups’?
Was ethnography and demography a factor?
Was it reported?
excerpted from The University of North Carolina:
http://www.serve.org/EdResearch/criteria.php
Are results overgeneralized?
Are the data analysis methods appropriate?
Instrumentation and measurement.
Has validity and reliability been achieved?
excerpted from The University of North Carolina:
http://www.serve.org/EdResearch/criteria.php
 Replication
Are the major findings replicated across a number of studies? No one study ever
settles an issue definitively. A research finding, in order to be of practical value,
should be repeated in a variety of demographic settings, with different student and
teacher populations. Every replication helps to lower the likelihood of the findings
arising by chance, and to raise the credibility of instructional decisions based on the
finding.
Quoted from The University of North Carolina:
http://www.serve.org/EdResearch/criteria.php
 Additional considerations
 Can judgments about the meaningfulness, validity, and reliability of the study be
made easily from the information presented?
 Are the similarities and differences between the study findings and findings from
similar studies discussed?
 Are the limitations and alternative explanations for the findings discussed?
excerpted from The University of North Carolina:
http://www.serve.org/EdResearch/criteria.php
A Final, Encouraging Thought...
An Excellent Example of Why We
Have to Ensure Quality
 The following American reference outlines the importance of a scientific
approach to education. From a teaching point of view, it is a surprising just
how far the legislative and financial implications of research can run. The
article's abstract is quoted verbatim and the full paper can be found at
http://www.ncrel.org/sdrs/areas/issues/envrnmnt/go/go900.htm
“ISSUE: The No Child Left Behind Act requires
educational programs and practices to be based on
scientifically based research. The federal policy
impacts practicing educators in the curriculum areas of
reading, mathematics, and science...
It also impacts instructional strategies, professional
development, parent involvement, and all federally
funded programs. The intent of these requirements is
for teachers and administrators to improve their
schools based on scientific knowledge as well as
professional wisdom.
"The charisma of a speaker or the attachment of an
educational leader to an unproven innovation drives
staff development in far too many schools. Staff
development in these situations is often subject to the
fad du jour and does not live up to its promise of
improved teaching and higher student achievement.
Consequently, it is essential that teachers and
administrators become informed consumers of
educational research when selecting both the content
and professional learning processes of staff
development efforts." (NSDC, 2004)
http://www.nsdc.org/standards/researchbased.cfm
Resource List
Text References
Boyden, J and Ennew, J. (1997). Children in focus: A manual for participatory research
with children. Stockholm: Lennart Reinius/Agneta Gunnarsson
Christensen, P. & James, A. (Eds.). (2000) Research with children: Perspectives and
practices. London: Falmer Press.
Crowl, Thomas K (1996) Fundamentals of Educational Research Second edition.
Boston, McGraw-Hill.
Fullan, M. (2000a). Change forces. The sequel. Philadelphia: Falmer Press.
Green, J.L., Camilli, G., & Elmore, P. B. (Eds.). (2006). Handbook of complementary
methods in education research. Mahwah, New Jersey: Lawrence Erlbaum Associates/AERA, Publishers.
Humphries, B (Ed.). (2000). Research in Social Care and social welfare: Issues and
debates for practice, London: Jessica Kingsley.
Kozma, R. B., (Ed.). (2003) Technology, innovation and educational change: A global
perspective. International Society for Technology in Education.
Lagemann, E.C. (200). An elusive science: The Troubling history of education research.
Chicago: University of Chicago Press in Green, J.L., Camilli, G., & Elmore, P. B.
(Eds.). (2006). Handbook of complementary methods in education research. Mahwah, New Jersey: Lawrence
Erlbaum Associates/AERA, Publishers.
Texts cont’d
McMillan, James H. (2000). Educational research: Fundamentals for the consumer
Third Edition. New York: Addison Wesley Longman, Inc.
Williams, D, Howell, S.L., Hricko, M. (2006). Online assessment, measurement and
evaluation: Emerging practices. Hershey, PA: Information Science Publishing.
Zazie, T., Nerlich, B., & McKeown, S. (2004). Introduction, Part 1:Theoretical and
historical foundations in Zazie, T., Nerlich, B., McKeown, S. & Clarke, D (Eds.). (2004). Mixing methods in
psychology: The integration of qualitative and quantitative methods in theory and practice. Hove: Psychology Press.
Electronic References
Margolin, J., & Buchler, B. Critical Issue: Using Scientifically Based
Research to Guide Educational Decisions
Retrieved August 19, 2006 at
http://www.ncrel.org/sdrs/areas/issues/envrnmnt/go/go900.htm
Serve Centre, University of North Carolina: Educational research:
What are the criteria for judging the quality of research
Retrieved August 21, 2006 from
http://www.serve.org/EdResearch/criteria.php
The Peer Review Process: Making Sense of Science
Retrieved August 21, 2006 at
http://www.senseaboutscience.org.uk/PDF/ShortPeerReviewGuide.pdf
Electronic Resources
A list of useful sites will be followed by screen images of
some of the more relevant sites with active hyperlinks:
General Resources
The ERIC index of journals/CIJE
http://www.eric.ed.gov/ERICWebPortal/Home.portal?_nfpb=true&_pageLabel=JournalPage
Accessing ERIC through the CSA
http://www.csa.com/factsheets/eric-set-c.php
ERIC and the CSA via Queen’s University
http://library.queensu.ca/webedu/guides/howto/ericsa.htm
The Canadian Journal of Education
The leading, bilingual journal of educational scholarship
in Canada
http://www.csse.ca/CJE/General.htm
Current Issues in Education (Arizona State College of Education)
http://cie.asu.edu/
University of the State of New York & The New York State Dept.
of Education
On-line Resources
http://usny.nysed.gov/teachers/genres.html
Phi Delta Kappa
An organization devoted to professional education
(access to both peer-reviewed and non reviewed literature)
http://www.pdkintl.org/
Action Research Links
ARI: Action Research International
Action research international is a refereed on-line journal of
action research. It is sponsored by Southern Cross University
http://www.scu.edu.au/schools/gcm/ar/arp/arphome.html
The On-Line Conference on Community Organizing and
Development (COMM-ORG))
COMM-ORG was founded in 1995 to link academics and activists, and theory and practice,
toward the goal of improving community organizing and its related crafts. The project is
supported by the University of Wisconsin and Economic Development.
http://comm-org.wisc.edu/research.htm
Center of Applied Linguistics: Online Resources:
http://www.cal.org/resources/digest/0308donato.html
Quality of Research Links
The Handedness Institute at Indiana University contains scholarly
guides as well as content on quality of research
http://handedness.org/help/researchguide.html
The Peer Review Process: Making Sense of Science
http://www.senseaboutscience.org.uk/PDF/ShortPeerReviewGuide.pdf
Resources for Methods in Evaluation and Social Research
free online resources
http://gsociology.icaap.org/methods/qual.htm
Serve Centre, UNC: The criteria for judging the quality of research
http://www.serve.org/EdResearch/criteria.php
Style Guides and
American Psychological Association (APA)
APA Guidelines can be obtained online at the official APA website.
They can also be ordered (for a cost) and/or downloaded
http://www.apastyle.org/
http://www.apa.org/journals/authors/guide.html#refer
Several other academic institutions post APA guidelines as well,
such as Purdue University:
http://owl.english.purdue.edu/owl/resource/560/01/
CIJE: Current Index to Journals in
Education
http://www.eric.ed.gov/ERICWebPortal/Home.portal?_nfpb=true&_pageLabel=Jour
nalPage
ERIC via CSA
http://www.csa.com/factsheets/eric-set-c.php
http://library.queensu.ca/webedu/guides/howto/ericsa.htm
Canadian Journal of Education
http://www.csse.ca/CJE/General.htm
Current Issues in Education
-A peer-reviewed journal from the Arizona State
College of Education
http://cie.asu.edu/
Professional Organizations: Phi
Delta Kappa
http://www.pdkintl.org/
http://usny.nysed.gov/teachers/genres.html
http://www.scu.edu.au/schools/gcm/ar/arp/arphome.html
http://handedness.org/help/researchguide.html
The Peer-Review Process
http://www.senseaboutscience.org.uk/PDF/S
hortPeerReviewGuide.pdf
APA Referencing
http://www.apastyle.org/
APA Referencing
http://www.apa.org/journals/authors/guide.ht
APA Referencing
http://owl.english.purdue.edu/owl/resource/560/01/
Download