2000-2001 - Shoreline Community College

advertisement
Shoreline Community College Self Study 2001-02
DOCUMENT CODE: 2._____
DOCUMENT DESCRIPTION:
Shoreline's Annual Outcomes Assessment
Report--2000-20001 contains descriptions of 16 faculty assessment development projects funded
with the State Outcomes Assessment allocation through the grant award process conducted by
the Institutional Effectiveness Committee. Extensive appendices contain examples of project
output. This document is also available at
http://intranet.shore.ctc.edu/intranetiear/assessment_report.htm
IEAR 8/01
1
Shoreline Community College
ANNUAL OUTCOMES ASSESSMENT REPORT—2000-01
Assessment Liaison/s
Jim James
Phone
206-546-6949
Email
jdjames@ctc.edu
A. Highlights of Major Assessment Activities/Project
For the second year, over two thirds of Shoreline's Assessment allocation was dedicated to
faculty assessment development efforts through an RFP process coordinated by the Institutional
Effectiveness Committee. Remaining funds supported the Office of Institutional Effectiveness,
Assessment and Research.
For 2000-01 funds were awarded to 16 faculty projects representing 5 of the 6 Shoreline
academic divisions:
















World Languages Assessment, Amelia Acosta, Humanities, aacosta@ctc.edu
Portfolio Assessment of Outcomes in Visual Arts, Bruce Amstutz, Humanities,
bamstutz@ctc.edu
Student Retention/Completion and Student Interest, Ken Campbell, Mark Hankins,
Automotive & Manufacturing Technologies, kcampbel@ctc.edu, mhankins@ctc.edu
College Literature Retreat, Du-Valle Daniel, Neal Vasishth, Humanities,
ddaniel@ctc.edu, nvasisht@ctc.edu
Critical Thinking Assessment of Nursing Students, Janice Ellis, Health Occupations
and P.E., jellis@ctc.edu (this project was not completed due to personnel turnover)
Admissions Process Analysis-Nursing, Janice Ellis, Health Occupations and P.E.,
jellis@ctc.edu (this project was not completed due to personnel turnover)
Reading Outcomes and Assessment-II, Dutch Henry, Humanities, ghenry@ctc.edu
Pre-English 101 Writing Portfolio Standards, Pearl Klein, Humanities,
pklein@ctc.edu
Developmental ESL Assessment, Bruce McCutcheon, Humanities, bmccutch@ctc.edu
English 101 Composition Study, Gary Parks, Humanities, gparks@ctc.edu
Outcomes and Assessment in Macroeconomics, Tim Payne, Intra-American Studies &
Social Sciences, tpayne@ctc.edu
Outcomes and Assessment in Microeconomics, Tim Payne, Intra-American Studies &
Social Sciences, tpayne@ctc.edu
Math Learning Center Evaluation, Margaret Anne Rogers, Science & Mathematics,
mrogers@ctc.edu
College Readiness Conference, Pam Dusenberry, Humanities, pdusenbe@ctc.edu
CNC Technologies Workplace Assessment Review, Jody Black, Automotive &
Manufacturing Technologies, jblac@ctc.edu
Assessment of Social Functioning as a Learning Outcome in Education Courses,
Tasleem Qaasim, Intra-American Studies & Social Sciences, tqaasim@ctc.edu
IEAR 8/01
1
B. Examples of Educational Improvements Made/Actions Taken



General
The number of assessment development projects at Shoreline has grown from 12 in
1999-2000 to 16 in 2000-01, with several faculty participating for the first time. It is
anticipated that the grant process will become more competitive in the coming year.
World Languages Assessment
o Participating in this project increased and improved communication among full-time
faculty and between full-time and associate faculty.
o The World Language faculty now have a greater understanding of the similarities and
differences in both grading practices/needs for each language, as well as in the
abilities that students exhibit in each level of each language.
o As a result of this grant, our program now has a relatively standardized rubric for the
assessment of take-home compositions.
o The wide variety of composition grading practices that exist now, will no longer
affect student grades in case they change instructors during their French, Japanese, or
Spanish educational experience. It is hoped that this standardization will also
decrease the possibility of grade inflation in our courses.
o Additional project detail is contained in Appendix A.
Portfolio Assessment of Outcomes in Visual Arts
o Graphic Design Sequence:
Introduction of a first year portfolio review, redefinition of the existing exit portfolio,
establishment of basic typographic and design principles to be taught throughout the
graphic design sequence, implementation of a first year portfolio evaluation process
to start Fall 2001. Direct collaboration with tenured and associate faculty responsible
for the delivery of content in the various courses.
o 3D Art and Design Sequences:
Classroom activities benefited by clarifying the content of the respective course work.
Direct collaboration with tenured and associate faculty responsible for the delivery of
content in the various courses. Budget request process benefited by clearly
establishing the kind and frequency of classroom/studio activities for each of the 3-D
classes offered. The assessment tool will assist in the development of other 3-D
related courses. The application of the portfolio assessment tools will assist in
articulating transfer credit from one institution of higher learning to another.
o Photography Sequence:
Students evaluate through written form their peer’s photographic assignments using a
set standard of critical assessments to determine the successful completion of the
requirements for visual dialogue. Students evaluate through oral dialogue their
impressions of their peer’s photographic work as to the successful visual construction
of individual photographic assignments. Students evaluate through a written
statement their own visual efforts to communicate subject matter through composition
construction. Mastercourse outlines reviewed and updated
o Additional project detail is contained in Appendix B.
IEAR, 8/01
2




Automotive Student Retention/Completion and Student Interest
o The automotive faculty involved with this project developed a database in Microsoft
Access for the compilation and analysis of student information. Along with standard
student demographic information, the heart of this project established a basis for
tracking student career interest as assessed by the Kuder Career Assessment Test. We
want to determine if any correlation exists between long-term employment in the
automotive industry and degree of mechanical interest as measured by the Kuder,
when students begin the automotive training program. This information should
improve student recruitment, advising, and ultimately, retention of our program
graduates within the industry.
o As the project developed, we learned that the database would also prove to be a
valuable tool for tracking student progress in general education class completion.
Some automotive students need encouragement to complete the academic
requirements for the Associate of Applied Arts and Science Degree. This database
will help the automotive faculty provide appropriate advising to assist the student
toward timely completion of the degree.
o Additionally, we have built the database to track where our students are coming from
(i.e. high school, military, worker retraining, etc.) and which automotive dealerships
are hiring them.
o Additional project detail is contained in Appendix C.
College Literature Retreat
o At our retreat, we created a value statement for our program.
o We developed an initial list of new courses to teach.
o We refocused our curriculum in order to be less Euro-centric and more global.
o We created a draft of common outcomes for literature MC0’s. This was made within
the context of the newly formed General Education Outcomes.
o We created a tentative two year, quarter by quarter, time schedule of our literature
course offerings that would optimize course variety as well as help students meet prerequisites at four year institutions.
o Additional project detail is contained in Appendix D.
Reading Outcomes and Assessment-II
o Development of sample reading assessments for use in English 080, 090, and 100
o Development of a reading rubric for use in evaluating student work in response to
reading assessments in English 080, 090, and 100
o Collection and faculty evaluation of sample student work which represents three
levels of achievement (exceeds standards, meets standards, does not meet standards)
in response to reading assessments
o Development of a list of sample reading activities for classroom use
o Development of a list of sample reading assessment ideas
o Development of a list of sample readings for use in Developmental English courses
o Additional project detail is contained in Appendix E.
Pre-English 101 Writing Portfolio Standards
o Faculty in English and ESL, both associate and full-time, met in Winter and Spring
quarters to discuss issues regarding placement, assessment, and advancement for
students entering English 101 from both English 100 and ESL 100 classes.
IEAR, 8/01
3



o One meeting as held to discuss “Outcomes Assessment at Alverno College.” The
project manager had a unique opportunity to visit Alverno and gather information on
their distinctive methods of evaluation; the purpose of this meeting was to discuss
ways in which SCC faculty could apply these methods to our English and ESL
courses. The future of this project was also discussed, and a need for the project to
continue next year was expressed.
o Faculty participating in these discussions indicated a strong need for them to continue,
as a way of improving our service to students preparing for college-level work.
o Additional project detail is contained in Appendix F.
Developmental ESL Assessment
o The development of reading assessment tools for Academic ESL courses is an
ongoing project. The tests have not been implemented officially, so they have not
made a program-wide contribution.
o At the end of winter quarter 2001 ESL Reading Assessment Committee members
field tested assessment tools which had been developed during the quarter. Four
assessment tools were developed (one for level 5, one for level 6 and two for level 7).
All but one of the tools were field tested, some in more than one section of a level.
o For the committee members who developed the tools, field tested them, and debriefed
about them at the beginning of spring 2001, the experience raised awareness of issues
around the testing of reading and the difficulty in interpreting scores.
o The project has set the ESL program on a course toward putting uniform reading
assessment in place.
o Additional project detail is contained in Appendix G.
English 101 Composition Study
o The composition study allowed English faculty to “norm” papers with each other and
compare evaluation methods using a common assessment rubric.
o Because assignments were attached to essays submitted, the composition study
allowed English faculty to view each others’ assignments to gain an indication of the
various kinds of assessment used to assess the skills outlined in the writing rubric.
o Because of some of the results revealed in this ongoing project related to ESL
students, the study fostered discussion and interaction between ESL instructors and
Developmental English faculty. These faculty held a common meeting to discuss
evaluation issues and to evaluate sample papers.
o The study has fostered department discussion on the possible need for a grammar
class to accommodate mechanics issues revealed by the data. We have a grammar
class on the books and the rejuvenation of it, possibly in an online fashion, is being
examined.
o Additional project detail is contained in Appendix H.
Outcomes and Assessment in Macroeconomics and Microeconomics
o These two projects were different in that each produced a completed list of learning
outcomes unique to each subject. These two projects were similar in that they both
required a collaborative effort from already-busy Economics faculty to evaluate
assessment tools created last year.
o I am pleased to tell your committee that both projects, “Quantitative Reasoning
Outcomes and Assessment in Microeconomics” and “Analytical Reasoning Outcomes
IEAR, 8/01
4


and Assessment in Macroeconomics” were completed successfully, with
accomplishments greater than anticipated.
o Additional project detail is contained in Appendix I.
Math Learning Center Evaluation
o There were two primary goals of the questionnaire:
--Discover ways we can improve the services of the Math Learning Center
--Learn whether there are ways we can better describe and promote the services of
the Center.
To a good extent, the message of the questionnaire was "stay the course".
SERVICES OF MLC
o Almost all of the surveyed students said the current MLC hours are convenient.
Fewer than 10% of the respondents suggested additional times, and there was no
agreement among them as to when the times should be.
o Only three of the infrequent users did not feel well treated. Only five of the more
frequent users felt they received the services they needed only "sometimes".
o Usable suggestions:
--Keep well enough staffed so that learning assistants have more time.
--Obtain student editions of text books.
--Work harder with learning assistants to be sure they are imparting accurate
information.
--Raise the language standards for hiring ESL learning assistants.
--Better explain why instructors sometimes do not want us to assist with group
projects.
o There is no prevalent negative stereotype of those who use the MLC.
o Several students who had never used the MLC perceived it as a place where students
go when they need help.
o Usable suggestion:
--Let students know that many use the MLC as a place to work alone or with
classmates.
o Additional project detail is contained in Appendix J.
College Readiness Conference
o At the Fall College Readiness Retreat, Dutch Henry, Sean Rody and Pam Dusenberry
worked on global assessment rubrics for reading, writing, study skills and student
responsibility for the end of a developmental program. These rubrics can be used to
assess students' readiness for college work in each area at the end of their final
developmental courses. The Developmental English faculty at Shoreline use the
rubrics to ensure that students are developing abilities and skills deemed important by
developmental educators from across Washington State. The rubrics are appended to
this document.
o At the Winter Retreat, the group of developmental educators created rubrics for
specific assignments in their classes that assess the college readiness outcomes.
Shoreline faculty presented their work on an English 100 Reading Rubric from last
year's outcomes project and developed a rubric for a reading process assignment used
in Developmental English classes at Shoreline; faculty are now using the rubric to
IEAR, 8/01
5


help students develop reading skills and find it quite helpful for giving detailed
feedback to students about their reading. The assignment and rubric are appended
here.
o Additional project detail is contained in Appendix K.
CNC Technologies Workplace Assessment Review
o All of the students were eager to participate in the study, and volunteered their
time to answer all the questions. The educational improvements made during the
course of the study consisted of a greater awareness about students' needs. For
example, on the job they felt they needed to be trained deeper in various specific
areas, such as extended lab floor time, more advanced projects to plan and
program. I feel as a group we tailored the second quarter toward these requests
and a positive result was evident. The students feel they have a voice in their
education. Also we held more shop floor machine tool demonstrations, and
practiced longer on the shop floor utilizing the skills taught in conjunction with
the classroom. In addition, the schedules of the three separate classes are
organized to create smaller class sizes on the shop floor at one time, which was a
major concern of most of the students.
o Additional project detail is contained in Appendix L.
Assessment of Social Functioning as a Learning Outcome in Education Courses
o Preparation for this assessment required a search of the literature to develop
resources on classroom methods for improving social functioning and
measuring the result as a learning outcome. Journals and periodicals were
examined to extract lessons for classroom techniques to enhance student
social functioning. During the courses students demonstrated improved
competencies in social functioning through their insight on environmental
factors that affect learning, and through their interaction with one another.
Reflection papers and classroom assignments were used to document this
improvement. The literature research provided information on several
research efforts that evaluated social functioning in community college
programs. This information was used as a basis for determining learning
outcome goals that would provide evidence of improvement in this core
competency. Additional project detail is contained in Appendix M.
C. Current Issues/Concerns
 General Although considerable progress has been made in attracting new faculty
participants in assessment development, there remains a great deal of work to be done to
complete the Shoreline Strategic Plan requirement for a well-defined assessment
procedure for each course in the College Catalog. A campus-wide examination of the
status of assessment in each course, to be conducted during 2000-01 as a part of the
Institutional Self-study, should assist in providing a map for completing this requirement.
Additional challenges to be addressed are assessments related to transfer degrees and
general education requirements.
 World Languages Assessment
IEAR, 8/01
6



o In several of our language programs there are no full-time faculty. Both Chinese and
German are taught exclusively by associate faculty and French is covered by an
associate faculty member and one 2/3 pro rata faculty member. Fortunately, our
French professor was able to participate in this project, but Chinese and German
representatives were not. Finding available personnel and funding to adjust the
standardized rubric to these languages’ needs is a concern.
Portfolio Assessment of Outcomes in Visual Arts
o Graphic Design Sequence:
Implementation of first year portfolio review and enforcing the review as a
requirement for advancement to second year design and graduation. Can we use the
review process to hold back under-achievers or suggest that they not continue in the
program? Who will be on the review committee? How will this process work to place
students coming from other institutions? Can students resubmit portfolios right away
or must they wait until the next quarterly review? Are reviews held every quarter or
only each Spring quarter?
o 3D Art and Design Sequences:
3D projects are necessarily large and the submission of portfolio projects for
assessment would require a substantial amount of storage and studio space. Portfolio
assessment should include 3-Dimensional presentation of work, but in the case of
transfer students, the work is often represented as a 2-dimensional display. Resolution
of this issue is pending further discussion by the faculty. A structure of pre-requisites
for the progression of courses in the 3-D art and design sequence is not in place and,
therefore, progression in the sequence can not be controlled by assessment.
o Photography Sequence:
It is hoped that improved assessment will result in improved student communication
skills, provide a meaningful measure of student progress toward educational goals,
and establish clear goals for the educational material covered in each class.
Student Retention/Completion and Student Interest
o Since this project is ongoing, it is difficult to make any conclusions at this time. All
incoming and current students in the automotive program were asked to take the
Kuder Assessment test. Their interest scores were entered into the database, and over
the next few years, we will track the students to see if they are still working as
technicians in their original sponsoring dealerships.
o Microsoft Access is a powerful and complicated program. We found that mastering
its functions went beyond the scope of this project. Future development of this
database may require the expertise of someone with specific Access database training.
College Literature Retreat
o Our English program is revising our MCO’s in the midst of the formulation of new
General Education Outcomes and the impending accreditation process.
o Due to the recent hiring of new faculty, we need to clarify outcomes in literature
courses and clarify our teaching and educational backgrounds in regard to literature.
We also need to clarify the future direction of the department in regard to its literary
curriculum.
IEAR, 8/01
7



o In the anticipation of working with our new Division Dean, the transition would be
made easier if we as a program had a more clearly defined identity and set of goals.
Reading Outcomes and Assessment-II
o English faculty are concerned about the reading skills and abilities needed across
campus and the connections between the assessments used in English courses and
their applicability to other curriculum.
o English faculty are concerned about the reading skills and abilities of ESL students
and the challenges they face in reading textbooks and other materials in content area
courses.
o English faculty are interested in exploring the possibility of providing students with
information regarding the reading challenges of reading intensive courses across
campus (i.e. information in the class schedule that would identify certain courses as
“reading intensive,” reading test score prerequisites for specific courses).
o As project coordinator, I face a significant challenge in sharing program level
standards and recommendations with an ever-changing and time-restricted group of
part-time faculty.
Pre-English 101 Writing Portfolio Standards
o ESL and Developmental English faculty, both working toward the same goal of
preparing students for college-level English coursework, have no formalized
interaction regarding their programs(which this project provided). Under the
circumstances, it makes little sense for both programs to proceed autonomously.
o Placement emerged as a major concern. Students can elect to take the ASSET when
the ESL placement test might be more useful. The ASSET as we apply it has no
bottom score for placement in English 080, the lowest level of Developmental
English we offer; were the same students who placed into English 080 to take the
ESL placement test instead, they might be, appropriately, placed into a lower level of
ESL. This intake problem may have a profound effect on the students’ ultimate entry
into English 101, as Developmental English teachers tend to focus more holistically
on reading and writing fluency, where a student with ESL issues would benefit more
from structured grammar study, which is generally more of a focus in ESL courses.
o There is some level of suspicion that operates between ESL and English faculty; the
level of blaming and finger-pointing that arose in the first meeting shows that many
individuals are not prepared to work as if we share common goals. This is
discouraging.
Developmental ESL Assessment
o One issue which teachers of academic ESL face is the difficulty in assessing students
in different courses in a consistent way. The hope is that the development of standard
reading assessment tools which can be used by all teachers of the same level may help
to remedy the situation.
o Another issue, perhaps more challenging, will be to persuade teachers who did not
participate on the committee that the tests are valid and reliable and that the “passing
scores” are fair.
o Another problem in using standard assessment tools throughout the program will be
the security of the tools. The tools will become useless if teachers neglect security.
IEAR, 8/01
8
o The tools have been developed. However, the committee cannot endorse the tools or
the suggested “passing scores” until the tools have been field-tested on a larger
sample of students. That field testing is continuing during spring 2001.
o Even when the committee has field-tested the tools, there may be resistance to using
standard tests. Teachers may disagree with the content areas included in the readings,
may dispute the appropriateness of the reading level of the material, or may argue that
multiple choice tests are not valid or are harmful to students. The committee reached
consensus on these issues, but not all ESL faculty members were included on the
committee. It is not in keeping with the traditions of the ESL program or the concept
of academic freedom to require the use of any particular standard final exams.
Therefore, it may be that the tools will never be more than “recommended” for use.
The best case scenario is that the entire program enthusiastically embraces the idea of
standard assessment tools and the particular ones developed by our committee.
 English 101 Composition Study
o With 60 percent of the English instruction being delivered by part-time faculty, we
face a significant challenge in making assessment methods more consistent in the
department and in broadening the discussion to include all.
o Our department is so large and diverse that it takes a long time to steer toward change.
We cannot be compared to a business model in which a manager makes a top-down
decision and things happen right away. Therefore, change happens more slowly than
is sometimes expected.
o Writing by its nature is difficult to assess because of its complexity and because of the
variety of perspectives that exist, even in our field, about the nature and purpose of
writing and writing instruction.
o We need to develop better reading assessments for transfer composition.
o The college needs better support for basic data tasks such as scanning, acquiring basic
results from scanned forms, etc.
 Math Learning Center Evaluation
o The members of the Math Department are currently working on achieving unity as
they reconsider the Master Course Outlines in preparation for the upcoming
accreditation. We feel that we have dropped our standards over the last few years and
we want to raise them back up again. The primary problems are 1) What should be
covered in each course and 2) What should we do about checking and enforcing
prerequisites.
College Readiness Conference
o A quite ticklish problem is how to assess student achievement of general education
outcomes in consistent ways across the curriculum.
o Another issue or concern is in the consistency of assessment of students' reading and
composition abilities in English 101 (Freshman Composition). The majority of these
classes is taught by part-time instructors; a process is needed for ensuring that all
instructors of English 101 are judging students with general consistency.
 CNC Technologies Workplace Assessment Review
o Within the duration of the survey we learned that most felt that a two year
machinist program would benefit them. One of the main reasons is the amount of
material that is covered in the 3 to 4 quarter current certificate program. They are
IEAR, 8/01
9

expected to be competent in math as far as basic trigonometry, when most of the
students are only marginally proficient or exposed to basic math skills, although
there are some exceptions. In addition the same problem is occurring within the
computer data entry and downloading programs portion of the class. The data
collected makes it clear that there is a demand for a two-year program and there
are students who would participate if the program existed. The Advisory
committee has been requesting this program for the last few years as a way to
better prepare the machinist for practical hands on knowledge to accomplish the
required skills needed to work in their shops. Mathematics should be taken in the
Math department as well as computers hence alleviating the instructor's time spent
in the classroom and utilizing the machinery for longer periods of the day,
therefore benefiting the hands on training. The machinist instructors can focus on
blueprint reading, shop theory and programming skills. This ties in with the next
point, which is most of our students are unemployed and seeking re training. This
is a growing trend, and the industry is in a lull right now due to the lack of out
sourcing by the Boeing Company. There are not many jobs for green machinists
that pay very well, and most want 5 years experience, which is a tall order after 3
quarters of machinist training. There is just too much material to be covered in too
short of a time, with too little practical experience.
Assessment of Social Functioning as a Learning Outcome in Education Courses,
o We faced a significant challenge in developing curricula that could educate
our students on how to understand the impact of cultural factors of an
increasingly diverse population of future students that they plan to teach.
Although textbooks on becoming a teacher are plentiful, textbooks on the
realities of the broad differences in school systems are not as plentiful. We
established key results areas for social functioning based on learning
outcomes cited by some of the researched literature. We were challenged to
develop classroom exercises and assignments that would require periodic
reflection on how environmental factors affect learning outcomes of students.
It was often difficult for students to understand the effects of environmental
influences on learning, and that the populations that they would serve in the
future was likely to be significantly more diverse than their current
educational experiences. Student attendance and participation proved to be
crucial factors in assessing this competency.
IEAR, 8/01
10
D. Budget Summary
CATEGORY
Salaries and benefits:
1) assessment liaison
2) institutional researcher
3) clerical support
4) other (please specify)
5) total salaries/benefits
AMOUNT SPENT
(indicate NA for “not applicable”)
1) (see below)
2) $17,500.
3) $2,000.
4) N/A
5) $19,500
Assessment project costs (faculty
stipends/reassigned time, minigrants, instrument costs, scoring
costs, etc.)
$33,075.
Professional development costs
(travel, consultants, workshops,
resource materials, etc.)
$3,496.
Support costs (supplies, printing,
postage, etc.)
Other: (optional)
IEAR, 8/01
$140.
NA
11
NOTES/COMMENTS
2)also serves as liaison
3) Training OA3 in
scanner operation and
scan form
development.
APPENDICES
1
APPENDIX A
World Languages Assessment
Project Manager & e-mail address: Amelia Acosta, aacosta@ctc.edu
Project Overview:
The purpose of this project was to standardize the assessment of oral and written work among the three languages
and all levels taught by the participants. Three professors participated: Amelia Acosta (Spanish), Patricia Cooke-Tassone
(French), and Mayumi Steinmetz (Japanese). We held four meetings on campus, but also completed work on our own. To
address the assessment of written work, each participant presented how she currently assesses compositions and the
rationale behind using this method. Comments and feedback from the other two were shared and discussed. We chose one
of the presented rubrics because it is useful for all languages and the grading system is easy to implement, as well as
covering all necessary aspects of written assessment. The other two participants created versions of this rubric for their
languages and experimented with them during Spring Quarter. We have created a standardized grading system for takehome compositions that satisfies each languages’ needs.
Each participant also presented the rubric she uses to assess oral ability. We discovered that each of us uses
different student tasks (oral interviews, skits/oral performances, or some combination of both) to assess speaking ability
during, or at the end of, the quarter. Also, different grading methods are used (ranging from global grading practices to very
specific scoring in various categories). These different systems, however, result in basically the same grade distributions;
the same number of students receive A’s, etc. with each method. We did discover, however, that across different languages
our students do not have the same oral abilities after finishing the same level. For example, after the first year series of three
courses, Spanish and French students are expected to carry on relatively complex conversations using most existing verb
forms. Students of Japanese will not have oral abilities quite so advanced after the first year, since much time is spent on
learning written Japanese symbols (kanji). As there does not appear to be as much grading disparity between languages in
oral assessment as there is in written assessment, as the participants have such varied preferences for oral tasks (interview
vs. performance), and as students across languages will not have the same speaking abilities, we decided not to create one
standardized rubric to be used in all language courses. Each language program will create their own rubrics. Making this
decision was very difficult for us, since we began the project assuming that a standardized oral rubric would be the result.
Materials:
Included with this report are samples of the written assessment rubric from each participating language. While
very similar, each is slightly different in the specific grammar and vocabulary emphasized. All are presented in a chart
format. The left column lists the particular items being graded. The other columns describe what qualifications a particular
composition must meet to receive the score at the top of that column. The Japanese, Spanish and French rubrics are
included in this document.
Sample Japanese Rubric
J211
Chapter 1 Writing Assessment Rubric
すばらしい!(9ー10)
Content
いいですね。(7ー8
)
Provides accurate and/or
relevant information with
some details.
Some information
regarding the description of
hometown and its climate,
advice about clothing,
Provides accurate, relevant
information with details.
Detailed information includes
the description of hometown
and its climate, advice about
clothing, and activities that can
IEAR, 8/01
2
もうちょっと(5ー6)
Information may not be
accurate and/or relevant.
Information may not include
the description of hometown
and its climate, advice about
clothing, and/or activities that
be done.
Organization
Organized, definite transition in
topics, making the reading flow
smoothly.
Language control
Express information clearly and
precisely using the targeted
structures. Correct use of
particles, tense, and
conjugation.
Ch.1 Sentence Structures
ーている、く/になる/す
る、そうです/かもしれな
い/でしょう/かな/かし
ら、ので
Vocabulary from
Chapter
Has vocabulary to accomplish
all task requirements.
Characters (Kanji,
Katakana, &
Hiragana)
Used learned kanji, katakana &
hiragana correctly throughout
the writing.
(More than 90% of correctness)
3
and/or activities may be
missing or not detailed.
Fairly organized, good
transition in topics.
can be done.
Express information clearly
using the targeted
structures. Some errors in
particles, tense, and/or
conjugation.
Express information but lacks
fluency. Errors distract from
comprehension.
Ch.1 Sentence Structures
ーている、く/になる
/する、そうです/か
もしれない/でしょう
/かな/かしら、ので
Has vocabulary to
accomplish the majority of
task requirements.
Used the majority of
learned kanji, katakana &
hiragana with some errors.
(More than 70% of
correctness)
Nothing in order, appears
thrown together, no
transition.
Ch.1 Sentence Structures
ーている、く/になる/
する、そうです/かもし
れない/でしょう/かな
/かしら、ので
Has some but does not have
vocabulary to accomplish
several task requirements.
Used some learned kanji,
katakana & hiragana with
some errors. Not consistent
throughout the writing.
(Less than 70% of
correctness)
Sample Spanish Rubric
Chapter 3, Self-description
Content
Organization
Language
Control
100 – 90%
Provides accurate, relevant
information with details.
Detailed description
includes:
1) identifying info. (eg.
name, nationality, etc.)
2) descriptives
3) likes/dislikes
4) daily activities
Well organized, definite
transition in topics, making
the reading flow smoothly.
Appropriate introduction
and conclusion.
Expresses information
clearly and precisely using
the targeted structures.
Correct spelling (including
accents/tildas) and usage of
the following grammatical
structures:
1) adjectives
2) numbers
3) present tense
4) articles
5) Ser and estar
Vocabulary:
prelim. – ch. 3
Demonstrates mastery of
the vocabulary needed to
accomplish required tasks.
Creativity,
Risk-taking
Very creative, takes many
grammatical or structural
risks.
IEAR, 8/01
89 - 70%
Provides accurate, relevant
information with some
details.
Some details regarding:
1) identifying info. (eg.
name, nationality, etc.)
2) descriptives
3) likes/dislikes
4) daily activities
Fairly organized, good
transition in topics.
69 – 50%
Information may not be
accurate or relevant.
Composition may not
include:
1) identifying info. (eg.
name, nationality, etc.)
2) descriptives
3) likes/dislikes
4) daily activities
Appears thrown together,
no transitions.
Introduction and conclusion
may not flow or may be
insufficient.
Expresses information using
the targeted structures.
No introduction and/or
conclusion.
Some errors in spelling
(including accents/tildas)
and usage of the following
grammatical structures:
Errors in spelling
(including accents/tildas)
and usage of the following
grammatical structures
distract from, or impede
comprehension:
1) adjectives
2) numbers
3) present tense
4) articles
5) Ser and estar
Demonstrates mastery of the
majority of vocabulary
needed to accomplish
required tasks.
Creative, takes some
grammatical or structural
risks.
4
Expresses information, but
lacks fluency.
1) adjectives
2) numbers
3) present tense
4) articles
5) Ser and estar
Does not demonstrate
mastery of vocabulary
needed to accomplish
several tasks.
Not creative, takes no risks.
Sample French Rubric
NOTE: Rough draft and final copy must be typed and double-spaced. There is no minimum length,
although your essay must be long enough to fulfill all requirements. Please circle the 150th word.
Rough draft will be due on Tuesday, Feb. 6; final copy will be due 2 days after you receive your rough draft
with feedback from me.
Chapitre 2 Writing Assessment Rubric
Subject: (Choose 1)
a. Autoportrait (Write about yourself.)
b. Une personne importante (Write about someone else.)
Provides accurate, relevant
information with details.
Content
Organization
Language
Control
Detailed description of person
includes
* identifying info (eg. name,
nationality, marital status,
domicile, etc),
* descriptive info (short, tall,
etc.),
*likes and dislikes, and *daily
activities.
Provides accurate and/or
relevant information with
some details.
Some details regarding
* identifying info (eg.
name, nationality,
marital status, domicile,
etc),
* descriptive info (short,
tall, etc.),
*likes and dislikes, and
*daily activities
may be missing or
insufficiently detailed.
Well organized, definite transition in Fairly organized, good
transition in topics.
topics, making the reading flow
smoothly.
Information may not be accurate and/or relevant.
Composition may not include
* identifying info (eg. name, nationality, marit
domicile, etc),
* descriptive info (short, tall, etc.),
*likes and dislikes, or *daily activities.
Appears thrown together, no transition.
Appropriate introduction and
conclusion.
Introduction and
conclusion may not flow No introduction and/or conclusion.
or may be insufficient.
Expresses information clearly and
precisely using the targeted
structures.
Expresses information clearn
using the targeted structures. Expresses information but lacks fluency. Errors distrac
comprehension.
Correct use of verb
conjugation, agreement,
spelling (including accents),
etc.
CH.2 STRUCTURES:
*adjectives
*definite articles
*-er verbs
5
Some errors in verb
conjugation, agreement, CH.2 STRUCTURES:
*adjectives
spelling (including
*definite articles
accents), etc.
*-er verbs
CH.2 STRUCTURES: *être
*adjectives
*adverbs (e.g. bien, assez, un peu, etc.)
*definite articles
*-er verbs
*être
*être
*adverbs (e.g. bien, assez, un
peu, etc.)
*adverbs (e.g. bien,
assez, un peu, etc.)
(EXTRA: various types of
yes/no questions, if you can
incorporate them into your
composition.
Vocabulary
from
Has vocabulary to accomplish all
chapters Prél task requirements.
-2
Has vocabulary to
accomplish the majority of
task requirements.
Other:
Picture
Includes simple drawing to
illustrate composition..
Includes relevant photo and/or
detailed drawing to illustrate
composition.
<<Return>>
IEAR, 8/01
6
Has some but does not include and/or correctly use the
accomplish several task requirements.
No illustration is included.
APPENDIX B
Project Title: Portfolio assessment of outcomes in Visual Art and Design
Project Manager: Bruce Amstutz bamstutz@ctc.edu
Part 1
Graphic Design Sequence:
All four members of the graphic design faculty (Mike Larson, Beth Halfacre, Sigrid Cannon and Bob Hutchinson) met for a
total of 15 hours each to discuss general program issues including grading policies, exit portfolio assessment requirements,
curriculum consistency, overlapping concept areas etc. The main focus of the group was on portfolio reviews–their function
and structure.
The graphic design faculty rarely meets more than once a year. These meetings were a great opportunity to discuss and
update the design curriculum. We addressed a variety of items including integration of design theory with computer skills
(how much and when); students needs and expectations; the state of the graphic design industry and how well our students
are meeting entry level requirements for it; design programs at other institutions.
We had much success in evaluating the six-quarter design series with the intent of rewriting the master course outlines. Our
main focus of attention, however, was on the development of a new first-year portfolio review. We decided that the first
review should concentrate on concepts rather than polished solutions. We agreed, wholeheartedly, that students weren’t
exploring visual ideas in enough depth to go beyond the predictable and the hackneyed. The group felt that the quality of
finished work would be much improved if students knew that their concept sketchbooks and research materials were going
to be used as review materials in the portfolios.
We agreed that implementing the review process and enforcing the resulting recommendations would
be challenging. We need to establish a review structure that includes options for these students whose
concept design work is not meeting our standards.
3D Art and Design Sequences:
Professor Mike Larson, in consultation with other associate faculty instructors teaching in 3D areas, spent 45 hours to
review the mastercourse outlines for courses in the 3D Art and Design sequences and to rethink the relations of the class
activities, learning outcomes and assessment activities in these courses.
The specific purpose of the project was to establish a common tool for evaluating student performance course by course for
3-D work. The content and outcomes for each course were defined, a consistent form of portfolio assessment was developed
for each course, and consensus was established from faculty responsible for teaching in the 3-D sequence of courses.
Particular challenges included problems of storage when actual 3-D projects are presented for portfolio assessment and the
difficulty of assessing 3-D work when the portfolio consists of 2-D representations in the form of slides, prints or electronic
media.
Photography Sequence:
Originally associate faculty photography instructor Don Metke was to work together with professor
Chris Simons to review and update the outcomes in the photography sequence of courses and to
develop portfolio assessment tools for them. Due to a family emergency Don was not able to
participate and Chris Simons invested 20 hours to complete this project himself.
As class size has increased in the photography program, communication and feedback between instructor and students has
diminished. The project focused in large measure on identifying appropriate critique methodologies to assist critical
dialogue in the class. In order to assess levels of understanding and learning of one hundred and sixty students, a
multilayered written and oral critique package was developed for the photography courses.
A.
Part 1 – continued
7
The development of a language and process of assessment of student work, and of photographic images in general, assisted
both in the communication and dialogue in the class and in the consolidation of this language into a hardcopy assessment
tool. The assessment of student work is clarified, and students learn to better assess each other’s work and photographic
images in general.
The challenge faced by both the instructor and the students was to identify and utilize appropriate verbal language to assess
photographic images, as the written or spoken word itself is limited when applied to the visual language of the image. The
development of a list of terms and concepts useful for discussing visual composition and imagery provides a somewhat
unstable and incomplete but, never-the-less, usable vehicle for describing the encounter with the visual.
B. Part 2
Summary of documents
Hardcopy documents are available for review from the project coordinator, Bruce Amstutz
bamstutz@ctc.edu
Graphic Design Sequence:
 Summary of first year and graduation portfolio requirements
 Portfolio assessment tool – first year
 Portfolio assessment tool – graduation
 Summary of outcomes, competencies and activities in graphic design curriculum
 Supplementary material detailing outcomes, competencies and activities in graphic design curriculum
3-D Art and Design Sequences:
 Course description, concept development and practicum sheets for Art 110 (3-D design); Art 253,254 and 255 (3-D
design and materials sequence); and Art 272, 273 and 274 (sculpture sequence)

Portfolio assessment tools for Art 110, 272, 273, 274, 253, 254 and 255.
Photography Sequences:
 Photography portfolio assessment tool
 Oral critique assessment tool
 Peer assessment tool
 Example of student self-assessment
 Composition vocabulary for discussion of visual images
<<Return>>
IEAR, 8/01
8
APPENDIX C
Project Title: Student Retention/Completion and Student Interest
Project Manager: , Ken Campbell, Mark Hankins, kcampbel@ctc.edu, mhankins@ctc.edu
Overview:
The purpose of this project was to create a database of student information for the automotive training department at
Shoreline Community College. This information will be used for improved recruitment, advising and retention of students
who enter the automotive programs.
The project began with research into an appropriate interest inventory test for current and incoming students. It was
determined that the Kuder Career Assessment test would be a valid and appropriate measure of student interest.
100 tests were purchased and currently enrolled students were encouraged to participate in the test. Since the Kuder is
available on-line, it was ultimately up to the student to complete the test. All students were given an access code and
password. 24 of 55 students actually completed the test for our database. For a more consistent database, we are considering
a requirement of all incoming students to take the test as part of their application procedure.
Two automotive faculty members worked with the close assistance of Technical Support Services to develop a Microsoft
Access database. After the database was built, the two faculty members input student information from college transcripts
and the Kuder Career Assessment website.
Learning how to use Microsoft Access turned out to be the most time consuming challenge of this project. While there is
much more to learn about database development, the current version already has proven itself to be of even more usefulness
than originally anticipated.
Our initial purpose in establishing this database was to see if students who remain with their sponsoring dealership, tend to
have a high “mechanical interest” rating on the Kuder. However, we also found the database to be a valuable tool for
assessing where students come from and where they go after they graduate from our program.
We also use the database to track each student’s completion of the academic requirements for the Associate of Applied Arts
and Sciences Degree.
It is assumed that if a student scores a high interest in mechanical, that they will be more likely to remain in the automotive
repair industry than students scoring with lower mechanical interest. We also assume that student’s applying to an
automotive repair program would have a very high interest in mechanical. However, we found that only 58% of the students
taking the Kuder test have “mechanical” scored as a top first or second priority interest.
Since we can only create the database from current students, it will take some years to determine if this 58% have a higher
likelihood of being retained within the automotive repair industry. However, there is one interesting correlation:
approximately 60% of the students who have completed the Honda PACT program over the last ten years at Shoreline are
still working in the industry. The question arises, if given the opportunity, would these students have scored a high degree of
mechanical interest?
As written above, the database is proving to be a powerful tool in ways other than tracking student interest scores. We will
utilize the database to manage these other kinds of information:

Student demographic information
9



General education course completion
Prior experience and referral information
Post - graduate information
There is still much more to learn in terms of database construction and this will be an ongoing project. The information
compiled through this assessment project is an excellent beginning to a powerful management tool that faculty and program
directors will use to more effectively manage the recruitment and advising of students who are interested in the automotive
repair industry. Through this effective management, we will also better serve industry by providing graduates who are more
informed about their chosen career and who have a greater likelihood of long-term commitment to the industry.
Materials:
Attached to this report is the database created in Microsoft Access. This database can also be accessed on the Rizzo ‘S’
drive through the Shoreline Community College network.
<<Return>>
IEAR, 8/01
10
APPENDIX D
English Department Retreat
Project Manager & e-mail address:
Du-Valle Daniel, ddaniel@ctc.edu
Neal Vasishth, nvasisat@ctc.edu
Purpose: This project served as an opportunity for the English Department at Shoreline CC to define
and focus its literature and creative writing offerings. Our faculty members engaged in a constructive
and creative dialog between faculty in an effort to meet the following objectives:
1. Define ourselves as a department, by agreeing to common goals and values.
2. Identify areas of literature and creative writing that we believe should be offered, but are not
currently being offered.
3. Identify areas of literature and creative writing that are being offered but
we no longer believe fit into our overall vision for our department.
4. Assess the strengths and weaknesses of our present full time faculty in
terms of educational background and interests, in regard to teaching
literature
5. Create a document which identifies shared outcomes, to be used as guidelines for MCO
preparations.
6. Determine what courses we want to teach, how we wish to teach them
(i.e. by periods, or thematically, etc.), and how often each course should
be offered within a one or two year period.
7. Clarify the criteria for teaching assignments for literature classes.
All eleven members of the English full-time faculty attended the retreat. Each participant received a
copy of a detailed agenda which is attached in our report. The agenda lists and explains the goals of
each session. In some sessions, we worked in small groups, and in others we worked as a whole.
Although we didn’t accomplish everything, the retreat was very successful. Most of the goals, at least
in draft form, were accomplished. The main challenge was simply that we had many participants with
many opinions and perspectives. Making decisions about changes in the department and making sure
everyone’s voice was heard was not easy, but working through that proved to be very beneficial. One
of the greatest successes of the retreat was the dialogue that we as a department engaged in. As a result
of that dialogue, we were able to make a shared value statement, we decided to re-orient our
curriculum, making our course offerings reflect a more global world-view, and we made a draft of
outcomes that would apply to all literature courses.
C. Appendix 2a
11
SHORELINE COMMUNITY COLLEGE
ENGLISH DEPARTMENT’S
LITERATURE RETREAT
MAY 18 & 19, 2001
Participants
Paul Cerda
Du-Valle Daniel
Pam Dusenberry
Shalin Hai-Jew
Ed Harkness
Dutch Henry
Location
Kathie Hunt
Pearl Klein
Gary Parks
Grace Rhodes
Sean Rody
Neal Vasishth
Trinity Lutheran College
4221 228th AVE SE
Issaquah, WA 98029
425-961-5572
Emergency Messages:
425-392-0400
agenda:
Friday, May 18, 2001
Check-in Time
12:00-2:00
Session #1
Defining Ourselves
2:00-5:00 p.m.
In this session, we will look at our present offerings, enrollment, past and present, and at area
colleges’ and universities’ expectations of our transfer students. We will then explore how we want to
design ourselves as an English department, within the parameters of our academic requirements.
Dinner/Cafeteria Style
5:45
Session #2
Assessing Ourselves
7:00-9:00
In this session, we will just get to know each other better. We will talk about our backgrounds,
previous teaching experiences, interests, etc. We will discuss different approaches to analyzing
literature, using a common piece of literature as a basis for our discussion. This session will have a
casual format, with snacks, and can lead directly into a social time. All participants are asked to
bring a snack to share (optional).
Saturday, May 19, 2001
Breakfast/ Continental Breakfast
IEAR, 8/01
8:00
12
Session #3
Revising the MCO’s
9:00-12:00
In this session, we will discuss MCO changes, in relationship to the new General Education Outcomes
and based on our discussions in sessions 1 and 2. We will attempt to create a model MCO, that can be
used for future MCO revisions, with a little modification.
Lunch/Cafeteria
12:15
Session #4
Discussing Department Interests and Concerns
1:00-3:00
Several issues that affect our English Department will be discussed in this session. We will begin with
issues specifically related to literature and creative writing courses, but we may eventually move into
more general areas of interest. Issues such as work load assignments, summer quarter loads,
associate faculty teaching loads, creating an annual schedule, etc., will be discussed. For this session,
if you have specific concerns that you would like to bring up, please feel free to do so at the
appropriate time.
Special Note:
Meals and lodging will be paid for already. There should be no out of pocket costs. All participants will be paid $35.00 per
hour, for up to 10 hours, for participating in the retreat.
A map and directions are attached to this agenda. If you have any questions, please call Du-Valle or Neal.
Appendix 2b
Value Statement
As an English department, we have agreed upon the following values that will be used to guide us in
our establishment of our individual, as well as, departmental goals.
1.
We believe that Literature is a unique and valuable discipline that enriches our
minds as well as our hearts.
2.
It is important that we, as a department, ensure that our offerings reflect a diverse
group of authors, representative of the many different cultures of the world in
which we live.
13
3.
We intend to include authors that have been traditionally included in the
literary "canon," as well as authors that have been excluded from the "canon,"
and in doing so, we wish to help students learn to value and appreciate all the
authors as valuable contributors to our literary canon.
4.
We believe that authors should be presented to students within their cultural,
historical and literary context.
5.
We value our students learning how to analyze, evaluate and critique literature
using multiple approaches, including, but not limited to, the formalist approach,
the new criticism approach, and the psychological, biographical, historical and
economical approaches.
6.
It is our desire that students succeed in both the academic world, and the world
the world outside of academia. With is in mind, we wish to encourage our
students to be curious, to ask questions and be bold enough to offer answers.
7.
We believe that our students should not only read and analyze literature, but
that they should experience it and allow themselves to be transformed by it.
Students should actively engage in creating literature and their imagination
should be stimulated and developed by the study of literature.
8.
We value the quality of the members of our faculty, and the trust that we have
for one another, which allows us to balance our academic freedom with our
commitment to upholding our share values.
D. Appendix 2c
LIST OF CLASSES TO BE CONSIDERED FOR THE FUTURE:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
Contemporary Women's Literature
World Mythology
Spoken Word Class
Best Sellers
Caribbean Literature
Regional Latin American Literature
Colonial Literature
African-American Renaissance
African, African-American and Caribbean Literature
Goddesses
Pacific Islander Literature
Minority Literature
Contemporary Poetry
Feminist Literature
Contemporary Fiction
Science Fiction
Asian-American Literature
IEAR, 8/01
14
18.
19.
Chinese-American Literature
Children's Literature
E. Appendix 2d
Literature Outcomes Brainstorm (May 19, 2001)
COURSE OUTCOMES AND COMPETENCIES
COMMUNICATION:














Evaluate their own writing as demonstrated through revision
Explicate in writing literary works
Make connections between texts (as demonstrated through discussion and writing)
Show deep understanding of text (as demonstrated through discussion and writing)
Summarize
Summarize the plot or basic situation/scenario of a work of literature. Identify main characters and events.
Use a general understanding of social, political, economic forces to explain/analyze literary works.
Analyze literature using historical context
Write expository papers analyzing works of literature using appropriate conventions. Write a logical and sound
analytical paper on a literary work while applying Toulmin logic.
Explain the literature devices and apply literary elements.
Use the elements of literature to analyze stories, poems and plays.
Talk in discussions about literature.
Orally present an analysis of a piece of literature.
Present findings about a piece of literature in a speech using various supplemental materials.
MULTICULTURAL UNDERSTANDING:
(These were moved to the “Global Awareness.”)
INFORMATION LITERACY:



Use computers for writing and research.
Differentiate between subjectivity and objectivity.
Research appropriately to bolster the understanding of a text. -*
GENERAL INTELLECTUAL ABILITIES:






Identify characteristics of genres of literature.
Show understanding of the literary portrayal of the basic human condition, that of loneliness—demonstrated
through discussion and writing.
Describe genres of literature.
Identify specific literary devices and analyze how an author uses them in a text.
Develop a thesis/interpretation of a literary text and support it, based on specific evidence from the text.
Analyze literature using a variety of critical perspectives.
15























Use theoretical approaches to the analysis of literature.
Develop and articulate an individualized criterion for evaluating the relative worth of a piece of literature,
incorporating (but not limited to) values and criteria discussed in the course.
Extrapolate and analyze the subtext from a piece of literature.
Gain universal and individual meaning from international literature.
Identify the various genres of literature and the elements that are particular to each.
Identify the formal convention of a piece of literature and articulate how these conventions help create meaning.
To identify and articulate specific authors, literary movements and literary traditions and genres of literature.
Write formal (analysis) or informal (reflective journal).
Apply a variety of literary approaches to the analysis of literature.
Discuss the societal, political, cultural and historical variables that influence the making of the traditional canon.
Analyze and articulate in writing and discussion the literary elements (e.g. theme, setting, point of view, character,
others).
Evaluate pieces of literature based upon agreed-upon standards of quality.
Read new literary works with an open mind.
Identify understanding of several literary theories as demonstrated through discussions and writing.
Place literary text in cultural/historical context.
Explain in writing the function of literary study, whether, for example, literature has inherent artistic value.
Articulate in writing or discussion the function of literature or value of literary study.
Students will compare their knowledge of different authors through writing and discussion.
Appreciate the value of literature as a repository of values, culture and history.
Compare and contrast texts, styles, literary devices from one text with those from another, from one author with
another, from one culture with those from another.
Apply literature to individual lives and experiences, and vice versa.
Show understanding of the political uses of literature as demonstrated through discussions and writing.
Articulate their own meanings of the world in which they live.
GLOBAL AWARENESS:
(Initially “Multicultural Awareness”)
 Articulate points of view about a text different from one’s own/
 Apply new strategies/approaches to coping with life which have been learned from literature.
 Identify out the context and history of a literary work and their influences on a literary work.
 Identify cultural biases of writers.
 Identify cultural biases of readers in a time period.
 Articulate views on the literary canon regarding the values of traditionally taught works versus ignored, underrepresented or marginalized texts and writers.
 Apply gender and cultural concepts in real-world and literary situations.










Students will display their appreciation for the literary productions of the world through articulation in discussion
and writing.
Students will apply their literary works to their lives and the diverse situations they constantly face.
Go see a play or poetry/fiction reading.
Extrapolate the religious and faith underpinnings in literary works.
Articulate the relationship between a piece of literature and its historical, cultural and literary context.
Describe the ways that social context is reflected in literature.
Decode texts for cultural messages and meanings.
Understand diverse cultural experiences through stories.
Explain in writing and discussion a multiplicity of perhaps opposing literary interpretations.
Compare elements in one or more literary works (e.g. theme, setting, character, but also cultural or historical
contexts).
IEAR, 8/01
16


Students will utilize their understanding of cultural, national and philosophical contexts
where literary works are created.
Show understanding of connections between literature and imagined community identifies (communities created in
literary works).
SPECIFIC COURSE OUTCOMES:








Students will be inspired by the literary works to create their own works.
Expose students to new authors.
Show a personal connection to literature through discussion and writing.
Students will articulate their understanding of forms, themes and genres in writing and discussion.
Make connections between literature and one’s own life, culture or social conditions.
Evaluate own approaches to coping with life based on other models presented in literature.
Evaluate their preconceptions as demonstrated through discussion.
Identify emotion associated with particular works of literature.
F. Appendix 2e
DRAFT OF GENERAL EDUCATION REQUIREMENTS OUTCOMES
COMMUNICATION
Students will read, write, speak in, and listen to college-level English. Effective communication incorporates
awareness of the social nature of communication and the effects of ethnicity, age, culture, gender, sexual orientation
and ability on sending and receiving oral, non-verbal, and written messages.
I.
Listen to, understand, evaluate and respond to verbal and non-verbal messages.
II.
Comprehend, analyze and evaluate information in a given text (such as a story, essay, poem, textbook, etc.).
III.
Formulate and verbally express focused, coherent, and organized information, ideas, and opinions, with style and
content appropriate for the purpose and audience, in a variety of communication contexts, such as one-on-one
situations, small groups and classes.
IV.
Formulate and express information, ideas and opinions in mechanically sound written forms that have a clear
purpose, focus, thesis and organization; that are appropriate for their audience in content and style; and that
support, clarify, and expand complex ideas with relevant details, examples and arguments.
V.
Use supplemental materials (visual, auditory, etc.) to support verbal or written communication; comprehend and
evaluate visual messages such as pictures, graphs, and other printed or electronic material.
VI.
Assess themselves as communicators, based on the standards of clear and effective communication expressed or
implied above and make adjustments and improvements in their communication strategies.
MULTICULTURAL UNDERSTANDING
Students will demonstrate understanding of issues related to race, social class, gender, sexual orientation, disabilities
and culture and the role these issues play in the distribution of power and privilege in the United States.
17
I.
Demonstrate awareness and knowledge of contemporary culture in the context of comparative US history as it
relates to race, social class, gender, sexual orientation, disabilities and culture.
A.
Compare histories of diverse peoples in the US.
B.
Articulate concepts of culture and cultural identity.
II.
Using awareness and knowledge about multiculturalism and various groups in the
United States, identify issues of power and privilege that exist in all interactions.
A.
B.
III.
Students will describe personal and institutional biases, emotional responses, behaviors, practices and
language that impact individuals and groups.
Students will describe specific benefits and costs to individuals and groups directly related to race, social
class, gender, sexual orientation, disability and culture.
Function effectively in multicultural settings.
A.
Use appropriate communication strategies to work through differences.
B.
Make judgments and decisions by considering as many points of view as possible.
C.
Recognize individual power and privilege.
IV.
Demonstrate awareness, knowledge, and skills in creating greater equity and social
justice.
A.
B.
Identify the benefits of multicultural understanding for personal and global survival and growth.
Adapt constructively to situations in which race, social class, gender, sexual orientation, disability and
culture affect people's experiences.
INFORMATION LITERACY
Students will access, use and evaluate information in a variety of formats, keeping in mind social, legal and ethical
issues surrounding information access in today's society.
I.
Define and articulate a need for information.
A.
Identify an information need.
B.
Formulate questions based on the information need.
C.
Identify key concepts and terms that describe the information need.
II.
Locate, access and use information from a variety of sources.
A.
Identify existing and emerging information resources.
B.
Select and use the most appropriate tools and strategies for accessing needed information.
C.
Use information to accomplish a specific purpose.
D.
Apply information retrieval and selection skills and concepts to emerging technologies.
III.
Identify the basic principles of how information is produced, stored, organized,
transmitted and accessed.
A.
B.
C.
IV.
Identify basic concepts of information organization, in online, print and other formats.
Identify the basic structural features of an information system and how they are used.
Evaluate the effect of emerging technologies on information production, storage, organization,
transmission and access.
Critically evaluate information and its sources.
A.
Extract relevant information from a source.
B.
Evaluate online and print sources for objectivity, authority, accuracy, and currency.
V.
Use information, considering the economic, legal, ethical and social issues surrounding
its access and use.
A.
B.
C.
IEAR, 8/01
Identify the role of information in personal, professional and academic areas.
Discuss the changing nature and role of information and information access and privacy issues in society.
Use information ethically and legally, considering issues such as plagiarism and copyright.
18
GENERAL INTELLECTUAL ABILITIES
Students will think critically within a discipline, identify connections and relationships among disciplines, and use an
integrated approach to analyze new situations.
I.
Think critically within a discipline:
A.
Identify and express concepts, terms and facts related to a specific discipline.
B.
Recognize how the values and biases in different disciplines can affect the ways in which data is analyzed.
C.
Discuss issues and questions within a discipline.
D.
Identify, interpret and evaluate pertinent data and previous experience to reach conclusions.
E.
Evaluate decisions by analyzing outcomes and the impact of actions.
II.
Identify connections and relationships among disciplines:
A.
Identify similarities and differences in the ways in which data is collected and analyzed in different
disciplines.
B.
Discuss consequences of expressed and tacit assumptions in specific disciplines.
C.
Collect and analyze relevant data for a specific issue.
D.
Expand perspectives by identifying and evaluating connections and relationships among disciplines.
III.
Use an integrated approach to problem solving in new and potentially ambiguous
situations.
A.
Identify information, skills, experience and abilities acquired in various academic and professional fields
to facilitate problem solving in new circumstances.
Describe how one's own preconceptions, biases and values affect one's response to new and ambiguous
situations.
Use various strategies of inquiry, to reach conclusions in new and ambiguous situations.
Recognize, acknowledge and accept ambiguity.
B.
C.
D.
GLOBAL AWARENESS
Students will demonstrate understanding and awareness of issues related to, and consequences of, the growing
global interdependence of diverse societies by integrating knowledge from multiple disciplines. Students will
describe how social, cultural, political, and economic values and norms interact.
I.
Recognize the value and significance of artistic and religious expressions in various world cultures.
II.
Articulate the values and beliefs that influence humans in seeking identify and meaning within their culture.
III.
Describe the impact of global interdependence on local cultures including those within the United States.
IV.
Identify the origin of events that have led to contemporary global conflict, competition, and cooperation.
V.
Demonstrate awareness and knowledge of the economic forces that have led to the interdependence of national
economies and the imbalance of distribution of wealth.
VI.
Demonstrate knowledge of the impact of global interdependence on the natural world.
<<Return>>
19
APPENDIX E
Project Title: Reading Outcomes and Assessment in Developmental English Courses, Part II
Project Manager & e-mail address: Dutch Henry, dhenry@ctc.edu
This project gathered approximately 15 faculty over six months to discuss, examine, and develop reading assessments,
sample student work, and a reading rubric for use in English 080, 090, and 100. We started by discussing the reading
activities and assessments we are currently using in our classes. We developed a list of sample of classroom activities and
assessments and identified the most common and most important reading skills students need to succeed in college level
courses. We also discussed the types of texts used and sample texts that have succeeded with students.
Next, we gathered assessments and reading materials from faculty and met to discuss the outcomes measured by the
assessments and the criteria used to evaluate student performance. We met for more than six hours over one month to
examine assessments. This was more time consuming than originally thought, but it proved to be the most important part of
the project for the faculty involved. To share and discuss assignments and identify exactly what is assessed by particular
assignments is extremely valuable for instructors.
Sample student work sets were collected, reproduced, and distributed to participating faculty. Faculty read the assignments,
the readings, and the student work, and we then met to discuss student performance and whether it would meet standards for
exit level performance at the conclusion of English 080, 090, and 100. Examining student work is also extremely time
consuming, but it is at the level of student work that faculty can concretely discuss and exemplify their expectations for
passing student performance in each of our Developmental English courses. Looking at a sample student essay, for
example, and discussing what it represents in terms of the reading skills and abilities students need to pass a course is
extremely valuable and important for individual instructors and our program overall.
Finally, we used the assessments and sample student work to develop a Developmental English Reading Rubric for use in
Developmental English courses. The Rubric is still in the draft stage and needs further review and refinement, but it
represents an important beginning. It is attached in electronic form.
Project Materials
The vast majority of materials gathered and utilized in this project are in hard copy form. Assessments, criteria sheets,
sample student work, and student work evaluation forms were submitted and completed in hard copy form and are included
in the Developmental English Handbook (available for examination from the project coordinator).
Three documents are included below: the Developmental English Reading Rubric, the How We Teach Reading Brainstorm,
and the Student Work Evaluation Form.
Developmental English Reading Rubric
Uses varied and appropriate skills,
strategies, and processes to
understand what is read
Does not meet standards
Meets Standards
Exceeds Stand





IEAR, 8/01
Begins to understand the
components of the reading
process
Formulates basic, preliminary
questions relating to reading
Uses inappropriate strategies for
understanding main idea,
inferences, and vocabulary
20


Demonstrates an adequate
understanding of the components
of the reading process
Formulates adequate questions
relating to reading
Demonstrates appropriate
strategies for understanding main
idea, inferences, and vocabulary


Demonstra
understand
process
Formulates
relating to
Consistentl
appropriate
understand



Inadequately organizes, outlines,
and evaluates what is read
Cannot distinguish between major
and minor ideas
Inaccurately paraphrases what is
read



Adequately organizes, outlines,
and evaluates what is read
Makes adequate distinctions
between major and minor details
Adequately paraphrases what is
read



Reads and understands a variety of
college entry-level texts
Uses reading as a learning and
research tool





Inadequately comprehends texts
Incorrectly or insufficiently
analyzes texts
Ineffectively synthesizes texts
Unable to identify and locate
appropriate resources
Cannot use reading as a tool for
understanding resources



Adequately comprehends texts
Correctly analyzes texts
Effectively synthesizes texts




Locates and adequately evaluates
resources
Uses reading as a tool for
understanding resources


Reading Outcomes Meeting
How do we teach reading?
 Assign readings
 How to annotate:
o Adler: reading on marking pages
o Read & mark it: photocopy readings and hand them in
o Copy a page on an overhead and annotate together
o Groups: Bullet a paragraph, marginal notes
o Visual cues: how to break a reading














Summarize and critique/react and synthesize
Identify facts and ideas
Write down vocabulary: must gain 50 by the end of the quarter
Reading images and slogans for implied arguments
Separate words and images
Kinds of texts: factual texts, non-fiction texts, variety of texts, analyze longer pieces (10-20 pages)
Author vs. persona in fiction
Poetry: concrete images, emotion, observation, visual, metaphor, read critically, explore and expand
Other kinds of writing: technical writing
Process: before/during/after, model, write notes before you read, problem solving when you’re reading,
metacognitive aspect: problems such as boredom, there is no boredom/interest, the reader creates it
Attitude: anti-reading
Synthesize: discuss readings together, reading grid, same issue in 3 readings—connect
Analyze: pull elements out
Background knowledge: share the experience of developing as a reader, connecting to their experience, finding a
link to something they’ve already done, brief discussion/lecture before all readings
Student Work Evaluation Form
Assignment Title_________________________________________________________
21

inferences,
Consistentl
and accura
read
Makes insi
distinctions
minor deta
Completely
paraphrase
Thoroughly
Insightfully
Creatively
Locates, ev
effectively
Consistentl
tool for un
Outcomes Assessed ( note briefly from MCOs using outline numbers):
080:_______________________________________________________________________________________________
__________________________________________________________________________________________________
___________________
090:_______________________________________________________________________________________________
__________________________________________________________________________________________________
___________________
100:_______________________________________________________________________________________________
__________________________________________________________________________________________________
___________________
Sample #
1
080:
090:
100:
Does not meet
Does not meet
Does not meet
Meets
Meets
Meets
Exit Standards
Exceeds
Exceeds
Exceeds
Comments:_________________________________________________________________________________________
__________________________________________________________________________________________________
___________________________________________________________________________________________
2
080:
090:
100:
Does not meet
Does not meet
Does not meet
Meets
Meets
Meets
Exceeds
Exceeds
Exceeds
Comments:_________________________________________________________________________________________
__________________________________________________________________________________________________
___________________________________________________________________________________________
3
080:
090:
100:
Does not meet
Does not meet
Does not meet
Meets
Meets
Meets
Exceeds
Exceeds
Exceeds
Comments:_________________________________________________________________________________________
__________________________________________________________________________________________________
___________________________________________________________________________________________
<<Return>>
IEAR, 8/01
22
APPENDIX F
Project Title:
“Pre-English 101 Essay Writing Portfolio Standards”
Project Manager & e-mail address:
Pearl Klein pklein@ctc.edu
Project Overview
The purposes of this project were many, and included:
 promotion of greater academic writing agreement and consistency between composition
instructors;
 increased discussion on academic writing issues between college instructors;
 clarified expectations for ENG 100 and ESL 100 writing;
 setting of voluntary standards for college entry-level writing;
 justification and enforcement of placement policies for students in developmental English
and ESL writing classes;
 strengthened use of evaluative ENG 100, ESL 100 and ENG 101 rubrics, and

consideration by the English program of whether to implement a portfolio reading system as
part of outcomes assessment and as a pre-requisite for entry into English 101.
The first three of these were most clearly addressed; the more concrete outcomes in terms of a
recommendation for future action was not.
20 faculty members, both associate and full-time, met two times, once per quarter; the group
included English and ESL teachers. This level of participation signaled a need for such
conversation, and was a positive outcome of the project. Less positive was the commitment to
work other than meetings—and the challenge of scheduling a time convenient for both
constituencies proved too much for the program manager.
The concrete measures for success all have yet to be achieved; in retrospect, many of these
seem to have come out of a desire to justify the core need of increased and improved
communication among faculty. Should the project be renewed, the following would still need to
be accomplished

A number of instructors will have collected portfolios (the contents of which have still
to be decided) Fall 2000, Winter and Spring 2001;

An evaluation/norming sheet will be developed in a meeting of instructors for their use;

Instructors will serve as “readers” and evaluate the portfolios of essays based on the
forms;

Instructors meet quarterly to norm the essays;

The essays are collated and recorded in a binder with collated notes (to debut in Spring
2001);
23


A “project documentation” binder will be kept about the whole process and for future
reference, and
Some initial conclusions may be made from the experience in an “executive summary”
report to the ADCs of English.
The responsibility for all this work left undone is at the feet of the program manager, and
should in no way reflect on the faculty who are inspired to continue refining our outcomes
procedures. There may also have been some sense by faculty that overlapping projects had to be
chosen between, and certainly, if the English faculty had been more fully aware of the recent
outcomes work by the ESL faculty, any redundancy could have been avoided. This last,
however, only underscores the need for continuing to improve communication.
Project Materials
The purposes of this project were many, and included:
 promotion of greater academic writing agreement and consistency between composition
instructors;
 increased discussion on academic writing issues between college instructors;
 clarified expectations for ENG 100 and ESL 100 writing;
 setting of voluntary standards for college entry-level writing;
 justification and enforcement of placement policies for students in developmental English
and ESL writing classes;
 strengthened use of evaluative ENG 100, ESL 100 and ENG 101 rubrics, and

consideration by the English program of whether to implement a portfolio reading system as
part of outcomes assessment and as a pre-requisite for entry into English 101.
The first three of these were most clearly addressed; the more concrete outcomes in terms of a
recommendation for future action was not.
20 faculty members, both associate and full-time, met two times, once per quarter; the group
included English and ESL teachers. This level of participation signaled a need for such
conversation, and was a positive outcome of the project. Less positive was the commitment to
work other than meetings—and the challenge of scheduling a time convenient for both
constituencies proved too much for the program manager.
The concrete measures for success all have yet to be achieved; in retrospect, many of these
seem to have come out of a desire to justify the core need of increased and improved
communication among faculty. Should the project be renewed, the following would still need to
be accomplished

A number of instructors will have collected portfolios (the contents of which have still
to be decided) Fall 2000, Winter and Spring 2001;

An evaluation/norming sheet will be developed in a meeting of instructors for their use;

Instructors will serve as “readers” and evaluate the portfolios of essays based on the
forms;

Instructors meet quarterly to norm the essays;
IEAR, 8/01
24



The essays are collated and recorded in a binder with collated notes (to debut in Spring
2001);
A “project documentation” binder will be kept about the whole process and for future
reference, and
Some initial conclusions may be made from the experience in an “executive summary”
report to the ADCs of English.
The responsibility for all this work left undone is at the feet of the program manager, and
should in no way reflect on the faculty who are inspired to continue refining our outcomes
procedures. There may also have been some sense by faculty that overlapping projects had to be
chosen between, and certainly, if the English faculty had been more fully aware of the recent
outcomes work by the ESL faculty, any redundancy could have been avoided. This last,
however, only underscores the need for continuing to improve communication.
<<Return>>
APPENDIX G
Academic ESL Assessment
Project Manager & e-mail address: Bruce McCutcheon, bmccutch@ctc.edu
Overview
The original idea for a project to develop reading assessment tools for Shoreline Community College’s
ESL program began in discussions after the development of rubrics for ESL writing courses. Vince
Barnes and a few other teachers noted that while progress had been made toward standardizing criteria
used in evaluating student work in various ESL courses, little consensus had developed regarding the
appropriate reading outcomes expected in each course or consistent methods of assessing reading. As a
result of these discussions, Bruce McCutcheon wrote a proposal for funding to work on ESL reading
assessment.
The purpose of the project then was to develop a set of reading assessment tools which could be used
as exit exams for three levels of Shoreline Community College’s ESL program. Once the tests were
developed, the program could use the assessment tools as it saw fit: for the gathering of data on student
achievement and/or for the screening of weak readers from passing into a following level in the
program. In other words, the actual use of the tests was not determined before the project began and
has not been determined yet. The optimistic idea was that if reading tests were developed by the
faculty, the faculty would want to make use of them.
Participation on the ESL reading assessment committee was solicited from the ESL program at large,
from both full-time and associate faculty. Five full-time faculty members (Bruce McCutcheon, Vince
Barnes, Donna Biscay, Elizabeth Hanson, and Kristin Marra) joined the committee as did three
associate faculty members (Rosemary Dunne, Mary March, and Lauren Wilson). An orientation
25
meeting was set for the beginning of Winter Quarter of 2001. At the meeting the general format of the
assessments was agreed upon. The assessment for each level would be modeled on the ESL reading
placement test which had been developed several quarters earlier. Therefore, each test would consist
of five one-to-three-paragraph reading passages, each followed by five multiple choice questions.
Rough grade levels (based on American public school grade levels) for the reading material in each
level were also established in this meeting. ESL level 5 readings would target 7th and 8th grade level
material, ESL level 6 would target 9th and 10th grade level material, and ESL level 7 would target 11th
and 12th grade level material. The reading passages would consist mostly of academic text along with
some fiction and journalistic prose. The committee broke into three subcommittees, one for each ESL
level, and agreed to find suitable material and work together to develop test items.
During the rest of Winter Quarter the subcommittees did the research necessary to find material and
check its readability level using Microsoft Word’s grammar checking tool. The subcommittees also
worked together to develop test items which would assess different agreed upon reading skills. Then at
the end of the quarter the entire committee met again to look at the subcommittee’s progress and to
arrange for field testing of the assessment tools. The level 5 test was run in two sections of that course,
the level 6 test was run in two sections, and one of the two level 7 tests was run in one section.
At the beginning of Spring Quarter the committee reconvened to debrief about the results of the tests.
Many of those who had administered the tests were somewhat disappointed by the results. This was
mainly due to the assumption of 75% as a passing score, based on the recommended passing score for
final exams in general. Field testers felt that too many students who had been determined on the basis
of other assessments during the course to be good or at least adequate readers had “failed” our newly
developed exit tests. However, simply lowering the passing score to 60% seemed to rectify the
problem. With this new cut off score the good and adequate readers passed while the weak readers still
failed.
The committee is planning more field testing at the end of Spring Quarter. With a larger sample of test
scores, the committee will be able to check the reliability of the assessments and the workability of the
60% cut-off score for passing. Hopefully, good and adequate readers will continue to pass the
assessments while weak readers will continue to fail at this cut-off score. If the results for this quarter
give the committee more confidence in the assessment tools, then other associate faculty can be
brought into the process of field testing. Then, hopefully, a consensus among all of the ESL faculty
can be built, a consensus that these assessments should be officially instituted in the program.
What it means to “institute the assessments officially” is not clear. In other words, the final use of the
assessments in the program is not certain at this point. The faculty as a whole may embrace the
assessments as a step forward in improving the quality of the program. On the other hand, some
faculty may be alarmed by the idea of standard exit exams. The challenge now is that ESL teachers
may view any outside control of assessment as intrusive. That is currently the feeling many teachers
already have about newly state-mandated testing procedures for the lower levels of ESL. Some
teachers may disagree with the choice of readings, wording of questions, and skills assessed. Certainly,
some teachers may have a bias against the multiple-choice format, feeling that it is an inappropriate
way to assess a complex skill. These are all potentially difficult issues which lie ahead.
Therefore, while the funding of the project may be finished, the project is not. The ESL reading
assessment committee will continue to work towards the acceptance of these reading assessment tools,
IEAR, 8/01
26
the modification of them if necessary, and the improvement of reading assessment in our program for
many more quarters.
Project Materials
G. Level 5 reading test
Eratosthenes was a Greek scientist who lived over 2,100 years ago. He lived most of his life in
the Egyptian city of Alexandria. For a time, Eratosthenes was the director of Alexandria’s famous
library. He was also a poet, an athlete, a historian, a map maker, and an astronomer. Eratosthenes is
most important to us, however, because he was the first to measure the size of the earth and to prove
the earth’s surface is curved.
Eratosthenes read in a book that, at noon on summer solstice (June 21), the sun had been
directly overhead in a town called Syene, a city south of Alexandria. He knew that, at the same time,
the sun had not been directly overhead in Alexandria. He also knew that objects and people cast
predictable shadows if they are on a flat surface. The shadows in Syene were different from what he
would have expected from objects on a flat plane. Therefore, Eratosthenes decided that this difference
could be accounted for only if the earth’s surface were curved. The earth must be a sphere! How big
could it be?
1. Eratosthenes knew that the summer solstice sun was directly overhead in Syene because
a. he was a great map maker.
b. he was a Greek scientist
c. he read about it in a book.
d. he lived in Syene
2.
a.
b.
c.
d.
Eratosthenes is important to us especially because
he lived over 2,100 years ago.
he was the first to measure the earth’s size.
he discovered the summer solstice.
he was the director of a famous library.
3.
a.
b.
c.
d.
What is a “sphere?”
a scientist
a surface
a round object
a shadow
4.
a.
b.
c.
d.
The word “they” in the second paragraph refers to…
objects and people
Syene and Alexandria
libraries in Alexandria
the sun and its shadows
27
5.
a.
b.
c.
d.
What is NOT mentioned in this reading passage?
Eratosthenes’s many occupations
the city where Eratosthenes lived
a great discovery that Eratosthenes made
a famous book that Eratosthenes wrote
For many years I have been catching bluefish off the island where I live. These fish grow to be
three feet long and to weigh as much as eighteen pounds, but most of the ones I catch weigh about five
pounds. I have always wondered just what they looked like underwater when they were feeding. One
day last winter in Key West public library, I watched a film of some blues having their breakfast in a
huge tank. The movie had been made by some scientists who were studying the habits of these wild
and strong fish.
After all my years of fishing for blues and trying to see in my mind’s eye what must have been
happening under the surface—under the mirror of the sky—I was finally able, sitting there in a house
of books, to feel as if I was going right into the sea with six blues, entering into their watery home and
swimming with them.
[Adapted by John Hersey from his novel Blues. From Words on the Page, The World in Your Hands,
Book 2 , by C. Lipkin & V. Solotaroff, Harper & Row, Publishers.]
6.
a.
b.
c.
d.
What is the main point of this passage?
The man catches lots of big fish.
He likes to read about catching fish.
Watching a film helped him to feel the fish in the sea.
He likes to watch the blues eat their breakfast in a huge tank.
7.
a.
b.
c.
d.
Which of the following is NOT true according to the passage?
The bluefish are wild and strong.
The Key West library has huge fish tank.
Bluefish sometimes weigh eighteen pounds.
The movie was made by some scientists.
8. What does “they," line 3, refer to?
a. the years
b. the islands
c. the pounds
d. the fish
9.
a.
b.
c.
d.
What is true according to the passage?
Bluefish can be three feet long.
Bluefish never weigh more than 3 pounds.
The writer hasn’t been fishing for a long time.
The writer only reads about bluefish.
10. What does the writer mean by “a house of books”?
IEAR, 8/01
28
a.
b.
c.
d.
Key West
a public library
a coffee house
the mirror of the sky
When Henry Ford drove down the price of the automobile, automobiles became more
popular. Equally responsible for the automobile's increasing popularity was a series of
important improvements such as the invention of a self-starter in 1912. Perhaps the most
important improvement was the introduction of the closed car.
The car-buying public discovered with delight that a closed car was something quite
different from the old "horseless carriage." It was a power-driven, storm-proof room on wheels
that could be locked and parked all day and all night in all weathers. You could use it to fetch
home the groceries, cool off on hot evenings, reach a job many miles away, or visit distant
friends. Young couples were quick to learn the privacy that this room on wheels gave them.
Also, of course, the car often became a source of family conflict: "No, Junior, you are not
taking the car tonight"'
Some people argued that the automobile destroyed the healthy habit of walking, weakened
the churchgoing habit, promoted envy, killed innocent people when driven by reckless and
drunken motorists, and provided criminals with a safe getaway. Nevertheless, the automobile
was here to stay. (Adaptation from “The American Revolution” from The Big
Change:America Transforms Itself 1900-1950 by Frederick Lewis Allen. Copyright 1952 by
Frederick Lewis Allen.)
11. What is the main point of second paragraph?
a. The American public was delighted with the invention of the closed car
b. The increase in popularity of the car was due to Henry Ford manufacturing cheaper cars
c. The popularizing of the automobile had many disadvantages for society
d. The invention of the self-starter was the most popular improvement
12. According to the passage, which of the following is NOT true?
a. Families argued over car usage.
b. Everybody was happy with increasing popularity of the car.
c. Americans cooled themselves off by going for a drive.
d. The closed car was used for many different purposes.
13. Which of the following best describes the feelings of young couples toward the closed car?
a. It was a good place to cool off.
b. It was useful for visiting distant friends.
c. It provided them with some alone time.
d. It caused a lot of family conflict.
14. What is the most likely meaning of the word “reckless” as used in the last paragraph?
a. drunken
29
b. friendly
c. careless
d. expensive
This woman's name is Juanita Maxwell. Sometimes she talks and acts in ways that are totally
different from her usual manner. At such times she believes that her name is Wanda and that Juanita is
an old friend of hers. Most experts feel that while true multiple personalities are rare, they do exist.
Juanita Maxwell was twenty-five years old when she was put on trial. She was charged with the
beating death of seventy-three-year-old Inez Kelly. Juanita, a married mother of two, had been working
as a maid at a hotel in Fort Myers, Florida, when the murder occurred. The state of Florida charged that
Juanita had killed Inez Kelly, a guest at the hotel, in a dispute over a fountain pen.
At her trial Juanita Maxell was declared "not guilty by reason of insanity.” Was it all a
trick? Was Juanita Maxwell playing a clever game to get away with murder? Some people
thought so. Some thought that she was pretending to be sick and to have a second personality so
she could escape a prison sentence. But the judge did not think Juanita was trying to fool him.
He believed that she really was the victim of the character named Wanda, whom she could not
control. (Adapted from Phenomena, Melissa and Henry Billings, pgs. 93-94, Jamestown
Publishers 1984)
15. What is the main point of this passage?
a. Juanita Maxwell was found not guilty of killing a guest at the hotel.
b. A judge felt that Juanita Maxwell was not guilty because of her personality disorder.
c. Juanita Maxwell was playing a clever game to get away with a murder.
d. Juanita Maxwell could not control the behavior of her second personality.
16. Which of the following is true according to the passage?
a. Many people have multiple personality disorders.
b. Multiple personality disorder doesn’t exist.
c. Juanita Maxwell killed a maid named Wanda.
d. People with true personality disorder cannot control their multiple personalities.
17. In line three, the word “they” refers to…………
a. experts
b. multiple personalities
c. Juanita and Wanda
d. friends of Juanita
18.Which of the following is NOT true according to the passage?
a. Wanda and Juanita are the same person.
b. Two people in a hotel argued about a pen.
c. Juanita and Inez worked together at the hotel.
d. The State of Florida charged Juanita with the murder.
19. Although it is not stated directly in the passage, which of the following can we guess, based on
the information in the reading?
a. Not everyone who seems to have a multiple personality disorder really has one.
IEAR, 8/01
30
b. Multiple personality disorders are more common in women than men.
c. Many judges excuse murders because of insanity.
d. Juanita was successful in tricking the judge.
Level 6 reading test
Reading Assessment Pilot Test
Directions: There are three short reading passages on this test. Read each passage and then answer
the following questions on your Scantron sheet. Please do not write on the test.
Reading 1
The social effects of alcoholism may be as bad as the medical problems. A drinking habit can drain a
family’s money reserves. Also, an alcoholic may be unable to hold a job. Many are put in jail as a
result of their actions. They may become angry when intoxicated and harm others. An alcoholic in a
family may lead to emotional problems for other family members. Some alcoholics are young people.
They may steal alcohol or steal money to buy alcohol. Driving while drinking causes many teenage
deaths. Malnutrition can be a more serious threat to the young alcoholic since a proper diet is
necessary for growth at this age.
1.
a.
b.
c.
d.
In the above passage, the word “malnutrition” probably means what?
Losing all the money
Committing a crime
Eating badly
Drinking too much alcohol
2.
a.
b.
c.
d.
Which of the following is not a social effect of alcoholism?
Damage that drinking does to your body
The loss of a job because of drinking
Family problems caused by alcoholism
Financial problems caused by drinking
3.
a.
b.
c.
d.
Which of the following is probably true?
Alcoholism does not have any physical effects.
Women cannot become alcoholics.
A sixteen year old boy could be alcoholic
It is always safe to let small children drink alcohol.
Mark the following statements true or false.
4.
a.
b.
Some alcoholics lose their jobs.
True
False
31
5.
a.
b.
6.
a.
b.
c.
d.
Alcoholism is one cause of theft.
True
False
We can suppose that the paragraph before this one was about which of the following?
the benefits of drinking alcohol
problems related to drugs
the medical problems related to alcoholism
the joys of alcoholism
7.
a.
b.
c.
d.
In the second sentence the word “drain” probably means what?
fill
empty
scare
help
8.
a.
b.
c.
d.
Which of the following is the best summary of this article?
Alcoholism can cause people to commit crimes.
Alcoholism is very common.
Being an alcoholic can cause considerable problems in one’s life.
Being an alcoholic is bad for teenagers.
Reading 2
1
What will schools of the future look like? Psychologist Wayne Holtzman offers an intriguing
answer. Holtzman believes schools should enhance children’s mental and physical health just as
actively as they encourage learning the “three R’s.”
2
To explore the possibilities, Holtzman and his colleagues have created several “full-service”
Schools for the Future. How are these schools different? First, they integrate a large array of health
services and human services into the schools themselves. In conventional schools, children and
families needing special help must travel all over their communities to get it. Many parents simply
never make the effort or find the appropriate agencies. In the School of the Future, everything is
available on campus. In this way, efforts to prevent mental health problems, drug abuse, school
dropouts, teen pregnancy, and gang activity are made an integral part of each school.
3
A second major feature of the School of the Future is a very high level of teacher, parent, and
community involvement. Parents, in particular, are encouraged to help set goals for the schools and to
identify their children’s needs. Many parents participate on campus as volunteers and they meet
frequently with teachers to monitor their children’s progress. In addition, the schools form links with
neighborhood groups, churches, recreation programs, and volunteer organizations. As a result, the
School of the Future becomes an active part of the surrounding community (Holtzman,1997).
4
Ideally, schools of the future will enrich and enhance the lives of children in ways that
traditional schools cannot. The School of the Future program has been highly successful in Texas.
Perhaps it could be in other communities too. (From Introduction to Psychology: Exploration and
Application by Dennis Coon. Brooks and Cole Publishing Company)
9.
In paragraphs 1 and 4, the word “enhance” means:
IEAR, 8/01
32
a.
b.
c.
d.
learn
prevent
teach
improve
10.
a.
b.
c.
d.
Which of the following is not listed as a feature of a School of the Future?
better teachers
health and human services
community involvement
programs to prevent teen pregnancy
11.
a.
b.
c.
d.
Which of the following sentences is the best summary of this reading?
Schools of the Future are much better than conventional schools.
The School of the Future offers more integration of family, community and social services.
Psychologist Holtzman and his colleagues created a model School for the Future.
In the School of the Future, everything is available on campus.
Mark the sentences below True or False.
12.
a.
b.
T or F Parental involvement is a important part of the Schools of the Future program.
True
False
13.
a.
b.
T or F The School of the Future does not already exist; it is still just an idea.
True
False
14.
T or F The article implies that if health and human services are far away, it can be
difficult to find and get them.
True
False
a.
b.
15.
"It" in line 5 of paragraph 2 refers to:
a. conventional schools
b. special help
c. health and human services
d. families
16. Which of the following best expresses how "full service schools" differ from regular schools?
a.
full service schools focus mainly on the "Three R's"
b.
Full service schools have more problems with mental health, drug abuse, etc., than regular
schools.
c.
Full service schools help families travel all over their communities, whereas regular schools don't
d.
Full service schools provide many services such as health and human services that regular schools
don't.
33
17.
The main idea of paragraph 3 is that:
a. Parents volunteer more in Schools of the Future.
b. Schools of the Future have involvement from teachers, parents, and the community.
c. Schools of the Future are better than regular schools because they have more involvement.
d. Parents obviously prefer Schools of the Future because they have more involvement.
18. In paragraph 2, line 3 "array" probably means:
a. variety
b. limited number
c. unlimited number
d. arrival
19. We can suppose that which one of the following is true?
a. Most school reforms take place in Texas.
b. Wayne Holtzman is very popular in Texas.
c. Schools of the Future will solve America's social and health problems.
d. Many students in regular schools have social and health problems.
Reading 3
1. Between 1840 and 1860, nearly 300,000 Americans moved west. About two-thirds of these
went to California for gold. Another 50,000 went to Utah, and about 53,000 went to the Oregon
Country.
2. The trail west was crowded in some years. Only 875 crossed into Oregon in 1843, but that
year became known as the Great Migration because there were six times as many travelers as there had
been the year before.
3. Before gold was discovered near Sacramento, California, five people came to Oregon for
every one who turned south to California. After 1848 the proportion was reversed. In 1852, 10,000
newcomers arrived in Oregon. The next year there were 75,000.
4. Some historians believe that people came west because growing populations in heavily
settled areas squeezed them out. The Northwest provided great opportunities for people to organize
and build communities. Many made the trip for adventure, hoping to start a new life. Oregon attracted
generally stable emigrants who intended to stay and build. People looking for fortunes often went to
the California gold fields.
20.
a.
b.
c.
d.
Which of the following is true according to the article?
Between 1840 and 1860 Oregon attracted many people because of the discovery of gold there.
Most people who traveled west from 1840 to 1860 went to California because gold was discovered
there.
More people went to Oregon in 1843 than in any other year between 1840 and 1860.
People went to Utah because of the gold there.
Mark the sentences below True or False.
IEAR, 8/01
34
21.
a.
b.
We can infer that gold was discovered near Sacramento in 1848.
True
False
22.
a.
b.
Before gold was discovered in California, more people went to Oregon.
True
False
23.
a.
b.
More people went to California than Utah and Oregon combined.
True
False
24.
Which of the following is not given as a reason that people went West?
a. To look for gold
b. To farm
c. To expand the country
d. To escape overpopulation in other areas
25.
a.
b.
c.
d.
Which of the following is true about differences between people going to California and
people going to Oregon?
People going to Oregon were mostly looking for a way to make money fast. Those going to
California were more responsible.
People going to California tended to be stronger than those going to Oregon.
People going to Oregon wanted to build communities and settle down; those going to California
were looking for money.
Those going to California were interesting in acting careers because of Hollywood. Those going to
Oregon were more interested in starting communities looking for gold.
Level 7 reading test
Inanimate objects, like cars, computers and washing machines are classified scientifically into three
major categories-those that don't work, those that break down and those that get lost.
The goal of all inanimate objects is to resist man and ultimately to defeat him, and the three major
classifications are based on the method each object used to achieve its purpose. As a general rule, any
object capable of breaking down at the moment when it is most needed will do so. The automobile is
typical of this category.
With the cunning typical of its breed, the automobile never breaks down while entering a filling
station with a large staff of idle mechanics. It waits until it reaches a downtown intersection in the
middle of rush hour, or until it is fully loaded with family and luggage on the Ohio turnpike. Thus, it
creates maximum misery, inconvenience, frustration and irritability among its human cargo, thereby
reducing its owner's life span. (Contemporary GED, 1994)
1. What is the main point of this reading passage?
35
a.
b.
c.
d.
Inanimate objects such as cars try to defeat humans.
Cars, computers and washing machines are inanimate objects.
A car is an inanimate object that breaks.
The problem with inanimate objects is that they get lost.
a.
b.
c.
d.
animalistic
breakable
nonliving
scientific
2.
3. How would you describe the tone of this paragraph?
a. amusing
b. serious
c. angry
d. mysterious
4.
a.
b.
c.
d.
The author has had mostly good experiences with cars.
Americans have more cars than people from other countries.
The Ohio turnpike is some kind of a road.
Most filling stations have large staffs.
a.
b.
c.
d.
the Ohio turnpike
the car
the family
the luggage
5.
Even though each broken marriage is unique, we can still find the common perils, the common
causes for marital despair. Each marriage has crisis points and each marriage tests endurance, the
capacity for both intimacy and change. Outside pressures such as job loss, illness, infertility, trouble
with a child, care of aging parents, and all the other plagues of life hit marriage the way hurricanes
blast our shores. Some marriages survive these storms, and others don't.
When we look at how we choose our partners and what expectations exist at the tender beginnings
of romance, some of the reasons for disaster become quite clear. We all select with unconscious
accuracy a mate who will recreate with the emotional patterns of our first homes. One marriage
therapist explains,
mate who has qualities of our parents, who will help us rediscover both the psychological happiness
and miseries of our past lives. We may think we have found a man unlike Dad, but then he turns to
drink or drugs, or loses his job over and over again, or sits silently in front of the TV just the way Dad
IEAR, 8/01
36
did. A man may choose a woman who doesn't like kids just like his mother or who gambles away the
family savings just like his mother. (Contemporary GED, 1994)
6. What does the first paragraph do?
a. set a context and introduce the essay
b. provide examples of marriage problems
c. compare marriage to a storm
d. all of the above
7.
a. marriage therapists are experts in understanding relationships.
b. some men have serious problems with drugs and alcohol.
c. people choose a mate based on their early family relationships.
d. we all experience happiness and misery in our lives.
8.
a.
b.
c.
d.
events
problems
times
people
a.
b.
c.
d.
our parents
our family roles
our mates
ourselves
9.
10. What do you expect the next paragraph to talk about?
a. The problem with marrying a gambler.
b. How addictions can destroy a marriage.
c. Other reasons that marriages fail.
d. Advice for having a successful marriage.
The millions of people on the bottom levels of electronic hierarchies (computer networks that
connect managers with workers) are increasingly likely to spend their days in an isolated no-mans-land.
They are subservient to intelligent information systems that report their progress to unseen supervisors
far away. Because computers measure quantity better than quality, such systems tend to reward
employees who work faster more than those who work better. There's a sharp rise in loneliness and
disconnection as individuals blindly forge ahead. Service people on the telephone or at a cash register
curtly terminate attempts at idle conversation because their performance is being electronically
37
monitored. At one time they were judged on their ability to communicate with customers or
troubleshoot unexpected situations. However, today they
1994)
11. Th
a. Supervisors are far away from technical workers and they are unseen.
b. Technical workers aren't as friendly as before when they had more time to chat.
c. Computer networks connect people to each other across a no-mans-land.
d. Technology has changed these technical workers' lives in negative ways.
a. superior to
b. isolated from
c. under the control of
d. supported by
13. According to this paragraph which of the following is true?
a. People don't have much time to socialize at work.
b. People who work harder get paid more.
c. People's communication skills are as important as ever
d. Fewer people are getting jobs working with computers.
14. What is NOT mentioned in this paragraph?
a. How these technical workers feel at their jobs
b. The kinds of machines these people are working with
c. The ways that the supervisors evaluate the workers
d. The problem of layoffs among these technical workers
a. performances
b. service people
c. supervisors
d. computers
IEAR, 8/01
38
The mos
it more and more as you go about trying to conduct your business. The other day I was at the
ite you out a ticket. The computer is the only one allowed to issue tickets for the
ive the computer the information about your trip, and then it tells us whether you can
it
you---
16. How would you best describe the tone of this passage?
a. serious
b. scary
c. funny
d. persuasive
a.
b.
c.
d.
He wonders if the computer is working.
He feels sad that he can't buy a ticket.
He wants to know why everyone is in a bad mood.
He wants the woman to write him a ticket by hand.
a. the airplane
b. the ticket
c. the information
d. the computer
19. What is true according to this passage?
a. The attendants don't know what to do.
39
b. The man will get on his flight in time.
c. The man is going to get his ticket soon.
d. The computer will start working soon.
20. How do you think the customer feels at the end of the story?
a. lonely
c. sad
b. angry
d. amazed
With a strength born of the decision that had just come to her in the middle of the night, Avey
Johnson forced the suitcase shut on the clothes piled high inside and slid the lock into place.
Taking care not to make a sound, she then eased the heavy bag off the couch onto the floor and,
straining under her breath, her tall figure bent almost in two, hauled it over the thick carpeting to
the door on the other side of the cabin
From the moment she had awakened in a panic less than an hour ago and come to the reckless
decision, her mind had left to go and stand down at the embarkation door near the waterline five
decks below. While she swiftly criss-crossed the room on her bare feet, spiriting her belongings
out of the closet and drawers, her mind had leaped ahead to the time, later that morning, when the
for the day, she would step from the liner onto the waiting launch. For the last time. And without
so much as a backward glance.
Avey Johnson's two traveling companions, with whom she shared the large deluxe cabin,
were asleep in the bedroom on the other side of a divider with narrow shelf-like openings on tip
and a long chest of drawers below, facing what was called the living room. Not to risk waking
them, she had left off the lamps in the living area, where she always volunteered to sleep each
cruise because it was more private.
(Explorations in World Literature
a. secretly
b. angrily
c. hurriedly
d. happily
22. What did Avey do at the same time she made her decision to leave?
a. Her thought about her reasons for leaving.
b. She imagined herself standing at the ship's exit.
c. She walked around her room saying good-bye.
d. She went down to the embarkation door.
23. How does Avey feel about leaving?
a. angry
b. sad
c. fine
IEAR, 8/01
40
d. guilty
24. Which of the following in NOT talked about in this passage?
a. How she packed her things and took care of her luggage.
b. A description of the place where they were all staying.
c. What she imagined would happen the next morning
d. What kind of relationship she had with her companions
25. This passage seems most similar to which kind of movie?
a. a horror movie
b. a comedy
c. a love story
d. a drama
Alternate Level 7 reading test
Passage One
They were both children, but something had made both in an odd way old. Mary was
fourteen and Ted eleven, but Ted wasn't strong and that sort of evened things up. They were the
children of a well-to-do Virginia farmer named John Grey in the Blue Ridge country in
small river running through it and high mountains in sight, to the north and south. Ted had some
kind of a heart disease, a lesion, something of the sort, the result of a severe attack of diphtheria
when he was a child of eight. He was thin and not strong but curiously alive. The doctor said he
might die at any moment, might just drop down dead. The fact had drawn him close to his sister
Mary. It had awakened a strong and determined maternalism in her.
The whole family, the neighbors on neighboring farms in the valley and even the other
children at the schoolhouse where they went to school recognized something as existing between
times together, but they are so serious. For such young children they are too serious. Still, I
done something to Mary. At fourteen she was both a child and a grown woman. The woman
Anderson, from The United States in Literature, Miller, et. al., Scott Foresman, 1985)
1.
What is the main point of this reading passage?
a.
John Grey was a well-to-do farmer from Virginia.
41
b.
c.
d.
2.
A boy named Ted had a very serious health problem.
A young brother and sister seemed closely connected to each other.
Children in the country used to live on farms and go to schoolhouses.
We can suppose from this reading that...
a.
Mary probably took care of Ted because of his health.
b.
most of the people in
c.
John Grey was already dead at the beginning of the story.
d.
many neighbors avoided interacting with the Grey family.
3.
a.
b.
c.
d.
---) is most likely a ...
kind of animal.
type of soldier.
kind of tool.
type of disease.
a.
b.
c.
d.
--- refers to...
Southwestern Virginia.
the wide valley
the railroad
the small river
4.
5.
Which of the following words best describes the writer's attitude toward the two children.
a.
fascination
b.
anger
c.
fear
d.
disgust
6.
Which of the following is true according to the reading passage?
a.
Ted was older but weaker than his sister Mary.
b.
Southwestern Virginia was an area inside of the Blue Ridge Country
c.
Ted seemed strangely alive for a child with such serious health problems.
d.
Other children had refused to let Ted and Mary attend school.
Passage Two
Growing concern with social injustice and the gradual acceptance of realistic subject matter and
techniques opened the way for natualistic fiction. The new theories of Darwin, Freud, and Marx
were suggesting that biology, psychology, and economics determine each individual's destiny.
Naturalistic novelists applied these theories to their presentation and interpretation of human
experience. Writers like Stephen Crane and Ambrose Bierce tended to depict life as grim, the
IEAR, 8/01
42
universe as cold and spiritless, and the individual as a hapless victim of heredity, society, and
natural forces. Naturalism, therefore, was in direct opposition to Romanticism and
Transcendentalism, which envisioned a holy and mystical presence in nature. Crane, along with
the novelists Frank Norris and Theodore Dreiser, exposed poverty, cruelty, corruption, and the
futility of war. (From The United States in Literature, Miller, et. al., Scott Foresman, 1985, p.
231)
7.
The main idea of this paragraph is that...
a.
social injustice was closely related to some important new theories.
b.
Darwin, Freud, and Marx were the major thinkers and scientists of their day.
c.
naturalistic fiction was darker and more realistic than previous kinds of fiction.
d.
Frank Norris and Theodore Dreiser were key novelists in the naturalist movement.
8.
Which of the following is true according to this paragraph?
a.
Naturalistic novelists argued against the ideas of Darwin, Freud, and Marx.
b.
Ambrose Pierce believed that the individual was the master of his own fate.
c.
Naturalism was very similar to Romanticsim and Transcendentalism.
d.
Crane, Norris, and Dreiser were interested in similar themes and issues.
9.
---) probably means to...
a.
b.
c.
d.
show
choose
avoid
remove
a.
b.
c.
d.
----) refers to...
social injustice and realistic subject matter
Darwin, Freud, and Marx
biology, psychology, and economics
naturalistic novelists
10.
11.
Based on this paragraph, which of these two writers would you expect to have a similar
writing style?
a.
Darwin and Freud
b.
Freud and Marx
c.
Marx and Crane
d.
Crane and Bierce
12.
How would you best describe the tone of this paragraph?
a.
serious
b.
sarcastic
43
c.
d.
comical
informal
Passage Three
Science seeks to explain objects and events in the world and universe. Very often discoveries in
science are put to use in everyday life. Technology is the use of scientific principles to solve
practical problems. For example, studying the effect of unequal pressures on surfaces is science.
Then applying that knowledge to construct a practical device, such as a drinking straw or a
vacuum cleaner, is technology. Investigating forces is science, but using knowledge about forces
to construct a can opener or a construction crane is technology. People may from time to time
confuse these two terms, science and technology. However, keeping the fundamental distinction
in mind can help as we examine various questions and problems that humans attempt to solve
using science or technology. However, the distinction does not imply that technology is always
simple. In fact, techology can be very basic or extremely complex. (Adapted from Physical
Science, Carle, et. al., D.C. Heath and Company, 1991, p. 15)
13.
The main idea of this paragraph is that....
a.
science has the power to explain how the world works
b.
technology has an ever-growing influence on our lives.
c.
science and technology are related but not exactly the same.
d.
technology is sometimes quite simple and sometimes quite complicated.
14.
---) is....
a.
b.
c.
d.
an invention
a machine
a force
a building
15.
Which of the following is true according to this paragraph?
a.
Little is known about the effect of unequal pressures on surfaces.
b.
Technology is the practical application of scientific principles.
c.
A person working in the field of technology investigates forces.
d.
Important new technologies are usually very complex.
16.
Based on this paragraph which of the following can we suppose is true?
a.
Can openers and construction cranes utilize some similar scientific principles.
b.
Scientists and technicians often have problems working together.
c.
Advanced technology has many bad effects on the natural environment.
d.
Vacuum cleaners would not have been possible without drinking straws.
IEAR, 8/01
44
17.
difference between...
a.
b.
c.
d.
18.
unequal pressures and surfaces
drinking straws and vacuum cleaners
can openers and construction cranes
science and technology
Which of the following CANNOT be found in this paragraph?
a.
an explanation of the purpose of science
b.
names of well-known scientists
c.
references to scientific principles
d.
examples of practical devices
Passage Four
The first step in the study of motion is to describe the position of a moving object. Consider a
car on an east-west stre
have to specify its position relative to come particular point. Any well-known landmark can
serve as our reference point, or origin for measuring position. We then can state how far the car
is from the landmark and in which direction, east or west, and the description is complete. Thus,
for example, we say that the car is 5 km west of the center of town, or it is 3 km east of Sandy
know whether this means 5 km east or 5 km west.
Similarly, if you wish to describe the position of a point on a straight line that you have drawn,
you must specify some origin and state a direction from that origin. However, this time the
direction cannot be given as east or west, for the line may not run that way
directions. To get a description of direction along the line about which we all can agree, we shall
call the line on one side of the origin positive, on the other side negative; we can then specify
position on the line by using a positive or negative number which gives both the distance (in
some convenient units) and the direction of that point from the origin. (From Physics, HaberSchwain, et. Al., D.C. Heath Comapny, 1986, p. 394)
19.
The main point of this reading passage is that...
a.
a special system must be used in order to describe the location of a point on a line.
b.
cars are not difficult to find as long as they are moving east or west on a highway.
c.
the center of town is not very far from Sandy River Bridge.
d.
45
20.
Which of the following is true according to the reading passage?
a.
kilometers are a more accurate measuring unit than feet or miles.
b.
Sandy River is a place in the State of Washington.
c.
.
d.
The distance from a point of origin is always expressed as a positive number.
21.
a.
b.
c.
d.
22.
---) refers to...
the way of finding a point.
the problem of finding the origin.
left and right.
east and west.
The word specify (line ---) means....
a.
say exactly
b.
disagree with
c.
hide with care
d.
act creatively
23.
What would you suppose this reading would explain if it continued?
a.
Some specific ways that cars are manufactured
b.
Some important landmarks besides the Sandy River Bridge
c.
The history of the kilometer and its use throughout the world
The way in which the human brain locates sources of light or sound .
<<Return>>
IEAR 8/01
46
APPENDIX H
English 101 Composition Study
Project Manager & e-mail address: Gary Parks, gparks@ctc.edu
Below is our last updated version of the Composition Study report. It dates from fall 99.
We have been waiting to complete the cycle of collecting Fall, Winter, and Spring data
again before updating the report in this ongoing project. This year’s assessment project
(the one reported here) completes the cycle, and we will seek funding and participation to
write a report analyzing the last round of data collected.
English Program Composition Self-Study of English 101
Writing Skills, 1997-99
In 1996-97, faculty in the Shoreline Community College English/Communications Department
decided to undertake a statistical study of end-of-quarter student writing in English 101, transferlevel composition. This study was piloted in spring quarter 97 (see Profiles of Student Writing:
Results of English 101 Pilot Study, Spring 1997). Based on a successful pilot, a baseline study
was conducted during fall, winter, and spring of 97-98. This report presents the findings of the
baseline study as well as analysis and recommendations made by the Composition Study
Committee (CSC) consisting of Dutch Henry, Ed Harkness, and Gary Parks.
The study design called for randomly selecting five students from each and every composition
section offered in fall, winter, and spring quarter 1997-98. With the instructor facilitating the
process, selected students were asked to submit a set of one in-class essay and one out-of-class
essay representing end-of-quarter work (see Pilot Study Instructions in Appendix B). Selected
students also filled out a survey that included demographics information and questions on
program effectiveness (see Appendix B).
To evaluate the essays, an scannable scoring form was created (see Appendix B) utilizing the
existing categories of rhetorical skills on the English 101 Writing Assessment Rubric:
development, organization, style/voice, mechanics (see Appendix B). Reading panels consisting
of six to eight faculty readers, both full and part-time, were convened each quarter. Participants
received a stipend for serving on the panel.
Each quarter, the reading panel held a meeting to coordinate the scoring project and, for norming
purposes, to evaluate sample essays using the scoring instrument. After the initial meeting,
scoring of the essays was done individually. Essays were then passed along to another reading
panel member for a second reading. The second readings were given “blind,” without the results
of the first readings available.
47
Results from the score sheets and demographic surveys were scanned into Excel files. Some
analysis of this information was done using SPSS software. These Excel files were also (after
the Office of Institutional Effectiveness and Assessment became involved) converted to Access
files for the “Questions for Further Analysis” queries (Appendix A).
Although this project was difficult due to its magnitude and complexity, the CSC recommends
that the original department plan for continuation of the study be followed. This plan called for
surveying one quarter per year after the initial baseline year (97-98). Therefore, the study was
conducted again in spring 99. Results of the spring 99 survey are being processed and are not
included in this report.
The report generally follows the sequence of information items collected in the demographic
survey and essay scoring forms, with analysis offered after the findings. The initial findings
available to the CSC were basic percentages and frequencies of responses. These findings were
thoroughly discussed by the CSC, so analysis and commentary have been offered. Queries which
needed technical expertise beyond the capabilities of the three English teachers on the CSC were
placed in the “Questions for Further Analysis” sections. With the assistance of Jim James,
Director of the Office of Institutional Effectiveness and Assessment, many of these queries were
addressed and any findings have been presented in the Questions for Further Analysis sections.
However, since the CSC has not been able to discuss these findings, little to no analysis of them
is offered. The English department faculty will discuss these at a department meeting.
Findings based on the “Questions for Further Inquiry” comprise Appendix A. Appendix B
includes demographic and scoring forms, the English 101 Writing Assessment Rubric, and other
documents related to the study. Appendix C includes sample essays collected in the study with
the results of the readers’ scoring of them. Appendix D is a collection of English 101 essay
assignments, a by-product of this study.
Although this report reflects many hours of work on the part of the CSC, it is not meant to be
seen as the definitive and final statement on what was learned from the composition study. It
should serve as a starting point for department level discussion, further inquiry, and instructional
improvement, and as a model for further investigation of student successes.
Any commentary or questions about the report can be addressed to the Assistant Division Chair
for English.
H.
IEAR, 8/01
48
DEMOGRAPHICS INFORMATION RESULTS
1.
When was your last English or writing class at any institution (including high
school)?
 Last quarter
 Within the last year but not last quarter
 1-2 years ago
 2-5 years ago
 More than 5 years ago
Fall Quarter
Value Label
Last quarter
Last year
1-2 years ago
2-5 years ago
More than 5
Frequency
19
36
11
6
10
1
83
Missing cases: 1
Valid Percent
23.2
43.9
13.4
7.3
12.2
Missing
100.0
Cum. Percent
23.2
67.1
80.5
87.8
100.0
Valid Percent
37.1
21.4
14.3
15.7
11.4
Missing
100.0
Cum. Percent
37.1
58.6
72.9
88.6
100.0
Total
Valid cases: 70
Frequency
26
15
10
11
8
1
71
Missing cases: 1
Spring Quarter
Value Label
Last quarter
Last year
1-2 years ago
2-5 years ago
More than 5
Total
Frequency
13
14
7
6
6
46
Valid Percent
28.3
30.4
15.2
13.0
13.0
100.0
Cum. Percent
28.3
58.7
73.9
87.0
100.0
Total
Valid cases: 82
Winter Quarter
Value Label
Last quarter
Last year
1-2 years ago
2-5 years ago
More than 5
49
I. Observations/Analysis
Fall quarter shows the lowest percentage of students coming in from an English or
writing class the previous quarter (perhaps due to confusion over whether to include
summer?). Winter shows the highest percentage, but a drop in those who had taken one
not in the last quarter but in the previous year. It appears that in any given quarter, the
percentage of students who have not taken English or writing for at least a year is 33% or
more. Also, the percentages of those who had not taken English or writing for more than
five years remained fairly consistent through the academic year, at a substantial 11 to
14%.
Questions for further analysis
How did students who came directly from an English course the previous quarter
perform on essay evaluations in terms of those who last took an English course one or
more years ago?
In general, students who took English recently performed slightly worse than those who
had not taken English for a period of time (See Appendix A, page 1). This finding was
not discussed by the CSC, so no analysis of it is offered..
2.
Which of the following non-transfer English classes (or their equivalents at other
colleges) have you completed? Mark all that apply.





Eng 080, 081, or 091
Eng 090, 091, or 092
Eng 089 Reading Lab
Eng 099 Writing Lab
Eng 100 (sometimes transfers as elective)
4.
Which of the following transfer level classes (or their equivalents at other colleges have
you completed?
 Eng 102
 Eng 271
 Any creative writing class
 Any other English class
 Any "W" class
IEAR, 8/01
50
Percentage of students in English 101 who
had completed English 100
60
50
40
30
43 .5
36 .6
20
18
10
0
fall
winter
spring
Eng 080
Eng 090
Eng 089
Eng 099
Eng 100
Eng 102
0
0
0
0
1
1.2
2
2.4
15
18.1
2
2.4
1
1.2
Winter
(n = 68)
number
percent
3
4.2
8
11.3
4
5.6
3
4.2
26
36.6
2
2.8
Spring
(n= 46)
number
percent
3
6.5
1
2.2
1
2.2
5
10.9
20
43.5
1
2.2
Fall (n = 83)
number
percent
Eng 271 Cr. writing
Other Eng
Any "W"
2
2.4
9
10.8
3
3.6
1
1.4
2
2.8
16
22.5
8
11.3
1
2.2
1
2.2
10
21.7
1
2.2
Question #2 and #4 (results charted together)
Observations/Analysis
The percentage of students who have completed English 100 or its equivalent grows
significantly from fall to spring.
In fall quarter, no students (out of 83) reported having taken English 080 or 090.
However, in winter the percentages increase. In fact, in winter quarter, 11.3% of the 68
students reported having taken English 090. This low fall quarter percentage of previous
developmental students may indicate problems retaining these students from spring to
fall. The CSC cannot explain the high percentage of English 090 students in winter.
51
The number of students who had taken English 099 (Writing Lab) increases significantly
in spring (to 10.9%). This along with other factors may indicate a student population
with more entrenched writing problems in spring quarter than in other quarters.
Questions for Further Analysis
How did students from pre-transfer English courses do in terms of essay scoring?
For all classes besides English 100, the n’s were too low to address this question
(Appendix A, page 2). The following table shows English 100 performance compared to
the yearly average. These results have not been discussed by the CSC so no analysis is
offered.
Eng 100
No 100
In Dev
In Org
In Sty
In Mech
Out Dev
Out Org
Out Sty
Out Mech
3.93
4.13
4.15
4.36
3.97
4.13
3.73
4.24
4.12
4.39
4.29
4.52
4.14
4.25
3.84
4.43
3.
Are you currently enrolled in either Eng 099 (writing lab) or Eng 089 (reading
lab)?
 Yes
 No
Fall Quarter
Value Label
Yes
No
Total
Valid cases: 80
Winter Quarter
Value Label
Yes
No
Total
Valid cases: 70
Spring Quarter
Value Label
Yes
No
IEAR, 8/01
Frequency
3
77
3
83
Missing cases: 3
Valid Percent
3.8
96.3
Missing
100.0
Cum. Percent
3.8
100.0
Frequency
3
67
1
71
Missing cases: 1
Valid Percent
4.3
95.7
Missing
100.0
Cum. Percent
4.3
100.0
Frequency
1
45
1
Valid Percent
2.2
97.8
Missing
Cum. Percent
2.2
100.0
52
Total
Valid cases: 45
46
Missing cases: 1
100.0
Observations/Analysis
Low percentages of students enroll for credit in the Reading/Writing Center while taking
English 101.
Note: This question should probably be revised to also determine what percentage of
students use the R/Wr Center on a drop-in basis.
Questions for Further Analysis
How did students who used the R/Wr Center perform in terms of essay scoring compared
to those who did not use the Center?
Using other information on the demographics sheet, what is the profile of a student using
the R/Wr Center in terms of credits earned, developmental classes taken, grade
expectations, native or non-native user of English, etc.
The number of students who reported using the center was too low to reach conclusions.
Performance and demographics are profiled beginning on page 4, Appendix A—again,
the numbers of respondents are low.
5.
How many college credits have you completed at all colleges, including
Shoreline? Mark one only.
 Less than 15 credits
 15-30 credits
 30-45 credits
 45-60 credits
 More than 60 credits
Fall Quarter
Value Label
Less than 15 cr
15-30 cr
30-45 cr
45-60 cr
More than 60 cr
53
Frequency
44
18
6
4
6
5
Valid Percent
56.4
23.1
7.7
5.1
7.7
Missing
Cum. Percent
56.4
79.5
87.2
92.3
100.0
Total
Valid cases: 78
Winter Quarter
Value Label
Less than 15 cr
15-30 cr
30-45 cr
45-60 cr
More than 60 cr
83
Missing cases: 5
100.0
Valid Percent
24.6
30.4
15.9
5.8
23.2
Missing
100.0
Cum. Percent
24.6
55.1
71.0
76.8
100.0
Total
Valid cases: 69
Frequency
17
21
11
4
16
2
71
Missing cases: 2
Spring Quarter
Value Label
Less than 15 cr
15-30 cr
30-45 cr
45-60 cr
More than 60 cr
Total
Frequency
5
14
8
10
9
46
Valid Percent
10.9
30.4
17.4
21.7
19.8
100.0
Cum. Percent
10.9
41.3
58.7
80.4
100.0
Observations/Analysis
The percentage of students who indicate they have taken less than fifteen credits is
highest fall quarter (56.4%) and drops sharply by spring quarter (10.9%).
This huge percentage shift has implications for instructor expectations in regard to study
habits, grade expectations, and other aspects of “college culture.”
One might hypothesize a positive correlation between the number of credits earned and
student success in English 101, yet “Essay Scoring” figures indicate otherwise. Students
fare best in English 101 during winter quarter and worst during spring quarter when they
have earned the most college credits.
Questions for Further Analysis
Correlate student performance with credits earned.
Correlate credits earned with student grade expectations.
IEAR, 8/01
54
No significant correlation in either case—see page 8-9 of Appendix A.
6.





Aside from school assignments, how much do you read each week?
I don't read anything except school assignments
Less than 2 hours per week outside my assignments
About 2-5 hours per week outside my assignments
About 5-10 hours per week outside my assignments
More than 10 hours per week outside my assignments
Fall Quarter
Value Label
School
Less than 2 hours
2-5 hours
5-10 hours
More than 10 hours
Total
Valid cases: 81
55
Frequency
8
28
31
10
4
2
83
Missing cases: 2
Valid Percent
9.9
34.6
38.3
12.3
4.9
Missing
100.0
Cum. Percent
9.9
44.4
82.7
95.1
100.0
Winter Quarter
Value Label
School
Less than 2 hours
2-5 hours
5-10 hours
More than 10 hours
Total
Valid cases: 69
Frequency
8
26
20
9
6
2
71
Missing cases: 2
Valid Percent
11.6
37.7
29.0
13.0
8.7
Missing
100.0
Cum. Percent
11.6
49.3
78.3
91.3
100.0
Frequency
2
18
17
8
1
46
Valid Percent
4.3
39.1
37.0
17.4
2.2
100.0
Cum. Percent
4.3
43.5
80.4
97.8
100.0
Spring Quarter
Value Label
School
Less than 2 hours
2-5 hours
5-10 hours
More than 10 hours
Total
Observations/Analysis
From 44 to 49 % of our students report reading two hours or less per week outside school
assignment.
From 17 to 20% indicate they read 5 hours a week aside from school assignments.
Questions for Further Analysis
Show student performance on essays broken down by categories of reading habit (or, at
minimum, compare the least and most frequent readers in terms of performance). This
information was not available.
Profile low and high readers in terms of other demographic information.
See beginning page 10 of Appendix A.
IEAR, 8/01
56
7.
How did you place into this class? Please read all choices before
responding.
 I took the previous class at Shoreline CC (English 100 or ESL
100/100B)
and received a 2.0 or above.
 I took the previous class at another college and received a 2.0 or above.
 I was placed in the class by my Asset score.
 I took Asset and was placed in a different class but decided to try this
one.
 I took Asset and was placed in a different class but was advised by a
teacher or counselor to try this one.
 I did not take Asset and self-placed in this class.
 I did not take Asset and was advised to take this class.
 Other (please specify).
Fall Quarter
Value Label
SCC
Other college
Asset
Differ Asset
Advise Asset
Self place
Other
Total
Valid cases: 83
Frequency
12
3
61
2
1
2
2
83
Missing cases: 0
Valid Percent
14.5
3.6
73.5
2.4
1.2
2.4
2.4
100.0
Cum. Percent
14.5
18.1
91.6
94.0
95.2
97.6
100.0
Valid Percent
27.3
4.5
43.9
1.5
6.1
7.6
1.5
7.6
Missing
100.0
Cum. Percent
27.3
31.8
75.8
77.3
83.3
90.9
92.4
100.00
Total
Valid cases: 66
Frequency
18
3
29
1
4
5
1
5
5
5
Missing cases: 5
Spring Quarter
Value Label
Frequency
Valid Percent
Cum. Percent
Winter Quarter
Value Label
SCC
Other college
Asset
Differ Asset
Advise Asset
Self place
Advised No Asset
Other
SCC
Other college
57
22
1
47.8
2.2
47.8
50.0
Asset
Advise Asset
Self place
Advised No Asset
Other
Total
14
2
3
1
3
46
30.4
4.3
6.5
2.2
6.5
100.0
80.4
84.8
91.3
93.5
100.0
Observations/Analysis
Radical shifts occur over the course of the year in terms of how students are
placed in English 101. In the fall, placement overwhelmingly results from Asset
test scores, with 73.5% of students indicating in Fall that they took Asset and were
recommended for 101. By Spring that number drops to 30.4%, and 47.8%
indicate they were placed in 101 via a previous class at Shoreline.
Spring quarter, then, becomes a key indicator in terms of our services to
previously under-prepared students in our Developmental English sequence.
If responses number 1,2,3 are viewed as departmentally “accepted” while
responses 4,5,6, and 7 (see the question itself for numerics) are viewed as
“improper” placement methods (the CSC agreed to these terms), a significant shift
occurs in the balance between “accepted” vs. “proper” placement from fall
quarter compared to winter and spring. In fall, 92% of students use “accepted”
methods. In winter and spring the figures drop to 76% and 80% respectively. In
fall, only 6% of students are improperly placed in English 101; in winter this rises
to 17% and in spring it is 13%. This may be because in winter and spring students
have accumulated enough credits to register without an advisor.
Note: Decision zone placement is not included and should be; it may account for
some of the “other” placement.
Questions for Further Analysis
Chart essay score performance according to acceptable or improper placement of
students.
Compare performance of students placed by Asset to students placed via English
100 or its equivalent.
Though the n’s are low for many categories, the chart on Appendix A, page 15
shows some interesting results. Students who took Asset and followed placement
IEAR, 8/01
58
as well as those who entered English 101 via the feeder class fared better than
those who took Asset and did not follow placement. However, students who did
not take Asset and self placed performed best of all categories.
8.
What grade do you expect to receive in this class?
 3.5 - 4.0
 2.9 - 3.4
 2.0 - 2.8
 1.2 - 1.9
 Below 1.2
Fall Quarter
Value Label
3.5 - 4.0
2.9 - 3.4
2.0 - 2.8
Total
Valid cases: 83
59
Frequency
37
36
10
83
Missing cases: 0
Valid Percent
44.6
43.4
12.0
100.0
Cum. Percent
44.6
88.0
100.0
Winter Quarter
Value Label
3.5 - 4.0
2.9 - 3.4
2.0 - 2.8
1.2 - 1.9
Total
Valid cases: 66
Spring Quarter
Value Label
3.5 - 4.0
2.9 - 3.4
2.0 - 2.8
Below 1.2
Total
Valid cases: 45
Frequency
26
30
8
2
5
71
Missing cases: 5
Valid Percent
39.4
45.5
12.1
3.0
Missing
100.0
Cum. Percent
39.4
84.8
97.0
100.0
Frequency
15
22
7
1
1
46
Missing cases: 1
Valid Percent
33.3
48.9
15.6
2.2
Missing
100.0
Cum. Percent
33.3
82.2
97.8
100.0
Observations/Analysis
In aggregate, significant discrepancies existed between what the students expected
to receive and the grade ranges indicated by essay readers. In fall, 44.6%
expected to receive 3.5-4.0, while only 18.1% did according to essay readers. In
winter, the discrepancy was still present, but somewhat smaller, from 36.6% to
21%. In spring, grades were lower and the students knew it, as 32.6% expected
3.5-4.0, while only 14.4% received it from readers. Consistently, though, 77-79%
of students expected 2.9-4.0, while only 36-58% received it.
During fall quarter, fully 100% of students expected to earn a decimal grade of 2.0 or
higher in English 101, while scorers rated 88.7% of fall essays at rubric level 4 (decimal
grade range of 2.0-2.8) or higher. For winter and spring quarters, students who expected a
grade of 2.0 or better remained high (97%; 97.8%), while essay readers rated winter
essays of rubric level 4 or higher at 86.2% and spring essays at 72.9%. The discrepancy
between student expectations for receiving a decimal grade of 2.0 or better and the
percentage of essays which readers scored at grade 2.0 or better was smallest during
winter quarter (10.8%) and greatest spring quarter (24.9%). Statistically, then, students in
English 101 fare best in winter quarter, both in terms of their lowered expectations and
their relatively high success rate. Spring quarter, the numbers indicate, appears fraught
with danger for many English 101 students, since scorers put the percentage of essays
IEAR, 8/01
60
falling at level 3 (decimal grade range of 1.5-1.9) or lower at 27.1%, over twice as high as
fall quarter essays at level 3 or lower (11.2%).
Questions for Further Analysis
Is there a correlation between credits earned (#5) and course expectations (#8)? Students
with few credits might have higher, less realistic expectations than those students who
know the rigors of college level work.
What is the statistical correlation between grade expectations and grades? Any
differences between good writers and poor writers in terms of this correlation.
This information was not available.
J.
9. Please rate the effectiveness of the instruction you have received this quarter in the skills listed below. Use the
following scale for your responses.
K. No response (selection left empty) means "I don't understand the question."
L. 1.
Very ineffective or no instruction on this topic.
2.
Somewhat effective instruction on this topic.
3.
Adequate instruction on this topic.
4.
Good instruction on this topic.
5.
Very good/excellent instruction on this topic.
M.
N. a.
O.
b.
P.
c.
Q.
R.
d.
S.
e.
T.
f.
U.
V.
g.
W.
Developing ideas (using details, sticking
to a point, supporting a position, etc.)
Organizing ideas (structuring an essay,
introductions, conclusions, transitions, etc.
Style/Voice (choosing words carefully,
varying sentence, structure, developing
your own writing style, etc.
Mechanics (how to use punctuation,
sentence structure, mechanics of
quotation, etc.)
Research writing (finding research
sources, evaluating sources, integrating
source into writing, documenting methods,
etc.)
Reading (comprehension, understanding,
the author's perspective and purpose, your
own response to readings, application to
personal experience)
Group dynamics (collaboration on a task,
group problem solving, group discussion,
respectful discussion, etc.)
61
Effectiveness of Types of Instruction, English 101
Expressed in Mean Score on 1-5 Scale, 5 being excellent.
FALL
WINTER
SPRING
YEAR
73
67
43
183
DEVELOPING IDEAS
4.06
4.16
4.04
4.09
ORGANIZING IDEAS
4.23
4.13
4.15
4.17
STYLE/VOICE
3.96
3.84
3.47
3.76
MECHANICS
3.78
3.46
3.70
3.65
RESEARCH WRITING
3.41
3.41
3.53
3.45
READING
4.28
4.21
3.80
4.10
GROUP DYNAMICS
3.99
3.84
3.52
3.78
AVERAGE:
3.96
3.86
3.74
3.86
NUMBER OF RESPONSES
Observations/Analysis
Averaged through the year, student ratings ranked areas of instruction like this top to
bottom:
Aspect of Instruction
Year Ave. on 1-5 scale
1.
2.
3.
4.
5.
6.
Organizing Ideas
Reading
Developing Ideas
Group dynamics
Style/voice
Mechanics
7. Research Writing
(4.17)
(4.10)
(4.09)
(3.78)
(3.76)
(3.65)
(3.45)
The top ranking of organization may indicate that this skill was what students needed
most from college-transfer writing.
The high rankings of organization, development, and reading fit well with the emphasis
given to these skills in the MCO and in the minds and approaches of most composition
teachers. These rankings indicate that, according to the students, the course is fulfilling
key elements of its mission in “good” to “very good” fashion.
IEAR, 8/01
62
Student performance (in terms of essay scoring) matched the students’ rating of
instruction closely in organization and development. Over the entire year students
performed best in the category of organization and they also said organization was the
category in which they received the best instruction. Similarly, student essays showed
development to be the third highest category, and they also rated it as the third highest
category in terms of effectiveness.
Most ratings of the effectiveness of instruction held consistent throughout the year.
However, there were three notable categories in which students in spring quarter saw less
effectiveness in instruction. Student ratings in Style/Voice, Reading, and Group
Dynamics, dropped 12.3%, 11.2%, and 11.8% respectively. It is interesting to note that
the percentage of the decreases are all very similar and they all occurred in the same
quarter. This may be related to general differences in spring quarter students that we have
seen in reflected in other data.
Even the lowest ranking, research writing, fell between “adequate” and “good.”
The low ranking of research writing can be explained by the fact that research writing is
not the focus of English 101, although it is supposed to be introduced in English 101.
This ranking may also, however, reflect a need to emphasize the research writing
introduction more strongly and/or more consistently throughout the department.
The lower ranking of mechanics, though probably not a surprise to English teachers,
might be difficult to explain to those outside the discipline since many see the purpose of
English 101 as teaching sentence and punctuation conformity.
The relatively low ranking for mechanics may indicate a need for increased instruction in
this area or else better placement, so that students with significant mechanics problems do
not enter English 101. (The department needs to discuss this issue.)
The relatively low ranking of style/voice instruction may indicate either lack of direct
instruction on this aspect, a perception of worse instruction in these aspects of writing, or
student confusion about the terms.
Questions for Further Analysis
Break down student perceptions according to various demographic categories such as
native, non-native speaker, credit accumulation, previous English classes taken, etc
See Cross-tabulations beginning page 16, Appendix A.
Find statistical correlations between effectiveness of ratings and student performance as
rated in essay scoring.
63
No significant correlation was found. See chart page 49, Appendix A.
Compare high and low performing students in terms of their perceptions of effectiveness
of instruction.
This information was not available.
X. 10. Is English your first language (the language used in the household in which
grew up)?
 Yes
 No
you
Is English Your First Language?
100
86
90
75
70
80
70
60
Yes
No
50
30
40
25
30
14
20
10
0
Fall
Winter
Spring
Percentage answering yes and no.
Observations/Analysis
The shift in native/non-native balance through the year is significant. For example, a
class of 27 will have on the average 4 non-native speakers in fall, but 8 in spring, almost
one third of the class.
Non-native speakers may put off English 101 until spring, or the increase may be related
to their need to progress through the ESL classes.
The higher percentage of non-native speakers may contribute to the low spring ratings for
effectiveness of instruction in style/voice, reading, and group dynamics—skill areas for
which many non-native speakers have high needs.
IEAR, 8/01
64
Questions for Further Analysis
Create demographic profile of non-native speakers in terms of credits earned, use of
Writing/Reading Center, grade expectations, and so on.
See Cross-tabulations beginning page 50, Appendix A.
Compare performance in terms of essay scoring of organization, development,
mechanics, and style-voice of non-native speakers vs. native speakers.
The following chart shows that non-native speakers were scored slightly lower than
native speakers in all rhetorical aspects. The greatest gaps occurred in mechanics.
Not ESL
ESL
In Org
In Dev In Sty
In Mech
4.43
4.29
4.21
4.00
4.30
3.63
11.
What is your gender?
 Male
 Female
65
4.31
3.98
Out
Org
4.60
4.40
Out
Dev
4.45
4.22
Out Sty Out
Mech
4.49
4.40
4.24
3.87
Gender by percentage
70
58.8
54.3
60
53.3
46.7
45.7
50
41.3
40
male
female
30
20
10
0
fall
winter
spring
Observations/Analysis
The gender imbalance is more striking in fall than in winter and spring.
Questions for Further Analysis
Is there a correlation between gender (#11) and expectations?
This information is not available.
Is there a correlation between gender and rating of instruction (#9)?
In every category of instruction, female students rated instructional quality lower than
female students, sometimes significantly lower. See beginning page 57 in Appendix A.
These findings have not been discussed by the CSC; therefore no analysis is offered.
Are women or men more likely to have taken developmental English classes and/or to
have used the Reading/Writing Center?
IEAR, 8/01
66
The numbers of respondents were too low to reach conclusions. English 100 may be the
exception: 56% of those who had taken English 100 were men while 44% were women.
12.
What is the employment status of your instructor?
 Part-time
 Full-time
 I don't know
Observations/Analysis
Most students don’t know their instructor’s employment status.
In spring, there is a significant jump in the percentage of students who do not know this
information. The CSC cannot explain this.
A low percentage of students are aware of having a full-time instructor.
67
Questions for Further Analysis
Compare percentages given in this question to actual staffing percentages. Prediction:
Most of the “don’t knows” would have to be part time according to staffing percentages.
Staffing percentages for 97-98 were approximately 66% part-time.
IEAR, 8/01
68
ESSAY SCORING RESULTS
Scale: 1 = 0.0, or level 1 on current English 101 rubric; 6 = excellent, or level 6 on rubric.
In-Class Essay
Out-of-Class Essay
Y. Organization
Development
Style/Voice
Mechanics
ESSAY GRADE
What grade level would you assign to this collection?
 3.5 - 4.0
 2.9 - 3.4
 2.0 - 2.8
 1.3 - 1.9
 0.7 - 1.2
 0.0
ESSAY SCORING BY CATEGORIES:
Scores are expressed in the following rubric levels:
Level 6
3.5 - 4.0
Level 5
2.9 - 3.4
Level 4
2.0 - 2.8
Level 3
1.3 - 1.9
Level 2
0.7 - 1.2
Level 1
0.0
NUMBER OF SET READINGS:
IN-CLASS DEVELOPMENT
IN-CLASS ORGANIZATION
IN-CLASS MECHANICS
IN-CLASS STYLE
OUT-CLASS DEVELOPMENT
OUT-CLASS ORGANIZATION
OUT-CLASS MECHANICS
OUT-CLASS STYLE
LEVEL 6
LEVEL 5
LEVEL 4
3.5-4.0
2.9-3.4
2.0-2.8
69
FALL
162
4.13
4.39
4.21
4.09
4.43
4.64
4.34
4.33
WINTER
155
4.25
4.40
4.17
4.42
4.54
4.57
4.45
4.65
SPRING
117
4.00
4.29
4.02
4.21
4.19
4.34
3.96
4.31
YEAR AVE
434
4.13
4.36
4.13
4.24
4.39
4.52
4.25
4.43
FALL
18.1
30.6
40
WINTER
21
32.3
32.9
SPRING
14.4
22.9
35.6
YEAR AVE
17.83
28.60
36.17
LEVEL 3
LEVEL 2
LEVEL 1
1.3-1.9
0.7-1.2
0.0
10.6
0.6
0
12.6
1.2
0
25.4
1.7
0
16.20
1.17
0
Observations/Analysis
Students in winter quarter outperformed students from fall quarter in three of four
categories, while students in the Spring scored lower than the previous quarters in all
categories. Interestingly, students in spring also had lower expectations about the grade
they were to receive.
In spring, essays in levels 4, 5, and 6 dropped 15.8% from winter. Meanwhile, essays in
levels 2 and 3 were up 15.7%. This shift may be related to changes in the characteristics
of students encountered in spring. Students in spring are more frequently non-native
speakers, and more typically did not place into English 101 by Asset but instead entered
through the feeder classes or through improper placement.
The average score for all in-class essays written for the year was 4.22 (based on the 1-6
levels of the rubric, with level six corresponding to a decimal grade range of 3.5-4.0),
suggesting that overall the in-class essays averaged at around a decimal grade of 2.3-2.5.
For out-of-class papers, the average score was just slightly higher at 4.4, corresponding to
a decimal grade range of about 2.4-2.6. In other words, the out-of-class papers on average
were rated only slightly better than the timed in-class papers, in which students had no
opportunity to revise. This slight difference, however, does not take into account the fact
that scorers may have shifted the weight of their criteria depending on whether they were
assessing an in-class or out-of-class paper.
Questions for Further Analysis
Compare student performance to actual grades given in English 101. The essay scoring
report indicates that the average grade given to writing in English 101 should be in the 2.5
range. Grades are probably higher—why? Here are some ideas to discuss and explore:


Reading panel members may have rated these essays lower because of the lack of
knowledge of the student and the context. It is easier to grade an essay low out of
context.
Other aspects of English 101 evaluation may have boosted the grade: reading
journals, homework, revision opportunities, extra credit, etc.
READER AGREEMENT
For each batch of essay sets collected, a reading panel was formed to score the sets. This panel
generally consisted of eight members. Some faculty members participated throughout the year,
while others revolved. All full-timers were encouraged to participate to help achieve ongoing
IEAR, 8/01
70
department consistency. Through the course of fall, winter, and spring of 97-98, 17 English
faculty (7 full-time and 10 part-time) served on the reading panel.
Before scoring each quarter’s essays, (see results above), faculty members held a norming
session in which they read, scored, and discussed sample essays. The study essays were then
distributed in batches, and panel members were given instructions on who to forward the batches
to for a second reading. Batches were not traded directly between readers but were distributed in
circular fashion. For example, Reader A would give her essays to Reader B, Reader B would
give hers to Reader C,. Reader C would give his to Reader D, and so on. Scoring sheets did not
accompany the essays when they were forwarded, so the second readings always occurred
“blind” to the scoring of the first reading.
Below are percentages of inter-reader agreement in each rhetorical category for fall, winter, and
spring. For the purposes of this study, agreement is defined as two evaluators scoring the student
in either the same level or one level apart on the 1-6 scale established in the rubric.
Percentage of Reader Agreement
IC Org IC Dev IC Sty IC Mech
Fall
Winter
Spring
Category Average:
81.2
85.2
83.6
83.3
81.2
86.9
67.3
78.5
87
85.2
76.4
82.9
79.7
86.9
83.6
83.4
OC Org
86.3
84.1
72.5
81.0
OC Dev
80
76.2
82.4
79.5
OC Sty OC Mech Quarter
Average
87.5
85
83.5
88.9
82.5
84.5
82.4
76.5
78.1
86.3
81.3
82.0
Z. Observations/Analysis
Jim James, Directory of Institutional Effectiveness and Research, reports that the goal for a
secondary level writing assessment project he worked on in Oregon was 90% Inter-Reader
reliability. Given the differences between secondary composition instruction and community
college composition in terms of writing tasks, academic freedom issues, and staffing levels, the
82% average figure for the year is an admirable starting point.
The drop in reader agreement in spring quarter may reflect the fact that, with a high number of
panel members repeating from previous quarters, less emphasis was given to the norming session
before the essays were scored.
Lower than average percentages in both in-class and out-of-class development point to a
department need to discuss how to assess development in essays.
Jim James suggests that inter-reader reliability data may be more sound if the essays are not
traded in batches but rather re-mixed and dispersed randomly for a second reading.
OVERALL RECOMMENDATIONS:
71
The Department will discuss this report in fall 99 and inlcude related
recommendations in the Program Review / Budget process. Some
preliminary recommendations forwarded by the Composition Study
Committee include:







Present the information contained in this report to English faculty and to others
interested.
Hold department discussions to clarify the nature of the “introduction to
research” given in English 101.
Get out information to English faculty on the significant difference in student
profile from fall to spring. Create “profile” of a spring English 101 student.
Hold department discussion and/or presentations on how to handle the
mechanics issue in English 101.
Hold department discussions on how to define and assess (and teach?)
development.
Continue to discuss and engage in professional development in regard to ESL
issues.
Continue the composition study if funding and department staffing allow.
IEAR, 8/01
72
Sample Essays
Over 400 essays were read and scored by two readers each for the Composition Study. From these essays, the
following have been selected as samples. Some are representative of student accomplishment at various levels;
others are included because they show interesting scoring profiles, such as great differences in scoring between the
two readers or far different levels of accomplishment in rhetorical categories.
Scores listed refer to the levels in the English 101 Essay Scoring Rubric for Organization, Development,
Style/Voice, and Mechanics for both In-Class and Out-of-Class Essays. Level 6 is the highest, at a 3.5-4.0 grade
range. The grade in the right column refers to the grade the reader would have given for the two-essay set as a whole.
See the Scoring Rubric for more detail.
These essays may be used for instructional purposes within the classroom.
[[what follows are print collections of essays. These were made available to English faculty]]
73
Good to Excellent Essays
These essays were generally scored in levels 5-6. See the English 101 Writing Assessment Rubric in Appendix B for
more detail on grade equivalencies and outcomes related to particular levels.
“I don’t attend a scholastically acclaimed four-year university” (in class) and “Uniforms in Public Schools”
IC Org IC Dev
Reader A
Reader B
6
6
6
6
IC Sty IC Mech OC Org
6
6
6
6
6
6
OC Dev
OC Sty
6
6
6
6
OC Mech
Grade
for Set
6 3.5-4.0
6 3.5-4.0
“Blinding Words” (in-class) and “Sidewalks: To Sit or Not to Sit”
IC Org IC Dev
Reader A
Reader B
5
6
5
6
IC Sty IC Mech
6
6
6
6
OC Org
OC Dev
OC Sty
5
6
5
6
5
6
OC Mech
Grade
for Set
6 3.5-4.0
6 3.5-4.0
“I’ve Never Really liked hospitals” (in-class) and “Fine Lines”
IC Org IC Dev
Reader A
Reader B
6
6
5
6
IC Sty IC Mech OC Org
5
6
5
6
6
6
OC Dev
OC Sty
6
6
5
6
OC Mech
Grade
for Set
5 3.5-4.0
6 3.5-4.0
“Go Back to Film School” (in-class) and “I’ll Show you ‘Chicken’!”
IC Org IC Dev
Reader A
Reader B
6
5
5
5
IC Sty IC Mech OC Org
5
5
5
4
6
5
OC Dev
OC Sty
5
5
6
5
OC Mech
Grade
for Set
5 2.9-3.4
4 2.9-3.4
“Most classes we take in college are in a traditional setting” (in-class) and “The Unthinkable Titanic”
IC Org IC Dev
Reader A
Reader B
IEAR, 8/01
6
5
5
6
IC Sty IC Mech OC Org
5
5
4
4
6
5
OC Dev
OC Sty
5
6
5
5
74
OC Mech
Grade
for Set
3 2.9-3.4
4 2.9-3.4
Middle-level Passing Essays
These essays were generally scored in levels 4-5. See the English 101 Writing Assessment Rubric in Appendix B for
more detail on grade equivalencies and outcomes related to particular levels.
“I think you can learn by doing ‘peer responses’” (in-class) and “The male gender comes in many forms”
IC Org IC Dev
Reader A
Reader B
5
2
4
2
IC Sty IC Mech OC Org
4
3
3
3
6
5
OC Dev
OC Sty
5
6
6
5
OC Mech
Grade
for Set
2 2.9-3.4
4 2.0-2.8
“Minimalism Can Achieve Maximum Results” (in-class) and “Traditions”
IC Org IC Dev
Reader A
Reader B
5
4
4
4
IC Sty IC Mech OC Org
4
4
5
5
4
5
OC Dev
OC Sty
3
4
4
5
OC Mech
Grade
for Set
5 2.0-2.8
5 2.9-3.4
“Both essays were written by the people who were from immigrant families” (in-class) and “Living Away from
Home”
IC Org IC Dev
Reader A
Reader B
4
6
3
5
IC Sty IC Mech OC Org
4
5
4
5
5
6
OC Dev
OC Sty
4
6
4
5
OC Dev
OC Sty
3
4
4
4
OC Mech
Grade
for Set
4 2.0-2.8
5 2.9-3.4
“My Desk” (in-class) and “Great Expectations”
IC Org IC Dev
Reader A
Reader B
5
4
4
4
IC Sty IC Mech OC Org
4
4
4
4
4
4
OC Mech
Grade
for Set
4 2.0-2.8
4 2.0-2.8
“’Making the Grade’ is an article” (in-class) and “Rage Against Writing”
IC Org IC Dev
Reader A
Reader B
4
4
75
4
3
IC Sty IC Mech OC Org
4
3
2
3
4
4
OC Dev
OC Sty
4
4
4
3
OC Mech
Grade
for Set
1 2.0-2.8
3 2.0-2.8
Non-Passing Essays
These essays were generally evaluated below the 2.0 mark needed for successful English 101 completion. See the
English 101 Writing Assessment Rubric in Appendix B for more detail on grade equivalencies and outcomes related
to particular levels.
“Throughout the quarter the class has read” (in-class) and “Today people of all ages”
IC Org IC Dev
Reader A
Reader B
4
5
3
3
IC Sty IC Mech OC Org
4
5
4
4
4
3
OC Dev
OC Sty
4
3
4
4
OC Mech
Grade
for Set
4 1.2-1.9
3 1.2-1.9
“The problem with owning old cars” (in-class) and “As I gaze out my office window”
IC Org IC Dev
Reader A
Reader B
2
1
2
1
IC Sty IC Mech OC Org
2
1
2
1
4
4
OC Dev
OC Sty
5
3
4
4
OC Mech
Grade
for Set
4 2.0-2.8
4 1.2-1.9
“Embryo Rights” (in-class) and “Defending Microsoft in the Workplace”
IC Org IC Dev
Reader A
Reader B
IEAR, 8/01
2
2
3
2
IC Sty IC Mech OC Org
3
2
2
2
3
3
OC Dev
OC Sty
2
2
3
2
76
OC Mech
Grade
for Set
3 1.2-1.9
2 1.2-1.9
English 101 Essay Assignments
During the essay collection process, instructors were asked to submit essays assignments commonly used in
English 101; the assignments collected are contained in this section. Not all instructors submitted
assignments, so this collection is not comprehensive.
<<Return>>
77
APPENDIX I
Assessment of Outcomes in Economics
2000-2001 Report
For the Institutional Effectiveness Committee
Shoreline Community College
Presented By Professor Tim Payne (Economics): Project Manager
Project Participants:
Bruce Christopherson
Bob Francis
Dongwa Hu
Jimmy Kelsey
Contents:__________________________________________________
A.
Economics Assessment Project Report
B.
Agendas of project meetings
_____________________________________________________________________________
_
A.
Economics Assessment Project Report
During the 2000-2001 academic year two assessment projects were undertaken in the
Economics department, “Quantitative Reasoning Outcomes and Assessment in Microeconomics”
and “Analytical Reasoning Outcomes and Assessment in Macroeconomics”. These two projects,
both continuations of projects that began last year, involved coordinating full-time and part-time
Economics faculty members in collaboratively discussing quantitative and analytical reasoning
outcomes and assessing our teaching. Last year (1999-2000) our faculty members collaborated in
beginning the process of identifying specific quantitative reasoning outcomes in Microeconomics
and analytical reasoning outcomes in Macroeconomics. We also created a first draft set exercises
to assess those outcomes which are now available for use by all Economics faculty members.
This years projects are a continuation of what was started a year before, and included the
following goals:

IEAR, 8/01
Creation of a complete list of specific quantitative reasoning outcomes for
Microeconomics and analytical reasoning outcomes for Macroeconomics,
78

Review of the first draft packet of exercises for classroom assessment developed
last year. This involved performing the assessment exercises we created last year
in our classes, tabulating results, and asking the questions "What is working?",
"What tools need to be revised or improved?", "What tools are needed in addition
to those that we developed (based on outcomes we identify in the process
above)?", and "What tools are unnecessary?"

Recommendations for future Economics department work on quantitative
reasoning outcomes and assessments
In pursuing the above goals, we exceeded the initial expectations of the original
project proposals with the following additional accomplishments:

Incorporation of the lists of outcomes we created into three different Master
Course Outlines (MCOs),

Creation of self-help tutorials and other study and learning aids for students, and

Creation of a lively forum for discussing teaching techniques and approaches.
A more thorough discussion of each of these points follows.
1. Creation of a complete list of specific quantitative and analytical reasoning
outcomes
One important accomplishment of this year’s projects was the creation of complete lists of
learning outcomes for both microeconomics and macroeconomics. While our group began this
process last year, we had started with the easy part, making a kind of “wish list” of all of the
things one could teach in a microeconomics or macroeconomics class. But in doing this we
found that there were topics that not all of us think are important enough to be a required learning
outcome. What we needed to do as a group was to narrow down the list to topics we all
considered crucial without making such a long list that it would preclude in-depth coverage of
any topic. We needed to trim the fat. We determined the topics and content that should be
covered in microeconomics and macroeconomics, and developed and further clarified learning
outcomes. We then used the lists of outcomes we created to revise the MCOs for ECON 100,
200, and 201. This was a timely accomplishment since there was a departmental need to
collaborate in MCO revision to meet timelines necessary for accreditation review. The
discussions that lead to our completed outcomes lists were crucial to the progress we made in
achieving other goals of the project.
2. Review of the first draft packet of exercises for classroom assessment developed last
year.
79
Through the process described above, we now had a good idea about the outcomes we
wished to assess. While last year’s projects involved creating packets of exercises for assessing
many outcomes, none of the instructors involved in the project had had much opportunity to
utilize the exercises in their classes. Thus we had little information about the effectiveness of the
assessment tools we had created. One goal of this year’s projects was to evaluate our assessment
tools.
In order to evaluate the effectiveness of the exercises we created last year, we each chose a
few exercises to use in our classes each quarter and asked specific questions. The questions asked
included the following:

How much time did it take students to complete the exercises in class?

Did the students understand the directions?

How much time did it take afterward for class discussion of the results?

Was the complexity of the exercise appropriate given limited class time?

How much time was needed by the instructor to tabulate the results?

How did students demonstrate their understanding of the subject? Were they “getting
it”?

Was the assessment process itself useful in improving student comprehension?
The process of evaluating our assessment exercises led to revision of several exercises,
mostly in the direction of simplifying and streamlining in an attempt to save time and alleviate
confusion. More work could be done in this area.
Another unanticipated accomplishment that came from this process was idea of creating selfhelp tutorials, study aids and other tools and exercises that instructors might provide for students
outside of the requirements of a class. Instructors are often challenged with the wide range of
student abilities at the beginning of a quarter, especially when it comes to the quantitative skills
required in Microeconomics. Consequently, we felt the need to develop additional study
resources that could be made available to students who could benefit from extra practice. The
Economics department is now committed to creating a web-based learning center that will
provide students with exercises created by Shoreline Economics faculty as well as links to
tutorial websites provided by publishers and other schools. Professor Dongwa Hu will
coordinate the creation of the web-learning center during the 2001-02 academic year.
3. Recommendations for future Economics department work on quantitative and
analytical reasoning outcomes and assessments
IEAR, 8/01
80
The projects undertaken over the last two years have been rewarding for faculty participants.
Our hope is that the most important result of these projects has also been achieved: an
enhancement in the learning of our students. However, that is a result that we have not yet tried
to quantify. More work could be done in measuring student learning, retention of concepts, and
ability to succeed in more advanced classes. Many of the classroom assessment exercises we
developed could be examined in more depth and refined, revised or replaced.
All participants agree that another area that could reap future rewards is to continue the level
of communication and understanding amongst full-time and part-time Economics faculty
members created by these projects. Not only are we all more informed about outcomes
assessment, but we were also able to create a consensus around many important issues in
teaching Economics. By openly discussing and listening to each other’s views and teaching
methods, we have become active learners as well as better educators. And perhaps the most
rewarding accomplishment of these projects was that in making time to meet with our colleagues
we created a clearinghouse for the discussion of teaching techniques and approaches. It seems
that in our day-to-day teaching lives we seldom get little if any time to talk about teaching with
others who teach in the same field. These projects gave us that opportunity, and we are grateful.
Our discussions of course content and emphasis, the relevance of different subjects and
concepts, and different approaches to teaching them was always interesting and spirited. While
some subjects were straight forward and easy to agree upon, others provided the seeds for lively
debate, from Professor Christopherson’s view on traditional economies to Professor Francis’
admiration for colorless textbooks to Professor Hu’s view of elasticity of demand. While these
might sound mundane to a non-economist, they proved to be important steps in creating a
process of examining what and how we teach. And for all involved, there were insights and new
ideas about how to become more effective educators.
81
TO
FROM:
DATE:
RE:
Bruce Christopherson, Bob Francis, Dongwa Hu, Jimmy Kelsey
Tim Payne
January 16, 2001
Outcomes Assessment Project for 2001
Everyone:
As I have discussed with each of you individually, we were fortunate to once again
receive some grant money for outcomes assessment in microeconomics and macroeconomics for
the year 2001. I foresee this year's projects as a continuation of what we started last year. I was
very pleased about the process we undertook last year and the results that we produced, and I am
excited about the possibilities of what we might create this year. I would like to invite you to our
first group meeting on Thursday January 18 at 3:00 pm in room 5331m (formerly 2225). A
tentative agenda is included below. I look forward to seeing you there.
Tim
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
AGENDA
1. Review of last year's project results
Tim
2. Overview of this year's project proposal
Tim
3. Suggestions of other project goals
All
4. Assignments for next time
All
5. Scheduling future meetings
All
IEAR, 8/01
82
Economics Assessment Project 2001
Tasks for this year's project:
1.
Review last year's project
2.
Complete lists of topics and outcomes for ECON 200 and 201
 Bring suggestions next time
3.

Assess our assessment
 Choose a few exercises each to use in this quarter's classes
 Analyze results
 Time for students to complete
 Time taken afterward for discussion in class
 Answer questions like;
 How did students demonstrate their understanding of the concept (did
they get it)?
 Was the assessment useful in improving student comprehension?
 Other questions?
Other things we might want to know about the assessment process?
4.
Create a mathematical aptitude pretest for ECON 200
 Why?
 How?
 What skills should we be testing?
5.
Other tasks for the year
6.
Scheduling future meetings
7.
other
83
TO
FROM:
DATE:
RE:
Bruce Christopherson, Bob Francis, Dongwa Hu, Jimmy Kelsey
Tim Payne
January 30, 2001
Agenda for next meeting
Economics Assessment Project 2001
Agenda for Meeting on
February 1
3:00 pm in room 5331m (formerly room 2225)
1. Complete lists of topics and outcomes for ECON 200 and 201
 suggestions for additions and changes to list generated last year
2. Assess our assessment
 Discussion of exercises used in your classes during the last two weeks
 Assignment for next meeting:
 Choose a few exercises each to use in this quarter's classes
 Analyze results
 Time for students to complete
 Time taken afterward for discussion in class
 Answer questions like;
 How did students demonstrate their understanding of the concept (did
they get it)?
 Was the assessment useful in improving student comprehension?
 Other questions?
 Other things we might want to know about the assessment process?
3. Create a mathematical aptitude pretest for ECON 200
 Why?
 How?
 What skills should we be testing?
4. Other tasks for the year
5. Scheduling future meetings
 Next meeting on February 15, 2001 (3:00 pm)
6. other
IEAR, 8/01
84
TO
FROM:
DATE:
RE:
Bruce Christopherson, Bob Francis, Dongwa Hu, Jimmy Kelsey
Tim Payne
Feb. 13, 2001
Agenda for next meeting
Economics Assessment Project 2001
Agenda for Meeting on
February 15
3:30 pm in room 5331m (formerly room 2225)
1. Finalizing lists of topics and outcomes for ECON 200 and 201
 suggestions for additions and changes to list generated last year
2. Assess our assessment
 Discussion of exercises used in your classes during the last four weeks
 Assignment for next meeting:
 Choose a few exercises each to use in this quarter's classes
 Analyze results
 Time for students to complete
 Time taken afterward for discussion in class
 Answer questions like;
 How did students demonstrate their understanding of the concept (did
they get it)?
 Was the assessment useful in improving student comprehension?
 Other questions?
 Other things we might want to know about the assessment process?
3. Creating resources for quantitative reasoning and mathematical skills improvement
 Refining packet of exercises from #2 above
 Textbook exercises
 CD rom study guides, websites, etc.
 Math learning center
4. Other tasks for the year
 MCO revision
5. Scheduling future meetings
 Next meeting on March, 2001 (3:30 pm)
6. other
85
TO
FROM:
DATE:
RE:
Bruce Christopherson, Bob Francis, Dongwa Hu, Jimmy Kelsey
Tim Payne
Feb. 27, 2001
Agenda for next meeting
Economics Assessment Project 2001
Agenda for Meeting on
March 1
3:30 pm in room 5302 (large conference room)
1. Complete review and discussion of MCOs for ECON 201 and ECON 100
2. Finalizing lists of topics and outcomes for ECON 200 and 201
 suggestions for additions and changes to list generated last year
3. Assess our assessment
 Discussion of exercises used in your classes during the last six weeks
 Assignment for next meeting:
 Choose a few exercises each to use in this quarter's classes
 Analyze results
 Time for students to complete
 Time taken afterward for discussion in class
 Answer questions like;
 How did students demonstrate their understanding of the concept (did
they get it)?
 Was the assessment useful in improving student comprehension?
 Other questions?
 Other things we might want to know about the assessment process?
4. Creating resources for quantitative reasoning and mathematical skills improvement
 Refining packet of exercises from #2 above
 Textbook exercises
 CD rom study guides, websites, etc.
 Math learning center
5. Scheduling future meetings
 Next meeting on March 29, 2001 (3:30 pm)
6. other
IEAR, 8/01
86
TO
FROM:
DATE:
RE:
Bruce Christopherson, Bob Francis, Dongwa Hu, Jimmy Kelsey
Tim Payne
April 16, 2001
Agenda for next meeting
Economics Assessment Project 2001
Agenda for Meeting on
Wednesday April 18
3:00 pm in room 5331m (formerly room 2225)
1. Finalizing lists of topics and outcomes for ECON 200 and 201
 suggestions for additions and changes to list generated last year
2. Assess our assessment
 Discussion of exercises used in your classes during the last four weeks
 Assignment for next meeting:
 Choose a few exercises each to use in this quarter's classes
 Analyze results
 Time for students to complete
 Time taken afterward for discussion in class
 Answer questions like;
 How did students demonstrate their understanding of the concept (did
they get it)?
 Was the assessment useful in improving student comprehension?
 Other questions?
 Other things we might want to know about the assessment process?
3. Creating resources for quantitative reasoning and mathematical skills improvement
 Refining packet of exercises from #2 above
 Textbook exercises
 CD ROM study guides, websites, etc.
 Math learning center
4. Other tasks for the year
 MCO revision for ECON 100
5. Scheduling future meetings
6. Other
87
TO
FROM:
DATE:
RE:
Bruce Christopherson, Bob Francis, Dongwa Hu, Jimmy Kelsey
Tim Payne
April 24, 2001
Agenda for next meeting
Economics Assessment Project 2001
Agenda for Meeting on
Wednesday April 25
3:30 pm in room 5331m (formerly room 2225)
1. MCO revision for ECON 100 --- final steps
2. Assess our assessment
 Discussion of exercises used in your classes during the last two quarters
 Assignment for next meeting:
 Choose a few exercises each to use in this quarter's classes
 Analyze results
 Time for students to complete
 Time taken afterward for discussion in class
 Answer questions like;
 How did students demonstrate their understanding of the concept (did
they get it)?
 Was the assessment useful in improving student comprehension?
 Other questions?
 Other things we might want to know about the assessment process?
3. Creating resources for quantitative reasoning and mathematical skills improvement
 Refining packet of exercises
 Textbook exercises
 CD ROM study guides, websites, etc.
 Math learning center
4. Other tasks for the year
5. Scheduling future meetings
IEAR, 8/01
88
TO
FROM:
DATE:
RE:
Bruce Christopherson, Bob Francis, Dongwa Hu, Jimmy Kelsey
Tim Payne
April 30, 2001
Agenda for next meeting
Economics Assessment Project 2001
Agenda for Meeting on
Wednesday May 2
3:00 pm in room 5331m (formerly room 2225)
1. MCO revision for ECON 100 --- final steps
2. Assess our assessment
 Discussion of exercises used in your classes during the last two quarters
 Assignment for next meeting:
 Choose a few exercises each to use in this quarter's classes
 Analyze results
 Time for students to complete
 Time taken afterward for discussion in class
 Answer questions like;
 How did students demonstrate their understanding of the concept (did
they get it)?
 Was the assessment useful in improving student comprehension?
 Other questions?
 Other things we might want to know about the assessment process?
3. Creating resources for quantitative reasoning and mathematical skills improvement
 Refining packet of exercises
 Textbook exercises
 CD rom study guides, websites, etc.
 Math learning center
4. Other tasks for the year
5. Scheduling future meetings
89
TO
FROM:
DATE:
RE:
Bruce Christopherson, Bob Francis, Dongwa Hu, Jimmy Kelsey
Tim Payne
May 28, 2001
Agenda for next meeting
Hey everyone. We have enough money left in the budget for one more meeting of about one
hour to wrap things up. The timing of this meeting is approximate…the Faculty Senate is
meeting tomorrow at 2:00-3:30 for its year end all-faculty meeting (you should come). It is
scheduled to end at 3:30 but could run late.
Economics Assessment Project 2001
Agenda for Meeting on
Wednesday May 30
3:30 pm in room 5331m (formerly room 2225)
1. Report of progress in including Economics in Math Learning Center
2. Assess our assessment
 Discussion of exercises used in your classes during the last two quarters
 Assignment for next meeting:
 Choose a few exercises each to use in this quarter's classes
 Analyze results
 Time for students to complete
 Time taken afterward for discussion in class
 Answer questions like;
 How did students demonstrate their understanding of the concept (did
they get it)?
 Was the assessment useful in improving student comprehension?
 Other questions?
 Other things we might want to know about the assessment process?
3. Summary of project accomplishments
<<Return>>
IEAR, 8/01
90
APPENDIX J
Math Learning Center Evaluation
Project Manager & e-mail address: Margaret Anne Rogers, mrogers@ctc.edu
Overview
Every year the students using the Math Learning Center are asked to evaluate our efforts. Unfortunately we have not
heard from the students who do not use the Center. The goal of this project was to survey a cross section of students to
learn why students do and do not use the Math Learning Center. A survey was handed out and collected in four classes:
Math 70 , Winter term 2001
Math 80, Spring term 2001
Math 110, Winter term, 2001
Math 120, Spring term, 2001
In each case the survey was completed by all students attending the class during the seventh week of the term. Each of
the four classes was a day class and none of the four instructors did anything special (such as posting sample tests in the
MLC) to encourage their students to use the Center. All students were asked to complete the first page of general
questions. The last question asked about the number of times they used the Math Learning Center that term. On the basis
of their responses, students self-identified into one of three groups
Group I Students who had not used the Math Learning Center that term.
Group II Students who had used the MLC between 1 and 3 times that term.
Group III Students who had used the MLC more than 3 times that term.
A total of 72 surveys were collected and analyzed in the following report.
Significant conclusions are also included when appropriate.
Challenges:
During winter quarter, many students failed to follow instructions and filled in responses for more than one Group. It was
difficult to decide how to include these students since they did not have the chance to finish the complete questionnaire.
During spring quarter I handed out the questionnaire and made sure the instructions were clear.
In retrospect, it would have been wise to include an evening class.
Almost all of the report is in narrative form and so is quite lengthy. Given more time, the results could be summarized in
tables.
The information should be analyzed using a more statistical approach. In particular
The various correlations noted and conclusions made should be checked for level of reliability.
91
GENERAL QUESTIONS
Results from the four classes (a total of 72 students) are combined in the following. When one class or quarter seems to
vary significantly from the others, the distinction is noted.
Question 1: Convenience of MLC hours
Monday – Thursday,
Convenient:
Friday, 8:00 – 2:30
Convenient:
Sunday, 1:00 – 4:00
Convenient:
8:00 – 5:45
58
Not convenient: 5
45
Not convenient: 12
44
Not convenient: 14
Note that some students did not respond to each blank. No student listed all three times as "not convenient" and almost
all listed two or three times as "convenient".
Suggested additional hours:
Monday – Thursday, 5:45–7:00 (person listed all given times as convenient)
Monday – Thursday, 6:30–8:00 (person only listed Sunday as convenient)
Monday – Friday, 5:45-7:00 (person listed Friday and Sunday times as convenient)
Friday, 2:30-5:00 (person listed given times except earlier Friday as convenient)
Friday, 2:30 –8:00 (person listed all given times except earlier Friday as convenient)
Saturday, 9:00 – 1:00 (person listed all given times except Sunday as convenient)
Sunday, 12:00-1:00 and 4:00-6:00 (person listed all given times as convenient)
Conclusion: In general, the students surveyed found the present hours to be convenient. Seven students suggested
additional hours, but there was little consistency as to when they wished the additional times. Five of the seven students
suggesting additional times were available to use the Math Learning Center during most of our current hours.
Question 2: How doing in math course?
Better than I had expected 14
As well as I had expected
Somewhat below expectations
Far below expectations
No response
1
23
30
4
Math 120 were the least satisfied with themselves.
- All four of the "far below expectations" came from Math 120.
- Math 120 was the only course which had no students doing better than expected.
In contrast, Math 80 students were the most satisfied.
- Five of the ten respondents were doing "better than expected".
Most of the students using the MLC came from the middle two groups.
- In the "better than expected" group, only three used the MLC three or more times whereas eight did not use it at all.
- In the "far below expectations" group, three did not used the MLC at all and only one came 1-3 times.
- Of the 53 students in the middle two groups ("as well as expected" and "somewhat below expectations") 16 students
used the MLC 1-3 times and 23 used it more than 3 times.
IEAR, 8/01
92
Conclusion: The MLC draws primarily from students whose class performance is either "as expected" or "somewhat
below expectations". (A previous study showed a direct correlation between the number of times the student used the
MLC and the grade earned in math.)
Question 3: Number of credits this term
Only this math course
6 to 10 credits
11 to 15 credits
16 to 18 credits
Over 18 credits
9
14
40
5
4
Math 80 students were the only group which did not conform to this general curve.
- Nine of the eleven respondents took 10 or fewer credits.
- No student took over 15 credits.
Question 4: Number of math study hours each week
Below 3 hours
3 to 6 hours
6 to 10 hours
More than 10 hours
27
No response 1
36
5
3
Math 80 students studied the most.
- All of the eleven respondents studied at least 3 hours a week.
- Two studied more than 10 hours a week.
Math 120 students studied the least.
- Thirteen of the twenty-two respondents studied fewer than 3 hours.
- None studied more than 10 hours.
Conclusion: Students don't spend much time studying. This is particularly apparent when one realizes that the survey was
completed by the more conscientious students – those who came to class that day.
Question 5: Number of times Math Learning Center was used this quarter
Never
1-3 times
More than 3 times
26
20
26
Usage varied considerably by class.
- Approximately half of the Math 70 and Math 110 used the MLC over 3 times.
- Only 13% of the Math 120 students used the MLC over 3 times.
Conclusion: 63% of the surveyed students used the MLC at least once. In general, about 55% of the eligible students use
the MLC at least once. Since only the more conscientious students would still be attending class on the seventh week, it
is not surprising that a disproportionate number use the MLC.
93
GROUP 1 Students who did not use the MLC that quarter
Results from the four classes (a total of 26 students) are combined in the following. When one class or quarter seems to
vary significantly from the others, the distinction is noted.
Question 1: Is this your first math course at Shoreline Community College?
Yes
No
13
13
If no, how many times did you use the MLC during your previous math course?
Never
6
No response
1
1-3 times
3
Over 3 times
3
Question 2: Where do your usually go for math help?
The reasons are listed in accordance with the number of responses
Ask a friend or relative
18
Work with classmates
9
Seek help from my instructor
6
Almost never need help
6
Work alone, even when I am having difficulties
4
Other
0
Three of the students who worked alone, even when having difficulties, said that they also worked with friends or
classmates. The one student who only worked alone was in Math 120. The student found the MLC hours convenient,
was taking 16-18 credits, spent below 3 hours a week on math, rated his outside obligations as "minimal", and was doing
"far below expectations" in math.
Question 3. How do your rate your outside obligations?
Minimal
Average
Very time-consuming
6
13
7
There did not seem to be any correlation between outside obligations and total number of credits taken, level of math
course, or success in the course.
Question 4: Reasons why you haven't used the MLC this term.
Twenty-five of the twenty-six students responded. Each response is listed below with only mild revisions for spelling and
grammar.
Math 70:
- Because I work and I don't make it there. (Student rated family/job as "minimal", MLC hours as convenient, and was
doing "as well as expected".)
- Because I work a lot. (Student rated family/job obligations as "average", MLC hours as convenient, and was doing "as
well as expected".)
- They did not have what I needed – a textbook without answers. (Student was doing "better than expected".)
- I can't make time for it in my schedule. (Student rated job/family as "average", MLC hours as convenient, and was
doing "somewhat below expectations".)
- I never had to go. I probably would in a higher math course. (Student was doing "better than expected".)
IEAR, 8/01
94
Math 80:
- I don't need to go. (Student is doing "as well as expected" and is only taking that one course.)
- Haven't needed to. I am maintaining a 4.0 GPA. (Student is going "better than expected")
- There is no need. (Student is doing "better than expected" and is only taking that one course.)
- No need. (Student is doing "better than expected" and is only taking that one course.)
- Don't want to. (Student is doing "as well as expected" and is only taking that one course.)
Math 110:
- I don't have time. I work and I don't live close by. (Student rates family/job as "more time-consuming than most", is
taking 11-15 credits, says the MLC hours are convenient, but is doing "somewhat below expectations" in math.
- I haven't any math problems. (Student is doing "better than expected".)
- I have class at BHS right before this and then I rust to get here. Then I leave right after class to get to work. I don't
come on Sundays because I have church. (Student rates family/job as "more time consuming than most", is only taking
that one course, and is doing "as well as expected".
- I don't like it. (Student is doing "better than expected".)
- I have not needed it yet. (Student is doing "better than expected".)
- Because I don't have a lot of extra time and I sometimes don't like asking for help. (Student rates family job as
"average", is only taking this course, but is doing "somewhat below expectations".)
Math 120:
- I don't have time. I have a class at 8:30 and my last class is at 11:30. After that I have to work right away. (Student
rates family/job as "more time-consuming than most", is taking 11-15 credits, but is doing "far below expectations".)
- I don't feel that I need it. I would rather learn from my professor. (Student is doing "as well as expected".)
- Unfamiliarity with who goes there. (Student is doing "somewhat below expectations".)
- I haven't had much homework that was required so the big load from my science class kept me busy. I just haven't
studied any for my math class. (Student is doing "as well as expected".)
- Because my teacher and classmates have all been able to clarify any troubles/misunderstandings I have had this quarter.
(Doing "somewhat below expectations'.)
- If I went to the MLC then I'd have to work on math and I'd rather do something else. (Rates family/job as "minimal"
and is doing "far below expectations".)
- I didn't think I needed to use the MLC because I didn't know that I was doing so badly. (Student is doing "far below
expectations".)
- I am not sure it will help me. (Student is taking 11-15 credits, rates family/job as "average" and is doing" far below
expectations".)
- I usually go to work after school and carpool to and from school. (Student is taking 6-10 credits, rates family/job as
"more time-consuming than most", and is doing "somewhat below expectations.")
Conclusion: There are a great number students who claim they are not doing well and have the time, and yet they do not
take advantage of the MLC. They claim they have time; they aren't doing well, and yet they don't take advantage of the
MLC.
Question 5: What, if anything, would have made you use the MLC this term?
Nineteen of the twenty-six students responded. Each response is listed below with only mild revisions for spelling and
grammar.
Math 70:
- A harder math course.
- The need to make up a test or to receive extra credit.
- If they had a math text without the answers.
Math 80:
- Harder math so that I needed to come.
- Higher math class.
- Lower grades.
95
Math 110:
- Suggestions from other students.
- If I knew that I wasn't (overall) doing well. (Student is doing "far below expectations.)
- If it were a Monday, Wednesday or Friday and class was canceled and I had a project due.
- If I had needed help on some of the big homework projects.
- If I needed to borrow a calculator for a test.
- If I had a difficult assignment hat needed to be turned in for credit.
- If the MLC offered sessions on certain things, such as functions, trig, etc.
- I might have to go to the MLC now because I am failing my math class. (Previously said did not use the MLC because
had to leave immediately after class for work.)
Math 120:
- If my instructor couldn't make me understand the information we were learning.
(Student is doing "somewhat below expectations.)
- If I were falling behind in class or didn't understand something.
- If I got better help.
- To better my grade in this math course.
- If I couldn't understand the teacher or had problems with my homework.
Conclusion: Of the students who use the MLC, many saw it as a place to work with friends or to work alone – knowing
that help or an answer book is nearby. The students who never used the MLC saw it primarily as a place for people to
receive help from a learning assistant.
Question 6: How do you think you would be doing in your math course if you had made more use of the MLC?
About the same
12
Somewhat better 11
Much better
No answer 1
2
The two students who responded "much better" were much alike. Both were in Math 70. Both said the MLC hours are
convenient, that they were taking 6-10 credits, and they were doing "somewhat below expectations". One studied 3-6
hours a week and rated job/family as "average" and the other studied below 3 hours a week and rated job/family as "more
time-consuming than most.
Conclusion: Slightly over half of the students felt they would be doing better if they had made more use of the Math
Learning Center. (A similar question for those who used the MLC – and hence knew more about it – showed a still
higher percentage who felt that using the MLC more would have improved their math performance.)
Question 7: What type of students do you think use the MLC?
Twenty-five of the twenty-six students responded. Each response is listed below with only mild revisions for spelling
and grammar.
Math 70:
- Responsible
- Smart people who have a drivers license and who live in Shoreline and have time.
- Maybe the students who want to get better at math.
- Students who are having trouble with their math, or just need a space to work.
- Any type.
Math 80:
- Unspecified.
- Don't know.
IEAR, 8/01
96
- People who wish to know more about math.
- People who need to.
- People who need help.
Math 110:
- People who need it.
- Anyone who struggles with math in general or some who don't understand some math concepts.
- Stupid.
- Various types.
- Hard-working students.
- People who have the time.
Math 120:
- Average.
- Those who need the help.
- Ones that need to do well in math.
- People who don't fully understand the concepts explained in class by their teacher.
- Students who care about learning the best they can; students who realize how easy it is to get help in the MLC. (This
student used the MLC on previous terms.)
- Studious.
- People who don't understand math and need help with it; also ones who are studying for help on tests, homework, etc.
- I think that they are hard workers that they just like math a lot.
- Smart.
Conclusion: There don't seem to be any prevalent stereotypes of MLC users which need to be corrected or dispelled.
GROUP 2 Students who used the MLC one to three times that quarter
Results from the four classes (a total of 20 students) are combined in the following. When one class or quarter seems to
vary significantly from the others, the distinction is noted.
Question 1: Where do your usually go for math help?
Work with classmates
Ask a friend or relative
Work alone, even when I am having difficulties
Seek help from my instructor
Almost never need help
Other
10
9
8
7
4
0
These students are consistent with those in Group 3 (those who used the MLC more than 3 times) with one major
exception. 35% of Group 2 sought help from their instructors whereas only 12% in Group 3 went to their instructors.
Several possibilities come to mind:
1) The MLC hours are more convenient
2) They prefer the help they receive from the MLC, either because they want another explanation or else
because they feel more comfortable with someone who will not be grading them.
A total of 40% of the respondents said they "work alone, even when having difficulties" but they also listed at least one
other source of help. The alarming statistic is that two students (10% of the respondents) only listed "work alone".
These students were very similar. Both were in Math 120 and both said they were doing "somewhat below expectations"
in their math course. Both of these students were taking 11-15 credits, rated their various obligations as "minimal", and
claimed that the hours of the Math Learning Center were convenient.
97
Conclusion: The MLC seems to take over the role of instructor's office hours for many students.
Question 2: Outside obligations?
Minimal
Average
Very time-consuming
5
10
5
There did not seem to be any correlation between outside obligations and total number of credits taken, level of math
course, or success in the course.
Question 3: Reasons why used the Math Learning Center this term.
The reasons are listed in accordance with the number of responses.
12
To get a Learning Assistant's help on a math problem
11
To work with classmates
6
To use an answer book
5
To study alone
4
To pick up a handout or sample test
3
To use a computer
1
To take a test (left by instructor)
1
To get advice on which math course I should take
1
To borrow a book or calculator
1
Other (To download programs for TI-83)
Conclusion: The primary reason people use the MLC is to get help from a Learning Assistant. A significant number also
use the Center to study with classmates. The responses are quite similar to those who used the MLC more than three
times..
Question 4: What do you recall about your visit(s) to the Math Learning Center?
Felt well treated
Did not think much about it, one way or the other
Did not feel well treated
12
5
3
Students who did not feel well treated were asked to explain. One did not answer, and the other two elaborated as
follows:
- The Learning Assistant was very polite and pleasant, however the language barrier caused a problem for our
communication. I needed a clear explanation, and when I didn't understand her she told me I just needed to do the
problem again. I feel that if English had been her first language, then I would have been able to understand her. (Math
110)
- I need more personal help like a tutor would give help. I feel like the MLC should give you 1 hour sessions of tutoring.
Two other students used the space intended for complaints to make the following comment:
- You need more helpers who know what they are talking about . (Math 120; did not thing about treatment one way or
the other)
- I think the MLC is a great resource. As long as I'm at Shoreline CC taking math, I will always visit. Keep up the great
work and thank you. (Math 70; felt well treated)
IEAR, 8/01
98
Conclusion: In general students were pleased with their MLC experience. It does not seem that these students used the
MLC so infrequently because they were dissatisfied with it.
Question 5: How do you think you would be doing in your math course if you had made more use of the MLC?
About the same 8
Somewhat better 9
Much better
No answer 1
2
Over half of the students felt they would be doing better if they had made more use of the Math Learning Center.
The two students who responded "much better" were much alike. Both were in Math 120. Both said the MLC hours are
convenient and they felt "well treated". Both were doing below their expectations for themselves. One was taking only
that math course and thought his other obligations were "about average". The other was taking 6-10 credits and rated his
other obligations as "minimal".
Conclusion: Students recognize the value of the MLC even though they do not always take advantage of it.
99
GROUP 3 Students who used the MLC more than three times that quarter
Results from the four classes (a total of 26 students ) are combined in the following. When one class or quarter seems to
vary significantly from the others, the distinction is noted.
Question 1: Is the MLC your primary source of math help outside of the classroom?
Yes
No
19
5
No response 2
Other sources of math help?
Only use the MLC
Ask a friend or relative
Work with classmates outside the MLC
Seek help from my instructor
Other
8
12
8
3
2 (Internet site, private tutor)
Although the Math Learning Center was the primary source of help for most math students, it was particularly true in the
Spring term when only one student responded "no". (That one person had a private math tutor.)
Conclusion: Clearly the MLC is an important part of the lives of many students. Frequent users of the MLC seem to
make very little use of their instructor's office hours. This is particularly surprising since at least one of the four
instructors is renowned for being friendly and available.
Question 2: Outside obligations?
Minimal
Average
Very time-consuming
5
12
7
No response 2
The results went in opposition to the level of the class. The highest level, Math 120, perceived of themselves as being
the least busy and the lowest level, Math 70, felt they were the most busy.
Those who rated their outside obligations as "more time-consuming than most other students" tended to have lighter
schedules. Two were currently enrolled in 11-15 credits, and the remaining five had 10 or fewer credits.
Conclusion: There is no correlation between students having many family and job obligations and their use of the MLC.
All three groups were almost identical in their ratings of their outside obligations. In each case, half regarded themselves
as "average"; a quarter said their obligations were "minimal"; a quarter said their obligations were "more timeconsuming than most other students". T
Question 3:
Reasons why used the Math Learning Center this term.
The reasons are listed in accordance with the number of responses.
22
To get a Learning Assistant's help on a math problem
13
To work with classmates
13
To study alone
9
To pick up a handout or sample test
4
To take a test (left by instructor)
3
To get advice on which math course I should take
3
To borrow a book or calculator
3
To use an answer book
IEAR, 8/01
100
0
0
To use a computer
Other
The responses are quite similar to the students who used the MLC less often.
Conclusion: The primary reason people use the MLC is to get help from a Learning Assistant. A significant number also
use the Center to study with classmates or to work alone. Few students use the solutions books even though they often
are not available through the bookstore and, in some cases, we have complete (instructors') solutions guides. These
students are also not very interested in using the readily available computers.
Question 4: Did the MLC provide you with the services you needed?
Almost always
Sometimes
Not usually
20
No response
1
5
0
The five who responded "Sometimes" elaborated as follows:
- Sometimes it is just a little intimidating to ask for help (Math 70)
- There should be more time for the learning assistants to explain the problems (Math 80)
- I need more accurate answers when I get help (Math 110)
- Sometimes what I was told in the MLC is not the right answer (Math 120)
- My instructor told them not to help me (Math 120)*
* Various instructors ask us not to assist with take-home exams so that students will be more likely to work in groups or
think for themselves.
One student who said the MLC was "almost always" helpful added the comment,
"I have had a problem several times with ESL assistants who speak very poor English. I can't quite understand the book
and I need someone to explain the problem/concept in clear English."
Conclusion: Most students are quite satisfied with the Math Learning Center.
Question 5: How did coming to the MLC affect your math grade?
Raised my grade
Had little or no effect
Lowered my grade
19 No answer 1
6
0
Five of the six students who said the MLC had "little or no effect" also said that they were doing "somewhat below
expectations". The sixth was doing "as well as expected".
Four students said that coming to the MLC had raised their grade but they volunteered that they would have done even
better if they had come more.
Conclusion: The MLC a significant positive impact on the students who avail themselves of its services.
Project Materials
Survey for Math Students
101
This survey is being conducted to learn how the Math Learning Center can best serve Shoreline math students. Please take a few
minutes to answer each question, and return the form to your instructor. Thanks for your help.
Margaret Rogers, Director
Math Learning Center
_________________________________________________________________
1. The Math Learning Center is open the following times. Please indicate which time(s) are convenient for you.
Monday – Thursday
8:00 – 5:45
Yes
O
8:00 – 2:30
Friday
1:00 – 4:00
Sunday
No
Yes
Yes
O
O
O
No
No
O
If you replied No, to all of the times above, which time(s) would you like us to add?
Day _______ and Time __________
Day _______ and Time __________
2. How are you doing in this math course? (Please check one response)
O Better than I had expected
O As well as I had expected
O Somewhat below my expectations for myself
O Far below my expectations for myself
3. How many credits are you taking this term?
O
O
O
O
O
Only this math course
6 to 10 credits
11 to 15 credits
16 to 18 credits
Over 18 credits
4. Approximately how many hours do you study for this math course each week?
O
O
O
O
Below 3 hours
Between 3 and 6 hours
Between 6 and 10 hours
More than 10 hours
5. Please estimate how often you have used the MLC since the start of this quarter.
Group 1:
Group 2:
IEAR, 8/01
O
O
Never
1-3 times
102
O
Group 3:
O
More than 3 times
Please answer the one page of questions for your group.
Group 1: Never used the Math Learning Center during this quarter
1. Is this your first math course at Shoreline Community College?
Yes
O
No
O
If No, how many times did you use the Math Learning Center during your previous Math Course.
O
times
O
Never
O
1-3
More than 3 times
2. Where do you usually go for help in Math? (check all that apply)
O
O
O
O
O
O
Almost never need help
Ask a friend or relative
Work with classmates
Seek help from my instructor
Work alone, even when I am having difficulties
Other (please explain) ______________________________________
3. Consider your various obligations (family, job, etc) beyond school. How do your rate these obligations.
O
O
O
Minimal
About average when I compare myself to other students
More time-consuming than most other students
4. Why haven't you made use of the MLC this term?
______________________________________________________________________________________________________
_____________________________________________________________________________________________
_________________________________________________________________
_________________________________________________________________
5. What, if anything, would have made you decide to use the MLC this term?
______________________________________________________________________________________________________
_____________________________________________________________________________________________
_________________________________________________________________
_________________________________________________________________
6. How do you think you would be doing in your math course if you had made use of the Math Learning Center?
O
O
O
About the same
Somewhat better
Much better
103
7. What type of students do you think use the MLC? (You may prefer to answer with a collection of one word responses, such
as "smart, boring")
______________________________________________________________________________________________________
____________________________
Group 2: Used the Math Learning Center 1-3 times during this quarter
1. Where do you usually go for help in Math? (check all that apply)
O
O
O
O
O
O
Almost never need help
Ask a friend or relative
Work with classmates
Seek help from my instructor
Work alone, even when I am having difficulties
Other (please explain) ______________________________________
2. Consider your various obligations (family, job, etc) beyond school. How do your rate these obligations.
O
O
O
Minimal
About average when I compare myself to other students
More time-consuming than most other students
3. Please indicate all reasons why you used the Math Learning Center this term.
O
O
O
O
O
O
O
O
O
O
To get a Learning Assistant's help on a math problem
To get advice on which math course I should take
To borrow a book or calculator
To use an answer book
To pick up a handout or sample test
To work with classmates
To use a computer
To take a test
To study alone
Other (Please explain) ______________________________________
4. What do you recall about your visit(s) to the Math Learning Center?
O
O
O
Felt well treated
Did not think much about it, one way or the other
Did not feel well treated.
IEAR, 8/01
104
If you did not feel well treated, please explain what happened and how we could have better met your needs.
__________________________________________
______________________________________________________________________________________________________
____________________________
______________________________________________________________________________________________________
____________________________
_________________________________________________________________
_________________________________________________________________
5. How do you think you would be doing in your math course if you had made more use of the MLC?
O
O
About the same
O
Somewhat better
Much better
Group 3: Used the Math Learning Center more than 3 times during this quarter
1. Is the Math Learning Center your primary source of math help outside of the classroom?
Where else do you go for help in math?
O
O
O
O
Yes
O
(check any that applies)
Ask a friend or relative
Work with classmates outside the MLC
Seek help from my instructor
Other (please explain) ______________________________________
2. Consider your various obligations (family, job, etc) beyond school. How do your rate these obligations.
O
O
O
Minimal
About average when I compare myself to other students
More time-consuming than most other students
3. Please indicate all reasons why you used the Math Learning Center this term.
O
O
O
O
O
O
O
O
O
O
To get a Learning Assistant's help on a math problem
To get advice on which math course I should take
To borrow a book or calculator
To use an answer book
To pick up a handout or sample test
To work with classmates
To use a computer
To take a test
To study alone
Other (Please explain) ______________________________________
105
No
O
4. Did the MLC provide you with the services you needed?
O
Almost always
O
O
Sometimes
Not usually
If you answered Sometimes or Not usually, please describe how we could better meet your needs.
___________________________________________________
______________________________________________________________________________________________________
____________________________
______________________________________________________________________________________________________
_____________________________________________________________________________________________
_________________________________________________________________
_________________________________________________________________
5. How did coming to the MLC affect your grade in your math course?
O Raised my grade
O Had little or no effect
O
Lowered my grade
<<Return>>
IEAR, 8/01
106
APPENDIX K
COLLEGE READINESS PROJECT
Project Manager & e-mail address: Pam Dusenberry, pdusenbe@ctc.edu
Participants
Dutch Henry
Sean Rody
Pam Dusenberry
Pearl Klein
Project Overview
The college Readiness Project is sponsored by the State Board for community and
technical colleges. Its purposes are
1) to provide student learning outcomes for Developmental Education (or, what it means
for a student to be ready for college),
2) to provide a growing toolbox of assessments that measure college readiness in reading,
writing, study skills, math, speaking and listening, and student responsibility, and
3) to foster dialog among developmental educators state-wide around student work that
demonstrates achievement of the learning outcomes.
The Project's work has been important for Shoreline's Developmental English outcomes
efforts, both by providing well-researched outcomes and by providing models for outcomes
assessment projects. Pam Dusenberry has been Faculty Co-Chair of the Project since
1994; Dutch Henry has also presented workshops for the Project and has participated
regularly since 1995.
At the Fall 2000 College Readiness Retreat, Dutch Henry, Sean Rody and Pam
Dusenberry worked on global assessment rubrics for reading, writing, study skills and
student responsibility for the end of a developmental program. These rubrics can be used
to assess students' readiness for college work in each area at the end of their final
developmental courses. The Developmental English faculty at Shoreline use the rubrics to
ensure that students are developing abilities and skills deemed important by developmental
educators from across Washington State. The rubrics are appended to this document.
At the Winter 2001 Retreat, the state-wide group of developmental educators created
rubrics for specific assignments in their classes that assess the college readiness outcomes.
Shoreline faculty presented their work on an English 100 Reading Rubric from last year's
outcomes project and developed a rubric for a reading process assignment used in
Developmental English classes at Shoreline; faculty are now using the rubric to help
students develop reading skills and find it quite helpful for giving detailed feedback to
students about their reading. The assignment and rubric are appended here.
107
IEAR, 8/01
108
Reading Rubric
AA. State
d
Objective
Beginning
Reading 1

Varied
appropriate
skills and
strategies to
understand
what is read




Reading 2
Read and
understand
a variety of
college
entry-level
materials



Reading 3

Student
uses
reading as a
tool

Developing
Student does not
understand the
components of the
reading process
Student does not use
appropriate strategies
for understanding
main idea, inferences,
and vocabulary
Student does not
organize and evaluate
what is read
Student cannot
distinguish between
major and minor
details
Student cannot
paraphrase author’s
ideas

Student does not read
academic and /or
technical texts
Student does not
independently read
and comprehend
general information
Student does not read
literature

Student reads literature
and shows little
comprehension
Student is unable to
identify and locate
resources
Student cannot use
reading as a tool for
managing
technological
resources

Student locates but is
unable to evaluate
resources

With assistance students
uses reading as a tool
for managing
increasingly complex
technological resources
109
Student demonstrates an
understanding of the
reading process but
shows inconsistent use
Student inconsistently
uses appropriate
strategies for
understanding main
idea, inferences, and
vocabulary
Student inconsistently
organized and partially
evaluates what is read
Student begins to
distinguish between
major and minor details
Student inaccurately or
incompletely
paraphrases


Student reads academic
and/or technical text
and shows little
comprehension


Student reads and
understands general
information
w/assistance




Exemplary
Accomplished








Student demonstrates a
good understanding of
the reading process and
uses frequently
Student frequently
demonstrates
appropriate strategies
for understanding main
idea, inferences, and
vocabulary
Student frequently
organizes and
adequately evaluates
what is read
Student makes adequate
distinctions between
major and minor details
Student frequently
demonstrates an
understanding of the
author’s ideas by
paraphrasing

Student reads academic
and/or technical text
and begins to
understand it
Student independently
reads and understands
general information
Student reads literature
and begins to
understand it

Student searches out
resources and
analyses/evaluates use
Student can use reading
as a tool for managing
increasingly complex
technological resources








Student completely
understands the reading
process and consistently
uses it
Student consistently
demonstrates
appropriate strategies
for understanding main
idea, inferences, and
vocabulary
Students consistently
organizes and
thoroughly evaluates
what is read
Student notices
insightful distinctions
between major and
minor details
Student accurately and
consistently paraphrases
Student reads and
thoroughly understands
entry-level text
Student independently
reads, understands, and
uses general
information
Student reads and
thoroughly understands
literature
Student searches out
resources independently
and uses them
effectively
Student consistently
uses reading as a tool
for managing
increasingly complex
technological resources
Writing Rubric
Does not meet standards
Meets standards
Exceeds standards
Writing 1:
The student
understands and
uses writing as a
process.
does not use reading and/or forms of
brainstorming to generate details.
does not or cannot create a coherent
plan of major ideas and supporting
points.
does not use revision as an important
part of the process of writing.
uses a variety of strategies to generate
varied details and ideas.
always creates a coherent, sophisticated
plan of ideas and details to use as a
guide for drafting.
uses several stages of revision to make
changes that significantly improve the
quality of the writing.
Writing 2:
The student
writes clearly and
effectively
Main idea is unfocussed or missing.
Organizes ideas in ways that do not
develop and support a main idea.
Develops ideas in overly simple or
banal ways.
Rarely uses specific details to support
more general claims
Organizes so that relationships
between ideas are not clear.
Does not consistently use language
that guides readers from idea to idea.
Writing is not consistently clear and
correct.
Produces messy, illegible work.
Does not use sentence construction
suitable to the content.
chooses details and language
inconsistent with purpose.
does not modify diction and formality
in light of different audiences.
uses reading and/or forms of
brainstorming to generate varied
details and ideas.
usually creates a coherent plan of
major ideas and supporting points to
use as a guide in drafting.
understands the need for revision in
crafting successful writing and uses
it as a vital part of the process.
produces a focused main idea.
organizes information to develop
and support a main idea.
demonstrates independent thinking
in the development of ideas.
uses specific details to support more
general claims.
usually organizes ideas in a coherent
manner.
uses language that helps guide the
reader from one idea to another.
generates clear, grammatically and
mechanically correct prose.
produces neat, legible final copies.
uses sentence construction suitable
to the content.
usually chooses details and language
consistent with a particular public or
private purpose.
modifies diction and level of
formality in light of different
audiences.
uses writing tools such as listing,
brainstorming, mind-mapping,
outlining, and journal writing to
explore issues and discover
connections in a variety of contexts.
recognizes the appropriate format to
use in academic, personal and workrelated communication.
recognizes, understands, and applies
criteria for effective writing.
provides useful feedback to improve
the writing of his/her peers.
makes revisions that improve
development and effective argument.
makes revisions that improve the
clarity and grammatical correctness
of his/her prose.
develops an understanding of the
importance for writing.
demonstrates an awareness of his/her
strengths and weaknesses as a writer.
demonstrates an increased
confidence in his/her ability to
communicate through writing.
uses a variety of writing tools to
explore issues and discover connections
in many contexts.
Writing 3:
The student
writes in a
variety of forms
for different
audiences and
purposes.
Writing 4:
The student
independently
uses writing as a
tool for learn-ing
in academ-ic,
personal, and
career situations
Writing 5:
The student
analyzes and
evaluates the
effectiveness of
his-her own
written work and
that of others.
Writing 6:
The student
analyzes and
evaluates his or
her own growth
as a writer.
rarely uses writing tools to explore
issues and discover connections.
uses formats that do not fit the writing
purpose in academic, personal and
work-related communication
does not usually understand and apply
criteria for effective writing.
provides vague or inaccurate feedback
to peers on their writing
makes revisions that do not clearly
improve development or argument.
inconsistently corrects clarity and
grammatical mistakes in his/her prose.
appears not to understand the
importance of writing.
is not consistently aware of his/her
writing strengths and weaknesses
does not gain confidence as writing
ability improves.
Creates original, focused main ideas.
Organizes to develop and support ideas
in a sophisticated or unusual way.
Most ideas are originally and
thoughtfully developed.
Consistently uses vivid, clear details to
support general claims.
Consistently organizes ideas
coherently.
Uses several strategies to make clear
the relationship among ideas
Always writes clear, correct prose
Produces attractive final copies
Uses varied and creative sentence
construction suitable to the content.
always chooses details and language
that helps accomplish a particular
purpose.
modifies diction and level of formality
to clearly tailor writing for different
audiences
uses a variety of formats in academic,
personal and work-related
communication.
consistently applies criteria for
effective writing.
always provides useful, creative
feedback to help peers improve.
always makes extensive revisions that
improve development and effective
argument.
corrects clarity and grammar mistakes
with little or no feedback.
clearly understands the importance of
writing
uses an awareness of writing strengths
and weaknesses to grow as a writer.
demonstrates improved confidence
consistent with improved ability.
Student Responsibility Criteria Score Sheet
Student Responsibility is worth 100 points toward course point total
IEAR, 8/01
110
S1 The student uses study skills effectively.









consistently uses various strategies to study in different settings
utilizes a complete understanding of learning styles to make good study decisions.
improves study weaknesses by planning ways to improve.
identifies and usually uses a variety of study strategies in diverse settings.
uses understanding of learning styles to make decisions about how to study.
identifies study strengths and formulates plans for improvement.
relies on one or two study strategies when others might work better
does not consistently use self understanding of learning styles to decide how to
study
is not familiar with own study strengths and weaknesses or does not form
improvement plans
S2 The student makes informed choices as to when and what technologies and/or
resources to use to gain information.












utilizes campus resources fully to get help when needed.
consistently and resourcefully uses technological support.
creatively searches and uses material from a wide variety of sources.
thoroughly and accurately evaluates all source materials.
knows when and how to get help from a variety of campus resources.
uses appropriate technological support.
searches out and uses material from a variety of sources.
critically evaluates source materials.
does not get help from campus resources when needed
does not always use appropriate technological resources
tends to rely on one or two types of sources
does not or cannot evaluate source material
S3 The student works effectively with others.









effectively takes a number of roles when working in groups
consistently suspends judgment while processing information.
consistently seeks and offers feedback.
understands and takes various roles within groups.
usually suspends judgments while information is being processed.
usually seeks and offers feedback.
tends to take one role only in groups
sometimes or often judges information
rarely seeks or offers feedback
111
20 points possible
20 points
15 points
0 points
20 points possible
20 points
15 points
0 points
15 points possible
15 points
10 points
0 points
S4 The student understands and responds to behavioral expectations within a
learning environment












always meets academic commitments
makes good choices about when to study independently or with others.
always contributes to positively to classroom environment.
consistently follows oral and written directions
follows through on academic commitments.
works alone or in a group as appropriate to the learning task.
demonstrates appropriate classroom behavior.
follows directions—both written and oral.
does not consistently follow through on academic commitments
is too dependent on others or works too independently
sometimes demonstrates inappropriate classroom behavior
does not always follow oral or written directions
S5 The student addresses problems that interfere with the learning process.









effectively and consistently uses a problem-solving process independently and
cooperatively.
effectively and consistently attempts to solve life problems that interfere with
learning.
effectively and consistently addresses problems in courses of study.
understands and usually uses the problem solving process both independently and
cooperatively.
addresses life problems that interfere with the student’s pursuit of learning.
addresses problems within a particular course of study.
does not consistently use a process for solving problems alone or in groups.
does not always address life problems that interfere with learning
does not consistently address problems within a particular course of study.
S6 The student analyzes and evaluates his/her growth as a
responsible learner.









accurately evaluates learning strengths and weaknesses.
consistently behaves in ways that show a positive attitude toward learning.
consistently demonstrates appropriate increased confidence in learning ability.
evaluates his/her strengths and weaknesses as a learner.
makes choices and demonstrates behaviors that indicate a positive attitude toward
learning.
usually demonstrates an increased confidence in his/her ability to be a successful
learner.
does not always evaluate learning strengths and weaknesses.
does not usually make choices that indicate a positive attitude toward learning.
does not show increased confidence in learning ability when appropriate.
IEAR, 8/01
112
15 points possible
15 points
10 points
0 points
15 points possible
15 points
10 points
0 points
15 points possible
15 points
10 points
0 points
Reading Log Assignment
Learning Outcomes
You willdevelop a useful reading process for various kinds of college reading including pre-reading
strategies, methods to improve comprehension while reading, and techniques to remember what you read.
Assignment
For every reading assigned in this class, complete the following steps.
Pre-read to get warmed up or ready to learn:
Preview—look at the entire reading, looking at headings, chapters, summary, questions at the end,
etc.
1. Write down what you anticipate will be in the reading.
2. Write down questions you have about the subject.
3. Write what you know about the subject.
Read to understand and to learn:
Keep in mind the author’s questions and your own.
4. Write down the main idea of each paragraph, section or page (depending on length of reading).
5. Write down answers to your questions.
6. List words you don’t know.
7. Write down any ideas that are interesting, confusing, noteworthy, etc.
8. Write down the main idea of the reading in one sentence.
Post-read to consolidate your learning:
9. Write down definitions of words you didn’t know.
10. Write a summary paragraph of the entire reading.
11. Write a response to the ideas and issues raised by the reading.
Reading Log Evaluation
Class Member __________________________________
Exceeds
Standards
5 points
Prereading strategies are used effectively:

Thoughtful anticipation of reading content

Thoughtful questions about the reading topic

Thorough list of what's known about topic
Reading strategies are used effectively (for understanding text):

Main ideas are noted

Vocabulary is noted

Interesting or confusing ideas are noted
Student frequently organizes and adequately evaluates what is read:

Summary follows text structure; includes only author's ideas

Response shows thoughtful analysis
Student makes adequate distinctions between major and minor details

Reading notes report major ideas, not details

Main idea sentence does not describe, nor is it too narrow or broad

Summary includes all major ideas

Summary includes no minor details
Student frequently demonstrates an understanding of the author’s ideas by
113
Meets
Standards
2-4 points
Doesn’t Meet
Stan-dards 01 pt
paraphrasing:

Summary is accurate

Summary is written in student's own words
<<Return>>
IEAR, 8/01
114
APPENDIX L
CNC Technologies Workplace Assessment Review
Project Manager: Jody Black, jblac@ctc.edu
1.The purpose of this project is to compile a database of information concerning
pre training skills and attainment of student objectives. Also demographic
information to be used for marketing the programs. There are 18 students included
in the sampling that is done via the Intranet. Once I gathered everyone in the
computer lab and got them all signed on, there was a buzz of conversation. Most
felt it was nice to have an opportunity to voice some concerns. Such as needing
more tooling, or the hours of operation for the program, to a sense of pride in
themselves that comes from learning, and gratefulness for the staff for making
differences in their lives. I believe I could come up with more questions to target
specific areas that need researching in the future, and I think the project was a
success. Thank you for the opportunity to improve and grow with knowledge that
will benefit the entire department and Shoreline.
2. The web page address is:
There you can find all of the questions, and I have processed percentages for the answers
below.
A. 88% gave their names
B. 88% home addresses
C. 77% employers address
D. 61% home phone
E. 27% Permission to contact employer
F. 94% Male
G. 22% currently work in manufacturing industry
16% non manufacturing
33% full time student
27% remained anonymous
H. 55% currently enrolled in 186 Machinist training Beginning class
22% currently enrolled in 196 Machinist training Advanced class
22% remained anonymous
11% are enrolled are in the AAAS degree program
I. 38% planned to receive 1 year Certificate of Completion
33% planned for the AAAS degree
5% chose to quit program
5% chose career in Manufacturing Engineering
115
J.
K.
L.
M.
N.
N.
O.
P.
IEAR, 8/01
5% undecided
38% heard of the program through friends or teachers
27% career counselors
16% Shoreline CC orientation
11% on line
5% newspaper
38% chose machinist as final career goal
27% CNC machinist
11% engineer
5% CNC machine operator
5% CNC machining instructor
5% retail sales
5% undecided
Suggested improvements are:
27% more instructors
38% 2 year course
22% more lab time and hands on training
33% additional lab hand tools and CNC machinery
33% increase in lab projects
16% inadequate tooling
11% no math comprehension
11% smaller class sizes
61% felt their needs were met in the classroom
16% did not
22% undecided
94% would recommend the program to friends and family
5% would not
83% have general understanding in the Mathematics section
22% somewhat understood
5% did not understand
61% have general understanding of the Blueprint section
33% somewhat understood
5% did not understand
44% have a general understanding of Machine tool theory
38% somewhat understood
11% did not understand
5% undecided
116
Q. 61% have a general understanding of Lab machining
27% somewhat understood
5% did not understand
5% undecided
R. 66% have a general understanding of the laboratory theory
27% did not understand
5% undecided
S. 50% have a general understanding of programming CNC machines
38% did not understand
5% undecided
T. 44% liked the classes scheduled hours
11% 8-12:30pm
5% 5-12pm
11% 9-2:45
22% evenings and weekends
5% questionable
U. 38% thought there was adequate and affordable parking
38% did not
16% not sure
5% take the bus
V. 55% would tool kits if they are offered.
11% would not
33% depends on price and quality of tools
W. 11% would buy the kit if it cost$140.00 -200.00
22% $200.00-$300.00
16% $500.00
50% It would depend on the brand names and what the box
contained
I also calculated the demographic information:
22% Seattle
5% Lynnwood
5% Redmond
5% Snohomish
5% Shoreline
5% Mukilteo
22% Everett
31% anonymous
Here is the percentage of unemployed:
50%
<<Return>>
117
APPENDIX M
Project Title:
ASSESSMENT OF SOCIAL FUNCTIONING AS A LEARNING
OUTCOME IN EDUCATION COURSES
Project Manager:
Tasleem T. Qaasim tqaasim@ctc.edu
APPENDIX 1 -FINAL REPORT FOR 2000-2001
This project evaluated current observable and measurable criteria that identify social functioning
as a learning outcome. We examined course outlines, instructional methods, student
performance and testing to determine their impact on the student competency of social
functioning. We will measure the desired learning outcome by student progress through
classroom assignments, group participation, and assessment. The principal assessment tasks will
be internships, group projects, classroom discussions, self-evaluations, and reflection papers. We
plan to measure student social functioning in the Education Core Options of Early Childhood
Education, Special Education, and Bilingual/Bicultural. Social functioning goals were identified
from researched articles and will be applied in the form of the above activities. Students will be
assessed in multiple areas of social functioning. They will be assessed in the obvious familiar
instructional methods of writing papers and in completing examinations on specific course
materials along with the not-so-obvious assessments of how well they apply this current learning
to actual classroom practice through group projects and community projects. Additionally, they
will be assessed on how they apply this new knowledge in the relationships with their peers.
The project examines the assessment of social functioning as a learning outcome in education
courses. We examine the development of assessment tools for measuring student interactions
with their classroom peers and how feedback from these interactions can be combined with selfassessment to report on improved social functioning. Social functioning is a fundamental
element of the learning process that is reflected in the measured learning outcomes of group
projects and work performed in the classroom. Classroom assessment is most effective when it
consists of challenging tasks that elicit a higher order of thinking, an ongoing process that is
integrated with instruction, and visible expectations to students active in evaluating their own
work. It can be used to evaluate teaching along with student learning. (Educational Researcher
Vol. 29. No.7. Pg. 2.)
This project conducted a search of educational indices and journals to identify leading indicators
of student proficiencies in social functioning. Research identified the evaluation of leadership,
teamwork, cooperation, and community commitment as signifiers of social functioning. These
signifiers relate to core values of Shoreline's Strategic Plan. These competencies will be assessed
through quantitative and qualitative methods to document student learning. The assessment will
provide objective evidence of increased student proficiencies in social functioning. Conventional
class measurement techniques will be applied to students in the targeted courses. This project
will demonstrate our core values of excellence, student success, diversity and support.
IEAR, 8/01
118
Preparation for this assessment project required a detailed search of the literature to develop
resources and background source information on classroom methods for improving social
functioning and measuring the result as a learning outcome. A search was performed of available
literature relevant to this topic. The search yielded a number of papers and journal articles that
were relevant to the measurement or assessment of social functioning as a learning outcome.
These articles were examined to extract some lessons for application to specific classroom
assignments. As a result, selected instructional techniques will be utilized to enhance student
social functioning. Common learning outcome results were identified from the research to
document improved social functioning.
Various assessment tools are needed to assess learning goals and connect those outcomes with
results of classroom instruction. Performance tasks are necessary to apply rubrics to the
development of the ability to translate classroom instruction into real world behavior. Data will
be gathered from observations, dialogue, reflective journals, projects, presentations, and selfevaluation. Students are not passive participants in this assessment period, and therefore it is
imperative that they view the assessment as part of their process of learning. A cultural
transformation is necessary to allow students to view assessment of their social functioning as a
source of insight and personal development; instead of part of a reward or punishment system.
(Sheppard). For assessment to play a useful role in assisting student learning, it must be move
into the middle of the teaching and learning process instead of being postponed as only the end
point of instruction (Lorrie).
Self-assessment is an essential component to complete assessment of student learning outcomes.
Self-assessment increases the students' responsibility for their own learning and makes the
relationship between the teacher and the student more collaborative. Klenowski (1995) found
that students participating in self-evaluation became more interested in the criteria and
substantive feedback about their grades. Wiggins (1995) asserts that involving students in the
analysis of their own work, breeds ownership of the evaluation process.
Some of the researched literature included studies of social functioning at Head Start programs
and early childhood education programs. Other papers examined social functioning at the
community college level. Because our education program consists of a professional/technical
track and academic track, we must prepare future teachers with skills and abilities for all grade
levels. Articles from preschool education through college were used as the basis for determining
core competencies that would provide evidence of improvement in this important learning
outcome.
In our program, we educate students who may become future teachers of any grade level and any
ethnic group. It is essential that we develop their cultural sensitivity and understanding of how
environment and neighborhood conditions affect learning. Similarly, we must develop their
ability to work together in collaborative teams of current educational processes. The research
done for this project has indicated that a collaborative team approach to education is a growing
trend in the future of education.
119
The assessment tools will be used to examine and improve the instructional methods of our
program. We are committed to developing a community of learners where students naturally
seek feedback and critique their own work. In developing this community of learners, it is
reasonable to expect that the instructors will model the same commitment.
Characteristics of social functioning that will be assessed include:
 Leadership
 Cooperative Learning/Collaboration
 Respect for Diverse Opinions
 Teamwork.
The assessment will determine to what degree did we accomplish our learning outcomes. There
are two methods of assessment that will be used in this project, qualitative assessment and
quantitative assessment. Assessment tools are:
 Checklists
 Rating Scales
 Skill Inventories
 Self-evaluations
 Narrative Reports
 Portfolios (self and peer critiques).
During the fall quarter of 2001-2002, we will administer the assessments to students under
another grant proposal. We will administer the following assessment instruments:
 Surveys
 Checklists
 Rating Scales
 Tests
 Skills Inventories.
An initial survey will be conducted to establish a baseline of student social functioning
characteristics. The observer will use a checklist that consists of key behavioral activities to
record background data. A rating scale will measure specified competencies such as enthusiasm,
consideration of others, or teamwork. These ratings often reflect broad impressions of perceived
behavior rather than actual behavior. Student performance on tests will be analyzed with respect
to demonstrated social functioning characteristics and abilities.
We will evaluate each identified characteristic of social functioning. For example, we will
evaluate leadership. How does leaders think and act? Leadership involves the identification,
acquisition, allocation, coordination, and use of social and cultural resources to establish the
conditions for the possibility of teaching and learning. (Spillane, Halveson, and Diamond). We
will specifically assess leadership of students, in an educational environment, rather than
individual leadership since these students are continuing on as educators. Educational leadership
has been defined as:
 Constructing and selling an institutional vision
 Building trust
IEAR, 8/01
120
 Collaboration
 Personal development
 Innovative ideas.
Social functioning, as a learning outcome, will be assessed during group and individual tasks.
The difference between these two approaches involves different ways of focusing on learning
activities. However both methods provide accounts of learning that can occur in either groups or
in solitary activities. Group activities provide an important analysis that emphasize students'
participation in socially organized activities that provide learning opportunities to participate
with diverse learners. Improved social functioning will assist students to participate in their
communities and work situations. Individual assignments will be studied to provide contrast on
how the student thinks and learns vs. how the student contributes and impacts group learning.
Cooperative learning involves students working in small groups or teams to help each other to
learn. Their cooperative learning team will consist of formal presentations, student practice and
coaching in learning teams, individual assessment of mastery, and public recognition of team
success. Cooperative learning involves self-esteem, the positive peer relationships, improved
academic achievements and acceptance of diversity {Tedesco}. The literature also shows that
students who work together develop social skills to become effective judges of healthy group
functioning in school, on the job, or at home.
Social functioning as a learning outcome should be a challenging, attainable, and measurable.
Competencies that lead to social functioning will be identified, and evaluated. For the
assessment to be feasible, attempts to measure too many competencies can become confusing,
frustrating, time-consuming, and confound study results. If social functioning is measurable, will
be determined by a general judgment of whether students know, think, and can do most of what
the instructor intends. Competencies should communicate learning outcomes clearly to others.
The competencies will be assessed by, collection of data, analysis, and reporting of results.
We face a significant challenge in developing curricula that educate our students on how to
understand the impact of cultural factors of an increasingly diverse population of future students
that they plan to teach. Studies were identified that quantified social learning factors for both
homogeneous and diverse groups. This research was valuable in developing teaching examples
and exercises for the affected education classes. The education faculty established key results
areas for social functioning based on learning outcomes from some of this literature research.
We will use these key results areas to assess the effectiveness of selected textbooks and syllabi
for those classes. We developed classroom exercises and assignments that require periodic
reflection on different environmental factors affecting learning outcomes of students. These
reflections better equip the students to enhance their level of social functioning, by evaluating an
array of circumstances affecting learning. We will require students to examine these effects as a
matter of similarities and contrasts.
This student assessment project was conducted during the spring quarter of 2000-2001. The
initial research and development of assessment tools was conducted during the spring quarter,
121
while the application of classroom techniques and assessments will be performed entirely in the
fall and winter quarters of 2001-2002 as part of another grant application. This two-phased
approach has enabled us to test some of the assessment assumptions before formal application to
entire classrooms.
This research and assessment should demonstrate that students apply social functioning as a
learning outcome to their current environments. Students will be asked to consistently
demonstrate that they can make the paradigm shift from their personal life and past acculturation
and embrace new concepts about how to relate to the emerging world around them. It will be
especially gratifying to have some students share with the class, towards the end of the
assessment period, how improved competencies in social functioning has enabled them to
challenge and reevaluate some inaccurate premises that had been handed down to them.
IEAR, 8/01
122
ANNUAL OUTCOMES ASSESSMENT REPORT 2000-01
Project Title:
ASSESSMENT OF SOCIAL FUNCTIONING AS A LEARNING
OUTCOME IN EDUCATION COURSES
Project Manager:
Tasleem T. Qaasim tqaasim@ctc.edu
APPENDIX 2
The Project used researched articles, developed instruments to measure attainment and level of
social functioning. Competencies determined from the early research were applied as goals for
the classes. Both qualitative and quantitative evaluations will be used to assess how well the
students had incorporated the learning outcome into their value system. The evaluation is
summarized in this final report that can be used as objective evidence to document Shoreline's
application of self-assessment in the accreditation process.
The literature research for this project was based on current journals and periodicals published
since 1990. The following list of journal articles and periodicals were used as initial research to
develop classroom goals, assignments, and assessment basis;
Roeber, Edward D. (1996). Guidelines for the Development and Management of
Performance Assessments. Practical Assessment and Research Evaluation 5 (7).
Frary, Robert B. (1996). Hints for Designing Effective Questionnaires. Practical
Assessment and Research Evaluation 5 (3).
Brualdi, Amy (1998). Implementing Performance assessment in the Classroom.
Practical Assessment and Research Evaluation 6 (2).
Brualdi, Amy (1998).
Evaluation 6 (6).
Classroom Questions.
Practical Assessment and Research
Potts, Bonnie (1993) Improving the Quality of Student Notes. Practical Assessment
and Research Evaluation 3 (8).
Tedesco, Lucyann M. (1999). The effects of Cooperative learning and Self-Esteem: A
Literature Review. Master's Thesis, Dominican College of San Rafael.
Matthews, Donna J. (1999). Enhancing Learning Outcomes for Diversely Gifted
Education. Journal of Secondary Gifted Education, V10 N2 P57-68.
123
Murphy, Patricia, Ed. (1999)
University, England.
Learners, Learning & Assessment .
Milton Keynes
Tur-Kaspa, Hana (1993). Reflections on Social Attributions of Students with Learning
Disabilities. Exceptionality: A Research Journal, V4 N4 P253-256.
Herman, Keith C.; Tucker, Carolyn M. (2000). Engagement in Learning and Academic
Success among At-Risk Latino American Students. Journal of Research and
Development in Education, v33 n3 p129-36
Holmes, Sharon L.; Ebbers, Larry H.; Robinson, Daniel C.; Mugenda, Abel G. (2001).
Validating African American Students at Predominately White Institutions. Journal of
College Student Retention, v2 n1 p41-58 2000-2001.
Young, Clifford O., Sr.; Young, Laura Howzell (1999). Assessing Learning in
Interactive Courses. Journal on Excellence in College Teaching, v10 n1 p63-75
Twelker, Paul A.; (1999). Guidelines for Developing Departmental Assessment Plans.
www.tiu.edu/psychology/Guidelines
Moskal, Barbara M. (2000). Scoring Rubrics:
Assessment, Research & Evaluation, 7(3)
What, When and How? Practical
Shepard, Lorrie. (2000). The Role of Assessment in a Learning Culture. Educational
Researcher, Vol. 20, No. 7, pp. 1-14
Spillane, James P. , Halverson, Richard, and Diamond, John B. (2001). Investigating School
Leadership Practice: A Distributed Perspective.
Educational Researcher, Vol 20. No. 7, pp.
23-27.
Anderson, John R., Greeno, James G., Reder, Lynne M. and Simon, Herbert A. (2000).
Perspectives on Learning, Thinking, and Activity. Education Researcher, Vol. 20, No. 4, pp. 1113.
<<Return>>
IEAR, 8/01
124
Download