COLLEGE OF ENGINEERING Engineering Programs Assessment SAN JOSE STATE UNIVERSITY

advertisement
SAN JOSE STATE UNIVERSITY
COLLEGE OF ENGINEERING
Engineering Programs Assessment
PERIOD COVERED
Spring 2006
Report completed by:
Date: 6/8/06
Ahmed Hambaba
Associate Dean; Graduate & Extended Studies
Ping Hsu
Associate Dean; Undergraduate Studies
Dean Signature:___________________
Date: 6/8/06
162
SECTION (A)
Engineering Degree Programs
_________________________________________________
Aviation and Technology Department
BS – Aviation
BS – Industrial Technology
MS – Quality Assurance
Chemical and Materials Engineering Department
BS*, MS – Chemical Engineering
BS*, MS – Materials Engineering
Civil and Environmental Engineering Department:
BS*, MS – Civil Engineering
Computer Engineering Department:
BS*, MS – Computer Engineering
BS, MS – Software Engineering
Electrical Engineering Department
BS*, MS – Electrical Engineering
General Engineering Department
BS, MS – Engineering
Industrial & Systems Engineering Department
BS*, MS – Industrial & Systems Engineering
MS – Human Factors/Ergonomics
Mechanical and Aerospace Engineering Department
BS*, MS – Mechanical Engineering
BS*, MS – Aerospace Engineering
* -- ABET accredited program.
262
SECTION (A)
Assessment Summary
_________________________________________________
Undergraduate Programs:
In June 2005, each of our eight ABET accredited engineering programs (Chemical,
Materials, Civil, Computer, Electrical, Industrial and Systems, Mechanical, and
Aerospace) completed a comprehensive report on its assessment and enhancement
process as part of the ABET accreditation evaluation requirements. The reports include
each program’s program educational objectives, learning outcomes, assessment process,
assessment results analysis, and improvement and enhancement activities over the late six
years. The ABET accreditation visit took place in October 2005. The period between
the visit and the spring semester of the following year is the due process period. During
this period, the programs address issues identified by the evaluators.
General Engineering is in the middle of transition from having three concentrations to a
single interdisciplinary program.
All three concentrations have been recently
discontinued. Current General Engineering upper division students are all reminding
students in these three concentrations. The interdisciplinary program along with its
learning outcomes and assessment process will be formalized and announced in Fall 06.
Graduate Programs:
Most of the graduate programs have assessed all their Program Outcomes for Spring
2005 – Spring 2006. The Program Outcomes were assessed in at least two (2) required
courses and/or the thesis/project. The assessment instruments were: student surveys, exit
survey, course survey and project/thesis assessments. Data were collected from Fall 2005
to Spring 2006 by the facilitator, the program director and course instructors. A couple of
programs had teams to assess their project/thesis reports.
Findings were generally positive and most program outcomes were adequately met.
These findings were presented in their department retreats or faculty/curriculum
committee meetings. With these findings, actions and program improvement have been
identified and will be implemented in subsequent semesters (Fall 2006 – Spring 2007).
Some of the identified actions that are planned to address the findings include
enhancements of specific courses and curriculum revisions. Part of the recommended
curriculum changes are better linkages between courses and project assignment/thesis or
more practical application of course concepts through industry involvement. Other
planned actions include lab support and maintenance, pedagogical improvements and
procedural changes as check and balance to ensure program integrity.
362
The departments will continue to collect data in fall 2006 and spring 2007. All
departments did not find a need to update or revise their Program Outcomes.
SECTION (B)
Undergraduate Programs
_________________________________________________
Aviation BS Program Outcome Assessment Summary
Please provide the following information that has (or will be) collected data on learning
outcomes this AY 05/06.
1.
For this semester, identify which learning outcome(s) was (were) the focus of
assessment. Please attach a copy of the department/program goals and Student
Learning Objectives.
A1.
A3.
A6.
2.
Apply current knowledge and adapt to emerging applications in
aviation
Communicate effectively
Understand the attributes and behavior of an aviation professional,
career
planning, and certification.
For each learning outcome, identified in item 1, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how
much, and by whom this Spring 06?
All students in the four options under the BS Aviation degree complete a capstone class
in their major, Avia 190. All the students in this class in Fall 2005 participated in a
student poster session and demonstrated their final projects. The faculty assessed the
students’ learning outcomes for these three items in Avia 190 (Senior Capstone Seminar)
through direct performance measures.
3.
For data collected Fall 05, how and when were the data summarized,
analyzed and discussed? (e.g. faculty retreat, assessment/curriculum
committee, report to the faculty).
462
The assessment data from Spring and Fall 2005 was discussed. The faculty assigned the
department curriculum committee the task of analyzing the assessment data in detail. The
department curriculum committee made a report on the assessment results to the
department faculty as a whole at the May 5, 2006 faculty meeting.
4.
What findings emerged from departmental analysis of the information/data
collected in Fall 05?
This was the first competency assessment done with the Senior Capstone courses (Avia
190). There were only fourteen assessment forms completed for students who graduated
in Fall 2005. All of the fourteen assessment forms indicated that the students met the
three competencies assessed. The assessment form for Avia 190 is attached.
5.
What actions are planned to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
As the data has been collected for only one cohort of graduating students (those
graduating Fall 2005), there is no curricular revision planned at this time. The faculty
decided that the department needs to develop additional competency assessment for the
aviation senior students.
6.
Describe plans to collect data on additional learning outcomes next semester.
The program will continue to collect data pertaining to learning outcomes throughout the
Spring and Fall 2006 in Avia 190. The aviation faculty will get together to discuss the
content issues of Avia 190 versus creating an alternative model for capstone courses for
each option. Also, a subgroup of the faculty has developed a new Senior Capstone
Evaluation Sheet for faculty to use in the poster session in Spring 2006 to assess the
students’ posters (see attached).
7.
Did your analysis result in revisiting/revising the Student Learning Outcomes?
If the answer was yes, please explain and submit an updated version of the
Student Learning Outcomes.
No
Industrial Technology BS Program Outcome
Assessment Summary
562
Please provide the following information that has (or will be) collected data on learning
outcomes this AY 05/06.
1.
For this semester, identify which learning outcome(s) was (were) the focus of
assessment. Please attach a copy of the department/program goals and Student
Learning Objectives.
Core Student Learning Objectives for ALL BSIT Students
C1 Demonstrate strong communication, critical thinking and interpersonal
skills
C4. Use skills in team development, dynamics, and management to work as
team
players
C5. Demonstrate ethical behavior and concern for colleagues, society, and the
environment
C9. Demonstrate leadership skills for a technology professional
Concentration in Electronics & Computer Technology
For students in Electronics and Computer Technology, we allocated the
concentration’s competencies to four classes. Most of the competencies are assessed in
the capstone class, Tech 169 in the Spring semester.
Figure 1 Assessment of Competencies in the Electronics & Computer
Technology Concentration:
Competency
Assessed in what class
E1. Solve electronic circuit and electronic systems
problems in analytical and creative ways.
E2. Analyze and troubleshoot analog and digital
communication techniques
E3. Apply theories of computer-aided design and
manufacturing of electronic systems: printed
circuit boards (PCBs) and integrated circuits (ICs).
E4. Use microprocessors and associated circuits in
test simulations, system interfacing of processes.
E5. Develop and implement software systems for
control of electronic industrial processes.
E6. Analyze the role of instrumentation and
automation in the electronics industry.
E7. Demonstrate skills in the control of electronics
manufacturing processes, production scheduling
and testing.
E8. Apply telecommunications theory to industrial
Capstone class, Tech 169 (SP)
Capstone class, Tech 169 (SP)
Capstone class, Tech 169 (SP)
Capstone class, Tech 169 (SP)
Capstone class, Tech 169 (SP)
Tech 115 (both Fall and
Spring)
Capstone class, Tech 169 (SP)
Tech 163 (SP)
662
settings and problems
E9. Manage a computer network
E10. Design and Analyze electronic circuits and
systems using simulation and hands-on exercises
Tech 166 (SP)
Capstone class, Tech 169 (SP)
For students in Manufacturing Systems, we allocated the concentration’s
competencies to four classes. Most of the competencies are assessed in the capstone
class, Tech 149.
Figure 2 Assessment of Competencies in the Manufacturing Systems
Concentration:
Competency
Assessed in what class
M1. Demonstrate skills in the planning and design
of manufacturing processes.
M2. Apply OSHA and NIOSH principles to facilities
design and management
M3. Design and plan industrial facilities
M4. Select and operate computer numerical controlled
and other machines
M5. Describe the uses, advantages, and disadvantages
of current and evolving manufacturing techniques
including laser machining, electrical discharge
machining, water jet and abrasive water jet
machining, and rapid prototyping.
M6. Select, analysis and use polymers, composite
materials, and materials in the design of
manufactured products.
M7. Apply the theory of computer-integrated
manufacturing (CIM), including the computeraided design/computer-aided manufacturing
(CAD/CAM) interface to industrial problems and
settings.
M8. Use the principles of production scheduling &
planning in an industrial environment
M9. Apply a knowledge of statics to manufacturing
product design
Capstone class, Tech 149 (SP)
Tech 45 (SP)
Tech 45 (SP)
Capstone class, Tech 149 (SP)
Capstone class, Tech 149 (SP)
Tech 140 & Tech 141 (Both
Fall and Spring); Capstone
class, Tech 149 (SP)
Capstone class, Tech 149 (SP)
Capstone class, Tech 149 (SP)
Tech 140 & Tech 141 (Both
Fall and Spring); Capstone
class, Tech 149 (SP)
Capstone class, Tech 149 (SP)
M10. Demonstrate an understanding of materials
management including Just-in-Time (JIT) and
Materials Resource Planning (MRP)
M11. Integrate design, manufacturing, and materials into Te Tech 140 & Tech 141
the design and development of new products
(Both Fall and Spring);
Capstone class, Tech 149 (SP)
M12. Apply the principles of Lean Manufacturing to
Capstone class, Tech 149 (SP)
762
manufacturing and soft systems
2.
For each learning outcome, identified in item 1, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much, and
by whom this Spring 06?
In each concentration for the BSIT, several intermediary checkpoints were chosen to
assess student achievement of the competencies. In addition, there is a capstone course in
each concentration that is integral to our assessment of student achievement. In addition
to the capstone courses in each concentration, there is a new course that is required of all
students under the Fall 2003 curriculum, Tech 190, Senior Seminar in Technology. Of
the skills listed under the core and support courses, the faculty chose four that were
deemed to be critical by the curriculum advisory members: Demonstrate strong
communication, critical thinking and interpersonal skills (C2); Use skills in team
development, dynamics, and management to work as team players (C4); Demonstrate
ethical behavior and concern for colleagues, society, and the environment (C5); and
Demonstrate leadership skills for a technology professional (C9). These competencies
are assessed each semester in Tech 190 through direct measures. The competency
assessment form for Tech 190 is attached.
Since Fall 2003, the faculty teaching the courses in Figures 1 and 2 have been completing
assessment forms for each student in the class. All these competencies are assessed
through direct measures. At the end of the semester, these class assessment forms are
filed in the student’s department folder. When the student graduates, a staff person in the
department tabulates the individual course assessment forms and completes a
Concentration Competency form.
3.
For data collected Fall 05, how and when were the data summarized, analyzed
and discussed? (e.g. faculty retreat, assessment/curriculum committee, report to the
faculty).
The data was summarized by the department curriculum committee and discussed at a
department faculty meeting
4.
What findings emerged from departmental analysis of the information/data
collected in Fall 05?
For BSIT under the Electronics & Computer Technology concentration (BSIT-ECT),
seven students who graduated in Fall 2005 had complete competency forms. For BSIT
students under the Manufacturing Systems concentration (BSIT-MS), nine students had
completed competency forms. In addition, the department gave the NAIT Certification
Test to 47 students enrolled in Tech 149 and Tech 169 in Spring 2005. BSIT-MS did well
on the NAIT certification exam with twelve out of nineteen students passing this exam.
BSIT-ECT students did not do as well on this exam. A faculty review of the exam
indicates that it would be a better assessment tool for BSIT-MS students.
862
Based upon the preliminary data analyzed, most BSIT-ECT students are not meeting one
degree competency (E5. Develop and implement software systems for control of
electronic industrial processes). Most BSIT-MS students are not meeting one degree
competency (M5. Describe the uses, advantages, and disadvantages of current and
evolving manufacturing techniques including laser machining, electrical discharge
machining, water jet and abrasive water jet machining, and rapid prototyping).
5.
What actions are planned to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
As the data has been collected for only one cohort of graduating students (those
graduating Fall 2005), there is no curricular revision planned at this time. The faculty
decided that the department needs to review the competency assessment sheets and
collect more data.
6.
Describe plans to collect data on additional learning outcomes next semester.
The program will continue to collect data pertaining to all the learning outcomes for the
BSIT in Spring and Fall 2006 in Tech 190 and in the other eight courses indicated in
Figures 1 and 2. In addition, the department decided to give two different certification
exams to BSIT-MS students in Tech 149 in Spring 2006: the NAIT Certification Test and
the SME Certified Manufacturing Technologist Test. The faculty will evaluate the results
from both of these tests to determine whether either would serve as a good assessment
tool for the department
In Spring 2006, the senior students in the Manufacturing Systems capstone class (Tech
149) are doing a poster session of their final projects. A subgroup of the faculty has
developed a new Senior Capstone Evaluation Sheet for faculty to use in the poster session
in Spring 2006 to assess the students’ posters (see attached).
7.
Did your analysis result in revisiting/revising the Student Learning Outcomes?
If the answer was yes, please explain and submit an updated version of the Student
Learning Outcomes.
No
Chemical Engineering BS Program Outcome
Assessment Summary
1.
URL to current set of student program education objectives is located on the
department web site at:
http://www.engr.sjsu.edu/cme/Academic/program_objectives.html
962
URL to current set of student program outcomes is located on the department web
site at:
http://www.engr.sjsu.edu/cme/Academic/CE/ChEProgOutcomes.html
2.
For this semester, identify which learning outcome(s) was (were) the focus of
assessment.
All learning outcomes were assessed and documented in a Self-Study report which
was submitted to the Accreditation Board for Engineering and Technology (ABET) in
June 2005. The Department hosted an accreditation visit from the ABET
accreditation team in October 2005.
The Department conducts continuous
assessment as required by ABET, and during AY 2005-06 we are primarily focused
on implementing changes which were found to be necessary as a result of what we
learned preparing our Self-Study as well as what was suggested by the ABET visiting
team. This report documents the learning objectives, outcomes, assessment process,
and all assessment and enhancement activities up to Summer 2005. Table 1 lists all
the program outcomes assessed.
Table 1: Assessed Program Outcomes
1.1 Analyze systems through material and energy balances
1.2 Utilize physics and math to construct theoretical models
2.1 Develop a structured design of experiments and analyze results
2.2 Develop experimental procedures to collect pertinent data
3.1 Analyze, design and evaluate chemical reaction systems
3.2 Analyze, design and evaluate separation systems for specific
applications
3.3 Design materials handling systems for chemical processes.
3.4 Develop comprehensive process designs for engineering projects
3.5 Develop process flowsheet to produce a specified product
4.1 Function effectively as both team leader and team member in multidisciplinary and multi-cultural engineering.
4.2 Utilize knowledge of resources and contribution of other disciplines
to solving engineering problems.
5.1 Troubleshoot a unit operation process. Identify problems and
recommend corrective action
5.2 Evaluate the economics of engineering projects and unit operations
5.3 Design control systems for chemical engineering unit operations
5.4 Apply background in chemistry and material science to analyze and
optimize chemical systems
6.1 Apply engineering ethics and professionalism in conducting
projects
1062
7.1 Deliver effective presentations of engineering activities in written
and oral formats
8.1 Quantify social impacts of engineering projects, especially with
respect to the environment and labor
8.2 Assess safety aspects for chemical processes
9.1 Collect data relevant for process comparisons from library
resources, including literature and patent searches and alternative
sources including electronic and personal
10.1 Compare and evaluate similar technical processes
10.2 Evaluate emerging and new technologies
11.1 Apply appropriate software for design and analysis of chemical
process systems
11.2 Design processes for system optimization for TQM
3.
For each learning outcome, identified in item 2, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much,
and by whom?
All learning outcomes were assessed and documented in the Self-Study report.
Table 2 lists the program outcomes and the methods used to assess each outcome.
Complete detail on the assessment process for the Department can be found in the
ChE and MatE Self-Study Reports.
Direct measures of student achievement of outcomes are primarily grades on
examinations and course projects. These measures are documented in the Course
Evaluation Form prepared by each faculty member at the end of the semester.
Indirect measures of student achievement include alumni surveys, graduating
senior surveys, and employer reviews of our Senior Project presentations.
Finally, the ABET review itself provides a significant comprehensive review of
all of our measures and examines their efficacy in measuring student
achievement.
1162
AY 03-04
AY 03-04
Senior
Survey 04
(scale 1-4)
AY 03-04
AY 03-04
2.1 Develop a structured design of experiments and analyze results
AY 03-04
AY 03-04
AY 02-03
162
2.2 Develop experimental procedures to collect pertinent data
AY 03-04
AY 03-04
AY 02-03
162
3.1 Analyze, design and evaluate chemical reaction systems
Analyze, design and evaluate separation systems for specific
3.2
applications
3.3 Design materials handling systems for chemical processes.
AY 03-04
AY 03-04
AY 02-03
158
AY 03-04
AY 03-04
AY 02-03
160A, 160B
Program Outcomes
1.1 Analyze systems through material and energy balances
1.2 Utilize physics and math to construct theoretical models
Student
Conference
Day
AY 04-05
Industrial
Survey
Alumni03
(scale 1-6)
Course
Assessement
AY 02-03
AY 02-03
160A, 160B
160A, 158
AY 03-04
AY 03-04
AY 02-03
160B
3.4 Develop comprehensive process designs for engineering projects
AY 04-05
AY 03-04
AY 03-04
AY 02-03
165
3.5 Develop process flowsheet to produce a specified product
Function effectively as both team leader and team member in multi4.1
disciplinary and multi-cultural engineering.
Utilize knowledge of resources and contribution of other disciplines
4.2
to solving engineering problems.
Troubleshoot a unit operation process. Identify problems and
5.1
recommend corrective action
AY 04-05
AY 03-04
AY 03-04
AY 02-03
165
AY 03-04
AY 03-04
AY 02-03
165
AY 03-04
AY 03-04
AY 02-03
162L
AY 03-04
AY 03-04
AY 02-03
158, 160B
5.2 Evaluate the economics of engineering projects and unit operations
AY 04-05
AY 03-04
AY 03-04
AY 02-03
165
AY 03-04
AY 03-04
AY 02-03
185
AY 03-04
AY 03-04
AY 02-03
158, 160B
AY 03-04
AY 03-04
AY 02-03
161
AY 04-05
AY 03-04
AY 03-04
AY 02-03
162L, 165
AY 04-05
AY 03-04
AY 03-04
AY 02-03
165
AY 04-05
AY 03-04
AY 03-04
AY 02-03
162L
AY 03-04
AY 03-04
AY 02-03
162L, 165
AY 04-05
AY 03-04
AY 03-04
AY 03-04
AY 03-04
AY 02-03
AY 02-03
158, 165
158, 165
AY 04-05
AY 03-04
AY 03-04
AY 02-03
160A, 165, 185
AY 03-04
AY 03-04
AY 02-03
165
5.3 Design control systems for chemical engineering unit operations
5.4
6.1
7.1
8.1
8.2
9.1
10.1
10.2
11.1
11.2
4.
Apply background in chemistry and material science to analyze and
optimize chemical systems
Apply engineering ethics and professionalism in conducting
projects
Deliver effective presentations of engineering activities in written
and oral formats
Quantify social impacts of engineering projects, especially with
respect to the environment and labor
Assess safety aspects for chemical processes
Collect data relevant for process comparisons from library
resources, including literature and patent searches and alternative
sources including electronic and personal
Compare and evaluate similar technical processes
Evaluate emerging and new technologies
Apply appropriate software for design and analysis of chemical
process systems
Design processes for system optimization for TQM
AY 04-05
How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
Each Program Outcome is assigned to one faculty member who is responsible for
collecting all of the data on the achievement of that outcome. At each Summer
retreat the outcomes are discussed and the evidence is reviewed . This process is
documented extensively in the Self-Study Reports. Outcomes which are not
being met to our satisfaction are given to a Task Force to determine how best to
proceed; this includes determining whether the outcome is still appropriate for
the major, and if so, how can it be better met. Course changes or curricular
change are then proposed and implemented.
5.
What findings emerged from departmental analysis of the information/data
collected?
1262
During the ABET cycle of 1999-2005, changes to the curriculum and to courses
have been made as a result of the Assessment Cycle we designed. Most of these
changes were made in order to more effectively meet the Program Outcomes. All
of the changes to the Program Outcomes themselves, as well as to the curriculum,
are documented in the Curriculum Logbook. Major curricular changes are
mentioned below under “Previous Changes to Curriculum”. Listed below are
some of the key findings of our program assessment



6.
Modification of program outcomes to the current version of 24 outcomes
Need for more statistics in the program
Improvement of unit operations laboratory courses
What actions were taken to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
Previous Changes to Curriculum.
Changes have been made during this ABET cycle as a result of continuous assessment.
These changes to the curriculum or to courses in order to either meet the outcomes better,
or to assess them more easily, have included:
 In 2002-03 we increased the statistics class (ChE162) from 1 unit to 3 units and
added applications to Design of Experiments (DOE) and Statistical Process
Control (SPC). In addition we cross-listed the course with Industrial and Systems
Engineering (ISE) and made arrangements for their faculty to teach the class.
Furthermore this class was identified as a “junior core class”. Students must
successfully complete all classes in the junior core before proceeding to the senior
level classes.
 In 2002-03 an additional unit was added to the unit operations laboratory class
ChE162L. The extra unit added a lecture component to this class to cover proper
practices of experimental design and collection and interpretation of data.
Future Changes to Curriculum
Changes have been identified during the preparation of this self-study that will need to be
implemented in the future. Specific changes to be made to the curriculum or to courses in
order to either meet the outcomes better, or to assess them more easily, include:
 Develop an implementation plan for strengthening the statistical aspect of
Outcome 2.1 and 2.2. This will probably include infusing more statistical
analysis into existing laboratory courses ChE161L and Che162L which will build
on the theory taught in ChE 162.
 Reevaluate outcome 3.3. Alumni did not rate this outcome very highly. Based
upon industry data, this outcome was useful, but not critical. Review of the
material collected to assess this outcome may result in a modification of the
outcome statement.
 College-wide discussion on whether or how to include a multidisciplinary
experience is ongoing; the possibility of interdisciplinary service projects or
honors seminar have been topics of discussion this year.
1362



7.
Develop a plan to assess the ethical portion Outcome 6.1 more formally.
Assessment of ethics will be incorporated into the plant design course (ChE165)
For Outcome 7.1, which is already met fairly thoroughly, improvements are
planned using progressive writing assignments throughout the curriculum (See
Outcome 7.1 discussion above)
Outcome 11.2 needs to be re-visited and clarified, so that that the assessment data
collected reflects the intent of the outcome. Alumni expressed confusion on how
to interpret this outcome.
Describe plans to collect data on additional learning outcomes next semester.
Include a calendar for collecting data on all SLO’s by end of Spring 07.
The assessment and improvement plan, including the calendar of activities, is
documented in the self-study report.
The assessment and improvement plan is
designed to coincide with the ABET 6-year accreditation cycle. To date, all program
outcomes have been assessed. The cycle for assessment activities is listed below in
Table 3. Data from the assessment activities are evaluated during the faculty summer
retreat.
Table 3: Schedule of Activities
Activity
Student
Conference
Day
Alumni
Survey
Industry
Survey
Sr. Survey
Course
Assessment
Fall 2005 Spring 2006 Fall 2006 Spring 2007
X
X
X
X
X
X
X
X
X
Materials Engineering BS Program Outcome
Assessment Summary
1.
URL to current set of student program education objectives is located on the
department web site at:
http://www.engr.sjsu.edu/cme/Academic/program_objectives.html
URL to current set of student program outcomes is located on the department web
site at:
http://www.engr.sjsu.edu/cme/Academic/ME/MatEProgOutcomes.html
1462
2.
For this semester, identify which learning outcome(s) was (were) the focus of
assessment.
All learning outcomes were assessed and documented in a Self-Study report which
was submitted to the Accreditation Board for Engineering and Technology (ABET) in
June 2005. The Department hosted an accreditation visit from the ABET
accreditation team in October 2005.
The Department conducts continuous
assessment as required by ABET, and during AY 2005-06 we are primarily focused
on implementing changes which were found to be necessary as a result of what we
learned preparing our Self-Study as well as what was suggested by the ABET visiting
team. This report documents the learning objectives, outcomes, assessment process,
and all assessment and enhancement activities up to Summer 2005. Table 1 lists all
the program outcomes assessed.
Table 1: Assessed Program Outcomes
1.1 Distinguish between and identify the microstructure of metals,
ceramics, polymers, liquid crystals and semiconductors.
1.2 Infer and predict materials properties based on knowledge of
materials structure.
1.3 Measure and identify the materials properties appropriate to a
specific application (e.g. mechanical, electrical, etc.)
1.4 Apply concepts of thermodynamics and kinetics in the process
design of materials system in order to produce desired structure and
properties.
2.1 Select appropriate characterization methods and interpret
experimental results of materials characterization tools.
2.2 Design an appropriate experiment to measure specific engineering
properties, using statistical procedures.
2.3 Analyze results of experiments using appropriate theoretical and
empirical models.
2.4 Utilize statistical design of experiments methodology
3.1 Describe specific processing techniques for synthesis and
modification of materials
3.2 Evaluate and select appropriate materials and processing methods
based on desired performance.
4.1 Demonstrate knowledge of resources and contribution of other
disciplines to solving engineering problems.
4.2 Function effectively as both team leader and team member in
accomplishing engineering team projects.
5.1 Assess needs, formulate problem statement, structure solutions and
identify role of materials engineering in solving real-world
problems.
6.1 Formulate and address ethical issues which arise in solving
engineering problems and in the workplace.
7.1 Make effective formal and informal presentations, in written and
oral formats appropriate to a specific audience.
7.2 Demonstrates effective interpersonal communication skills.
1562
8.1 Demonstrates knowledge of environmental impacts, life cycle &
disposal issues for chemicals and materials processes .
9.1 Conduct a thorough information search, be resourceful in
uncovering information, and critically evaluate information.
9.2 Participates in and contributes to service, professional, educational
or civic organizations.
10.1 Demonstrates in at least one project the materials issues relevant to
current technological problems.
11.1 Demonstrates appropriate and safe use of characterization and
metrology tools
11.2 Demonstrates advanced proficiency in office software such as
spreadsheets, word processors, presentation software and search
engines.
11.3 Utilize specific statistical and mathematical software.
11.4 Utilize common materials data formats such as binary and ternary
phase diagrams, Ellingham, TTT, energy band diagrams, Ashby
diagrams.
3.
For each learning outcome, identified in item 2, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much,
and by whom?
All learning outcomes were assessed and documented in the Self-Study report.
Table 2 lists the program outcomes and the methods used to assess each outcome.
Complete detail on the assessment process for the Department can be found in the
ChE and MatE Self-Study Reports.
Direct measures of student achievement of outcomes are primarily grades on
examinations and course projects. These measures are documented in the Course
Evaluation Form prepared by each faculty member at the end of the semester.
Indirect measures of student achievement include alumni surveys, graduating
senior surveys, and employer reviews of our Senior Project presentations.
Finally, the ABET review itself provides a significant comprehensive review of
all of our measures and examines their efficacy in measuring student
achievement.
1662
Program Outcomes
1.1
1.2
1.3
1.4
2.1
2.2
2.3
2.4
3.1
3.2
4.1
4.2
5.1
6.1
7.1
7.2
8.1
9.1
9.2
10.1
11.1
11.2
11.3
11.4
4.
Distinguish between and identify the microstructure of metals,
ceramics, polymers, liquid crystals and semiconductors.
Infer and predict materials properties based on knowledge of
materials structure.
Measure and identify the materials properties appropriate to a
specific application (e.g. mechanical, electrical, etc.)
Apply concepts of thermodynamics and kinetics in the process
design of materials system in order to produce desired structure and
properties.
Select appropriate characterization methods and interpret
experimental results of materials characterization tools.
Design an appropriate experiment to measure specific engineering
properties, using statistical procedures.
Analyze results of experiments using appropriate theoretical and
empirical models.
Utlize statistical design of experiments methodology
Describe specific processing techniques for synthesis and
modification of materials
Evaluate and select appropriate materials and processing methods
based on desired performance.
Demonstrate knowledge of resources and contribution of other
disciplines to solving engineering problems.
Function effectively as both team leader and team member in
accomplishing engineering team projects.
Assess needs, formulate problem statement, structure solutions and
identify role of materials engineering in solving real-world
problems.
Formulate and address ethical issues which arise in solving
engineering problems and in the workplace.
Make effective formal and informal presentations, in written and
oral formats appropriate to a specific audience.
Demonstrates effective interpersonal communication skills.
Demonstrates knowledge of environmental impacts, life cycle &
disposal issues for chemicals and materials processes .
Conduct a thorough information search, be resourceful in
uncovering information, and critically evaluate information.
Participates in and contributes to service, professional, educational
or civic organizations.
Demonstrates in at least one project the materials issues relevant to
current technological problems.
Demonstrates appropriate and safe use of characterization and
metrology tools
Demonstrates advanced proficiency in office software such as
spreadsheets, word processors, presentation software and search
engines.
Utilze specific statistical and mathematical software.
Utilize common materials data formats such as binary and ternary
phase diagrams, Ellingham, TTT, energy band diagrams, Ashby
diagrams.
Industrial
Survey
Senir exit
survey 04
Alumni03
Course
Assessement
AY 03-04
AY 03-04
AY 02-03
141
AY 03-04
AY 03-04
AY 02-03
195,153
AY 03-04
AY 03-04
AY 02-03
195,153
AY 03-04
AY 03-04
AY 02-03
198
AY 04-05
AY 03-04
AY 03-04
AY 02-03
141, 198
AY 04-05
AY 03-04
AY 03-04
AY 02-03
198
AY 04-05
AY 03-04
AY 03-04
AY 02-03
198
SCD 2005
AY 03-04
AY 03-04
AY 02-03
155
AY 04-05
AY 03-04
AY 03-04
AY 02-03
155
AY 04-05
AY 03-04
AY 03-04
AY 02-03
155
AY 04-05
AY 03-04
AY 03-04
AY 02-03
198,129
AY 03-04
AY 03-04
AY 02-03
129
AY 03-04
AY 03-04
AY 02-03
198
AY 03-04
AY 03-04
AY 02-03
161
AY 03-04
AY 03-04
AY 02-03
198
AY 03-04
AY 03-04
AY 02-03
198
AY 03-04
AY 03-04
AY 02-03
161,198
AY 03-04
AY 03-04
AY 02-03
198
AY 03-04
AY 03-04
AY 02-03
198
AY 03-04
AY 03-04
AY 02-03
198
AY 03-04
AY 03-04
AY 02-03
141,154
AY 03-04
AY 03-04
AY 02-03
198
AY 03-04
AY 03-04
AY 02-03
162
AY 03-04
AY 03-04
AY 02-03
198
AY 04-05
AY 04-05
How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
Each Program Outcome is assigned to one faculty member who is responsible for
collecting all of the data on the achievement of that outcome. At each Summer
retreat the outcomes are discussed and the evidence is reviewed . This process is
documented extensively in the Self-Study Reports. Outcomes which are not
being met to our satisfaction are given to a Task Force to determine how best to
proceed; this includes determining whether the outcome is still appropriate for
the major, and if so, how can it be better met. Course changes or curricular
change are then proposed and implemented.
1762
5.
What findings emerged from departmental analysis of the information/data
collected?
During the ABET cycle of 1999-2005, changes to the curriculum and to
courses have been made as a result of the Assessment Cycle we designed. Most
of these changes were made in order to more effectively meet the Program
Outcomes. All of the changes to the Program Outcomes themselves, as well as to
the curriculum, are documented in the Curriculum Logbook. Major curricular
changes are mentioned below under “Previous Changes to Curriculum”. Listed
below are some of the key findings of our program assessment



6.
Modification of program outcomes to the current version of 24 outcomes
Need for more statistics in the program
Improve the design component in the Sr. Design Project course
(MatE198A/B)
What actions were taken to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
During the ABET cycle of 1999-2005, changes to the curriculum and to courses have
been made as a result of the Assessment Cycle we designed. Most of these changes were
made in order to more effectively meet the Program Outcomes. All of the changes to the
Program Outcomes themselves, as well as to the curriculum, are documented in the
Curriculum Logbook. Major curricular changes are mentioned below under “Previous
Changes to Curriculum”.
In addition to changes already made, the process of assembling this Self-Study during the
AY 2004-05 has resulted in ideas and concerns for new changes and implementation to
be made in the future. These are mentioned below as “Future Changes to Curriculum.”
Previous Changes to Curriculum,
Changes have been made during this ABET cycle as a result of continuous assessment.
These changes to the curriculum or to courses in order to either meet the outcomes better,
or to assess them more easily, have included:
 In 2002-03 we increased the statistics class (ChE162) from 1 unit to 3 units and
added applications to Design of Experiments (DOE) and Statistical Process
Control (SPC). In addition we cross-listed the course with Industrial and Systems
Engineering (ISE) and made arrangements for their faculty to teach the class.
Furthermore this class was made a prerequisite to MatE 155 (Matls Selection and
Design).

In 2004-05 the Senior Design Project course was modified to ensure that students
were required to meet multiple design milestones, including multiple realistic
design constraints and broader considerations. Previous changes to the course
made in 2002-04 added design content workshops but did not result in
consideration of these constraints.
1862
Future Changes to Curriculum
Changes have been identified during the preparation of this self-study that will need to be
implemented in the future. Specific changes to be made to the curriculum or to courses in
order to either meet the outcomes better, or to assess them more easily, include:
 Formally add an independent lab project to MatE 152 and use it to assess
Outcome 1.4
7.

Continue development of Materials Characterization Center so that Outcome 2.1
and Outcome 11.1 is met by every student through coursework (154 and 141) and
senior project (198). SEM capabilities are now in place and funding is being
developed for new XRD equipment.

Develop an implementation plan for strengthening the statistical aspect of
Outcome 2.2. This will probably include infusing more statistical analysis into
existing laboratory courses such as 153 and 195 which will build on the theory
taught in ChE 162.

Develop a plan for a more formal teamwork experience (Outcome 4.2) within the
MatE curriculum. College-wide discussion on whether or how to include a
multidisciplinary experience is ongoing; the possibility of interdisciplinary service
projects or honors seminar have been topics of discussion this year.

Develop a plan to assess Outcome 6.1 more formally; this may necessitate
additional content to be added to the MatE 198 course.

For Outcome 7.1, which is already met fairly thoroughly, improvements are
planned using progressive writing assignments throughout the curriculum (See
Outcome 7.1 discussion above)

We plan to re-write Outcome 8.1 so that it is broader and more easily assessed in
the Senior Design Project. Simply using the ABET criteria (h) as written may be
the best approach.

Outcome 11.3 needs to be re-visited and made more specific, and the curriculum
be strengthened as necessary to achieve it.

Outcome 11.4 needs to be re-visited and made more specific, so that it is more
easily assessed.

Another outcome under criteria (k) may need to be written, related to materials
processing techniques.
Describe plans to collect data on additional learning outcomes next semester.
Include a calendar for collecting data on all SLO’s by end of Spring 07.
1962
The assessment and improvement plan, including the calendar of activities, is
documented in the self-study report.
The assessment and improvement plan is
designed to coincide with the ABET 6-year accreditation cycle. To date, all program
outcomes have been assessed. The cycle for assessment activities is listed below in
Table 3. Data from the assessment activities are evaluated during the faculty summer
retreat.
Table 3: Schedule of Activities
Activity
Student
Conference
Day
Alumni
Survey
Industry
Survey
Sr. Survey
Course
Assessment
Fall 2005 Spring 2006 Fall 2006 Spring 2007
X
X
X
X
X
X
X
X
X
Civil
Engineering BS Program Outcome Assessment
Summary
1.
URL to current set of student learning outcomes on the UG studies web site.
http://www.engr.sjsu.edu/civil/home/modules.php?name=Content&pa=showpage
&pid=38
2.
For this semester, identify which learning outcome(s) was (were) the focus of
assessment.
The summary of all learning outcomes assessment for 1999 to 2005 was documented
in a self-study report submitted to the accreditation agency for engineering. This
report documents the program learning objectives and program outcomes assessment
process, and all assessment and enhancement activities up to Summer 2005. The
report was reviewed by the accreditation agency and a site visit was completed in
October 2005. The department is currently reviewing and responding to the
comments made by the accreditation agency.
In Fall 2005, a three-hour faculty retreat will be held on Friday, Dec. 9 to conduct an
assessment of two of our program outcomes:
1. Graduates have proficiency in and an ability to apply knowledge of
engineering, mathematics through differential equations, probability and
statistics, science including calculus-based physics and chemistry, and
general engineering.
2062
6.
Graduates have an understanding of professional and ethical
responsibility.
3.
For each learning outcome, identified in item 2, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much,
and by whom?
General Information
EBI Survey Results – Spring 2005.
Indirect measure - collected by COE
Report from DAC for Spring 2005 Student Focus Group.
Direct measure – collected by Department Advisory Panel
Outcome 1
Summary of Performance on FE Exam.
Direct measure – collected by state
Summary of Performance on FE Mock Exam.
Direct measure – collected by department
Faculty Assessment of Learning Objectives in
Assessed Courses – Fall 2004.
Direct measure – collected by department
Sample Student Work from Primary Assessment Courses.
Direct measure – collected by department
Outcome 6
Summary of Performance on FE Exam.
Direct measure – collected by state
Summary of Performance on FE Mock Exam.
Direct measure – collected by department
Faculty Assessment of Learning Objectives in
Assessed Courses – Fall 2004.
Direct measure – collected by department
Sample Student Work from Primary Assessment Courses.
Direct measure – collected by department
4.
How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
Data was collected by one faculty member into a package and distributed to all
full-time faculty members in early December. The retreat will be held on Friday
morning of all full-time faculty members in department. A summary report will
be compiled by the department’s assessment coordinator and will be distributed to
the faculty early next year.
2162
5.
What findings emerged from departmental analysis of the information/data
collected?
Students need stronger math and science skills, both at the time they begin our
department courses and at the time they take their first statewide licensing exam.
Additionally, it was unclear from reading student reports if they were able to
accurately identify an experience during their internship related to ethics.
6.
What actions were taken to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
The following recommendations were made:
 Math prerequisites added for CEE numerical methods and statistics
courses.
 Geology 1 removed as an option for science elective.
 Report requested from CE131 instructor to clarify extent of course
coverage addressing ethics and professional issues.
 Pursue the option of having a common final for CE99
 Choose an ethics handbook to be included in required materials for
CE105.
7.
Describe plans to collect data on additional learning outcomes next semester.
Include a calendar for collecting data on all SLO’s by end of Spring 07.
The assessment and improvement plan is designed to coincide with the ABET 6-year
accreditation cycle. Data collection will continue for specific forms of data in Spring
2006. A long term plan of data collection is being discussed to address shortcomings
identified in the Fall 2005 accreditation review.
Computer Engineering BS Program Outcome
Assessment Summary
1. URL to current set of student learning outcomes on the UG studies web site.
http://www.engr.sjsu.edu/cmpe/cmpe/cmpe_bs.php
2. For this semester, identify which learning outcome(s) was (were) the focus of
assessment.
All learning outcomes were assessed and documented in a self-study report. This
report documents the learning objective, outcome, assessment process, and all
assessment and enhancement activities up to Summer 2005.
2262
Program Outcomes
The Computer Engineering program is designed to produce computer engineering
graduates who have attained:
(a) an ability to apply knowledge of mathematics, science, and engineering
(b) an ability to design and conduct experiments, as well as to analyze and interpret
data
(c) an ability to design a system, component, or process to meet desired needs within
realistic constraints such as economic, environmental, social, political, ethical,
health and safety, manufacturability, and sustainability
(d) an ability to function on multi-disciplinary teams
(e) an ability to identify, formulate, and solve engineering problems
(f) an understanding of professional and ethical responsibility
(g) an ability to communicate effectively
(h) the broad education necessary to understand the impact of engineering solutions
in a global, economic, environmental, and societal context
(i) a recognition of the need for, and an ability to engage in life-long learning
(j) a knowledge of contemporary issues
(k) an ability to use the techniques, skills, and modern engineering tools necessary for
engineering practice.
The support of the PEOs through the Program Outcomes
Program Educational Objectives
Program Outcomes
b c d e f g h i
a
j k
I. Demonstrated an understanding of the
fundamental knowledge that is a prerequisite
for the practice of computer engineering,
x
x
including its scientific principles and the
importance of rigorous analysis and creative
design.
II. Demonstrated a practice-oriented, hands-on
understanding of how to apply theoretical
x x
x
x
x
concepts to real world problems.
III. Demonstrated clear communication skills,
responsible teamwork, leadership, and
x
x x
professional attitude and ethics.
IV. Successfully entered the engineering
profession, and embarked on the process of
x x x
life-long learning in engineering or other
professional areas.
Summary of Outcome Assessment (up to Summer 2005)
Outcome /Course
(assessed courses identified by an underscore)
a
b
c
d
e
f
g
h
i
j
2362
k
GE Area A: Skills
GE Area C: Humanities & Arts
GE Area D: Social Science
GE Area E: Human Understanding &
Dev
MATH 30, 31, 32
MATH 42
GE Area B: Science *
PHYS 70 & 71 or PHYS 50, 51 & 52
CHEM 001A
ENGR 010
ME 020
ME 109
EE 097
EE 098
EE 101
CMPE 046
CMPE 101
CMPE 102
CMPE 110
CMPE 124
CMPE 125
CMPE 126
CMPE 127
ISE 130
MATH 133A
MATH 129A or MATH 138 or 143C
ENGR 100W
GE Area R: Earth & Environment *
GE Area Z: Written Communication *
CMPE 130
CMPE 131
CMPE 140
CMPE 142
CMPE 152
GE Area V: Culture, Civilization &
Global
GE Area S: Self, Society & Equality
CMPE 195A
CMPE 195B
1
1
1
1
1
1
1
1
2
1
2
1
1
1
1
1
1
1
1
1
1
2
1
1
2
2
2
1
1
1
1
1
2
1
2
1
1
1
1
1
1
1
1
3
1
1
1
1
1
1
1
1
1
2
2
2
1
2
1
1
1
2
2
2
2
3
2
2
2
2
2
2
3
1
2
3
3
3
3
3
3
* GE Area B covered within Math, Phys & Chem required courses
GE Area R & Z covered within Engr 100W required course
Content level
1 – Knowledge, comprehension, or basic level
2 – Application, analysis, or intermediate level
1
1
1
1
2
1
2
1
1
1
1
1
1
1
2
2
1
2
2
2
1
1
1
1
2
2
2
1
1
1
1
2
2
2
2
2
2
3
2
2
2
1
1
2
3
1
2
2
3
3
1
1
1
2
Overall Achieved
3.
1
1
2
2
2
2
2
2
3
3
3
3
3
3
3
3
3
3
2
2
3
3
3
3
2
3
3
3
3
3
3
3 – Synthesis, evaluation, or advanced levelFor
each learning outcome, identified in item 2, what
direct (performance measures) and indirect (e.g. surveys) information/data were
collected, how much, and by whom?
All learning outcomes were assessed and documented in a self-study report. This report
documents the learning objective, outcome, assessment process, and all assessment and
enhancement activities up to Summer 2005.
2462
The Program Outcome Assessment Cycle
Institutional
mission and goals
ABET criteria
Establish and enhance
Program Educational
Objectives
Assess the
PEO
Achievements
Program Report
Establish and enhance
Program Outcomes
Assess the
Program Outcomes
Achievements
Outcome Report
Establish and enhance
Program Learning
Objectives
Assess the Course
Learning Objectives
Achievements
Course Journal
Enhance course
Enhance curriculum
Student course
performance
Student course
feedback
Senior/Junior
feedback
Program
delivery
Student exit
focus groups and survey
Alumni feedback
Employer feedback
Advisory Council
feedback
The Course Learning Objective Assessment Cycle
2562
Institutional
mission and goals
ABET criteria
Establish and enhance
Program Educational
Objectives
Assess the
PEO
Achievements
Program Report
Establish and enhance
Program Outcomes
Assess the
Program Outcomes
Achievements
Outcome Report
Establish and enhance
Program Learning
Objectives
Assess the Course
Learning Objectives
Achievements
Course Journal
Enhance course
Enhance curriculum
Student course
performance
Student course
feedback
Senior/Junior
feedback
Program
delivery
Student exit
focus groups and survey
Alumni feedback
Employer feedback
Advisory Council
feedback
2662
A course journal is constructed for all required courses and select elective course, to
serve two purposes:
 Course Improvement: By documenting a set of requirements for course content
encompassed in course learning objectives, evaluation indicators for these
objectives encompassed in learning performance indicators, and achievement
levels for each, it promotes consistency, quality and dependability for the course,
regardless of who is teaching it. Course assessment results for the achievement of
course learning objectives are used to identify where and how improvements can
be made to enhance student learning.
 Evaluation of Outcome Support: By documenting a set of requirements for
assessable components, and contributions to program outcomes and combined
with student work and feedback collected from the course, it is used as a source
for assessing student achievement with regard to program outcomes.
The course journal characterizes the contributions the course is intended to make to
student learning and progression towards the program outcomes. To assess the extent
to which the course delivery succeeds, student work is collected from a sampling of
students of all passing grade levels. Assessment of the student work together with the
course journal is used to evaluate and improve the course and also to evaluate and
improve the course’s contribution to Program outcomes as it is integrated into the
curriculum. A course journal consists of four sections: course description, course
modifications, course assessment, and assessment data
Course Description


The course description includes the syllabus, an explanation of the significance of the
course learning objectives and a course assessment matrix.
Syllabus: describes the CLOs, the course content, the course conduct.
Significance of Course Learning Objectives: includes a table to describe the
relationships of CLOs to POs. The relationships are represented in content capability
levels. There are three categories of content capability levels:
Level 1 – Basic level: knowledge, comprehension
Level 2 – Intermediate level: application, analysis
Level 3 – Advanced level: synthesis, evaluation
 Course Assessment Matrix: describes the CLOs, Learning Performance Indicators
(LPIs), assessment methods, prerequisites, and content levels. LPIs are indicators
associated with a particular CLO that can be evaluated using designated
assessment methods to indicate student performance. All LPIs for a particular
CLO are used to make the assessment of achievement for that CLO. The
assessment methods are categorized as follows:
E – Exam and test
F – Stakeholder feedback
H – Homework assignment
J – Project
L – Laboratory assignment
P – Presentation
2762
R – Report
S – Survey
X – Exit survey
O – Others
Course Modifications
Based on the actions recommended as a result of previous course assessments, the
modifications of a course are summarized in the following categories.
 Changes to course learning objectives
 Changes to course content
 Changes to laboratory
 Changes to textbook
 Changes to prerequisites
 Others
Course Assessment





After each semester, a summary of content levels and assessment capability levels for
each course learning objective is summarized in a table. Furthermore the following
areas related to each courses are assessed:
Assessment of student learning capability levels: each “to be improved” learning
performance indicator consists of a description of data sources, analysis statements,
and action items with individual tracking identifications
Assessment of student performance: includes a summary, a comparison with overall
departmental student grades, a comparison with previous student grades, an analysis
of the trend, and a list of actions with tracking identifications.
Assessment of student feedback: includes a summary, a comparison with overall
departmental student survey, a comparison with previous student course feedback of
each course learning objectives, an analysis of each to be improved course learning
objective, and a list of actions with tracking identifications.
Assessment of other feedback
Summary of actions: includes an action id, description, implementation, and status of
each action related to the course.
Assessment Data
The data pertinent to the assessment of each course is included in this section. The
data is presented in the following sequence.
2862
 Exams and tests
 Homework assignments
 Laboratory assignments
 Presentations
 Reports
 Projects
 Student feedback
 Stakeholder feedback
Others

4. How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
The self-study report has an extensive record of activities associated with outcome
assessment and program enhancement. Typical, assessment data is analyzed off-line by
designated faculty members and the improvement actions are discussed in industry
advisory board meeting and faculty retreats and meetings.
Activity
Assess program educational
objectives
Assess program outcomes w.r.t.
PEOs
Assess curriculum w.r.t. POs
Assess courses
F S F S F S
0 0 0 0 0 0
3 4 4 5 5 6
Assessment mechanisms

√
F
0
6
S
0
7





S
0
8
F
0
8
S
0
9
F
0
9
S
1
0



√

F
0
7









F
1
0










Feedback mechanisms
Employer feedback
Alumni feedback
Graduates exit feedback
Student feedback
Faculty retreat1
Department Advisory Council
meeting2
External (accreditation) review




























































5. What findings emerged from departmental analysis of the information/data
collected?
1
Not all retreats are focused on the BS Computer Engineering program
Not all meetings of the Department Advisory Council are focused on the BS Computer
Engineering program
2
2962
The findings are documented in the self-study report.
Program Outcome a
The understanding and application of mathematics and science are at the foundation of a
computer engineer’s conceptual world. The students are iteratively taken through a series
of math and science courses, and are repeatedly required to apply their understanding to
domain specific engineering tasks.
The external feedback (e.g., the employer survey rates the alumni at 3.3 with respect to
this outcome), the student confidence level (significantly above that our the average of
the peer institutions, and showing an increase across the last senior year) seems to
confirm that our assessment that outcome a is well supported by the curriculum is valid.
Program Outcome b
The computer engineering curriculum is designed to enable the graduate to gain the
outcome-b capabilities through both hardware and software laboratory/project
components.
Our program provides students an iterative and incremental learning experience in
designing, conducting and interpreting their laboratory/project assignments.
The external feedback (e.g., the employer survey rates the alumni at 3.3 with respect to
this outcome) as well as the students’ confidence levels (steady through the junior and
senior surveys, significantly above that of the average peers institution) seems to confirm
that our assessment that outcome b is well supported by the curriculum is valid.
Program Outcome c
Understanding the principles of design and to be able to design, implement and verify a
system and its components using contemporary hardware and software design tools and
methodologies are the foundations of computer engineering. Students successively take a
series of design-oriented hardware and software courses during their undergraduate
study; for each new course, they are required to apply their knowledge of previous
coursework to understand the fundamental design principles. This method culminates in
a successful senior design project.
The employer survey rates the graduates at 2.8. This is a satisfactory rating, but it is
indicative of there being room for improvement. The employers’ experience is with
graduates some years after their graduation, and recent improvements in the senior design
courses are not likely to have propagated sufficiently far into the alumni population to
influence this rating yet.
The engineering exit study seems to indicate that the 2005 program graduates are
significantly more confident of their ability to perform design tasks than graduates from
the peer institutions are. This may be reflective of the improvements of the program that
we have recently engaged in.
Though outcome c is well supported by the curriculum as assessed, we are keeping an
eye on this outcome and will re-assess in particular how the recent changes affect our
graduates’ design capabilities.
3062
Program Outcome d
The practical understanding of how to work in teams and how to work in a
multidisciplinary setting is central to a computer engineer’s professional career.
The students are repeatedly exposed to the challenges of understanding non-engineering
domains of knowledge, and the domains of other engineering disciplines. They are also
required to work in team settings with students from their own and from other disciplines
to achieve tasks and assigned objectives.
The students are repeatedly assigned tasks and project in a team context, and the senior
capstone project is the culmination of this learning experience, where the students form
teams, assign themselves a project and its objectives, plan the execution of the project
and are held jointly and individually responsible for their contributions to the teamwork.
It is noteworthy that the students confidence increases significantly across their last year,
as seen in the junior and senior surveys.
The employer survey rates the graduates at 3.3.
The engineering exit study seems to indicate that the 2005 program graduates are
significantly more confident of their ability to perform design tasks than graduates from
the peer institutions are.
We conclude that outcome d is well supported by the curriculum.
Program Outcome e
The computer engineering curriculum is designed to enable the graduate to gain the
outcome-e capabilities through the integration of lecture and lab components. With a
strong emphasis on laboratory work and on the confirmation of concepts learned in the
classroom by means of their immediate and practical application our program provides
students with an iterative and incremental learning experience that prepares them well for
a life of identifying, formulating and solving engineering problems.
The employer survey rates the graduates at 3.3, providing strong support for the
contention that our graduates are doing well in this outcome area.
The engineering exit study seems to indicate that the 2005 program graduates are
somewhat more confident of their ability to perform design tasks than graduates from the
peer institutions are.
The junior and senior surveys indicate an increasing sense of competence with respect to
outcome e in the students as they progress through their last year.
We conclude that outcome d is well supported by the curriculum.
Program Outcome f
Understanding and accepting professional ethics in school and at work are taught as a
recurring theme throughout the program, with particular courses addressing the topic
quite specifically.
The employer survey indicates a rating of 2.9, and the national exit study indicates that
our students are in line with the graduates of peer programs. The junior and senior
surveys indicate an increase in the student’s awareness of ethical issues across their last
year.
We feel that outcome f is well supported by the curriculum.
3162
Program Outcome g
The practical understanding of how to communicate appropriately and effectively is an
important career skill for a computer engineer.
The employer survey indicates a rating of 2.9, and the national exit study indicates that
our students are significantly more confident in the communication area than the
graduates of peer programs. The junior and senior surveys indicate a strong increase in
the student’s perception of their own communication capabilities.
Since the employers’ experience is with students graduating across the last decade, we
feel that the recent graduates are likely to exceed the most recent (and satisfactory) rating,
since a number of improvements in the communication area have been implemented over
the last few semesters.
Ethics and professional responsibility was the main topic for the focus group for the
spring 2005 graduating class. The topic was probed extensively, and the students
demonstrated a good understanding of how their obligations to themselves and to society,
as well as being able to discuss subtleties of ethics in a professional context.
Based on the curriculum assessment, the students’ feedback and that of alumni and
employers, we conclude that outcome g is well supported by the curriculum.
Program Outcome h
The computer engineering curriculum is designed to enable the graduate to achieve
outcome h through both general education courses and three upper division major
courses.
The employer survey indicates a rating of 2.8, and the national exit study indicates that
our students are in line with the graduates of peer programs. The junior and senior
surveys indicate an acceptable confidence level among the students.
Since the employers’ experience is with students graduating across the last decade, we
feel that the recent graduates are likely to exceed the above (though satisfactory) rating,
since a number of improvements in the communication area have been implemented over
the last few semesters.
We conclude that outcome h is well supported by the curriculum.
Program Outcome i
The recognition of the need for continued professional development, and the ability to
sustain such a development through a professional career is an absolute for a computer
engineer.
The students are repeatedly provided with the tools for independent learning, and
increasingly exposed to the need for independent knowledge pursuit and career selfdirection.
They are also repeatedly confronted with evidence of the rapid change in the IT field in
general and in computer engineering in particular.
The employer survey indicates a rating of 2.8, and the national exit study indicates that
our students are in line with (slightly above) the graduates of peer programs. The junior
and senior surveys indicate an adequate confidence level with respect to outcome i, and
their confidence seems to be confirmed by the fact that the alumni survey indicates that
53% of the respondents have enrolled in further study within two years of graduation.
We feel that the curriculum well supports outcome i.
3262
Program Outcome j
Understanding the contemporary context in which the students work and live, and to be
able to see trends and directions of society and the profession are valuable qualities in a
computer engineering professional.
The employer survey indicates a rating of 2.6, and the national exit study indicates that
our students are significantly more confident than the graduates of peer programs in this
area. The junior and senior surveys indicate an acceptable level of confidence across their
last year.
The employer survey is a tad lower than we would like it to be (though it is still in the
acceptable range), and is at some variance with the recent graduates’ expressed
confidence in this area. The discrepancy may simply be a matter of the employers seeing
graduates of previous years before the changes in the senior year curriculum were
implemented.
Though satisfactorily achieved, we will be watching outcome j to assess the
improvements resulting from recent curriculum changes.
Program Outcome k
The computer engineering curriculum is designed to enable the graduate to achieve
outcome-k through a series of hardware/software laboratory, homework and project
components.
Our program provides students an iterative and incremental learning experience in
conducting their laboratory/project assignments, using contemporary and professional
tools as appropriate.
The employer survey indicates a rating of 2.6, and the national exit study indicates that
our students are significantly more confident than the graduates of peer programs in this
area. The junior and senior surveys indicate a solid level of confidence across their last
year. The alumni survey provides evidence that 93% of the respondents are using
contemporary tools in their professional work.
6. What actions were taken to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
All corrective or improvement actions are documented in the self-study report.
Changes That Have Been Implemented
Based on the improvement cycles depicted in the following figure and the university
procedures, the Computer Engineering Department has implemented a systematic process
for program curricular changes and course content changes. This figure illustrates the
process for curriculum and course changes.
Curriculum and Course Change Process
3362
Due to the university processes, the delay from the moment a curricular change decision
A c c e s s c u rric u lu m /c o u rs e
has been made by the department until it takes effect is at least one ad usually two
semesters. Thus, the whole feedback loop is of some two year’s duration when changes
affect the catalog.
P ro p o s e c u rric u lu m /c o u rs e c h a n g e
Curricular Changes
Since fall 2001, the Computer Engineering program has improved its curriculum through
R e v i e w e d b y D list
e p a rof
t m changes
ent
a series of curricular changes. A comprehensive
is included in Appendix
U
n
d
e
r
g
r
a
d
u
a
t
e
S
t
u
d
i
e
s
C
o
m
i t t e e curricular changes since
III.B of the self-study report. The following highlights them key
fall 2001:
Fall 2001 Curricular Changes
[im p a c te d c o u rs e s = 1 ]
[im p a c te d c o u rs e s > 1 ]
 CMPE 101 (New requirement): Because CMPE 126 is one of the first courses that
transfer students take when matriculating into SJSU, there was a big difference in
C++ background, knowledge, and programming skills which are covered in the
CMPE 046 prerequisite to CMPE 126. The need
R e v therefore
i e w e d b y arose to ensure that all
CMPE 126 students had the requisite C++ background
a l l f a c u land
t y preparation. The CMPE
101 placement exam was initiated to investigate this need. Market testing was
conducted that correlated the performance of student in the CMPE 101 exam and
their subsequent performance in CMPE 126. A sufficient correlation to improvement
in student performance warranted
the initiation of CMPE 101 as a new requirement to
G e n e ra te C u rric u lu m C h a n g e F o rm
the program.
 CMPE 124 (New prerequisite): The EE 101 placement exam requirement was
initiated in fall 2000 to ensure students had the requisite preparation in introductory
R e v ie w e d b y C o lle g e
knowledge of circuit analysis
and design concepts. It was deemed that the content of
U n d e rg ra d u a te S tu d ie s C o m m itte e
the CMPE 124 course required this introductory knowledge of circuit analysis and
design concepts to improve student performance in this course, thus EE 101 was
added as a prerequisite.
R e v i erequire
w e d b y aU lab
n i v ecomponent
rs ity
 EE 97 (New requirement): Students
to exercise and reinforce
n d edesign.
rg ra d u a te S tu d ie s C o m m itte e
concepts in circuit analysis Uand
 MATH 42 (New requirement): Based on the recommendation of the previous ABET
visit, MATH 42 requirement was added.
Im p le m e n t c h a n g e
 Technical electives (Unit requirement): In order to accommodate the new curriculum
requirements of CMPE 101, EE 097 & MATH 42 the number of required units for the
technical elective requirement change from 12 units to 9 units. (The California State
University system does in effect not allow the department to increase the number of
units in the program.)
Fall 2003 Curricular Changes
 CMPE 110 (Replacement): CMPE 110 combined the necessary elements of EE 110
and EE 122 in one course. It retains most of EE 110 content, and emphasizes the
digital aspects of EE 122 in greater detail. Therefore the EE 110 and EE 122 required
courses were replaced by the new CMPE 110 course requirement. Students are
allowed to take EE 110 and EE 122 as technical elective courses to strengthen their
analog background.
3462

CMPE 102 (New requirement): The department determined that assembly language
programming was a necessary element for the undergraduate curriculum. In
conjunction with the replacement of EE 110 and EE 122 with the new CMPE 110
course a sufficient amount of unit hours was freed up to add CMPE 102 as a new
required course.
Spring 2004 Curricular Changes
 CMPE 140 (Prerequisite changes): CMPE 140 students showed a lack of basic logic
design skills, thus the CMPE 125 prerequisite was added to compensate. Since
CMPE 140 emphasized the micro-architecture design of a RISC CPU scalar, CMPE
127 was removed as prerequisite restriction.
Fall 2004 Curricular Changes
 CMPE 124 (New prerequisite): The CMPE 124 course required introductory circuit
analysis and design (covered in EE 98 which is a prerequisite to EE 101) as well as
the laboratory component (EE 97) to cover materials not included in EE 98. EE 97
therefore was added as a new prerequisite.
Content Changes
The data gathered in the student surveys, the employer evaluation surveys, and other
sources are being used by the faculty to address those areas of the program where there is
particular room for improvement.
Most changes within a course are minor, involving didactics rather than changes in
course content. However, the capstone project sequence has seen some significant
content upgrades. Even though the student surveys and internship supervisor evaluations
were generally very positive, there were a few areas that the students and employers
indicated needed improvement.
The Computer Engineering program has changed the format of its senior capstone
sequence (CMPE 195A and B) to place a greater emphasis on communication skills, lifelong learning, multi-disciplinary exposure and awareness of the global context in which
the graduates will find themselves:
1. CMPE 195A (Sr. Project I) has added a lecture series on topics such as teamwork,
research skills (life-long learning), and engineering ethics.
2. CMPE 195B (Sr. Project II) has added a multi-disciplinary lecture series on
leading edge technologies and the global nature of the engineering professions
(the lecturers are technology and corporate leaders from Silicon Valley).
3. CMPE 195B provides a tutor for written communication to the class, with weekly
tutoring sessions for students who need (or want) to improve their skills in written
communication.
4. All capstone project reports are reviewed by two faculty members.
5. All presentations are reviewed by at least two faculty members.
3562
Other Changes
The feedback from students with respect to the quality of advising received by nonacademic staff has caused the college to evaluate ways to improve on the process of
advising students on the non-academic aspects of their progress towards a degree.
Specifically, spring 2005 the college established an undergraduate students advising
center, with the mission of providing exactly such support to the student population.
Feedback from students on the Exit Survey has also caused the department to evaluate
options for providing improved remote access to the computing infrastructure in the
department laboratories. We expect an additional laboratory to be made available for
remote access starting fall 2005, and with the addition of the university campus-wide
wireless network, we expect that the students’ satisfaction with their access will improve.
7. Describe plans to collect data on additional learning outcomes next semester.
Include a calendar for collecting data on all SLO’s by end of Spring 07.
The assessment and improvement plan, including the calendar of activities, is
documented in the self-study report.
The assessment and improvement plan is
designed to coincide with the ABET 6-year accreditation cycle.
F S F S F S
0 0 0 0 0 0
3 4 4 5 5 6
Assessment mechanisms
Activity
Assess program educational
objectives
Assess program outcomes w.r.t.
PEOs
Assess curriculum w.r.t. POs
Assess courses

√
F
0
6
S
0
7






S
0
8
F
0
8
S
0
9
F
0
9
S
1
0



√

F
0
7










F
1
0








Feedback mechanisms
Employer feedback
Alumni feedback
Graduates exit feedback
Student feedback
Faculty retreat3
Department Advisory Council
meeting4
External (accreditation) review




























































Electrical Engineering BS Program Outcome
Assessment Summary
3
Not all retreats are focused on the BS Computer Engineering program
Not all meetings of the Department Advisory Council are focused on the BS Computer
Engineering program
4
3662
1.
URL to current set of student learning outcomes on the UG studies web site.
http://www.engr.sjsu.edu/electrical/
2. For this semester, identify which learning outcome(s) was (were) the focus of
assessment.
Outcomes are assessed on a regular basis according to our assessment plan (see
question 7). All outcomes were accessed and documented in a report for our recent
ABET accreditation visit (Fall 2005). These learning outcomes are listed as below:
a. an ability to apply knowledge of mathematics, science, and engineering
b. an ability to design and conduct experiments, as well as to analyze and interpret
data
c. an ability to design a system, component, or process to meet desired needs
d. an ability to function on multi-disciplinary teams
e. an ability to identify, formulate, and solve engineering problems
f. an understanding of professional and ethical responsibility
g. an ability to communicate effectively
h. the broad education necessary to understand the impact of engineering solutions
in a global and societal context
i. a recognition of the need for, and an ability to engage in life-long learning
j. a knowledge of contemporary issues
k. an ability to use the techniques, skills, and modern engineering tools necessary for
engineering practice.
l. specialization in one or more technical specialties that meet the needs of Silicon
Valley companies
m. knowledge of probability and statistics, including applications to electrical
engineering
n. knowledge of advanced mathematics, including differential and integral
equations, linear algebra, complex variables, and discrete mathematics
o. knowledge of basic sciences, computer science, and engineering sciences
necessary to analyze and design complex electrical and electronic devices,
software, and systems containing hardware and software components
3. For each learning outcome identified in item 2, what direct (performance measures)
and indirect (e.g. surveys) information/data were collected, how much, and by whom?
The assessment methods include course and program assessment matrices, course
evaluations,
EE110
(Network
Analysis)
placement
examination,
junior/senior/alumni/employer surveys, focus group report, senior skill audit
examination, capstone project evaluation, and team evaluation.
Assessment Methods
Course and
Program Assessment Matrices
Types
Faculty
Documentation
Indicators
Descriptive
Quantity
All courses
By whom
Faculty
3762
Course Evaluation Forms
EE110 Placement Examination
Junior Survey
Focus Group Report
Senior Skill Audit Exam
Capstone Project
Team Evaluation
Senior Survey
Alumni Survey
Employer survey
Student Report
Objective Scoring
Self Report
Observer
Objective Scoring
Objective Scoring
Report
Self Report
Self Report
Employer Report
Statistics
Exam Score
Statistics
Narrative
Exam Score
Grade
Statistics
Statistics
Statistics
Statistics
All courses
> 100
207
All seniors
30
128
126
40
12
Instructors
Dept. Chair
Asses. Coord.
Dept. Chair
Dept. Chair
Project advisors
Asses. Coord.
Asses. Coord.
Asses. Coord.
Asses. Coord.
4. How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
The data have been summarized by assessment coordinator and distributed to the
entire faculty members. The data then have been analyzed and discussed as shown in
the time table below:
Timeline
F01
(11/01)
S03
(04/03)
F03
(9/03)
S04
(04/09)
F04
(11/19)
Tasks
Discussion is based on
Documents
Departme ∙ Faculty inputs
Program Outcome champion
nt retreat ∙ Results from industry advisory presentations
board meetings
∙ Assessment data from Fall98
and SP99 surveys
Departme ∙ Assessment data from SP02
∙ Course evaluations results
nt retreat
and SP03 surveys
∙ Student, alumni, employer
evaluation results
∙ All program criteria
∙ All course descriptions and
program objectives
Departme ∙ Assessment data from SP02
∙ Reports from program area
nt retreat
and SP03 surveys
group leaders
∙ Results from up-to-date skill
audit exams
∙ Course syllabi
Departme ∙ Course materials
∙ All survey forms (draft)
nt retreat ∙ Draft survey questionnaire
∙ Action items
∙ Previous enhancements
∙ Program objectives
Department ∙ Program Educational
retreat
Objectives
∙ Course reports
∙ Learning objectives
∙ Program outcomes
contributions
∙ Course maintenance forms and
reports
∙ Discussion of past curriculum
3862
S05
changes
Departme ∙ Assessment data from SP04
nt retreat
and Fall04 surveys
∙ Status of course binders
∙ Course evaluations results
∙ Student, alumni, employer
evaluation results
∙ All program criteria
∙ All course descriptions and
program objectives
5. What findings emerged from departmental analysis of the information/data collected?
Quite a number of findings from the data collected and it is lengthy to describe them
all here. Every finding was discussed and actions have been taken to improve the
outcomes as described in 6. below
6. What actions were taken to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
During the 2001-2003 AE&E cycle, discussions about the program assessment results
have brought up some issues related to the current Program Educational Objectives.
Most of faculty members believe that the current set of PEO's needs to be modified
for better understanding, easier in assessing its outcomes, and better responding to the
today dynamic environment. The new PEO's were adopted during spring 2004
semester and immediately used for the 2004-2006 AE&E cycle.
A major change in the program technical area is the addition of two course sequences
starting from the spring 2004: A course sequence in Design, Fabrication, and Test of
Integrated Circuits and another course sequence in Mixed Signal Design and Test.
Other program changes and suggested for changes can be found in the Champion
Opinions on Program Outcome Assessment webpage.
There are quite many changes in the courses and projects during this AE&E period.
First, there are major enhancements in laboratory equipments, laboratory manuals and
course materials due to the availability of California Workforce Incentive grants in
2002. Secondly during spring 2004, all course learning objectives have been reviewed
and modified by area faculty members and under the final approval of the entire
faculty. Thirdly, all course textbooks, topics, pre-requisites and lecture schedules
have been discussed together with the course series and assessment results during the
spring 2005 and ABET retreat. Recommendations have been provided by area
committee and modifications are in progress (by the course coordinators). All course
modifications now are documented in the course logs which can be found in the
Course Descriptions and Learning Objectives webpage.
Starting from spring 2004, enhancement for student writing skills has been done by
changing the process of writing senior project report. The new process requires the
students to seek for writing consultants for reviewing their project report before
submission. The department also supports the IEEE student chapter in organizing
3962
weekly seminars regarding current technology and business. The EE198A students
are now required to submit a business plan as part of the project proposal.
7. Describe plans to collect data on additional learning outcomes next semester. Include
a calendar for collecting data on all SLO’s by end of Spring 07.
There will be no data collection next semester since next semester is in the
enhancement period. Below is the chart showing assessment/enhancement calendar.
SP
2004
Fall
2004
SP
2005
Fall
2005
SP
2006
Fall
2006
SP
2007
Fall
2007
SP
2008
Fall
2008
Ass.
Eval.
Impl
General Engineering BS Program Outcome Assessment
Summary
1.
URL to current set of student learning outcomes on the UG studies web site.
The BS General Engineering program is currently in transition. Before Fall 2005,
there were three concentrations in the program: Microelectronics Process
Engineering (MPE), Environmental Health and Safety Engineering (EHSE), and
Software and Information Engineering (SIE). Nearly all undergraduate General
Engineering students were in one of these three concentrations. The program
also served as an unofficial ‘engineering undeclared’ major for lower division
engineering students. MPE and EHSH concentrations were discontinued in Fall
2005 and SIE was moved to computer engineering department in Spring 2005.
Most SIE students followed the program to the computer engineering department
and all other BSGE students are in the pipeline for graduation.
The current plan for BSGE program is to develop a multidisciplinary program in
Fall 2006. Learning objectives and assessment plan will be developed at that
time. To position BSGE curriculum for possible future ABET accreditation, the
ABET 11 outcomes will be used as the basis for development of the BSGE
program outcomes. The ABET 11 outcomes are as follows:
a.
b.
c.
d.
e.
an ability to apply knowledge of mathematics, science, and engineering
an ability to design and conduct experiments, as well as to analyze and
interpret data
an ability to design a system, component, or process to meet desired needs
an ability to function on multi-disciplinary teams
an ability to identify, formulate, and solve engineering problems
4062
f.
g.
h.
i.
j.
k.
2.
an understanding of professional and ethical responsibility
an ability to communicate effectively
the broad education necessary to understand the impact of engineering
solutions in a global and societal context
a recognition of the need for, and an ability to engage in life-long learning
a knowledge of contemporary issues
an ability to use the techniques, skills, and modern engineering tools
necessary for engineering practice.
For this semester, identify which learning outcome(s) was (were) the focus of
assessment.
The number of remaining BSGE students is insufficient for conducting any
meaningful assessment. Besides, these students are in the pipeline for graduation
under the discontinued curriculum.
3.
For each learning outcome, identified in item 2, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much,
and by whom?
All other BS engineering programs in the College of Engineering are ABET
accredited. The assessment tools and plan for BSGE will be similar to those used by
the ABET accredited programs.
Examples of assessment tools are: survey of
students, alumni, and employers, course journals, exit exams, senior project
presentation, etc.
4.
How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
Examples of data analysis and evaluation activities are assessment committee
meetings and faculty retreats.
5.
What findings emerged from departmental analysis of the information/data
collected?
NA.
6.
What actions were taken to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
NA.
7.
Describe plans to collect data on additional learning outcomes next semester.
Include a calendar for collecting data on all SLO’s by end of Spring 07.
4162
As mentioned above, the plan for BSGE program is to develop a multidisciplinary
curriculum in Fall 2006. Learning objectives and assessment plan will be
developed at that time. Some course level assessment can be done starting Spring
2007. A complete program level assessment will have to wait until there are
students graduated following the new curriculum.
Industrial and Systems Engineering BS Program
Outcome Assessment Summary
1.
URL to current set of student learning outcomes on the UG studies web site.
http://www.engr.sjsu.edu/ise/dept_files/iseobjs_flyer.pdf
2.
For this semester, identify which learning outcome(s) was (were) the focus of
assessment.
All learning outcomes and program educational objectives were assessed and
documented in a self-study report. This report documents the learning objective,
outcome, assessment process, and all assessment and enhancement activities up to
Summer 2005. The outcomes and objectives are listed in Table 1 on the following
page.
3.
For each learning outcome, identified in item 2, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much,
and by whom?
All learning outcomes were assessed and documented in a self-study report. This
report documents the learning objective, outcome, assessment process, and all
assessment and enhancement activities up to Summer 2005. Table 3 shows the
instrument of assessment for each objective. Table 4 shows the instrument of
assessment for the outcomes and the assessment process cycle.
4.
How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
The self-study report has an extensive record of activities associated with
outcome assessment and program enhancement. Typical, assessment data is
analyzed off-line by designated faculty members and the improvement actions are
discussed in industry advisory board meeting and faculty retreats and meetings.
Table 1 – Program Outcomes and Relationship to Program Educational Objectives
4262
O
b
j
e
c
t
i
v
e
n
u
m
b
e
O
u
tc
o
m
e
le
tt
e
r
ISE Program Outcome Description
3
a
Graduates have an ability to apply knowledge of
mathematics, science and industrial and
systems engineering.
3
b
Graduates have an ability to design and conduct
experiments, as well as to analyze and interpret
data.
c
Graduates have an ability to design a system,
component, or process to meet desired needs
within realistic constraints such as economic,
environmental, social, political, ethical, health and
safety, manufacturability, and sustainability.
d
e
r
Graduates have an ability to function on multidisciplinary teams.
Graduates have an ability to identify, formulate
and solve engineering problems.
4
4
2
3
4
1
2
4
f
g
ISE Program Educational Objective
Graduates have an understanding of professional
and ethical responsibility.
Graduates have an ability to communicate
effectively.
4
1
5
Collect and analyze appropriate data.
Develop and evaluate potential solutions to the
problem.
Collect and analyze appropriate data.
Develop and evaluate potential solutions to the
problem.
Describe the problem and recognize its
difficulty.
Collect and analyze appropriate data.
Develop and evaluate potential solutions to the
problem.
Participate effectively within an organization
and be able to identify problems in the ISE
domain.
Describe the problem and recognize its
difficulty.
Develop and evaluate potential solutions to the
problem.
Develop and evaluate potential solutions to the
problem.
Participate effectively within an organization;
provide leadership as an ISE professional.
Present results effectively.
Develop and evaluate potential solutions to the
problem.
h
Graduates have the broad education necessary to
understand the impact of engineering solutions in a
global, economic, environmental, and societal
context.
4
i
Graduates have a recognition of the need for, and
an ability to engage in, life-long learning.
6
Be prepared to engage in life-long learning and
continued development in the field.
j
Graduates have the knowledge of contemporary
issues.
4
Develop and evaluate potential solutions to the
problem.
Graduates have an ability to use the techniques,
skills, and modern industrial and systems
engineering tools necessary for industrial and
systems engineering practice.
2
k
3
5
Describe the problem and recognize its
difficulty.
Collect and analyze appropriate data
Develop and evaluate potential solutions to the
problem.
Present results effectively.
Table 2
4362
Educational Objectives
1.To prepare students to function
effectively as an ISE professional in
any industry, government, or
service organization designing or
improving and
implementing
efficient business processes.
2.To provide students with the ability to
use
methodologies
and
computational skills to identify,
formulate, and develop solutions for
problems normally encountered in
their organizations.
3.To train students relative to collect,
analyze,
and
interpret
data
efficiently and effectively to solve
systems analysis and engineering
problems.
4.To prepare students to evaluate the
impact of their proposed solutions
to engineering problems in the
broader context of the organization
or society.
5.To prepare students to effectively
communicate using written, oral
and electronic media to articulate
technical problems and their
proposed solutions.
6.To help students recognize the need
for life-long learning and growth
within their chosen profession and
to be familiar with the strategies
they may employ to accomplish
this.
5.











Evaluation
Data from Career Center
Alumni Survey
Senior Student Exit Survey
Input from Department Advisory Committee
Ad-Hoc industrial review committees
Number of students attending graduate school
Student Evaluations of Faculty
Data from Career Center
Input from Department Advisory Committee
Alumni Survey
Senior Student Exit Survey
 Performance in engineering projects
 Performance on design projects
 Internship Supervisor Questionnaires and
Evaluation Letters
 Ad-Hoc industrial review committees
 Performance in communication
 Performance in engineering courses
 Internship Supervisor Questionnaires and
Evaluation Letters
 Alumni Survey
 Alumni Survey
 Performance on Design Projects
 Number of alumni participating in
professional activities (student clubs,
professional
meetings,
and
student
competitions)
 Performance of alumni on team projects
Table 3
What findings emerged from departmental analysis of the information/data collected?
The findings are documented in the self-study report. The following table is an
index table for some of the assessment results. Note that the figure and table
numbers are in the ABET report and are not included in this summary report.
Data
Provides
When
Form/Tool
Data /
4462
Collection
Protocol
Start ISE
Program
Junior
Outcomes
Survey (same
as Senior)
Class
Outcomes
Survey
Class Topics
Survey
Take ISE
Classes
Teamwork
Survey
Complete
ISE
Program
Senior Exit
Exam
These
surveys
and
assessment
s are
conducted
at the end
of the ISE
19B
Capstone
Administered
Baselines for
Cohort Analysis
Information on
how class is
perceived to meet
Outcomes
Appropriateness
and estimate of
percentage
learned about
each topic
Nine items
regarding self and
each team
member for each
class team
project. Will feed
database for
teamwork
feedback and
outcome
assessment.
Direct assessment
of learning about
key concepts –
each question
mapped to one or
more program
outcomes
Capstone
Project
Outcome
Assessment
Review of
technical content
of Project Report
relative to
Program
Outcomes
Senior
Outgoing opinion
This survey is
administered at
the end of the
ISE 102 class.
Figure
Number
Results /
Figures /
Tables
Figure 15
Tables 1417, Figures
24-29
Figure 13
Table 11,
Figures 1821
Figure 14
Tables 1213,
Figures 2223
At the time
each team
project report
is submitted
during the
course.
Figure 16
Table 18,
Figures 3033
Approximately
3 weeks prior
to the end of
the Capstone
Project course.
Table 10,
Appendix
III-H
Tables 1921
Figure 34
At the end of
each ISE
course.
After final oral
presentation of
Capstone
Project report.
Figure 17
Figure 15
Preliminary
results
available at
visit
Tables 1417, Figures
24-29
4562
Project
class
Outcomes
Survey (same
as Junior)
DAC Exit
Questionnaire
6.
of the
appropriateness
and learning
relative to each
outcome for
Cohort Studies
Figures 3435,
Appendix
III-M
Appendix
III-I
Wide ranging
overview of
courses, labs,
faculty, staff, etc.
What actions were taken to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
The following figure shows the number of course modifications affecting
program outcomes in the current review period . A detail description of these
course changes and other corrective or improvement actions are documented in
the ABET self-study report.
Semester Review
F99 S00 F00 S01 F01 S02 F02 S03 F03 S04 F04
Outcome (a): Ability to apply knowledge of mathematics,
science, and engineering.
2
Outcome (b): Ability to design and conduct experiments,
as well as to analyze and to interpret.
2
Outcome (c.): Ability to design a system, component, or
process to meet desired needs
1
Outcome (d): Ability to function on multi-disciplinary teams.
Outcome
Outcome (e): Ability to identify, formulate, and solve
engineering problems.
1
3
3
1
2
1
2
3
2
2
2
2
1
1
2
2
2
4
2
1
1
2
1
2
1
2
3
Outcome (h): Understanding of the impact on engineering
solutions in a global/societal impact.
1
1
1
1
1
1
Outcome (i): Recognition of the need for, and an ability to
engage in life long learning.
1
1
1
2
1
1
2
2
1
2
1
1
1
2
3
1
1
Outcome (j): Knowledge of contemporary issues
Outcome (k): Ability to use the techniques, skills, and
modern engineering tools necessary for engineering
practice.
2
2
Outcome (f): Understanding of professional and ethical
responsibility.
Outcome (g): Ability to communicate effectively.
2
1
1
2
1
2
1
1
2
2
1
2
1
4662
7.
Describe plans to collect data on additional learning outcomes next semester.
Include a calendar for collecting data on all SLO’s by end of Spring 07.
The assessment and improvement plan, including the calendar of activities, is
documented in the self-study report.
The assessment and improvement plan is
designed to coincide with the ABET 6-year accreditation cycle.
Mechanical Engineering Program (BSME) Outcome
Assessment Summary
1.
URL to current set of student learning outcomes on the UG studies web site.
http://www.engr.sjsu.edu/mae/mae%20mission.php
2.
For this semester, identify which learning outcome(s) was (were) the focus of
assessment.
Outcomes are assessed on a regular basis according to our assessment plan (see
question 7). All outcomes were accessed and documented in a report for our recent
ABET accreditation visit (Fall 2005). These learning outcomes are as follows:
ME graduates are expected to have:
a.
An ability to apply knowledge of mathematics, science, and engineering to
solve mechanical engineering problems.
b.
An ability to design and conduct experiments, as well as to analyze and
interpret data.
c.
An ability to design a system, component, or process to meet desired
needs within realistic constraints such as economic, environmental, social,
political, ethical, health and safety, manufacturability and sustainability.
d.
An ability to function on multi-disciplinary teams.
e.
An ability to identify, formulate, and solve mechanical engineering
problems.
f.
An understanding of professional and ethical responsibility.
g.
An ability to communicate effectively.
h.
The broad education necessary to understand the impact of engineering
solutions in a global and societal context.
i.
A recognition of the need for, and an ability to engage in life-long
learning.
j.
A knowledge of contemporary issues.
4762
k.
An ability to use the techniques, skills, and modern engineering tools
necessary for engineering practice.
All learning outcomes were assessed and documented in a self-study report submitted
to ABET in June 2005. The report can be found at:
http://www.engr.sjsu.edu/nikos/abet/aematrix.htm
3.
For each learning outcome, identified in item 2, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much,
and by whom?
Each outcome (a – k) is addressed in several courses of the ME curriculum. A
subset of these courses was chosen for a thorough assessment of each outcome, as
shown in the table below. With the exception of the three capstone courses
(ME157, ME182, ME190), this subset consists solely of required courses taken by
all ME students. E10 is the only lower division course included in this set.
Lower division courses typically prepare students at skill levels 1 or 2, while
upper division courses prepare students at skill levels 3, 4, 5 or 6 of Bloom’s
Taxonomy in the particular outcomes they address.
ME Program – Outcome Matrix
O
3a
E10
E100W

u
3b
t
c
3c
3d
A
A
o
m
e
s
3e
3f
3g
3h
3i
3j

A

A




A

3k
A

B

ME106
B
B
B
B

B
B
C
C
A
ME111
B
B
B
B
B
B
B
ME113
C
C
B
B
B
A
ME114

C
C
C
C
ME120


B
B
ME147




C
ME154
  
ME157
B
B
B
B
C

B
B
B
B
C
B
B
B
ME182

 C
ME 195 A, B
C
C
C
B
C
C
A Skill level 1 or 2 in Bloom’s Taxonomy B Skill level 3 or 4 in Bloom’s Taxonomy
C Skill level 5 or 6 in Bloom’s Taxonomy  Skills relevant but not presently assessed
ME101
The following flow chart describes the assessment process:
4862
ABET + PROGRAM FACULTY:
Define Program Outcomes.
OUTCOME CHAMPIONS:
Break down each outcome into elements.
Define outcome attributes for each element.
OUTCOME
CHAMPIONS:
Write student
survey questions
for each outcome
based on attributes
PROGRAM FACULTY:
Define outcome indicators and performance targets.
PROGRAM FACULTY:
Identify courses that address this outcome.
PROGRAM FACULTY:
Select courses to be assessed for this outcome.
OUTCOME
CHAMPIONS +
COURSE
COORDINATORS:
Recommend /
implement
curriculum
improvements
as needed.
No
COURSE COORDINATORS:
Collect / organize course material from each of
the selected courses (syllabus, student work,
assignment / test scores for each student).
COURSE
COORDINATORS:
Generate student
survey including
questions for each
of the outcomes
addressed in their
course.
COURSE
COORDINATORS:
Administer student
surveys.
COURSE COORDINATORS: Analyze data
Performance targets met?
Yes
Outcome satisfied.
4.
How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
The data was collected, summarized, and analyzed from Fall 2002 through Spring
2005. The findings were presented to the MAE faculty by the Assessment
Coordinator and discussed at several department meetings. They are also
presented and discussed in the ABET Self-Study Report, which can be found at:
http://www.engr.sjsu.edu/nikos/abet/aematrix.htm
5.
What findings emerged from departmental analysis of the information/data
collected?
Based on the data collected and analyzed he BSAE Program satisfies all the
outcomes. More details on the findings regarding each outcome can be found in
the ABET Self-Study Report at http://www.engr.sjsu.edu/nikos/abet/aematrix.htm
4962
6.
What actions were taken to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
As a result of our outcomes assessment, several improvements have been
implemented to ensure that ME students acquire the highest possible level of the
skills defined under each outcome. These improvements are listed in the
following table:
Curriculum improvements
Courses in which these improvements
have been implemented
Students design the experiments they will ME113-Thermodynamics
perform in the various laboratories.
ME114-Heat Transfer
ME120-Experimental Methods
Students are taught team skills and
ME120-Experimental Methods
required to assess formally the
ME195A&B-Senior Design Project
performance of their teammates using
specific criteria.
Students identify, formulate, and solve
ME111–Fluid Mechanics
open-ended problems.
ME113-Thermodynamics
ME114-Heat Transfer
Students research, present, and discuss in ME111–Fluid Mechanics
class contemporary engineering
ME113-Thermodynamics
applications and their impact in a global
ME114-Heat Transfer
and societal context.
7.
PO
3b
3d
3a
3e
3h 3j
Describe plans to collect data on additional learning outcomes next semester.
Include a calendar for collecting data on all SLOs by the end of Spring 2007.
Timetable for Program Outcomes Assessment
O
3a
Fall 05
Spring 06
Fall 06
Spring 07
Fall 07
Spring 08
Fall 08
Spring 09
Fall 09
Spring 10
Fall 05
Spring 06
Fall 06
3b
3c
3d
u
3e
X
t
3f
c
3g
o
3h
m
3i
e
3j
s
3k
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Self-Study Reports Due
ABET VISIT
X
ABET requires that each program have in place a plan for continuous assessment and
improvement. The MAE faculty has decided on the timetable shown above for assessing the 11
Program Outcomes. According to this table, outcomes 3a, 3f (Fall 2005), 3b and 3g (Spring 2006)
will be assessed in AY 2005-2006.
5062
Aerospace Engineering Program (BSAE) Outcome
Assessment Summary
1.
URL to current set of student learning outcomes on the UG studies web site.
http://www.engr.sjsu.edu/mae/mae%20mission.php
2.
For this semester, identify which learning outcome(s) was (were) the focus of
assessment.
Outcomes are assessed on a regular basis according to our assessment plan (see
question 7). All outcomes were accessed and documented in a report for our
recent ABET accreditation visit (Fall 2005). These learning outcomes are as
follows:
AE graduates are expected to have:
a. An ability to apply knowledge of mathematics, science, and engineering to
solve aerospace engineering problems.
b. An ability to design and conduct experiments, as well as to analyze and
interpret data.
c. An ability to design a system, component, or process to meet desired
needs within realistic constraints such as economic, environmental, social,
political, ethical, health and safety, manufacturability and sustainability.
d. An ability to function on multi-disciplinary teams.
e. An ability to identify, formulate, and solve aerospace engineering
problems.
f. An understanding of professional and ethical responsibility.
g. An ability to communicate effectively.
h. The broad education necessary to understand the impact of engineering
solutions in a global and societal context.
i. A recognition of the need for, and an ability to engage in life-long
learning.
j. A knowledge of contemporary issues.
k. An ability to use the techniques, skills, and modern engineering tools
necessary for engineering practice.
5162
All learning outcomes were assessed and documented in a self-study report submitted
to
ABET
in
June
2005.
The
report
can
be
found
at:
http://www.engr.sjsu.edu/nikos/abet/aematrix.htm
3.
For each learning outcome, identified in item 2, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much,
and by whom?
Each outcome (a – k) is addressed in several courses of the AE curriculum. A
subset of these courses was chosen for a thorough assessment of each outcome, as
shown in the table below. This subset consists of ten (10) required courses and
one elective (ME114). E10 is the only lower division course included in this set.
Lower division courses typically prepare students at skill levels 1 or 2, while
upper division courses prepare students at skill levels 3, 4, 5 or 6 of Bloom’s
Taxonomy in the particular outcomes they address.
AE Program – Outcome Matrix
O
u
t
3a
3b
3c
c
o
3d
3e
m
3f
e
3g
3h
s
3i
3j
3k

 A
 A
 A
A
A
A

B
B

B
B
C
C
B
B
B
B
B
B
B
B

C
C
C
C
B
B
C
C
C
B
C
B
C
C
AE164
B
C
C
B
C
B
B
B
C
B
B
B
B
A
AE167

 C
AE170A, B
C
C
C
B
C
B
C
ME114
C
C
B
B
B
A
A Skill level 1 or 2 in Bloom’s Taxonomy B Skill level 3 or 4 in Bloom’s Taxonomy
C Skill level 5 or 6 in Bloom’s Taxonomy
 Skills relevant but not presently assessed
E10 
ME101
ME111
ME113
ME120
AE162
The following flow chart describes the assessment process:
5262
ABET + PROGRAM FACULTY:
Define Program Outcomes.
OUTCOME CHAMPIONS:
Break down each outcome into elements.
Define outcome attributes for each element.
OUTCOME
CHAMPIONS:
Write student
survey questions
for each outcome
based on attributes
PROGRAM FACULTY:
Define outcome indicators and performance targets.
PROGRAM FACULTY:
Identify courses that address this outcome.
PROGRAM FACULTY:
Select courses to be assessed for this outcome.
OUTCOME
CHAMPIONS +
COURSE
COORDINATORS:
Recommend /
implement
curriculum
improvements
as needed.
No
COURSE COORDINATORS:
Collect / organize course material from each of
the selected courses (syllabus, student work,
assignment / test scores for each student).
COURSE
COORDINATORS:
Generate student
survey including
questions for each
of the outcomes
addressed in their
course.
COURSE
COORDINATORS:
Administer student
surveys.
COURSE COORDINATORS: Analyze data
Performance targets met?
Yes
Outcome satisfied.
4.
How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
The data was collected, summarized, and analyzed from Fall 2002 through Spring
2005. The findings were presented to the MAE faculty by the Assessment
Coordinator and discussed at several department meetings. They are also
presented and discussed in the ABET Self-Study Report, which can be found at:
http://www.engr.sjsu.edu/nikos/abet/aematrix.htm
5.
What findings emerged from departmental analysis of the information/data
collected?
Based on the data collected and analyzed he BSAE Program satisfies all the
outcomes. More details on the findings regarding each outcome can be found in
the ABET Self-Study Report at http://www.engr.sjsu.edu/nikos/abet/aematrix.htm
5362
6.
What actions were taken to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
As a result of our outcomes assessment, several improvements have been
implemented to ensure that AE students acquire the highest possible level of the
skills defined under each outcome. These improvements are listed in the
following table:
Curriculum improvements
Students design the experiments
they will perform in the various
laboratories.
Students discuss economic,
environmental, social, political,
ethical, safety, liability, and
manufacturability constraints in their
design of aircraft / spacecraft.
Students are taught team skills and
required to assess formally the
performance of their teammates
using specific criteria.
Students identify, formulate, and
solve open-ended problems. Some
of these problems involve
integration of material from two or
more courses.
Students research, present, and
discuss in class safety, ethics, and
liability issues in AE.
Students research, present, and
discuss in class contemporary
engineering applications and their
impact in a global and societal
context.
Courses in which these improvements have
been implemented
ME113-Thermodynamics
ME114-Heat Transfer
ME120-Experimental Methods
AE162-Aerodynamics
AE164-Compressible Flow
AE170A&B-Aircraft / Spacecreft Design
PO
ME120-Experimental Methods
AE162-Aerodynamics
AE164-Compressible Flow
AE170A&B-Aircraft / Spacecreft Design
ME111–Fluid Mechanics
ME113-Thermodynamics
ME114-Heat Transfer
AE162-Aerodynamics
AE164-Compressible Flow
AE165–Flight Mechanics
AE167-Aerospace Propulsion
AE170A&B-Aircraft / Spacecreft Design
3d
ME111–Fluid Mechanics
ME113-Thermodynamics
ME114-Heat Transfer
AE162-Aerodynamics
AE164-Compressible Flow
AE165–Flight Mechanics
AE167-Aerospace Propulsion
3b
3c
3a
3e
3f
3h
3h 3j
5462
7.
Describe plans to collect data on additional learning outcomes next semester. Include a
calendar for collecting data on all SLOs by the end of Spring 2007.
ABET requires each program to have in place a plan for continuous assessment and
improvement. The MAE faculty has decided on the following timetable for assessing the 11
Program Outcomes:
Timetable for Program Outcomes Assessment
O
u
t
c
o
m
e
s
3a
Fall 05
Spring 06
Fall 06
Spring 07
Fall 07
Spring 08
Fall 08
Spring 09
Fall 09
Spring 10
Fall 05
Spring 06
Fall 06
3b
3c
3d
3
e
X
3f
3g
3h
3i
3j
3k
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Self-Study Reports Due
ABET VISIT
X
According to this table, outcomes 3a, 3f (Fall 2005), 3b and 3g (Spring 2006) will be
assessed in AY 2005-2006.
SECTION (B)
Graduate Programs
_____________________________________________________
Chemical & Materials Engineering MS Program Outcome Assessment
Summary
1.
For this semester, identify which Student Learning Outcome(s) was (were) the focus of
assessment.
For the Spring 2006 semester the following outcomes were assessed:
PO2:
Be able to evaluate the impact of their work on society.
PO3
Be able to solve complex engineering problems and tasks, and use engineering and science
principles to justify recommendations.
PO4: Develop life-long learning skills and be able to apply their engineering knowledge to
critically evaluate pertinent literature and new technologies or systems.
PO5: Effectively communicate problems and solutions and develop resolutions.
PO6: Be cognizant of the ethical, economic and environmental implications of their work, as
appropriate.
PO8: Deliver effective presentations of engineering activities in written and oral formats.
PO9: Ability to do professional work that would be of the quality that is acceptable for publication
in a peer-reviewed journal.
PO10: Ability to use statistical design and analysis techniques in experimental work.
2.
For each Student Learning Outcome, identified in item 1, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much, and
by whom?
Outcomes 2, 3, 4, 5, 6, 8, and 10 are assessed by graduate research committees on two separate
occasions. Each student is individually assessed as part of their proposal defense in ChE/MatE 281
and again at the time of their final project/thesis defense. Program outcome 9 is assessed at the time
of the final thesis/project defense.
The committee, at minimum, consists of an SJSU faculty advisor and two reading committee
members. The faculty advisor must be a full time tenure/tenure track CME faculty. The reading
committee members can be SJSU faculty or industrial collaborators with expertise in the students
area of research. Two of the members must be SJSU faculty (regular or part-time).
The committee members were asked to evaluate the level at which the students achieved the
objectives listed in Table 1. The values listed in parentheses indicate the outcome(s) that the
objective addresses. Meeting the objective implies that the outcomes is being met.
1
2
3
4
5
6
7
Table 1: Course Objectives
Delivered professional written report (PO.5,PO.8)
Delivered professional oral presentation (PO.5,PO.8)
Show how project relates to work in literature(PO.2, PO.4)
Defend procedures/results based on engineering principles (PO.3, PO.5)
Adequate statistical methods (PO.10)
Aware of ethical/environmental impact (PO.6)
Quality of thesis/project work is acceptable for publication in a peerreviewed journal (PO. 9)
5677
The committee rated each objective on a scale listed in Table 2
Table 2: Rating Scale
1
Unacceptable
2
Weak (minor deficiencies)
3
Acceptable
4
Very good
5
Excellent
During the academic year 2005/2006 the graduate program committee modified the survey given
during the ChE/MatE281 proposal defense. The number of questions were reduced from seven to
five and a more detailed description of the rating scale was provided with the survey. These
changes were made to streamline the survey process and to improve the uniformity of the
assessments between each thesis/project committee. These changes were implemented during the
Spring 2006 semester. A rating of 3 on the new survey still indicates that the student is meeting the
objective.
Objectives are evaluated at the proposal defense and the final defense to measure the impact of the
research experience on the quality of the student’s education. If an objective is rated at less than a 3
at the final defense, the advisor does not sign off on the project or thesis until that objective is met
to the advisor’s satisfaction.
Between Fall 2004 and Spring 2006, 60 ChE/MatE 281 proposal defenses and 20 Project/Thesis
defenses were assessed.
3.
How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
The data was presented at the Summer 2006 CME retreat. The raw data and a statistical summary
of the assessment were presented to the faculty during the retreat.
4.
What findings emerged from the departmental analysis of the information/data
collected?
Tables 3 and 4 summarizes the results of the ChE/MatE281 proposal assessment. The S2006
survey incorporated the question about statistical methods evaluation into the question about the
students ability to defend their procedures on engineering principles. This combination eliminated
some of the difficulties assessing a research project in which the statistical analysis of data is not
meaningful or practical. The question related to economic/environmental and ethical impact was
combined with the question related to overall significance of the research. The graduate committee
felt that these two questions somewhat redundant.
Table 5 summarizes the Final Thesis/Project defense assessment. This survey has not yet been
modified to reflect the changes made to the new S2006 proposal survey. The modification will be
done during the 2006/2007 academic year.
5777
It should be noted that the 60 students evaluated in ChE/MatE281 course were not the same set of
students evaluated for the Final Project/Thesis Defense. There is some overlap in students between
the ChE/MatE 281 course students and the Final Project/Thesis defense student, but currently a
student by student analysis has not been done.
Table 3: ChE/MatE 281 Proposals Fall 2003 to Fall 2005
5877
AveragMinx
Delivrdpofsna
PO83.7245
writenpo
Delivrdpofsna
PO84.025
oralpesnti
Showprject
PO4relatsowkin3.825
literau
Defnd
procedus/lt
PO3,5.42
5977
Table 4: ChE/MatE 281 Proposals Spring 2006
6077
AveragMinx
Delivrd
PO8profesinalwt3.64
repot
Delivrd
PO8profesinal3.5
presntaio
Showprject
PO2,4relatsowkin3.85
literau
Defnd
procedus/lt
PO3,510.24
6177
Table 5: ChE/MatE Final Thesis/Project Defense
6277
AveragMinx
Delivrdpofsna
PO84.035
writenpo
Delivrdpofsna
PO84.235
oralpesnti
Showprject
PO4relatsowkin.35
literau
Defnd
procedus/lt
PO3,54.1
6377
The data indicates that the project/thesis research experience is improving the students’ abilities to
meet the student learning outcomes. Data from the proposal indicated that more than one program
outcome was not met. By the time of the defense all but one student learning outcome was not met.
The one outcome not met was a data point from a single student. That student was required to
correct this deficiency before the advisor accepted the thesis/project.
5.
What actions were taken to address the findings? (e.g. curricular revision, pedagogical
changes, student support services, resource management.)
The assessment results indicate that the programs are meeting student learning outcomes 2, 3, 4, 5,
6, 8, 9 and 10. Additional data will be collected every semester to continually track the
performance of students completing the ChE/MatE281 course and students that successfully defend
their project/thesis.
6. Describe plans to collect data on additional learning outcomes next semester.
ChE/MatE 281 proposals and the final defenses will continue to be assessed for outcomes 2, 3, 4, 5,
6, 8, 9 and 10. The graduate committee will review the Final Thesis/Project defense survey and
modify the survey to correlate the data better with the data collected from the ChE/MatE 281
proposal defense survey data.
Additionally, a database will be created to track the following information from our current
graduate students:
o Conference presentations
o Publication of technical papers
o Awards
o Professional services (office bearer in professional societies)
o Community services
This student activities database will be used to assess program outcomes 4 and 9.
Finally an alumni survey will be developed to collect information from our graduate alumni related
to program outcomes 1, 4, and 9. Table 4 below shows the schedule of activities to be completed
by Spring 2007.
Table 5: Schedule of Activities
Activities
ChE/MatE281 and Defense Survey
Alumni Survey
Grad. Student Acheivement Database
F2005
X
S2006
X
X
F2006
X
X
S2007 Outcomes Addressed
X
2,3,4,5,6,8,9,10
1,4,9
4,9
7. Did your analysis result in revisiting/revising the Student Learning Outcomes? If the
answer was yes, please explain and submit an updated version of the Student Learning
Outcomes.
This analysis did not require a modification of the existing student learning outcomes. The student
learning outcomes are listed below
6477
Student Learning Outcomes: Chemical and Materials Engineering
PO1: Become a team leader that is capable of working with various disciplines of engineering,
science and business.
PO2: Be able to evaluate the impact of their work on society.
PO3
Be able to solve complex engineering problems and tasks, and use engineering and science
principles to justify recommendations.
PO4: Develop life-long learning skills and be able to apply their engineering knowledge to
critically evaluate pertinent literature and new technologies or systems.
PO5: Effectively communicate problems and solutions and develop resolutions.
PO6: Be cognizant of the ethical, economic and environmental implications of their work, as
appropriate.
PO7: Be able to adapt and apply their engineering education to a variety of career paths.
PO8: Deliver effective presentations of engineering activities in written and oral formats.
PO9: Ability to do professional work that would be of the quality that is acceptable for publication
in a peer-reviewed journal.
PO10: Ability to use statistical design and analysis techniques in experimental work.
6577
Computer Engineering MS Program Outcome Assessment
Summary
1. For this semester, identify which learning outcome(s) was (were) the focus of
assessment. Please attach a copy of the department/program goals and Student
Learning Objectives.
For Spring 2005, learning outcomes 1, 2, 3, 4 and 6 were the focus of assessment.
Learning outcomes are
(1) Be able to demonstrate an understanding of advanced knowledge of the practice of computer
or software engineering, from vision to analysis, design, validation and deployment.
(2) Be able to tackle complex engineering problems and tasks, using contemporary engineering
principles, methodologies and tools.
(3) Be able to demonstrate leadership and the ability to participate in teamwork in an
environment with different disciplines of engineering, science and business.
(4) Be aware of ethical, economic and environmental implications of their work, as appropriate.
(5) Be able to advance successfully in the engineering profession, and sustain a process of lifelong learning in engineering or other professional areas.
(6) Be able to communicate effectively, in both oral and written forms.
2. For each learning outcome, identified in item 1, what direct (performance measures)
and indirect (e.g. surveys) information/data were collected, how much, and by whom
this Spring 06?
Learning Outcome Coverage (Spring 2006)
Course
1
2
CMPE 200


CMPE 220


CMPE 295A/B


3
4
5
6






Data Collection Mechanism
Course
Midterm
Exam
Final
Exam
Team
Project
CMPE 200



CMPE 220



CMPE 295A/B
Master
Project
Report
Project
Present
ation
Exit
Survey



6677
All data were collected by course instructors.
3. For data collected Fall 05, how and when were the data summarized, analyzed and
discussed? (e.g. faculty retreat, assessment/curriculum committee, report to the
faculty).
In Fall 2005, CMPE 295A/B was assessed for graduate programs. During the department
retreat on 2/17/06, the data were summarized, analyzed and discussed.
4. What findings emerged from departmental analysis of the information/data collected
in Fall 05?
Course learning object B.2 (ability to communicate in forms of an oral project presentation and
a written project final report) was identified to be enhanced.
5. What actions are planned to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
Two action items were planned to be implemented in Fall 2006
o Increase numbers of student presentations in CMPE 295A
o Coordinate the course delivery processes and project assignments between CMPE 295A and
the written skill courses (ENGR 200W and CMPE 294).
6. Describe plans to collect data on additional learning outcomes next semester.
Learning Outcome Coverage (Fall 2006)
Course
1
2
CMPE 200


CMPE 220


4


5
6




CMPE 294
CMPE 295A/B
3





From Fall 2006, CMPE 294 data will be collected to cover Outcome 5.
7. Did your analysis result in revisiting/revising the student learning outcomes? If the
answer was yes, please explain and submit an updated version of the Student
Learning Outcomes.
The student learning outcomes were reviewed during the department retreat (2/17/06) and
Department Advisory Council meeting (5/5/06). No revisions were recommended.
6777
Electrical Engineering MS Program Outcome Assessment
Summary
1. For this semester, identify which learning outcome(s) was (were) the focus of assessment.
Please attach a copy of the department/program goals and Student Learning Objectives.
This semester the focus of assessment was the following four Program Outcomes (POs):
1) Students will be able to base analysis, problem solving and design on core advanced EE
theory.
2) Students will be able to develop deeper understanding of an area of concentration in their
graduate programs.
3) Students will be able to apply modern tools for computations, simulations, analysis, and
design.
4) Students will be able to communicate engineering results effectively.
2. For each learning outcome, identified in item 1, what direct (performance measures) and
indirect (e.g. surveys) information/data were collected, how much, and by whom this
Spring 06?
A) For all four Program Outcomes listed in subsection 1, we collected data in 7 of our graduate
classes. This data was collected by the faculty members teaching the classes. A total of 141
students took part in the surveys. The results of the survey are presented in Appendix A.
B)
We developed 25 Course Learning Outcomes (CLOs) for one of our core course: EE210.
These outcomes are listed in Appendix B. We collected data for these outcomes by
surveying the students taking EE210. A total of 35 students participated in the survey. The
results are listed in Appendix C.
3. For data collected Fall 05, how and when were the data summarized, analyzed and
discussed? (e.g. faculty retreat, assessment/curriculum committee, report to the faculty).
The POs and CLOs data were analyzed and discussed in the Department Graduate Committee
and in the Department Faculty Retreat on April 7, 2006. The survey data presentation to the
faculty was followed by a discussion of results. Recommendations to improve the program and
the core course EE210 were made.
4. What findings emerged from departmental analysis of the information/data collected in
Fall 05?
Summary of the student evaluations for the program outcomes (POs) indicate that all outcomes
are met adequately. Less than 10% students disagree that the PO #1, #2, and #4 are met by the
program. About 15% disagree that the PO #3 is met by the program. Reasons for lower
evaluation for PO #3 were discussed and determined to be insufficient lab support and
maintenance.
6877
Course learning outcomes (CLOs) for the EE210 course as evaluated by the students indicate
that the course meets adequately all the outcomes. CLO #21 and #23 were evaluated as
minimally meeting the requirement. CLO #21 was judged to be an important outcome. CLO
#23, on state-space technique, was considered to be relatively less important for the program.
5. What actions are planned to address the findings? (e.g. curricular revision, pedagogical
changes, student support services, resource management.)
Looking into the lab support and better maintenance of the lab equipment was recommended to
be considered while allocating department resources.
EE210 course should emphasize more the topic on understanding different types of filters. This
is an important learning objective considering the thrust of our program and, therefore, should
be adequately addressed.
6. Describe plans to collect data on additional learning outcomes next semester.
We plan to collect data for all POs again in Fall 2006. For course learning outcomes
(CLOs), we will collect data for EE221 and EE250 core courses during Fall 2006. We will
also collect data to evaluate project course EE297 in Fall 2006.
These data will be analyzed and discussed in Spring 2007. These evaluations will be
extended to include faculty, alumni and people from industry.
7. Did your analysis result in revisiting/revising the student learning outcomes? If the
answer was yes, please explain and submit an updated version of the Student Learning
Outcomes.
None
6977
Appendix A
PROGRAM OBJECTIVES (POs) SURVEY FOR MSEE STUDENTS
The following is a survey intended to determine how well this course is meeting the Outcome Objectives for the EE Program.
Please take a few moments to check your personal impressions and add evidence to support your evaluation. Please note this is
not an evaluation of the instructor, but it is an attempt to determine if the content of the course and the method of instruction
are supporting the EE Program Outcome Objectives.
COURSE NUMBER
INSTRUCTOR
SEMESTER AND YEAR OF COURSE
EE PROGRAM OUTCOME STATEMENT
1.0 I can base analysis, problem solving and
design on core advanced EE theory.
2.0 I am able to develop deeper
understanding of an area of concentration in the
graduate programs.
3.0 I am able to apply modern tools for
computations, simulations, analysis, and design.
4.0 I am able to communicate engineering
results effectively.
GRADUATE IMPRESSION
AG
RE
E
ST
SLI
RO
GT
NG
HL
LY
Y
AG
AG
RE
RE
E
E
SURVEY DATE
FALL SPRING
SLI
GH
TL
Y
DIS
AG
RE
E
DIS
AG
RE
E
ST
RO
NG
LY
DIS
AG
RE
E
SUMMER YEAR________
NOT
APPLI
CABL
E TO
THIS
DEGR
EE
PROG
RAM
Example from professional
activities to support rating,
connected to a specific EE course
if possible.
PROGRAM OBJECTIVES (POs) SURVEY FOR MSEE STUDENTS
Areas for improvement of the MSEE program:
1.0 What additional components would you
recommend adding to this course to better
prepare you for professional activities?
2.0 What changes would you recommend as to
the method of instruction that is used in this
course? (This can be in terms of general format:
lecture, seminar or lab or you can refer to
particular methods that in your opinion would
make this course more effective.)
3.0 What changes would you recommend as to
how this course is included in the program? (This
refers to common core, option core, or elective
status and possible prerequisite relationships.)
4.0 What new or related courses would you
recommend that be added to the EE program to
meet developing needs.
5.0 Please add any additional comments that
would be relevant to improvement of the EE
program.
Thank you for taking the time to provide us with a response.
MSEE SURVEYS
FALL 2005
PO
#
Respondents Strongly Agree
Slightly
Slightly Disagree Strongly NA
Agree
Agree
Disagree
Disagree
1
25
1
16
4
0
1
0
1
35
5
19
6
2
3
0
1
0
1
1
1
1
1
7.0
28
24
10
10
9
141.0
10
7
3
1
2
20.6
13
14
7
7
4
56.7
4
2
0
2
2
14.2
1
0
0
0
0
2.1
0
0
0
0
0
2.8
0
0
0
0
1
0.7
0
0
0
0
0.7
2
2
2
2
2
2
2
14.0
25
35
28
24
10
10
9
141.0
3
4
6
9
3
1
2
19.9
15
16
18
12
7
7
3
55.3
5
6
3
3
0
2
3
15.6
0
6
1
0
0
0
0
5.0
1
3
0
0
0
0
0
2.8
1
0
0
0
0
0
1
1.4
0
0
0
0
0
0
0
0.0
3
3
3
3
3
3
3
21.0
25
35
28
24
10
10
9
141.0
3
0
8
6
4
1
2
17.0
12
10
13
16
6
5
4
46.8
7
9
4
2
0
4
3
20.6
1
5
2
0
0
0
0
5.7
2
6
0
0
0
0
0
5.7
0
3
1
0
0
0
0
2.8
0
2
0
0
0
0
0
1.4
4
4
4
4
4
4
4
28.0
25
35
28
24
10
10
9
141.0
2
3
8
6
3
1
2
17.7
12
16
15
16
4
6
2
50.4
10
7
4
2
2
3
4
22.7
1
4
1
0
0
0
0
4.3
0
4
0
0
0
0
0
2.8
0
0
0
0
0
0
1
0.7
0
1
0
0
0
0
0
0.7
Summary of the survey data for POs
Total Respondents : 141 in 7 Courses
7293
PO #1
60
50
Percentage
40
30
20
10
0
1
2
3
4
Response
5
6
7
5
6
7
PO #2
60
50
Percentage
40
30
20
10
0
1
2
3
4
Response
Appendix B
EE210 Course Evaluation Form
Section: ______ Instructor: ____________________
2005
Fall
7393
To what extend, did this course contribute to the achievement of the following knowledge, abilities, skills, and
educational goals?
Very significant
Significant
Moderate
Somewhat
Minimum
contribution
No contribution
Not applicable
1
Ability to understand the definitions of linear, time-invariant, causal systems
      
2
      
3
Ability to analyze linear, time-invariant, causal systems in terms of the differential and difference equations with zerostate and zero-input responses
Ability to understand the concepts of impulse response and transfer functions in solving LTI problems
4
Ability to analyze linear, time-invariant systems in terms of convolution
      
5
Ability to understand the concepts of eigen-vector and eigen-value related to LTI system
      
6
Ability to analyze linear, time-invariant systems in terms of eigen-vector eigen-value
      
7
Ability to understand the definition and usages of correlation functions
      
8
Ability to apply the usages of correlation functions in LTI systems for non-deterministic signals
      
9
Ability to understand the orthogonal properties related to the development of Fourier Series
      
10
Ability to analyze linear, time-invariant systems in terms of Fourier Series
      
11
Ability to understand the development of Fourier Transform from Fourier Series
      
12
Ability to analyze linear, time-invariant systems in terms of Fourier Transform
      
13
Ability to find different forms of Fourier Transform (FT, DTFT, DFT, FFT, etc.) for a signal
      
14
Ability to understand the development of Laplace Transform from Fourier Transform
      
15
Ability to analyze linear, time-invariant systems in terms of Laplace Transform and its region of convergence
      
16
Ability to approximate solutions of a LTI problem based on the pole-zero plots
      
17
Ability to define parameters for a sampling problem
      
18
Ability to understand the development of Z Transform from the Laplace Transform, sampling, and from DTFT
      
19
Ability to analyze discrete-time, linear, time-invariant systems in terms of Z-Transform
      
20
Ability to approximate solutions of a discrete-time problem based on the pole-zero plots of Z-transform
      
21
Ability to understand different types of passive and active, analog and digital filters (IIR and FIR)
      
22
Ability to design and analyze simple analog and digital filters
      
23
Ability to understand the concepts of state-space variable technique
      
24
Ability to model continuous- and discrete-time systems by state-space technique
      
25
Ability to solve state-space system of equations
      
Your comments:
Appendix C
Summary of the CLOs for the EE210 Course
Fall 2005
7493
      
Total Respondents: 35
Course Learning Very
Outcome
Significant
1
17
2
16
3
16
4
14
5
14
6
8
7
8
8
7
9
11
10
16
11
17
12
16
13
16
14
16
15
15
16
14
17
11
18
15
19
15
20
10
21
4
22
4
23
12
24
13
25
14
Significant Moderate
13
5
13
4
15
4
16
2
7
9
11
10
11
7
12
7
9
5
12
6
14
1
14
3
9
7
11
6
17
2
12
7
10
5
10
6
12
6
11
11
7
6
6
7
13
6
8
8
9
9
Somewhat
Minimum
No
Contribution Contribution NA
2
3
3
4
3
4
4
1
3
2
2
4
4
2
1
2
1
1
1
1
5
2
1
2
1
2
1
1
2
1
2
1
2
2
1
2
1
2
4
1
7
5
2
1
Industrial & Systems Engineering MS Program Outcome
Assessment Summary
1) For this semester, identify which learning outcome(s) was (were) the focus of
assessment. Please attach a copy of the department/program goals and Student
7593
1
1
1
6
6
3
2
1
Learning Objectives.
All ISE graduate courses are only offered once a year. Our course assessments focus on the
following 9 learning outcomes for our MS courses. The student learning objectives vary by
course, and are included in each course Green Sheet. (Note: ISE reviews all learning outcomes
by course each academic year – we conduct these reviews in a retreat style setting course by
course across all outcomes )
1. Ability to function effectively and provide leadership within an
organization.
2. Ability to form, facilitate, lead, coordinate and participate in teams
3. Understanding of organizational processes and behaviors
4. Knowledge of methodological and computational skills with which to
operate effectively
5. Ability to collect, analyze, and interpret data
6. Ability to approach unstructured problems and to synthesize and design
solutions for this problem
7. Ability to evaluate the impact of these solutions in the broader context
of the organization and society
8. Ability to effectively present and sell solutions in the form of written,
oral and electronic media
9. Ability to accomplish life-long growth within the field/profession of
ISE
Industrial & Systems Engineering Goals
The goal of the ISE program is to serve society with emphasis on the manufacturing and service
sectors by:

providing undergraduate and graduate Industrial and Systems Engineering education
that prepares students to effectively apply engineering knowledge to the evaluation,
design, and operation of complex industrial, service, and governmental systems
comprised of people, equipment, and supplies through the application of modeling,
measurement, and economic methods.

contributing to the enrichment of the profession and to the development of knowledge
through faculty leadership, scholarship and professional practice.
7693

meeting the needs of working professionals for continuing education in the fields of
operations research, advanced statistical methods, ergonomics and human factors,
production planning and control and related topics.
Specific components of the ISE program goals include:




Basic Education
"Graduate well-qualified industrial and systems engineering professionals with
outstanding problem-formation, problem-solving and managerial skills. Provide a
strong basis for industrial and systems engineering education and establish a benchmark program in Silicon Valley."
Professional Education
"Prepare students, alumni, practitioners and managers for the practice of industrial
and systems engineering for the solution of engineering and management problems"
Research Education
"Expand the frontiers of human knowledge, creating the educational base for the next
generation of scholars. Provide a program in research education emphasizing quality
in the preparation of students for careers in academe and research."
Leadership in Education
"Provide leadership in setting the direction, and in bringing about changes in the field
and in the Silicon Valley.
In accordance with these goals, we have adopted the following descriptions of our perspectives
regarding the domain, approach and specialized knowledge of contemporary Industrial
Engineers:



ISE Problem Domain
ISE is a branch of engineering that engages in the study of how to describe, evaluate,
design, modify, control, and improve the performance of complex systems, viewed
over time and within their appropriate context. While relevant for engineering
disciplines in general, ISE is unique in its identification of human beings as central
decision makers with and contributors to the inherent complexity of such systems but
also as the primary targets and ultimate benefactors of their analysis and anticipated
improvement.
ISE Approach to Problem Solving
The ISE approach to problems arising in the stated domain embraces a variety of
fundamental skills derived from the mathematical, physical, and social sciences.
Industrial & Systems Engineers approach problems by seeking measurable and thus
objectively verifiable attributes of a system and form these, attempt to build
abstractions and employ quantitative methods in order to analyze the resultant
models.
ISE Specialized Knowledge
The field of Industrial Engineering draws upon specialized knowledge in numerous
subject areas including but not limited to: mathematics, optimization, probability, and
statistics, economics, computing, physiology, and psychology.
7793
ISE Graduate Program Objectives
Within the context of these three descriptions and the mission statement and the College of
Engineering Objectives, The 6 objectives of the graduate ISE Program (currently being
reviewed by the ISE DAC) are to enable MSISE graduates to be performing in the following
manner 3 to 5 years post graduation:
1.Functioning effectively and providing leadership within an
organization as an Industrial and Systems Engineering professional
including an ability to form, facilitate, lead, coordinate and
participate in teams as well as understand organizational process and
behavior.
2.Possessing the methodological and computational skills with which to operate
effectively within the ISE problem domain, through training in problem
representation, abstraction and validation.
3.Effectively collecting, analyzing, and interpreting data relevant to
problems arising in the ISE domain.
4.Competently approaching unstructured problems, synthesizing and
designing potential solutions and evaluating the impact of their
solutions in the broader context of the organization or society.
5.Effectively presenting and selling solutions and doing so in the context
of written, oral and electronic media.
6.Pursuing actively a professional life-long learning agenda and
evidencing an ability to accomplish life-long growth within the
field/profession of Industrial and Systems Engineering.
2) For each learning outcome, identified in item 1, what direct (performance measures)
and indirect (e.g. surveys) information/data were collected, how much, and by whom
this Spring 06?
Each outcome listed in Item 1 was assessed in each graduate course by means of a survey
administered to all students. A copy of the survey is attached (Attachment 1). The survey asks
students to evaluate the question:
“To what extent did THIS COURSE increase your:
1. ability to function effectively and provide leadership within an
organization.
7893
2. ability to form, facilitate, lead, coordinate and participate on teams.
…….
etc for all 9 graduate program outcomes.
Response options for each outcome are: A. Exceptional, B. Significant, C. Moderate, D.
Somewhat, and E. None. The professor provides a mark sense form to each student for their
responses. The survey is administered during the final weeks of the semester, at the professor’s
discretion.
3) For data collected Fall 05, how and when were the data summarized, analyzed and
discussed? (e.g. faculty retreat, assessment/curriculum committee, report to the faculty).
The graduate program data collected in Fall 05 will be analyzed at a department retreat before
the end of this summer. Fall 05 ISE graduate classes were not offered in Spring 06 (once per
year).
4) What findings emerged from departmental analysis of the information/data collected in
Fall 05?
Not yet available.
5) What actions are planned to address the findings? (e.g. curricular revision, pedagogical
changes, student support services, resource management.)
Curricular revision, pedagogical changes are typically the actions that emerge from these reviews
(based on our experience with the same process applied to our undergraduate programs).
6) Describe plans to collect data on additional learning outcomes next semester.
Our graduate learning outcomes (course related) are derived directly from our undergraduate
program objectives (program related), presuming that graduate students ARE the 3-5 year out
people that the undergraduate objectives are relating to. Since we don’t plant to modify our
undergraduate objectives prior to our next ABET visit (we’re focusing on developing processes
and measures to assess them), we do not anticipate additional learning outcomes in the next
semester.
7) Did your analysis result in revisiting/revising the student learning outcomes? If the
answer was yes, please explain and submit an updated version of the Student Learning
Outcomes.
7993
No.
Attachment 1
ISE Graduate Program
Outcomes Survey
8093
ISE Department
San Jose State University
Be sure to use No. 2 pencil only and that you blacken the bubble in the scantron sheet completely.
Be sure to mark only one response per question .
GRADUATE PROGRAM OUTCOMES SURVEY
For each of the following line items 1-9 , mark one bubble as appropriate on the scantron sheet and use
the following criteria for your response.
A=
Exceptional
Line Item
B = Significant
C=
Moderate
D = Somewhat
E = None
To what extent did THIS COURSE increase your:
1
ability to function effectively and provide leadership within an organization.
2
ability to form, facilitate, lead, coordinate and participate in teams
3
understanding of organizational processes and behaviors
4
knowledge of methodological and computational skills with which to operate effectively
5
ability to collect, analyze, and interpret data
6
ability to approach unstructured problems and to synthesize and design solutions for this problem
7
ability to evaluate the impact of these solutions in the broader context of the organization and society
8
ability to effectively present and sell solutions in the form of written, oral and electronic media
9
ability to accomplish life-long growth within the field/profession of ISE
17
How many hours a week do you work? If its 40+ = A, 30+ = B, 20+= C, 10+=D, None=E
18
How many units are you taking this semester? If its 18+=A, 15-17=B, 14-12=C, 11-6=D, and 0-5=E
Email: _____________________________________________________________________________
Comments: _________________________________________________________________________
________________________________________________________________________________________
This data will be used by the program faculty to help to improve the program for future students. Thank you
for your assistance.
Turn page for additional questions
8193
Human Factors/Ergonomics MS Program Outcome
Assessment Summary
1. For this semester, identify which learning outcome(s) was (were) the focus of
assessment. Please attach a copy of the department/program goals and Student
Learning Objectives.
The HF/E program is a new program in the College.
interdisciplinary and includes:
The faculty of the program is
Tenure Track faculty:
Lou Freund, Program Director
Kevin Corker, Culminating Experience Director
Kevin Jordan
Emily Wughalter
John McClusky
James Kao
Adjunct faculty:
Anthony Andre
Abbas Moallem
Dave Krack
Although this program has successfully operated for more than 10 years in the College of
Graduate Studies, it was recently moved to the College of Engineering, specifically to the
Department of Industrial & Systems Engineering. The program has more than 25 applicants
annually from across the United States and internationally. Currently, about 60 students are
actively pursuing this degree, taking courses in ISE, Kinesthesiology, Industrial Design, and
Psychology.
To date, however, the faculty have not formulated program goals.
definition of program goals will be undertaken this summer.
A process to reach a
2. For each learning outcome, identified in item 1, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how
much, and by whom this Spring 06?
N/A
3. For data collected Fall 05, how and when were the data summarized, analyzed
and discussed? (e.g. faculty retreat, assessment/curriculum committee, report
to the faculty).
8293
N/A
4. What findings emerged from departmental analysis of the information/data and
collected in Fall 05?
N/A
5. What actions are planned to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
N/A
6. Describe plans to collect data on additional learning outcomes next semester.
By next semester, we will collect data from students who are participating in the culminating
experience.
7. Did your analysis result in revisiting/revising the student learning outcomes? If
the answer was yes, please explain and submit an updated version of the
Student Learning Outcomes.
N/A
8393
Quality Assurance MS Program Outcome Assessment
Summary
1 . For this semester, identify which learning outcome(s) was (were) the focus of
assessment. Please attach a copy of the department/program goals and Student
Learning Objectives.
Program Outcomes
*3. Study, analyze, and make recommendations to improve quality-related problems.
*5. Be cognizant of the ethical, economic and environmental implications of their work.
*6. Be able to develop and deliver presentations in written and oral formats.
2.
For each learning outcome, identified in item 1, what direct (performance measures)
and indirect (e.g. surveys) information/data were collected, how much, and by whom
this Spring 06?
Three program outcomes were assessed in two classes (Tech 235 and Tech 239) during Fall 2005
by utilizing a student survey, The survey was given to all the students in both courses by the
professor. Twenty students completed and returned the surveys.
3.
For data collected Fall 05, how and when were the data summarized, analyzed and
discussed? (e.g. faculty retreat, assessment/curriculum committee, report to the
faculty).
The data was summarized by the department curriculum committee and discussed at a
department faculty meeting
4.
What findings emerged from departmental analysis of the information/data
collected in Fall 05?
The survey results were generally positive; 76% of the responses were in the “Agree” or
“Strongly Agree” categories.
5.
What actions are planned to address the findings? (e.g. curricular revision,
pedagogical changes, student support services, resource management.)
The program will continue to collect data pertaining to MSQA learning outcomes throughout the
Spring and Fall 2006 in different courses.
8493
6.
Describe plans to collect data on additional learning outcomes next semester.
The program will continue to collect data pertaining to MSQA learning outcomes throughout the
Spring and Fall 2006 in different courses. Two professors (Drs. Zargar and Backer) are going to
construct an assessment tool for the final project or thesis to be used by the department for
outcomes In Spring 2007, we are going to conduct a survey of graduates of the MSQA program
to gather more data.
7.
Did your analysis result in revisiting/revising the Student Learning Outcomes? If
the answer was yes, please explain and submit an updated version of the Student
Learning Outcomes.
No
8593
Master of Science in Engineering Program Outcome
Summary
1) For this semester, identify which learning outcome(s) was (were) the focus of
assessment. Please attach a copy of the department/program goals and Student
Learning Objectives.
Learning Outcomes Assessed:
1.0 Work collaboratively with various disciplines of engineering, science and business.
2.0 Apply advanced theory and analysis for problem solving and synthesize and integrate
information in the engineering process.
3.0 Effectively communicate for problem analysis and solutions.
4.0 Apply contemporary tools for computation, simulation, analysis and design.
5.0 Deliver effective presentation of engineering activities in written and oral formats.
6.0 Aware of the ethical, economic and environmental implication of my engineering activities.
2) For each learning outcome, identified in item 1, what direct (performance measures)
and indirect (e.g. surveys) information/data were collected, how much, and by whom
this Spring 06?
Outcome metrics (See attached survey documents):
Student class surveys request student impression of success with regard to the outcome with
a request for anecdotal evidence to confirm. Data is collected for all ENGR courses in all
semesters.
A second portion of the survey requests student recommendations for improvement of the
program, which can include content revision and/or new courses.
3) For data collected Fall 05, how and when were the data summarized, analyzed and
discussed? (e.g. faculty retreat, assessment/curriculum committee, report to the
faculty).
Data Summarized:
The data from each survey have been summarized and returned to the faculty. Faculty are
expected to revise courses based on the responses.
Program data have been correlated and are to be discussed with faculty who teach and
administer in the program during a retreat. One of the primary objectives is to develop
linkages between the common core courses in the program and the final student projects or
theses. Some concepts in the courses should be employed in these culminating student
activities.
8693
4) What findings emerged from departmental analysis of the information/data collected in
Fall 05?
Findings from Data Analysis (See attached preliminary evaluation):
The retreat is to be held during the summer. At this point, faculty may have made some
changes in course content based on the student responses in Fall 2005.
5) What actions are planned to address the findings? (e.g. curricular revision, pedagogical
changes, student support services, resource management.)
Anticipated Actions:
The objective of the retreat will be to revise the ENGR courses to provide the best linkages
between the common core, the culmination activities, and the other courses in the programs.
6) Describe plans to collect data on additional learning outcomes next semester.
Data Collection:
Surveys are currently being complete for Spring 2006 and this data may be available by the
time we conduct the retreat.
7) Did your analysis result in revisiting/revising the student learning outcomes? If the
answer was yes, please explain and submit an updated version of the Student Learning
Outcomes.
8793
The following is a survey intended to determine how well this course is meeting the Outcome Objectives for the MSE Program. Please
take a few moments to check your personal impressions and add evidence to support your evaluation. Please note this is not an
evaluation of the instructor, but it is an attempt to determine if the content of the course and the method of instruction are supporting
the MSE Program Outcome Objectives.
COURSE NUMBER
SURVEY DATE
INSTRUCTOR
SEMESTER AND YEAR OF COURSE
FALL SPRING SUMMER YEAR________
STUDENT IMPRESSION
MSE PROGRAM OUTCOME STATEMENT
1.0 I can work collaboratively with various disciplines of
engineering, science and business.
2.0 I can apply advanced theory and analysis for problem
solving and synthesize and integrate information in the
engineering process.
3.0 I can effectively communicate for problem analysis
and solutions.
4.0 I can apply contemporary tools for computation,
simulation, analysis and design.
5.0 I am able to deliver effective presentation of
engineering activities in written and oral formats.
6.0 I am aware of the ethical, economic and environmental
implication of my engineering activities.
88
S
T
R
O
N
G
L
Y
A
G
R
E
E
A
G
R
E
E
S
L
I
G
T
H
L
Y
S
L
I
G
H
T
L
Y
A
G
R
E
E
D
I
S
A
G
R
E
E
D
I
S
A
G
R
E
E
S
T
R
O
N
G
L
Y
D
I
S
A
G
R
E
E
NOT
APPLIC
ABLE
TO
THIS
COURS
E
Example from course activities to
support rating.
Areas for improvement of the MSE program:
1.0 What additional components would you recommend adding
to this course to better prepare you for professional activities?
2.0 What changes would you recommend as to the method of
instruction that is used in this course? (This can be in terms of
general format: lecture, seminar or lab or you can refer to
particular methods that in your opinion would make this course
more effective.)
3.0 What changes would you recommend as to how this course is
included in the program? (This refers to common core, option
core, or elective status and possible prerequisite relationships.)
4.0 What new or related courses would you recommend that be
added to the MSE program to meet developing needs.
5.0 Please add any additional comments that would be relevant to
improvement of the MSE program.
Thank you for taking the time to provide us with a response.
89
Introduction
This is a summary of the results of the program evaluation surveys completed by students in the
Fall 2006 semester, intended to be used as an agenda for our discussions regarding the MSE
program. It is based on answers to questions 3, 4 and 5 in the student opinion survey:
3.0 What changes would you recommend as to how this course is included in the program?
(This refers to common core, option core, or elective status and possible prerequisite
relationships.)
4.0 What new or related courses would you recommend that be added to the MSE program to
meet developing needs.
5.0 Please add any additional comments that would be relevant to improvement of the MSE
program.
The other questions in the survey targeted the specific courses and it is assumed that faculty will
review those and make the appropriate revisions.
General Observations
Students had general structural recommendations that we should consider, specifically the use of
industrial involvement to demonstrate application of course concepts and more student
presentations.
The recommended industrial component includes presentations and possible field trips. Industrial
presentations could be accomplished by having students recommend personnel from their firm
who could provide relevant material and having students arrange for field trips, in addition to the
traditional methods of recruiting external resources.
Many of our students come from programs that do not require developing communication skills
and particularly verbal presentations, so they consider these opportunities to be an important
program component.
Students also have some program content recommendations that should be considered as areas
for revision. Many students entering the project courses are surprised to discover we require
projects to be evaluated both technically and economically. They do not have a concern about the
technical component, but they do not have a sense of how to apply the economic aspects that
may have been included in previous courses. The students expressed this concern by
recommending more MBA-type courses in the program and the implied recommendation is to
get more engineering economics into the common core.
The results are summarized below and have been separated by class. The reason for the
separation is that students in Common Core are in the initial stages of the program and students
in the Projects classes should be in the latter stages. This relative tenure in the program could
influence their opinions.
90
Question 3.0
Student Positions:
A. ENGR 201 students:
a. Were of the opinion that the mathematical structure with the statistical and control
applications was effective for the rest of the program courses.
b. Some indicated that it might be more of an elective than a core component, so it
appears there is still some need to have links to the concentration courses.
B. ENGR 202 students
a.
b. Most students stated Systems Engineering was valuable to MSE and
recommended keeping ENGR 202 as a requirement.
c. Others thought a course more specifically directed at the concentration would be
preferred.
C. ENGR 203 students
a. It was not clear that students appreciated the value of the course as a required
core.
b. Students appreciated the opportunity to make class presentations.
c. Students expressed an interest in quality assurance components and incorporation
of computer applications.
D. ENGR 236 students
a. Students were concerned about the use of this class as a concentration core for
some options.
b. They were interested in applications other than manufacturing companies.
E. First Semester Project/Thesis ENGR 281 and ENGR 295A
a. Students like the results of the course, but have some reservations about the dual
credit.
b. Students were concerned about having to schedule committee members according
to times set by the class coordinator.
Discussion Points:
A. Alternate components for the program:
a. Should we assume students have an adequate preparation in statistics and
probability to incorporate Design of Experiments into the program, specifically
into ENGR 201? Many students start the projects courses with multivariable
studies and they are not familiar with techniques like Taguchi methods to manage
these types of studies.
b. Should there be some discussion of specific systems engineering concepts as
applied to alternate industry types? It is not clear that these concepts are actually
being applied to the project courses and perhaps there is a way to integrate these
into the configuration of that course.
c. Would it be valuable to employ a computer lab in some of the courses? For
example applications such as Microsoft Project for the Project Planning and
Scheduling concepts and there may be some others that could be used for the QA
concepts?
91
d. The projects experience indicate many of the students have had limited exposure
to Engineering Economics, so could some basic components be incorporated in
the common core? The economic justification for projects can be based on a range
of opportunities including: new products, improved products, reduced
manufacturing costs, cost avoidance, and implementation of control standards
such as the ISO’s. The students have difficulty applying these factors for
economic justification of projects.
e. Should we include components for Management of Innovation with an
Entrepreneurship subcomponent? Would this include factors such as outsourcing,
supply chain, and other Operations Research topics?
B. Structural Questions
a. Do we have the correct sequence of courses and should there be additional cross
links or prerequisites?
b. We apparently still have some discrepancy between multiple section courses so
what is an effective way to assure we include common components and common
emphases in the multiple sections? We had a student complete a Comprehensive
exam and it was clear he took the course from a faculty member other than the
one who prepared the questions.
Question 4.0
Student Positions:
A. ENGR 201 students:
a. Recommended course in Advanced or Applied Statistics
b. Recommended course on Risk Assessment
c. More industrial participation to reflect current industry interests
d. Additional management components
B. ENGR 202 students
a. Similar to the ENGR 201 student recommendations
b. An alternate Systems Engineering course with more hands-on content
c. Want more flexibility in electives and probably fewer core
d. Some specific technical courses that would probably come from the other
programs
C. ENGR 203 students
a. Similar to the ENGR 201 student recommendations
b. Some students (probably Special Concentration), want the option to take more
than 3 courses in another program (EE back door).
c. Recommended courses in Project Management, Engineering presentations, and
decision making
d. Courses with more application and less theory
e. More seminar courses
D. ENGR 236 students
a. Courses targeting currently growth industries
b. Design courses, which seem to be similar to undergraduate simulation courses
92
c. Courses in demand planning and forecasting, supply chain and MBA-oriented
management courses.
E. First Semester Project/Thesis ENGR 281 and ENGR 295A
a. Engineering Economics course
b. Project planning
c. Specific courses that are currently offered in other programs that could be
substituted for core courses.
Discussion Points:
A. Alternate components for the program:
a. Should we assume incorporate a greater number of industrial guest lecturers in the
core courses to discuss specific applications?
b. Do the recommendations for new courses require revising the common core
courses or can some of these recommendations be satisfied with courses from
other programs?
B. Structural Questions
a. Should we change to core structure to allow more electives? Can we do this and
still maintain the multi-disciplinary structure of the program? Would this make
management of the program more complex?
b. Should we be considering developing any new courses to support concentrations?
c. Should we spin off more concentration areas where the number of students in a
Special Concentration area is, or might become, adequate to justify this effort?
Question 5.0
Student Positions:
A. ENGR 201 students:
 Comments are primarily with regard to structure and the desire to have a wider
range of electives
B. ENGR 202 students
a. Similar to the ENGR 201 student recommendations
b. There was a suggestion reduce the number of common core courses, such as by
combining ENGR 202 and ENGR 203 to make room for another elective course
in the program.
C. ENGR 203 students
a. Students claim a need for better advising
b. Students would like to have internship opportunities
c. More industry involvement in classes
D. ENGR 236 students
 Change of time for start of courses
 Better access to BUS courses
 More industrial involvement
E. First Semester Project/Thesis ENGR 281 and ENGR 295A
 More industrial involvement
93
 Possible use of tours (field trips?) in the program
 Students appreciate 200W
 Students want more flexibility and Special Concentration want to eliminate the 3
course rule.
Discussion Points:
A. Alternate components for the program:
a. Is there some way to get more industrial contribution in the program? Could
students be a source of industrial speakers who could be recruited at the start of
each class?
B. Structural Questions
a. Should the Special Concentration criteria be revised without sacrificing the multidisciplinary nature of the program?
94
________________________________________________________________________
Civil and Environmental Engineering Department
________________________________________________________________________
Champion: Dr. Akthem Al-Manaseer/Dr. Ram Singh/Dr. Udeme Ndon
MS – Civil Engineering
1. For this semester (S 06), identify which Program Outcome(s) was (were) the focus of
assessment.
Two graduate courses were assessed in the Spring of 2006. The courses assessed were:
(1) CE 233- Construction Management, and (2) CE 243- Advanced Foundation
Engineering.
In the CE 233 course, the following Program Educational Outcomes were assessed:
PO 4: Students will be able to work collaboratively.
PO 5: Students will be able to communicate effectively.
In the CE 243 course, the following Program Educational Outcomes were assessed:
PO 2: Students will be able to apply modern tools in doing computations, simulations,
analysis, and design.
PO 3: Students will be able to synthesize and integrate information in the engineering
process.
PO 5: Students will be able to communicate effectively.
2. For each Program Outcome, identified in item 1, what direct (performance
measures) and indirect (e.g. surveys) information/data were collected, how much,
and by whom?
Program Educational Outcomes (2,3,5) for CE 243 were measured by reviewing course
description in the university catalog, reviewing of the “green sheet” which describes the
course in detail, attendance of one course session by an external evaluator, William A.
Bishop, on May 2, 2006, who interviewed four students and provided input through a
written report.
Program Educational Outcomes (4 and 5) for CE 233 were measured by interviewing
four students taking the course by an external evaluator, John Jacobsen, who provided
input through a written report.
3. How and when were the data summarized, analyzed and discussed? (e.g. faculty
retreat, assessment/curriculum committee, report to the faculty).
CE office/Al-Manaseer/CEE Department Program Plan Hambaba. Ver. 2/ 06.14.2006AB
9
5
Graduate Program Assessments and Enhancement Process were discussed by the faculty
at meetings on May 9, 2006 and May 16, 2006. The faculty did not approve the proposed
plan but requested that further modifications and discussions need to be considered. Plan
for further discussion is attached.
CE 212 (Structural Dynamics), and CE 255 (Sediment Transport) were assessed in the
Fall of 2005. The assessment involved students’ survey on how the courses meet
educational objectives and course objectives. Analyses of students’ ratings were
presented to faculty. No major changes were required on CE 212 and CE 255
4. What findings emerged from the departmental analysis of the information/data
collected?
No major changes on the content of CE 255 and CE 212. The proposed assessment plan
needs to be discussed further in Fall 2006.
5. What actions were taken to address the findings? (e.g. curricular revision pedagogical
changes, student support services, resource management.)
Minor revisions and improvements to the green sheets to CE 212 and CE 255 were
suggested by the external evaluator.
6. Describe plans to collect data on additional Program Outcomes next semester.
Include a calendar for collecting data on all PO’s by end of Spring 07.
The department would like to modify the Program Assessments and Enhancement
Process for our Master of Science in Civil Engineering, and to continue to assess courses.
A schedule of the proposed courses assessment is attached
CE office/Al-Manaseer/CEE Department Program Plan Hambaba. Ver. 2/ 06.14.2006AB
9
6
The Master of Science programs in Civil Engineering are intended to develop the high degree of professional competency and
specialization required for the treatment of current engineering problems.
OBJECTIVES:




Prepare students for their professional careers and licensure by strengthening their
knowledge in their specialization (depth) and extending their skills and knowledge base (Breadth).
Provide students advanced proficiencies for professional practice to enable them to advance in
the licensing process and equip them for advancement in their career.
Improve students' research skills and prepare them for further graduate study.
Provide students with experience and skills for multi-disciplinary and cross-CE disciplinary
practice.
OUTCOMES:





Students will be able to apply advanced theory and analysis for problem solving.
Students will be able to apply modern tools in doing computations, simulations, analysis, and
design.
Students will be able to synthesize and integrate information in the engineering process.
Students will be able to work collaboratively.
Students will be able to communicate effectively.
CEE Department
97
Ma s ter of S c ienc e in Civil E ng ineering
P rog ra m As s es s ments a nd E nha nc ement
P roc es s
98
Surveys
1. Program Objectives Survey (Every Semester)
2. Course Objectives Survey
(Every Semester)
3. DAC - Evaluation Survey
(Every Semester)
SURVEYS
Program
3. Alumni Survey
4. Employer Focus Group
Fall
2005
Sprin
g
2006
Fall
2006
Sprin
g
2007
Fall
2007
Sprin
g
2008
Fall
2008
Sprin
g
2009
Fall
200
9
Spring
2010
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
(Every Two Years)
(Every Three Years)
Objectives
Survey
Course
Objectives
Survey
Alumni Survey
X
Employer
X
X
X
Focus Group
DAC-
X
X
X
X
X
X
X
X
X
Evaluation
CEE MS Program
99
Graduate Classes Fall 2002-Spring 2010
CONSTRUCTION
ENVIRONMENTAL
Fall 2002
Spring 2003
Fall 2003
Spring 2004
Fall 2004
Spring 2005
Fall 2005
Spring 2006
CE 232
* CE 230
*CE 234
CE 232
* CE 230
*CE 234
CE 232
* CE 230
CE 235
* CE 233
CE 236
* CE 233
CE 235
CE 236
* CE 233
CE 276
CE 271
CE 270
CE273
CE 276
CE 271
CE 272
CE 282
*CE 245
*CE 240
CE 243
CE 241
*CE 240
CE 243
CE 210
CE 212
CE 260
CE 272
GEOTECHNICAL
CE 241
CE 244
*CE 245
STRUCTURAL
TRANSPORTATION
WATER RESOURCES
CE 260
CE 210
CE 212
CE 264
CE 264
CE 220
CE 261
CE 261
CE 265
CE 221
*CE 224
CE 223
*CE 222
CE 226
CE 251
CE 252
*CE 250
CE 255
CE 269
*CE 267
CE 261
CE 265
*CE 222
CE 220
*CE 224
*CE 222
*CE 225
CE 221
CE 226
*CE 225
CE 251
*CE 250
CE 255
CE 252
CE 212
CE 255
Structural
Water Recourses
CLASSES ASSESSED
CONCENTRATION
Construction
*CE 230
CE 232
*CE 233
*CE 234
CE 235
CE 236
CEE MS Program
Environmental
CE 271
CE 272
*CE 274
CE 275
*CE 279
CE 281
CE 282
Geotechnical
*CE 240
CE 241
CE 242
CE 243
CE 244
*CE 245
CE 246
Structural
CE 210
CE 212
CE 216
CE 260
CE 261
CE 264
CE 265
*CE 267
CE 269
Transportation
CE 220
CE 221
*CE 222
CE 223
*CE 224
*CE 225
CE 226
*CE 233
CE 243
Construction
Geotechnical
Water Resources
*CE 250
CE 251
CE252
CE 255
100
CONSTRUCTION
ENVIRONMENTAL
GEOTECHNICAL
Fall 2006
Spring 2007
Fall 2007
Spring 2008
Fall 2008
Spring 2009
Fall 2009
Spring 2010
*CE 234
* CE 230
CE 232
*CE 234
* CE 230
CE 232
*CE 234
* CE 230
CE 235
* CE 233
CE 236
CE 235
* CE 233
CE 236
CE 235
* CE 233
CE 281
CE 276
CE 271
CE 270
CE 273
CE 282
CE 281
CE 276
*CE 279
*CE 274
CE 272
*CE 279
*CE 274
CE 244
*CE 245
*CE 240
*CE 240
CE 243
CE 244
CE 210
CE 264
CE 212
CE 260
CE 241
CE 243
*CE 245
STRUCTURAL
CE 210
CE 264
CE 212
CE 260
CE 269
*CE 267
CE 261
CE 265
CE 269
*CE 267
CE 261
CE 265
TRANSPORTATION
CE 221
*CE 222
CE 220
*CE 224
*CE 222
CE 220
*CE 224
CE 221
CE 226
*CE 225
CE 223
CE 221
CE 226
*CE 225
CE 223
CE 226
WATER RESOURCES
*CE 250
CE 251
CE 252
CE 255
*CE 250
CE 251
CE 252
CE 255
CLASSES ASSESSED
CONCENTRATION
Construction
*CE 230
CE 232
*CE 233
*CE 234
CE 235
CE 236
CEE MS Program
Environmental
Transportation
Environmental
CE 271
CE 272
*CE 274
CE 275
*CE 279
CE 281
CE 282
Structural
Water Recourses
Construction
Geotechnical
Geotechnical
*CE 240
CE 241
CE 242
CE 243
CE 244
*CE 245
CE 246
Environmental
Transportation
Structural
CE 210
CE 212
CE 216
CE 260
CE 261
CE 264
CE 265
*CE 267
CE 269
Structural
Water Recourses
Construction
Geotechnical
Transportation
CE 220
CE 221
*CE 222
CE 223
*CE 224
*CE 225
CE 226
Environmental
Transportation
Structural
Water Recourses
Water Resources
*CE 250
CE 251
CE252
CE 255
101
Course
No.
Title
FT
Faculty
Professor_Status_Term
(last taught only)
1
2
3
4
5
CE 210
Advanced Mechanics
(Vukazich)
Singhal_PT_S’05



CE 212
Structural Dynamics
(Vukazich)
Singhal_PT_F’05



(Vukazich)
Singhal_PT_S’03



(Botha)
Signore_PT_S’05


(Botha)
Fakhry_PT_S’05



(Botha)
Vickroy_PT_S’06



(Botha)
Wei_PT_S’04



(Botha)
Botha_FT_F’05



(Botha)
Fakhry_PT_F’04



(Botha)
Vickroy_PT_F’05



(Yates)
Yates_FT_F’04





(Yates)
Battersby_FT_F’05





(Yates)
Yates_FT_S’06




CE 216
CE 220
CE 221
CE 222
CE 223
CE 224
CE 225
CE 226
CE 230
CE 232
CE 233
Finite Elements with
Civil Engr Appl.
Rigid & Flexible
Pavement Design
Advanced Highway
Design
Transportation Engr.
Planning
Airport Planning &
Design
Traffic Operations
Public Transportation
Systems
Topics in
Transportation System
Construction Project
Development
Construction
Estimating & Cost
Analysis
Construction
Management
CE 234
Construction Law
(Yates)
Yates_FT_S’05
CE 235
Information Systems in
Constr Mgmt.
(Yates)
Yates_FT_S’05




1. Students will be able to apply advanced theory and analysis for problem solving.
2. Students will be able to apply modern tools in doing computations, simulations, analysis, and
design.
3. Students will be able to synthesize and integrate information in the engineering process.
4. Students will be able to work collaboratively.
5. Students will be able to communicate effectively.
102
CEE Program

Course
No.
Title
FT
Faculty
Professor_Status_Term
(last taught only)
1
2
3
4
5
(Yates)
Baratta_PT_F’05





Construction
Operations Analysis
Advanced Soil
Mechanics
Groundwater, Seepage,
and Drainage Control
Experimental Soil
Mechanics
Advanced Foundation
Engineering
Al-Manaseer
Oskoorouchi_PT_F’05



Al-Manaseer
Oskoorouchi_PT_S’05



Al-Manaseer
Yeung_PFT_S’00


Al-Manaseer
Oskoorouchi_PT_S’06



CE 244
Earth Structures
Al-Manaseer
Oskoorouchi_PT_S’03



CE 245
Geotechnical/Structura
l Seminar
Al-Manaseer
Haggblade_PT_F’04



CE 246
Soil Dynamics
Al-Manaseer
Neelakantan_XX_S’94



(Singh)
Prasad_PT_S’05



(Singh)
Lee_PT_F’04


CE 236
CE 240
CE 241
CE 242
CE 243
CE 250
CE 251
Water Resources
Engineering
(Singh)
Hydraulics of Open
Channels
CE 252
Advanced Hydrology
(Singh)
Prasad_PT_S’06


CE 255
Sediment Transport
(Singh)
Butt_PT_F’05




CE 260
CE 261
Matrix Analysis of
Structures
Advanced Structural
Concrete Design
Vukazich
Singhal/Vukazich_PT
(Al-Manaseer)







/
FT_S’06
Pang_PT_F’05


1. Students will be able to apply advanced theory and analysis for problem solving.
2. Students will be able to apply modern tools in doing computations, simulations, analysis, and
design.
3. Students will be able to synthesize and integrate information in the engineering process.
4. Students will be able to work collaboratively.
Students will be able to communicate effectively.
Course No.
Title
Professor_Status_Term 1
2
3
4
FT
(last taught only)
Faculty
CE 264
CE 265
Prestressed
Concrete Design
Advanced Seismic
Design
(Al-Manaseer)
Aalami_PT_S’04



(McMullin)
McMullin_FT_S’06




103
CEE Program
5
CE 267
CE 269
CE 271
CE 272
CE 274
CE 275
CE 279
Advanced Steel
Design
Advanced Topics in
Structural Design
Water Treatment
and Plant Design
Wastewater
Treatment and
Plant Design
Industrial and
Hazardous Waste
Management and
Treatment
BioSolids and
Residual
Management
Engineering
Environmental
Engineering
Seminar
(McMullin)
Pang_PT_S’05



(Al-Manaseer)
Al-Manaseer_FT_F’04




(Ndon)
Cummings_PT_S’05





(Ndon)
Ndon_FT_F’05





(Ndon)
Ndon_FT_F’97





(Ndon)
Spicher_PFT_S’89
(Ndon)
Williamson_PFT_F’00
1. Students will be able to apply advanced theory and analysis for problem solving.
2. Students will be able to apply modern tools in doing computations, simulations, analysis, and
design.
3. Students will be able to synthesize and integrate information in the engineering process.
4. Students will be able to work collaboratively.
Students will be able to communicate effectively.
Course No.
Title
Professor_Status_Term
1
2
3
4
(last taught only)
CE 290
Civil Engineering Analysis I
Singh_FT_F’89
CE 291
Civil Engineering Analysis II
Singh_FT_S’87
CE 292
CE 297

5
Civil Engineering Economic
Analysis
Special Topics in Civil
Engineering
CE 298
Special Problems
TBA
CE 299
Master's Thesis or Project
TBA
1 Students will be able to apply advanced theory and analysis for problem solving.
2. Students will be able to apply modern tools in doing computations, simulations, analysis, and
104
CEE Program
design.
3. Students will be able to synthesize and integrate information in the engineering process.
4. Students will be able to work collaboratively.
5. Students will be able to communicate effectively.
105
CEE Program
SAN JOSE STATE UNIVERSITY
GRADUATE ENGINEERING PROGRAM
Aerospace Engineering
Program Assessment
PERIOD COVERED
Spring 2006
Prepared by:
Assessment Team: Dr. Nikos J. Mourtos (Chair),
Dr. Burford J. Furman, Dr. Raymond K. Yee, Dr. Nicole Okamoto,
Dr. Periklis Papadopoulos
Report Date: 05/22/2006
MS Aerospace Engineering
106
Please provide the following information for AY 05/06:
1. In AY 05-06, identify which Graduate Program Educational Objectives were the focus of
assessment. Please attach a copy of the Graduate Program Educational Objectives.
All GPEOs were the focus of assessment for AY 05-06. The GPEOs for the MSAE Program are:
1. A strong foundation beyond the undergraduate level in their chosen focus area as well as in mathematics,
basic science and engineering fundamentals, to successfully compete for technical engineering positions in
the local, national and global engineering market, advance in their current position or pursue doctoral
studies.
2. Contemporary professional and lifelong learning skills to be able to apply theory to solve practical
aerospace engineering problems.
3. Provide students with the expertise necessary to work in the analysis and design of aerospace
engineering systems with possible specialization in areas such as:
Aerospace Systems
Aircraft Design
Spacecraft Systems
Space Transportation & Exploration
4. Strong verbal and written communication skills, including the ability to write engineering reports.
5. The ability to perform research and work independently to solve open-ended problems in aerospace
engineering.
2. For each GPEO identified in item 1, what direct (performance measures) and indirect (e.g.
surveys) information / data were collected, how much, and by whom?
The members of the AE Advisory Board:
(a) Rated the GPEOs in terms of their importance to the AE industry, and
(b) Assessed how well the current MSAE curriculum addresses these GPEOs.
AE Advisory Board input on the importance of the GPEOs
How
important is
BM1
BM2
BM3
BM4
BM5
Average
this PEO?
GPEO # 1
5
4
3
4
4
4.0
GPEO # 2
4
3
4
4
3
3.6
GPEO # 3
4
4
4
4
5
4.2
GPEO # 4
5
5
5
4
4
4.6
GPEO # 5
5
5
5
4.5
5
4.9
Scale: 5=Very important, 4=Important, 3=I am not sure, 2=Not important, 1=Irrelevant, should not be included.
AE Advisory Board input on how well the MSAE curriculum addresses the GPEOs
How well is this PEO addressed through the MSAE
BM1 BM2 BM3 BM4 BM5
curriculum?
GPEO # 1
4
4
4
4
4
GPEO # 2
3
3
3
3
4
GPEO # 3
5
5
5
4
5
GPEO # 4
5
5
5
5
5
GPEO # 5
5
5
5
5
5
Scale: 5=Very well, 4=Well, 3=Adequately, 2=Not adequately, 1=It is not addressed at all.
Average
4.0
3.2
4.8
5.0
5.0
The MAE Assessment Team targeted for assessment all (24) MSAE project / thesis reports for 2003-2004 (16)
and 2004-2005 (8) graduates.
The following criteria were used to evaluate each M.S. project / thesis report:
MS Aerospace Engineering
107
Thesis / Project Evaluation Form
Project / Thesis Title
Student Name
Faculty Advisor
Field
AE
GPEO
addressed
1
1
Application of mathematics appropriate for graduate level
1
2
Application of science appropriate for graduate level
Application of engineering fundamentals appropriate for graduate
1
3
level
2
4
Use of modern tools (computational or experimental)
Appropriate literature search (# and appropriateness of references
2
5
cited)
2
6
Understanding of the cited literature (summary of previous work)
2
7
Understanding of the work performed in the project
3
8
In-depth analysis and / or design of an ME or AE system
4
9
Correct language and terminology
4
10
Abstract, ability to summarize, draw conclusions
4
11
Appropriate use of graphs and tables
5
12
Clear objectives (problem definition)
5
13
Appropriate assumptions / modeling of the problem
Total Score (L=13-25, W=26-38, A=39-49, G=50-59, E=60-65)
L = Lacking, W = Weak, A = Acceptable, G = Good, E = Excellent
Max Possible Score = 65
Scale: 1 = Lacking, 2 = Weak, 3 = Acceptable, 4 = Good, 5 = Excellent
Max = 5
A score of three (3) or above is necessary to satisfy each GPEO.
3.
How and when were the data summarized, analyzed and discussed? (e.g. faculty retreat,
assessment / curriculum committee, report to the faculty).
The following data were summarized by the MAE Assessment Team and shared with the MAE
faculty.
MSAE Results Summary
Number of MSAE students who graduated in AY 03-04 & 04-05 = 24
Missing reports = 5
Reports evaluated = 19
PEO
1
1
Application of mathematics appropriate for graduate level
3 (16%) did not meet this objective
16 (84%) met this objective
1
2
Application of science appropriate for graduate level
1 (5%) did not meet this objective
18 (95%) met this objective
1
3
Application of engineering fundamentals appropriate for graduate level
1 (5%) did not meet this objective
18 (95%) met this objective
MS Aerospace Engineering
Max = 5
1=2 reports
2=1
3=2
4=5
5=9
1=0 reports
2=1
3=3
4=3
5=12
1=1 reports
2=0
3=2
4=3
108
2
4
Use of modern tools (computational or experimental)
2 (11%) did not meet this objective
17 (89%) met this objective
2
5
Appropriate literature search (# and appropriateness of references cited)
4 (21%) did not meet this objective
15 (79%) met this objective
2
6
Understanding of the cited literature (summary of previous work)
2 (11%) did not meet this objective
17 (89%) met this objective
2
7
Understanding of the work performed in the project
0 (0%) did not meet this objective
19 (100%) met this objective
3
8
In-depth analysis and / or design of an AE system
3 (16%) did not meet this objective
16 (84%) met this objective
4
9
Correct language and terminology
0 (0%) did not meet this objective
19 (100%) met this objective
4
10
Abstract, ability to summarize, draw conclusions
0 (0%) did not meet this objective
19 (100%) met this objective
4
11
Appropriate use of graphs and tables
2 (11%) did not meet this objective
17 (89%) met this objective
5
12
Clear objectives (problem definition)
0 (0%) did not meet this objective
19 (100%) met this objective
5
13
Appropriate assumptions / modeling of the problem
1 (5%) did not meet this objective
18 (95%) met this objective
Total Score (L=13-25, W=26-38, A=39-49, G=50-59, E=60-65)
L = Lacking, W = Weak, A = Acceptable, G = Good, E = Excellent
Max Possible Score = 65
MS Aerospace Engineering
5=13
1=2 reports
2=0
3=3
4=7
5=7
1=3 reports
2=1
3=1
4=3
5=11
1=2 reports
2=0
3=2
4=4
5=11
1=0 reports
2=0
3=2
4=2
5=15
1=2 reports
2=1
3=6
4=6
5=4
1=0 reports
2=0
3=1
4=1
5=17
1=0 reports
2=0
3=1
4=1
5=17
1=2 reports
2=0
3=1
4=5
5=11
1=0 reports
2=0
3=1
4=3
5=15
1=0 reports
2=1
3=0
4=4
5=14
L=0 reports
W=1
A=2
G=5
E=11
109
4.
What findings emerged from departmental analysis of the information / data collected?
The following findings emerged from the analysis of the data:
A. The MSAE Program at Lockheed-Martin is currently run with little or no input from the
B. Several MSAE project / theses were found to have been supervised by ME faculty with no expertise
in the area of the project / thesis. As a result, the quality of some of them was poor.
C. A large number of reports were not available at the time of the analysis (21%). Additionally, in a few
cases, committee members were replaced without their knowledge.
D. The percentage of reports that satisfied the various elements of GPEO # 1 ranged from 84% - 95%.
E. The percentage of reports that satisfied the various elements of GPEO # 2 ranged from 79% 100%. In general, students seem to perform very well in their use of modern tools. They lack,
however, in their citations of appropriate literature (21% of the reports were found weak or lacking)
as well as in their understanding of this literature (11% of the reports were found weak or lacking).
On the other hand, almost all of the evaluated reports demonstrated a good understanding of the
work performed.
F. Students seem to perform well in GPEO#3 (in-depth analysis / design of a system). One issue that
was identified in relationship to this GPEO is that there was a mismatch between the project / thesis
topic, project / thesis advisor and the major.
G. Students seem to perform well in GPEO#4 (communication skills). The percentage of reports that
satisfied the various elements of GPEO # 4 ranged from 89% - 100%. However, the MAE
Department needs a more standardized format for the M.S. Project Report.
H. The percentage of reports that satisfied the various elements of GPEO # 5 ranged from 95% 100%.
5.
What actions are planned to address the findings? (e.g. curricular revision, pedagogical
changes, student support services, resource management.)
The following recommendations were made to the Department to address the findings:
1. AE295A and AE299 (1st semester):
a. Faculty who serve on a student’s committee should not sign the grade form without first evaluating
the student’s end-of-semester written report.
b. The AE295 and AE299 Coordinator should not assign a grade for a student without the grade form
signed by all three committee members.
2. The AE295B and AE299 (2nd semester): Students should not receive a passing grade without first
submitting to the Department a bound copy of their final report signed by all three committee members.
A non-bound copy and a receipt (showing that a copy is in the process of being bound) would be
acceptable accompanied by the cover page signed by all three committee members.
3. Committee Issues:
a. Department policy should be clarified that students are not allowed to replace committee members
at their discretion without approval of the committee members.
b. Once established on the student’s M.S. project / thesis proposal, the committee members cannot be
changed, without consulting with the committee members themselves.
4. There should be checks and balances to ensure the integrity of our graduate programs. In particular:
a. A student's M.S. project / thesis grade should be determined by the student's committee only.
b. The coordination of the AE295 and AE299 courses should be given to an AE tenured faculty
member.
5. Although the level of mathematics and science may vary from project to project, students are
expected to use graduate level mathematics and science in their analysis or modeling. In the
Design area, students should be encouraged to be more explicit about the relevant mathematics
and science.
6. The Department should organize a presentation on literature search and review every semester.
7. By the end of the 1st month, in AE295A and AE299, students should be required to take the online
tutorials offered by the SJSU Library, such as Info-Power.
8. By the end of the 1st month, in AE295A and AE299, students document and present a literature review
MS Aerospace Engineering
110
related to their project / thesis.
9. The Project / Thesis cover page should read:…for the degree of M.S. in Aerospace Engineering.
10. The Graduate Coordinator and Faculty Advisors will ensure that the topic area selected by their
students is appropriate for the degree and is supported by the courses taken by the student.
11. On the Project Proposal Form, a line should be added to indicate the area of specialization.
12. Students should seek a committee chair and appropriate committee members in their field of study.
Faculty should advise students in their area of expertise.
13. The MAE Department should develop a uniform M.S. Project Report guideline. One idea may be to
simply adopt the University guidelines for thesis.
14. Students admitted into a graduate program must enroll in at least one course per semester, selected
from the list of approved M.S. courses for their area of specialization.
In addition, the AE faculty strongly recommend that:
a.
The coordination of the AE295A,B and AE299 courses should be given to the AE
Coordinator.
b.
All MSAE thesis / project topics and committees are approved by the AE Coordinator to
ensure that (i) topics are appropriate for the MSAE degree and (ii) the committee chair and
committee members have expertise in the area of each thesis / project.
c.
The MSAE Program at Lockheed-Martin should have substantially more input from and
participation of the AE faculty. Part-time instructors at Lockheed-Martin should meet with
the AE faculty member who coordinates the particular course, and curriculum changes
should be coordinated with the AE faculty.
6.
Describe plans to collect data on additional Gapes next semester.
Recommendations a, b, and c above (section 5) are very important for the quality of the MSAE Program and
must be addressed in Fall 2006 before performing further assessment tasks.
Fall 2006
Spring 2007
7.
1. Implement the changes adopted by the MAE Department in Spring 2006.
2. Administer exit interviews with MSAE graduates.
1. Assess all the thesis / project reports from the 06-07 graduates.
2. Administer exit interviews with MSAE graduates.
Did your analysis result in revisiting/revising the GPEOs? If the answer was yes, please
explain and submit an updated version of the Student Learning Outcomes.
N/A
MS Aerospace Engineering
111
Download