2009-2010 Assessment Report Department of Computer Science California State University, Sacramento Du Zhang, Department Chair Assessment Committee: Mary Jane Lee, Coordinator William Mitchell Anne-Louise Radimsky Richard Smith Du Zhang Submitted June 27, 2010 Option 1: Narrative Submission In AY 2009-2010, the assessment activities of the Computer Science Department focused on the following: Undergraduate Program • Assessment of Program Educational Objectives - Survey of Alumni • Assessment of Student Learning Outcomes a–d in CSC Core Courses Graduate Programs • Development of Assessment Plans for Objectives and Outcomes • Assessment of Student Learning Outcomes: b. Oral Communication and g. Written Communication • Assessment of Student Learning Outcomes a–c and e–h using Employer Evaluations • Assessment of Student Learning Outcomes a-d Technical Content Undergraduate Program Program Educational Objectives – Survey of Alumni In Fall 2009, the BS degree program in Computer Science was reviewed for external accreditation by the Computer Science Accreditation Commission (CAC) of the Accreditation Board of Engineering and Technology (ABET). To address a concern raised by the visiting team regarding achievement of program objectives, the department revised and implemented a survey of alumni in Spring 2010. This survey addressed all program objectives and targeted alumni who received their degrees between 3 to 6 years ago. Alumni were asked to respond to general questions related to their company and to their job responsibilities followed by a question asking them to indicate the importance of each program objective to their professional careers and how well they believe they were prepared by their CSUS education to achieve that objective. Sixteen alumni responded to the survey. In a report submitted to ABET/CAC [1], the department concluded that, “Overall, the results of this latest alumni survey provide strong evidence on the accomplishment of our program objectives. This further substantiates the objective assessment results we obtained through the other two mechanisms: industry visits and evaluation by the department Industry Advisory Committee. Several meaningful conclusions can be drawn from this latest survey. 1. A high percentage of alumni (77%-100%) view the department’s program educational objectives as extremely/very/moderately important in their professional careers. 2. A high percentage of alumni (77% - 92%) rate their CSUS education as having prepared them extremely/very/moderately well to achieve these objectives. 3. There is strong evidence that our alumni are assuming leadership roles, taking on increasing job responsibilities, pursuing professional development opportunities, and participating in life-long learning.” 2 Assessment of Student Learning Outcomes a–d in CSC Core Courses In last year’s assessment report, the department recommended that a process to evaluate core topics be developed by (1) identifying fundamental performance criteria for outcomes a – d, (2) developing questions which assess these performance criteria, and (3) examining “norming” procedures for evaluators and training faculty. The department developed a table which associated for each outcome a through i, major performance criteria and related core courses (Appendix A). Rather than assessing knowledge of core topics in upper division elective courses as planned, the department decided to re-assess core topics in core courses since not all students are required to take elective courses. Another change made was to use course instructors as evaluators rather randomly selected faculty to reduce the variability that occurred in prior evaluations of student exams. As a result, “norming” procedures did not appear to be necessary. In Fall 2009, instructors for upper division core courses, CSC 130 – CSC 139, identified questions that would be used to assess at least two performance criteria in Outcomes a through d for their particular course. These questions were submitted to the department assessment committed for review prior to implementation as final exam questions in Fall 2009. At the end of Fall 2009 semester, faculty evaluators submitted student scores on selected questions using a 4-point scale with 4: exceeds criterion, 3: meets criterion, 2: progressing to criterion, and 1: below expectations. Results of the assessment of core topics are provided in Appendix B. (These results were submitted as part of the department’s 30-day response to ABET/CAC.) Compared to prior results in the assessment of core topics (see 2008-2009 Assessment Report for Computer Science) , the following conclusions can be reached: 1. For Outcome a, there was improvement in the two areas of deficiency from last year. Performance criterion, knowledge of fundamental algorithms, improved from 55% to 87%. Performance criteria, finite state machines (70%) and grammars (96%) were combined into one criterion this year. Student scores improved to 85% for the merged criterion. However, the results for criterion a-2, understand and use essential data structures (41%), and a-7, understand functional programming paradigms (56%) showed deficiencies in student understanding. It is recommended that instructors in CSC 130 and 136 develop instructional materials that help students in their understanding and application of these concepts. 2. For Outcome b, the minimum standard was satisfied for performance criteria b-3 understand and apply design principles (87%) and for b-6 demonstrate ability to design and analyze hardware components (92%). Students appear to be knowledgeable about computer systems design and development. 3. For Outcome c, the minimum was not satisfied for performance criterion c-1 understand and apply semi-formal modeling languages (59%), but was satisfied for c-5 understand and use verification and validation methods. Because modeling languages, such as UML, are applied in several courses, e.g., CSC 20, 130, and 131, it is recommended that students be provided with additional assignments utilizing this tool. 4. For Outcome d, the results were mixed as they were last year. Criterion d-1 competence in programming a commonly used language was satisfied in CSC 136 (78%), but not in CSC 133 (55%). In the exam question for CSC 133, students were required to write Java code compared 3 to the CSC 136 question covering scope and parameter passing concepts. In general, outcome d appears to be marginally satisfied. Graduate Programs During Spring 2010, in preparation for the campus-level review of the MS programs in Computer Science and in Software Engineering, a self-study report was completed by the department [2]. Section 2 of the self-study details the department’s assessment efforts for the graduate programs and represents the assessment of our graduate programs’ educational objectives and student learning outcomes. The results from that report [2] are summarized in this section. Development of Assessment Plans for Objectives and Outcomes The department of Computer Science Graduate Committee and Assessment Committee jointly developed the program educational objectives and student learning outcomes for the graduate programs. The list of objectives and outcomes is presented in Appendix C. The assessment plans for our graduate programs can be found in Reference 2, pages 14 and 15. Assessment of Student Learning Outcomes b. Oral Communication and g. Written Communication Effective oral presentation was assessed at the department’s biannual Graduate Student Symposium on April 19, 2010. Fourteen graduate students presented their MS projects. A total of 10 faculty members assessed the presentations using a new rubric (see Appendix D). In general, our MS students have effective oral communication skills with percentages of meeting or exceeding criteria between 86% 100%. However, emphasis should be made in CSC 209 Research Methodology on the importance of providing adequate justification to support the methodologies used in projects which received 71%, slightly below our minimum standard of 75%. Effective written communication skills were assessed by evaluating the two major categories of 1) composition and completeness and 2) presentation of technical content using a new rubric (see Appendix E). MS project reports from previous semesters (3 from Spring 2009 and 17 from Fall 2009) were assessed using the rubric. Fourteen CSC faculty and 4 alumni, who are members of our Industry Advisory Committee, evaluated the reports. With the exception of one report, there were two evaluators for each report. Analysis of the results indicated that students performed very well in the following criteria: structure, paragraph, problem statement, design specification, and development. However, deficiencies appeared in syntax, and in the analysis and conclusion sections of the reports. It is recommended that the procedures be refined and the number of evaluators per report be increased for the next review cycle. Assessment of Student Learning Outcomes a–c and e–h using Employer Evaluations Employers were asked to assess the performance of MS student interns who registered for CSC 195 and CSC 295 between Fall 2006 and Fall 2009. Questions asked of employers were related to performance in Outcomes a–d and e–h. Results indicated that all evaluated MS students met or exceeded performance criteria a-c and e-h with results ranging from 98.67% to 100%. Assessment of Student Learning Outcomes a-d Technical Content The same written project reports used to assess writing were assessed for quality of technical content using a checklist (see Appendix F). The evaluators for writing skills were also used to assess technical 4 content. The results indicated that the minimum standard of 75% was reached for all outcomes. The technical quality of MS projects was viewed as excellent with no deficiencies. As a result of faculty reflection on these results, are there any program changes anticipated? No major program changes are recommended at this time for both undergraduate and graduate programs, although more emphasis in select areas are suggested as indicated, e.g., supporting documentation. Did your department engage in any other assessment activities, such as, the development of rubrics, course alignment? Two rubrics were developed for our graduate programs; one to assess oral communication (Appendix D) and the other for written communication (Appendix E). What assessment activities are planned for the upcoming academic year? The department will fine-tune its undergraduate assessment plans for objectives and outcomes in light of the recommendation from its accrediting body to change from a 3-year to a 2-year cycle. The change may be made by eliminating the assessment of core topics in elective courses which are not taken by all majors. In general, we are on schedule relative to our assessment plans for our undergraduate and graduate programs. 5 Appendix A Assessment of Outcomes – Performance Criteria and Core Courses January 21, 2010 (revised 3/16/2010) Outcome (a) Apply knowledge of mathematics, algorithmic principles, computer theory, and principles of computing systems in the modeling and design of computer-based systems that demonstrate an understanding of tradeoffs involved in design choices. (b) Analyze a problem, specify the requirements, design, implement, and evaluate a computer-based system, process, component, or program that satisfies the requirements. 1 a-1. Performance Criteria Understand and apply fundamental algorithms 1. Core Courses CSc 130 Electives CSc 148 a-2. Understand and use appropriately essential data structures1. CSc 130 CSc 151, 152, 165, 174, 180 a-3. Understand trade off in the selection of algorithms and data CSc 130, CSc 133, structures. CSc 190/191 Demonstrate knowledge of abstract machines, languages, CSc 132 and grammars. a-4. a-5. Understand and use relational databases. CSc 134 a-6. Understand predicate calculus and logic programming. CSc 136 a-7. Understand the functional programming paradigm. CSc 136 a-8. a-9. Understand layers of communication protocols. Understand concurrency and resource management. CSc 138 CSc 139 b-1. Understand and apply modeling and analysis techniques. CSc 190/191 b-2. Understand and apply requirements engineering process. CSc 190/191 b-3. b-4. Understand and apply design principles. Understand and apply proper testing techniques. CSc 131, 190/191 CSc 190/191 b-5. Understand and apply project management processes and tools. CSc 190/191 See CSC 130 course description. 6 CSc 148 (c) Apply design and development principles in the construction of software systems of varying complexity. (d) Use current skills, techniques, and tools necessary for computing practice. (e) Function effectively as a b-6. Demonstrate the ability to design and analyze hardware components such as processors and memory devices. CSc 137 b-7. Understand modern computer architectures. CSc 137 b-8. Understand and apply process synchronization principles. CSc 139 (3/16/2010) c-1. Understand and apply semi-formal modeling languages such as UML. CSc 133 c-2. Understand and use object-oriented design. CSc 133 c-3. Understand and use design patterns. CSc 133 c-4. c-5. c-6. c-7. Understand and use structured analysis. Understand and use verification and validation techniques. Understand and use software metrics. Understand software maintenance and prepare for it. CSc 190/191 CSc 131, CSc 190/191 CSc 131 CSc 190/191 c-8. Understand and apply documentation standards. CSc 190/191 d-1. Demonstrate competence to program in commonly used languages such as C++ or Java. CSc 133, CSc 136 d-2. Demonstrate proficiency in using programming development tools. CSc 133 d-3. Demonstrate competence in using system libraries. CSc 133 d-4. Demonstrate proficiency in using hardware description languages. CSc 137 d-5. d-6. Demonstrate competence in using SQL. Demonstrate competence in applying regular expressions, grammars, and automata. CSc 134 CSc 132, 136 e-1. Cooperate and collaborate as a team member. CSc 131, 190/191 7 CSc 148 CSc 165 member of a team to accomplish a common goal. (f) Understand professional, ethical, legal, social, and security issues and responsibilities; analyze the impact of computing on individuals, organizations, and society both locally and globally. (g) Write effectively. (h) Speak effectively. e-2. Communicate and listen; keeps teammates informed. CSc 131, 190/191 e-3. Face conflicts and resolve differences. CSc 131, 190/191 e-4. Contribute equally as a participant in the project. CSc 131, 190/191 f-1. Know, understand, and practice professional codes of conduct (i.e. ACM Code of Ethics and Professional Conduct, IEEE Code of Ethics, ACM/IEEE Software Engineering Code of Ethics and Professional Practice). Phil 103, CSc 190/191 f-2. Able to evaluate the ethical dimensions of a computer solution to a problem. Phil 103, CSc 190/191 f-3. f-4. Understand need for and use of proper security measures. Understand moral/ethical issues in resolving an ethical/moral conflict. CSc 138 Phil 103 g-1. Use language and technical level appropriate for the audience. CSc 190/191 g-2. Demonstrate an organizational pattern that is logical and conveys completeness. CSc 190/191 g-3. Use the rules of standard English. CSc 190/191 g-4. Provide adequate detail to support solution. CSc 190/191 h-1. Identify main points clearly and present them concisely. CSc 131, CSc 190/191 h-2. Demonstrate good organization. CSc 131, CSc 190/191 h-3. Attract and hold the interest of the audience. CSc 131, CSc 190/191 h-4. Present the material effectively with confidence. CSc 131, CSc 190/191 8 (i) Recognize the need for, and an ability to engage in, continuing professional development. h-5. Maintain eye contact. CSc 131, CSc 190/191 h-6. Speak clearly and distinctly. CSc 131, CSc 190/191 i-1. Demonstrate the ability to identify, evaluate and utilize opportunities and resources to learn new material not covered in classes. Graduating seniors and recent alumni, CSc 192,194,195, 199 i-2. Demonstrate the ability to recognize continuing education opportunities and importance of life-long learning to professional success. Graduating seniors and recent alumni, CSc 192,194,195, 199 9 Appendix B Assessment of Student Learning Outcomes/ Performance Criteria Outcome a. Apply knowledge of mathematics, algorithmic principles, computer theory, and principles of computing systems in the modeling and design of computer-based systems that demonstrate an understanding of tradeoffs involved in design choices. Outcome a a-1 Understand and apply fundamental algorithms 130 % Students Meeting or Exceeding Criterion 87% a-2 Understand and use appropriately essential data structures Demonstrate knowledge of abstract machines, languages, and grammars Understand and use relational databases 130 41% 132 76% 134 80% Understand predicate calculus and logic programming Understand functional programming paradigm 136 74% 136 58% a-4 a-5 a-6 a-7 Performance Criterion Course Average: 69% Outcome b. Analyze a problem, specify the requirements, design, implement, and evaluate a computerbased system, process, component, or program that satisfies the requirements. Outcome b Performance Criterion Course b-3 Understand and apply design principles. 131 % Students Meeting or Exceeding Criterion 87% b-6 Demonstrate the ability to design and analyze hardware components, such as, processors and memory devices. 137 92% Average: 10 90% Outcome c. Apply design and development principles in the construction of software systems of varying complexity. Outcome c c-1 c-5 Performance Criterion Course Understand and apply semi-formal modeling languages, such as, UML. Understand and use verification and validation methods. 133 % Students Meeting or Exceeding Criterion 59% 131 77% Average: 68% Outcome d. Use current skills, techniques, and tools necessary for computing practice. Outcome d d-1 d-6 Performance Criterion Course Demonstrate competence in programming in commonly used languages, such as, C++ or Java. Demonstrate competence in applying regular expressions, grammars, and automata. 133 % Students Meeting or Exceeding Criterion 55% 136 78% 132 77% Average: 11 70% Appendix C From Department of Computer Science Graduate Program Self-Study Report 2.1 Program Educational Objectives The Graduate Curriculum Committee and the Assessment Committee of the department developed the graduate program educational objectives listed below. Graduates of the MS programs will 1. Demonstrate advanced proficiency in the design, development, maintenance, and support of computing systems. 2. Be effective and contributing members of project teams. 3. Engage in the pursuit of professional development opportunities, and/or pursue advanced degrees. 4. Assume leadership roles in their chosen career and profession. 5. Write effectively. 6. Have effective oral communication skills. 7. Abide by the ethical standards of the profession and understand the ethical, social, and global implications of their professional activities. 2.2 Student Learning Outcomes The Graduate Curriculum Committee and the Assessment Committee developed the following student learning outcomes for the graduate programs. At the time of graduation, MS students will be able to a. Apply advanced knowledge of mathematics, algorithmic principles, computer theory, and principles of computing systems in the modeling and design of computer-based systems that demonstrate an understanding of tradeoffs involved in design choices. b. Analyze a problem, specify the requirements, design, implement, and evaluate a computer-based system, process, component, or program that satisfies the requirements. c. Apply design and development principles in the construction of software systems of varying complexity. d. Develop and apply skills, techniques, and tools necessary for computing practice. e. Contribute effectively as members of a team to accomplish a common goal. f. Understand professional, ethical, legal, social, and security issues and responsibilities; analyze the impact of computing on individuals, organizations, and society both locally and globally. g. Write effectively. h. Speak effectively. i. Recognize the need for and engage in continuing professional development. 12 Appendix D Oral Communication Rubric for MS Project/Thesis Presentations Date: __________________________ Project/Thesis#: _________________________ Evaluator: [ ] Faculty [ ] Instructor [ ] Student [ ] Alumni [ ] Industry 4 Exceeds Criteria 3 Meets Criteria 2 Progress to Criteria 1 Below Expectation Organizational pattern: introduction and conclusion, sequenced material within the body, and transitions Is clearly and consistently observable, is skillful and makes the content of the presentation cohesive. Is consistently observable in the presentation. Is intermittently observable in the presentation. Is not observable in the presentation. Are adequate and generally support the effectiveness of the presentation. Are limited and partially support the effectiveness of the presentation. Are inappropriate and adversely impact the effectiveness of the presentation. Language choices Are captivating and compelling, and enhance the effectiveness of the presentation. Delivery techniques: visual aids, question handling, posture, gesture, eye contact, and vocal expression Make the presentation compelling. Speaker appears polished and confident. Make the presentation interesting. Speaker appears comfortable. Make the presentation understandable. Speaker appears tentative. Make the presentation difficult to understand. Speaker appears uncomfortable. Supporting materials: background and related work, explanations, examples, illustrations, statistics, analogies, quotations from relevant authorities A variety of supporting materials provided. Makes appropriate reference to information or analysis that significantly supports the presentation and demonstrates a thorough knowledge of problem area. Adequate supporting materials provided. Make appropriate reference to information or analysis that generally supports the presentation and demonstrates a good knowledge of problem area. Some supporting materials provided. Make reference to information or analysis that partially supports the presentation and shows understanding of some issues of problem area. No supporting materials provided. Make reference to irrelevant information or analysis and demonstrates a lack of understanding of problem area. Communication of technical content: project/thesis objectives are precisely stated, appropriately repeated, logically reasoned, and strongly supported Communication is compelling. Arguments are presented persuasively and logically. Communication is clear. Arguments are adequate. Communication is not convincing. Arguments are lacking. 13 Communication is poor and ineffective. Arguments are nonexistent. Rating Appendix E Written Communication Rubric for MS Projects/Thesis Date: ____________________ Evaluator: [ ] Faculty Project/Thesis#: ____________________ [ ] Industry [ ] Student [ ] Alumni Table 1. Evaluation of composition and completeness Criteria 4 Exceeds Criteria 3 Meets Criteria 2 Progressing to Criteria 1 Below Expectations NA Score Structure. This section evaluates the formal structure of the project/thesis including the organization of sections and subsections. Reports should have a title and a table of contents showing logical sections and subsections. The report is well organized, and Report is organized with a Structure (organization maintains a consistent style. Transitions reasonable flow of ideas. Most and are logical and smooth. transitions are logical and smooth. transitions) Report is somewhat organized. Report is not organized. Transitions are not always logical and Little sense of wholeness and completeness. Poor smooth. transitions. Syntax, Sentence structure and conventions of standard English. This section evaluates the author's use of language to clearly communicate ideas. Spelling and grammar are included in the evaluation. Syntax, sentence structure and conventions of standard English Words are chosen with care in consideration of fine differences in meaning. Correct syntax, spelling, and grammar. Sentence structure usually conveys the intended meaning. In general, there are few errors in syntax, spelling, and/or grammar. Sentence structure sometimes conveys confusing meanings, but the intent can still be discerned from the context. A number of errors in syntax, spelling, and/or grammar. Sentence structure conveys misleading meanings. Many errors in syntax, spelling, and/or grammar. Paragraph Structure. This section evaluates the author's integration of sentences into meaningful paragraphs. Please evaluate the report with respect to the following description of a well-written paragraph: The first sentence of a paragraph establishes some perspective for the remainder of the paragraph (e.g., a topic sentence or a transitional sentence). Within a paragraph, sentences are relevant to the paragraph and are in a logical order. Near the end of the paragraph, there is some statement that unifies or completes the ideas presented in that paragraph. Paragraph Paragraphs are on topic and Most paragraphs are on topic and understandable. Stylistic variations show understandable with some errors. command of language. Although there may be some loss of focus, paragraphs are reasonably written. 14 Some paragraphs indicate good Paragraphs are confusing, structure, but often, paragraphs do not with unclear topic and show unifying thought and logic. meaning. Sentences within paragraphs seem to be related. Table 2. Presentation of technical content This is an evaluation of writing skills as used to convey technical content, not an evaluation of the perceived difficulty of the project. Consider whether the student has effectively communicated the attributes of the project. If any of the following aspects does not apply, then mark NA. Criteria 4 Exceeds Criteria 3 Meets Criteria 2 Progressing to Criteria 1 Below Expectations NA Score Problem Statement. This section evaluates the problem statement. A problem statement describes the purpose of the work (i.e., the need being addressed) as well as how the project results will accomplish that purpose. Problem Statement Objective, nature of challenges and value of the project are clearly established. Objective, nature of challenges and value of the project are adequately stated. Some significant aspects of the objective, nature of challenges and value of the project are missing. Significant aspects of the objective, nature of challenges and value of the project are missing. Background and Related Work (Research). This section provides support for the project/thesis by identifying and citing background and related work. Background and related work are Background and related work are Limited background and related work Background extensively identified. adequately identified. are identified. and Related Work Design Requirements. This section includes specifications of functional and/or non- functional requirements. Specifications are complete. Appropriate Design Requirements design constraints have been identified. Specifications Specifications are fairly complete. Most design constraints have been identified. No background and related work are identified. Some specifications are missing. Some Requirements are not specified. design constraints are not identified. Design constraints are not identified. Development Process. In this section, students document their development process. The purpose is not to write a history of the project, but to document key development decisions and the factors that should be considered in making those decisions. It is possible that this section will recommend to the reader an improvement over the development process that was actually followed. Limited key development decision Key development decision Development Key development decision alternatives are Key development decision well identified and/or compared. alternatives are adequately identified alternatives are identified and/or alternatives are not identified and Process Reasoning shows a deep understanding of and/or compared. Reasoning shows a compared. Reasoning shows a limited compared. Reasoning does not show problem area. good understanding of problem area. understanding of problem area. an understanding of problem area. Analysis of Project Results. In this section, do not evaluate how far the student has developed the project, but evaluate whether you understand what has been accomplished in the project on the basis of data analysis and performance results. Analysis of Results All important aspects of the performance of the project are described with measured results or precise evaluative statements. The implementation of specified requirements is fully analyzed and verified. Most important aspects of the performance of the project are described with measured results or evaluative statements. The implementation of specified requirements is adequately analyzed and verified. Some aspects of the performance of the project are described with measured results or evaluative statements. The implementation of specified requirements are minimally analyzed and verified. No aspect of the performance of the project is described with measured results or evaluative statements. The implementation of specified requirements is not analyzed and verified. Conclusion. Evaluate how well the report summarizes and evaluates the major efforts involved in the project, and discusses future work. Conclusion Conclusion succinctly describes the accomplishments of the effort and relates them to the original problem. Future work is fully discussed. Conclusion clearly describes most of the accomplishments and relates them to the original problem statement. Future work is reasonably well discussed. 15 Conclusion describes some of the accomplishments and relates them to the original problem statement. Discussion on future work is very limited. No clear summary of project. No discussion of future work. Appendix F Technical Content Evaluation for MS Projects/Theses Date: __________________________ Project/thesis#: _________________________ Evaluator: [ ] Faculty [ ] Industry [ ] Student [ ] Alumni a. Apply advanced knowledge of mathematics, algorithmic principles, computer theory, and principles of computing systems in the modeling and design of computer-based systems that demonstrate an understanding of tradeoffs involved in design choices. Exceed Criteria Meets Criteria Progress to Criteria Below Expectation NA* b. Analyze a problem, specify the requirements, design, implement, and evaluate a computer-based system, process, component, or program that satisfies the requirements. Exceed Criteria Meets Criteria Progress to Criteria Below Expectation NA* c. Apply design and development principles in the construction of software systems of varying complexity. Exceed Criteria Meets Criteria Progress to Criteria Below Expectation NA* d. Develop and apply skills, techniques, and tools necessary for computing practice. Exceed Criteria Meets Criteria Progress to Criteria * Mark NA only when an aspect does not apply. 16 Below Expectation NA* References 1. Report on Alumni Survey for Objective Assessment, Department of Computer Science, College of Engineering and Computer Science, California state University, Sacramento, June 16, 2010. 2. 2009-2010 Graduate Program Self Study Report, Department of Computer Science, California State University, Sacramento, June 7, 2010. 17