Rensselaer At Hartford Department of Engineering and Sciences CISH-6050: Software Engineering Management Summer, 2012 Syllabus Instructor: Houman Younessi Starting Date: Monday, 5/07/2012 Office Location: Classroom or Phone Number: 860-548-7880 Office #700 Email: youneh@rpi.edu Office Hours: Mondays, 4:00 PM Class Hours: 5:30 PM to 8:30 PM or by appointment. Prerequisite: CISH-4020, Object Structures, (ECSE 6770 SE1 recommended but not mandatory) Course Objectives: The main objective of this course is to construct a solid foundation for the understanding and application of the principles, techniques, and methodologies for managing software development. To that end, the software capability maturity model, and its related levels, key process areas, and assessments will be explored. Additional topics will include current issues in software engineering management, the software crisis, current state of the practice, modeling the software engineering process, software metrics and estimation, software validation, software engineering economics, process improvement, and risk management. Upon successful completion of this course, the student should have a good understanding of software management fundamentals, which will allow the student to identify and use an array of principles, techniques, and methodologies to effectively manage a software development project. Required Text Books: There are no required textbooks for the student to purchase for this course. However, there will be assigned readings, most of which will be available from the Web. Reference Text Books: For reference, a number of textbooks were used in preparing this course material. In particular, if a student needs additional background information on material covered in the course, he/she may wish to consult the following books: T1. W. S. Humphrey, Managing the Software Process, Addison-Wesley, Reading, MA, 1989 T2. M. C. Paulk, C. V. Weber, B. Curtis, M. B. Chrissis, The Capability Maturity Model: Guidelines for Improving the Software Process, Addison-Wesley Longman, Inc. Boston, MA, 1995 T3. R. S. Pressman, Software Engineering: A Practitioner’s Approach, 5th ed., McGraw-Hill, New York, 2001 T4. I. Sommerville, Software Engineering, 5th ed., Addison-Wesley, Reading, MA, 1995 T5. N. E. Fenton, S. L. Pfleeger, Software Metrics: A Rigorous and Practical Approach, 2nd ed., International Thomson Computers Press, Boston, MA, 1996 T6. R. L. Glass, Software Runaways: Lessons Learned from Massive Software Project Failures, Prentice Hall PTR, Upper Saddle River, NJ, 1998 Page 1 T7. H. Younessi, Object Oriented Defect Management of Software, Prentice-Hall, 2002 T8. I. Graham, B. Henderson-Sellers, H. Younessi, OPEN Process Specification; Addison-Wesley; 1997 T9. B. Henderson-Sellers, A. Simons, H. Younessi, OPEN Toolbox of Techniques, Addison-Wesley, 1998 Web Sites: The students should be familiar with the following Web sites, which contain information pertinent to this course: W1. W2. W3. W4. W5. Software Productivity Consortium: Carnegie Mellon University, Software Engineering Institute: Software Quality Institute (SPICE): Software Technology Support Center: The Cole Library (R@H) Databases & Journals: http://www.software.org http://www.sei.cmu.edu http://www.sqi.gu.edu.au http://www.stsc.hill.af.mil http://www.rh.edu/library/dbs.htm Assigned Readings: The student should read and be prepared to discuss the following articles for the specific lecture identified in the course schedule below. Additional articles may be assigned. R1. W. S. Humphrey, “The Changing World of Software”, Software Engineering Institute, Carnegie Mellon University, December, 2001. Available at http://www.sei.cmu.edu/publications/articles/watts-humphrey/changing-world-sw.html R2. W. S. Humphrey, “A Process or A Plan?”, Software Engineering Institute, Carnegie Mellon University, December, 2001. Available at http://www.sei.cmu.edu/publications/articles/watts-humphrey/process-or-plan.html R3. W. S. Humphrey, “The Process Bureaucracy”, Software Engineering Institute, Carnegie Mellon University, December, 2001. Available at http://www.sei.cmu.edu/publications/articles/watts-humphrey/process-bureaucracy.html R4. S. A. Sheard, “Evolution of the Frameworks Quagmire”, IEEE Computer, July 2001, pp. 96-98. Available at http://www.software.org/pub/externalpapers/Evolution_Frameworks_Quagmire.pdf R5. S. A. Sheard, J. G. Lake, “Systems Engineering Standards and Models Compared”, Proceedings of the International Council on Systems Engineering, Vancouver, British, Columbia, 1998. Available at http://www.software.org/pub/externalpapers/9804-2.html R6. S. A. Sheard, G. J. Roedler, “Interpreting Continuous-View Capability Models for Higher Levels of Maturity”, Systems Engineering, Vol. 2, No. 1, 1999. Available at http://www.software.org/pub/externalpapers/contarch/index.html R7. M. Phillips, S. Shrum, “Creating an Integrated CMM for Systems and Software Engineering”, Cross Talk, September, 2000, pp. 26-27. Available at http://www.stsc.hill.af.mil/crosstalk/2000/09/phillips.html R8. B. Pierce, “Is CMMI Ready for Prime Time?”, Cross Talk, July, 2000, pp. 21-24. Available at http://www.stsc.hill.af.mil/crosstalk/2000/07/pierce.html R9. L. Carter, C. Graettinger, M. Patrick, G. Wemyss, S. Zasadni, “The Road to CMMI: Results of the First Technology Transition Workshop”, Software Engineering Institute, Carnegie Mellon University, February, 2002. Available at http://www.sei.cmu.edu/publications/documents/02.reports/02tr007.html Page 2 R10. S. A Sheard, L. Ibrahim, S. Rose, W. Makhlouf, “Two Approaches to CMM-Integration”, Software Productivity Consortium, April, 1998. Available at http://www.software.org/pub/externalpapers/9804-1.html R11. M. Paul, “Practices of High Maturity Organizations”, Software Engineering Institute, Carnegie Mellon University, February, 2002. Available at http://www.sei.cmu.edu/publications/articles/paulk/high.maturity.survey.html R12. M. Paulk, B. Curtis, M. B. Chrissis, C. V. Webber, “Capability Maturity Model for Software”, Version 1.1, Software Engineering Institute, Carnegie Mellon University, February, 1993. Available at http://www.sei.cmu.edu/publications/documents/93.reports/93.tr.024.html R13. D. Dunaway, R. Berggren, G. des Rochettes, P. Iredale, I. Lavi, G. Taylor, “Why Do Organizations Have Assessments? Do They Pay Off?”, Software Engineering Institute, Carnegie Mellon University, July, 1999. Available at http://www.sei.cmu.edu/publications/documents/99.reports/99tr012/99tr012abstract.html R14. M. Keil, D. Robey, “Blowing the Whistle on Troubled Software Projects”, Communications of the ACM, Vol. 44, No. 4, April 2001, pp. 87-93. PDF file (ACMp87-Keil.pdf) available online via ACM Digital Library when accessed from the RPI at Hartford library. R15. ISO/IEC TR 15504:1998 - Software Process Assessment (SPICE) documentation suite. Available at http://www.sqi.gu.edu.au/spice/suite/download.html R16. M. Paulk, D. Goldenson, D. White, “The 1999 Survey of High Maturity Organizations”, Software Engineering Institute, Carnegie Mellon University, February, 2000. Available at http://www.sei.cmu.edu/publications/documents/00.reports/00sr002.html R17. S. A. Sheard, H. Lykins, J.R. Armstrong, “Overcoming Barriers to Systems Engineering Process Improvement”, Softarware Productivity Consortium, 1999. Available at http://www.software.org/pub/externalpapers/sepi_barriers.html R18. S. A. Sheard, “Fifty Ways to Save Your Budget: Reduced Cost Systems Engineering Process Improvement”, Software Productivity Consortium, 1999. Available at http://www.software.org/pub/externalpapers/50ways.html R19. D. Goldenson, J. Herbsleb, “After the Appraisal: A Systematic Survey of Process Improvement, its Benefits, and Factors that Influence Success”, Software Engineering Institute, Carnegie Mellon University, August, 1995. Available at http://www.sei.cmu.edu/publications/documents/95.reports/95.tr.009.html Page 3 WebCT and Listserv: For this course, we will not be using the LMS system for electronic communications. Distribution of course lecture notes, assignments, and class discussions will be through the class listserv. Work commitments and business trips will be accepted for absence from class, but will not be an acceptable basis for requesting extensions to work which needs to be submitted. Oral Presentation: Good communication skills, both verbal and written, are essential for effective management. Each student will be required to prepare and give an oral presentation to the class. The actual length of the presentation will be determined based on the number of students enrolled in the class; typical length of a presentation should be between 5 and 8 minutes. The presentations are scheduled on 6/25/2012; students will be allocated time slots within which to give their oral presentations. More details on the presentation format, constraints, and grading will be discussed in class. Students will submit their presentation topic and a brief overview to the instructor two weeks prior to presentation for review and approval (on 6/11/2012). In addition to demonstrating effective communication skills, a key result of the student presentations should be to provide other members of the course with beneficial and pertinent information in the Software Engineering and Management field. To that end, students have the option of selecting a topic from one of two general areas: 1. Lessons Learned in Software Engineering Management: A topic in this area should cover some aspect of a software development project in or near crisis and the lessons that the student learned from dealing with that crisis. For example, the discussion could be about a project with cost over-runs, schedule delays, scope creep, etc. and the approach taken to mitigate the issues. Or, the student could describe process changes in their work environment that were implemented to help prevent a software development crisis. The discussion can be based on the student’s personal/professional experiences or from a case study documented in current literature. The student should select a topic that will be both informational and beneficial for the other members in the course. The discussion should not be a direct presentation of a software crisis or lessons learned which have already been discussed in class. 2. Current Topics in Software Engineering Management: A topic in this area should cover an aspect of Software Engineering Development and/or Management that supports process improvement, proper software development, software crisis prevention, etc. The presentation should educate other members of the course on topics that are new or emerging in Software Engineering, topics under debate in current literature, or topics that were already presented in class, but done at a high level. Past presentations in this area included establishing a software measurements plan, legal issues associated with software development, RUP and CMM Level 2 & 3, Personal Software Process, knowledge management, extreme programming, etc. The topic should have a direct correlation to software development and management and, if the topic was already covered in class, the student should offer a more detailed view, different angle, practical application, etc. of that topic. Team Project: Key focus areas of the Software Engineering Management course are on software development, the capability maturity model, Key Process Areas (KPA), and the maturity assessment process. To provide students with an opportunity to further explore these areas, the class will be divided into teams that will be performing a maturity assessment. Following the Software Process Improvement and Capability dEtermination (SPICE) International Standard (ISO/IEC TR 15504), each team will create an assessment tool, perform an assessment on a team member’s software development organization, and present the assessment results to the class. Page 4 The number of teams and actual team size will be determined based on the number of students in the class. Typically, we would expect teams with 4 to 6 members on each team. Due to time constraints, number of teams, and team size, each team will be assigned only a subset of the total Key Process Areas (KPAs) on which to perform their assessment. Additionally, the teams will need to ensure that there is at least one person on each team that has access to a software development organization (i.e. place of employment) on which to perform the assessment. The SPICE International Standard will be used as the basis for developing the assessment tool and performing the actual assessment. Each team will be responsible for obtaining the SPICE documentation suite and adhering to the requirements specified in the SPICE documentation. The SPICE documentation can be downloaded free of charge from the Software Quality Institute web site at the following location: http://www.sqi.gu.edu.au/spice/suite/download.html Each team will be responsible for the following deliverables for this project. Additional details on the project deliverables, documentation constraints, presentation format, and grading will be discussed in class: Due date for first set of deliverables: Week 8 Team write-up #1: Documenting the organization being assessed, assessment tool, and the approach to doing the assessment. Team Presentation #1: Assessment tool, organization beings assessed, assessment approach. Individual/peer evaluation #1: Evaluation of peer contribution to the team and project deliverables. Due date for second set of deliverables: Week 11 Team write-up #2: Documenting the results, findings, and recommendations of the assessment. Team Presentation #2: Assessment results, findings, and recommendations. Individual/peer evaluation #2: Evaluation of peer contribution to the team and project deliverables. Grading: Grading for this course is based on a weighted sum of the Z-scores for the Oral Presentation, and Team Project. A Z-score is computed as follows: Student Z-score = [(your score) - (Class Mean or Median score)] / Class Standard Deviation Note: A mean score is normally used. However, a median score will be used if there exists an abnormally high or low score. General Grading: A = Z-score > +1.00 B = -1.00 < Z-score < +1.00 (at least) C = -2 < Z-score < -1.00 F = Z-score < -2 Weighting: Oral Presentation: 30% Team Project: 60% Class Participation 10% Page 5 Attendance: Students are expected to attend all scheduled classes for the courses in which they are enrolled and are expected to conduct themselves with consideration and regard for their fellow students. Academic Integrity: Student-teacher relationships are built on trust. For example, students must trust that the teachers have made appropriate decisions about the structure and the content of the courses they teach, and teachers must trust that the assignments that students turn in are their own. Acts that violate this trust undermine the educational process. As stipulated in the Rensselaer at Hartford Student Handbook, all members of the Rensselaer at Hartford community are expected to assume responsibility for honor, honesty, and integrity in their academic work. Violators of a proper code of behavior are subject to the Grievance and Hearing Procedure as outlined in the Handbook. Penalties for students found in violation of this Policy include dismissal from Rensselaer at Hartford. All individual work, including presentations, must be the work of the individual who is presenting the work for assessment. The Rensselaer at Hartford Student Handbook can be found at http://www.rh.edu/publications/handbook/current/policies.html. Course Schedule: Following is the tentative schedule that we will follow for this course. We will allow for flexibility in moving a topic from one class to the next. TOPIC SESSION DATE READING 05/07 R1, R2, R3: Humphrey R4: Sheard R5: Sheard, Lake 2 Software Process Maturity (Part 1): Background Frameworks Quagmire (S. Sheard) 05/14 Standards SPICE Form Project Teams R6: Sheard, Roedler 3 Software Process Maturity (Part 2): Software Capability Maturity Model (SW-CMM) 05/21 Systems Engineering Maturity Model Staged vs. Continuous Architecture 1 Review Course Syllabus Introductions The Software Crisis Software Engineering Basics/Definitions Software Process and Product Quality 05/28 No Class 4 Software Process Maturity (Part 3): Integrating SW and SE CMM Models o CMMI 06/04 o FAA-iCMM o STRICOM – SAE-CMM Software Process and Project Management (Part 1) [CMM Level 2: Repeatable]: Page 6 R7: Phillips, Shrum R8: Pierce R9: Carter, et. al. (Except Appendix A, B, C) R10: Sheard, et. al. Requirements Management Project Tracking & Oversight Risk Management 5 Software Process and Project Management (Part 2) [CMM Level 2: Repeatable]: Project Planning Project Estimation 06/11 Project Scheduling Software Quality Assurance Software Configuration Management Subcontract Management 6 Software Process Definition and Modeling [CMM Level 3: Defined] Process Definition Software Process Models o Waterfall, V-Model, Spiral o Rational Unified Process (RUP) o OPEN 06/18 o Evolutionary, Prototype, OO, RAD, o Cleanroom, 4GT Education/Training, Peer Reviews Software Process Assessment SW-CMM Organizational gains from process assessments Team Discussions/Questions on Assessment Project 7 06/25 No Class 8 07/02 9 10 R11: Paulk R12: SW-CMM, Section 4 R13: Dunaway, et. al. R14: Keil, Robey Reference: R15: SPICE, Parts 3, 4, 5, & 6 Due Date: Team Project Presentations & Write up #1 Managed and Measured Process [CMM Level 4: Managed] The managed process 07/09 Data collection Managing software quality R16: Paulk, et. al. pp. 1-11 Optimizing the Software Process [CMM Level 5: Optimizing] Defect Prevention Automating software process Software process change 07/16 Software Process Improvement Gap Analysis Phases to process improvement Process improvement barriers Reducing expense of Process Improvement Process Improvement Payoffs R17: Sheard, et. al. R18: Sheard R19: Goldenson, Herbsleb, pp. 1-28 Reference: R15: SPICE, Part 7 Page 7 Due Date: Team Project Presentations & Write11 07/23 up #2 on assessment results, findings, & recommendations 12 07/30 Team Project Presentations (Continued) Page 8