Report on: Workshop on Process Modeling Technologies Lee Osterweil University of Massachusetts Sue Koolmanojwong USC-CSSE Problem: How to Sort Through a Profusion of Approaches to Software Process? Monday’s Workshop • Presentation of four approaches • Discussion of how to sort through the alternatives Phase 1: Presentation of Four Different Technologies • Eclipse Process Framework Composer (EPFC) / Rational Method Composer • System Dynamics • Object Petri-Nets • Little-JIL Modeling Software Engineering Processes using Eclipse Process Framework Composer (EPFC) / Rational Method Composer (RMC) Molly Phongpaibul, Sue Koolmanojwong March 17, 2008 Who Uses EPFC/RMC? Requires: - Realistic consistency - Viable governance - Improved ROI Produces: - Base methods - Plug ins Process Author Management Provides: - Training - Consulting - Mentoring - Adoption services Service Provider Professional Desires: - Simplicity - Templates - Examples - Guidance Needs: - Teachable material - Teach process development - Use in student projects - Bring research to mainstream Wants to: - Build tools - Sell tools - Sell services Tool Provider Source: www.eclipse.org/epf Process Coach Performs: - Tailoring - Publishing - Support - Training Academia Process Representation Process Elements Representation Form-based Editor One Key advantage: Scalability • Method content repository approximately contains – – – – 100s Work products 30-50 roles 1,000+ tasks Around 100 delivery processes Company Proprietary Extension Commercial Extension Open Source Practices RUP IBM Global Services Other Advantages • Reusability • Compatibility • Universality Modeling System and Software Engineering Processes with System Dynamics Ray Madachy USC Center for Systems and Software Engineering madachy@usc.edu Annual Research Review March 17, 2008 System Dynamics Notation • System represented by x’(t)= f(x,p). • x: vector of levels (state variables), p: set of parameters • Legend: level source/sink rate information link auxiliary variable defects • Example system: defect generation rate undetected defects defect escape rate defect detection rate defect detection efficiency detected defects • Dynamic ODC COQUALMO Portion Portion for Requirements Completeness defects only: Dynamic ODC COQUALMO Sample Outputs • Example of applying increased V&V for Execution Testing and Tools at 18 months: Some Advantages • Rests on established, respected work – Jay Forrester (1950s) • Is a Macro approach • Can address the highest level issues • Yields nice analytic answers System Diagram Modeling Value-Based Process with Object Petri-Nets LiGuo Huang lghuang@engr.smu.edu Department of Computer Science & Engineering Southern Methodist University 03/17/08 18 VBSQA-OPN System Net – VBSQA Process Framework Initiate project (α) Mission objectives and stages Project cost/benefit analysis (β) Terminate or redefine project Launch project Project SCS classes & Business Cases Risk analysis & Architecture/ technology evaluation (δ) Legend (System Net) synchronous transitions status transitions SCS define acceptable&desired values for Q-attributes (γ) SCS: Success Critical Stakeholders Q-: Quality Architecture/ technology combination(s) CAN satisfy all Q-attribute requirements? System top-level design (η) LCO: Life Cycle Objective FRD: Feasibility Rationale Description Architecture/ technology combination(s) CANNOT satisfy all Q-attribute requirements? Identify conflicting Q-attributes & perform tradeoff analysis (ε) SCS adjust acceptable values for Q-attributes (ζ) LCO phase rework or extra work (ι) LCO Fail LCO Pass 03/17/08 Deliverables: System top-level design and FRD LCO Review (θ) (Exit criteria: Provide at least one feasible architecture) 19 VBSQA-OPN Object Nets – Stakeholders’ Process Instances Developer’s Object Net Acquire system upgrade requirements (α) Mission objectives and stages Estimate system upgrade schedule/cost & develop DMR results chain (β) System Acquirer’s Object Net Schedule/cost not accepted Issue project bidding (α) Schedule/cost accepted Project SCS classes & Business Cases Architecture/ technology combination(s) CAN satisfy all Q-attribute requirements? Architecture/ technology combination(s) CANNOT satisfy all Q-attribute requirements? Identify conflicting Q-attributes & perform tradeoff analysis (ε) External prototype evaluation (δ) Peer review system top-level design & other documents LCO Fail Requirement elicitation meeting (γ) Architecture/ technology combination(s) CAN satisfy all Q-attribute requirements? Architecture/ technology combination(s) CANNOT satisfy all Q-attribute requirements? Stakeholder renegotiation (ζ) LCO phase rework or extra work (ι) Identify conflicting Q-attributes & perform tradeoff analysis (ε) 03/17/08 LCO Review (θ) (Exit criteria: Provide at least one feasible architecture) Stakeholder renegotiation (ζ) Deliverables: System top-level design and FRD LCO Fail Deliverables: System top-level design and FRD LCO Pass LCO Pass Launch project Project SCS classes & Business Cases Requirement elicitation meeting (γ) System top-level design (η) Terminate or redefine project System Acquirer Developers External prototype evaluation (δ) Mission objectives and stages Verify system upgrade schedule/cost & DMR results chain (β) Legend (Object Net) object-autonomous transitions synchronous transitions LCO Review (θ) (Exit criteria: Provide at least one feasible architecture) status transitions 20 VBSQA Process Generator – Based on VBSQA-OPN Model Simulation Results VBSQA Process Creator VBSQA Process Checker VBSQA Process Simulator A Mapping between ERP Software Development Activities and VBSQA Process Framework 03/17/08 21 VBSQA Process Simulator – ROI of Synchronous Stakeholder Interaction Activities DIMS Top-Priority Q-attributes: Performance, Evolvability 03/17/08 22 Some Advantages • Petri Nets have interesting well-defined properties • Coordination of different views – Separation of concerns • Graphical notation • Particularly useful for concurrency The Little-JIL Process Definition Language Leon J. Osterweil (ljo@cs.umass.edu) Lab. For Advanced SE Research University of Massachusetts USC Center for Software and Systems Engineering 17 March 2008 The “Step” is the central LittleJIL abstraction Interface Badge (parameters, resources, agent) Prerequisite Badge Postrequisite Badge TheStepName X Handlers Substep sequencing Exception type Artifact flows continuation Trivial Example Elaboration of Design Step Little-JIL Environment Architecture Process Programmer Analyzers Properties Flavers Fault Tree Analyzer Simulator Process Definition Various Editors Resources Definition Artifacts Definition Coordination Structure Resource Repository Artifact Repository User Interface Manager Execution Engine (Juliette) Agenda Manager Agendas Agents Some Advantages • • • • Broad semantics Precise semantics Analysis Growing toolset Phase 2: What to Make of All of This? What to Make of All This? • • • • • Which is good for what? What are we missing? What needs are not covered? Can we compare and contrast? Can we combine best features? A Classification and Comparison Framework for Software Architecture Description Languages by Medvidovic and Taylor as a model? • Comparison of Software Architecture technologies • Technologies are rows • Features are columns • Lots of work to fill in the entries The paper by Taylor and Medvidovic as a model? • Comparison of Software Architecture technologies • Technologies are rows • Features are columns • Lots of work to fill in the entries Can we do something like that for Process modeling technologies? A Possible Approach • What should we be doing? – What goals do stakeholders have? – Columns of a matrix (??) • What are we currently doing? – What do we say our goals are? • What are we really doing? – What do our technologies address? – Rows of a matrix (??) Process Stakeholders • • • • • • • • • • • • Process performer Process engineer Manager Customer End user Educator/trainer Tool provider Researcher Union representative Regulator Standardizer (e.g. OMG) Domain specific stakeholder • …….. MORE? Stakeholder Goals for Process Technology • • • • • • • • • • • • • • • • • • • • Ambiguity tolerance Analysis Automation Compliance Composability Cost effectiveness / save money Coverage Efficiency Evolvability Implementability / doable Interchangability Learnability Maintainability manager’s satisfaction Marketability Minimum cost of the product No job loss Non-interference (with other standards) Optimal time of the product / speed Precision • • • • • • • • • • • • • • • • • • • • Prepare negotiation stance Process analysis Process management Profit Purpose fulfillment Quality Reasoning Reinvention Repeatability Reusability Risk mitigation Satisfiability Satisfy high value stakeholders Scalability Tailorability Teachability Understandability Usability Verifiability/ conformance Work rule MORE Goals Technologies seem to be addressing • • • • • • • • • • • • • • Comprehension Coordination Automation Education and training Continuous improvement Deep understanding Planning and control Reinvention Strategic management Communication Standardization Analysis Risk mitigation Agility These don’t match the previous list very well First Attempt to Structure and Organize the Goals • Survey some example stakeholder communities • Top level goals • Some decomposition into subgoals Goals for Researchers, Educators • Understanding, comprehension – Understanding, comprehension – Education – Training – Dissemination – Radical reinvention • Improvement – Of workforce • More (,) better workers • Better management • Better resource allocation – Of process itself • Faster, better, cheaper – Of product • More better ilities in the product Possible Research roadmap • Refine and organize list of goals • Turn it into a list of desired capabilities – The columns of a matrix • Identify a list of technologies – The four presented here are only a start – Some come from other disciplines • E.g. business process, workflow, service architecture • Study which technologies do what well • Identify gaps in coverage • Suggest syntheses Something for CSSE to Lead?