Service Areas Program Review Template for Academic Year 2013‐2014 

advertisement
 Service Areas Program Review Template for Academic Year 2013‐2014 (fields will expand as you type) Please provide a concise response to all questions, and include relevant details in direct support of your responses. Bulleted lists may be used to clearly organize information. Section 1 ‐ Program Information 1.0 Name of Program: Institutional Research Date: 11/13/2013 1.1 Program Review Authors (include names and campus locations): Angelina Hill 1.2 Program Director Signature: Angelina Hill Date: 11/15/2013 1.3 Vice President Signature: Date: 1.4 Primary Function: Institutional Research (IR) prepares timely reports to outside Federal entities (e.g., IPEDS, Gainful Employment), State entities, and grant/contract agencies. IR provides data to the district from the student information system, surveys, and learning outcomes assessments. IR supports the institution’s planning efforts by aiding in plan development and evaluation. 1.4.1 State briefly how the program functions support the college mission: IR is responsible for housing, analyzing, and reporting information to enhance decision making at the College of the Rewoods. IR provides suppo
rt to academic, administrative, and service areas by working directly with faculty and staff to meet their data needs, and through committee inte
raction. IR assists internal college assessment to improve student learning, inform planning, and promote institutional effectiveness. 1.4.2 Program highlights/accomplishments: IR recently implemented Tableau, a business analytics tool, for visualizing data. The analytic tool reports from SQL snapshots of Datatel. Several standard IR reports (e.g., enrollments, demographics, success, completions) have been redone using tableau. All instructional program review data reporting was also redone using tableau. Reports utilize Tableau Public’s browser‐based analytics, allowing users to create their own reports by sorting and filtering variables of interest (e.g., location, program). cIR‐ProgramReview2013.docx 3/12/2014 Page 1 IR has played a more prominent role in planning. The office created the 2012‐2013 Institutional Effectiveness Report which tracks the progress of annual planning items, and ties them to student achievement, administrative and assessment data in an effort to summarize the impact of the institution. In the absence of an analyst, a database administrator/analyst position was created to assist IT and IR. Paul Chown has transferred into this position. This position is essential to being able to meaningfully get at data from Datatel, as well as collecting data from web applications such as the assessment reporting tool. 1.4.3 Program Data: # of Full Time Employees 2011‐2012 2 # of Part Time Employees 2012‐13 1.5 2011‐2012 Personnel Budget 2012‐13 2011‐2012 $177,314 2012‐13 $119,073 Discretionary Budget 2011‐2012 $1,341 2012‐13 $1,730 Section 2 ‐ Data Analysis 2.0 List Service Area Metrics/Indicators and provide information on changes over time (Steady/Increasing/Decreasing, etc.) 2011/12 2.1 Metrics/Indicators Tickets (fulfilled data requests) 168 cIR‐ProgramReview2013.docx 3/12/2014 2012/13 Observations (steady/increasing/decreasing) 61* Closed tickets in Parature during 12‐13 were lost in conversion to Spiceworks Completed tickets have risen since 2006‐2007: 40 (06‐07), 86 (07‐08), 51 (08‐09), 66 (09‐10), 80 (10‐11), 168 (11‐12), 61*(12‐13). More than 61 tickets were completed that we were not able to track because of ticket system conversion. However, the number of tickets has likely declined because of a dedicated IR analyst. Paul is starting to tackle more IR related tickets in recent months. Page 2 IR Website Hits Planning and Committee Member Survey Ratings to item: Discussions & decisions were data driven and supported by sound evidence Committee self‐evaluation survey ratings to item: Engages in data‐driven decision making 6971 Avg time: 1 min 16 sec 3.98 4757 Avg time: 1 min 28 sec 4.0 October didn’t see the large number of visits it did when Show Cause happened. 2010‐2011 = 3.39 Using scale where 1 = strongly disagree and 5 = strongly agree (higher numbers are better) Using the ACCJC developmental rubrics where (1) is least 1= 2% 1= 0% developed/inconsistent/ad hoc, and (4) is most 2= 19% 2= 56% developed/well defined process 3= 52% 3= 22% 4= 27% 4= 22% 2.2 Describe how these changes affect students and/or the program: The campus continues to show small but steady gains in terms of using data to inform decisions. The goal is to have faculty and staff know how to access and interpret information. The easier it is for employees to find and correctly use information, the more feasible it is to have a campus that makes informed decisions. Lower numbers of tickets could indicate that the campus is aware that the IR department has fewer staff, but it could also indicate that employees are more successful at finding the information themselves. This question will be investigated further. 2.3 Provide any other relevant information, or recent changes, that affect the program: The full‐time research analyst was gone for much of the 2012‐2013 year. Part of the responsibilities of this position was assisted by Database Analyst Paul Chown. The program is slowly learning how to provide the same level of information with fewer human resources by using a more robust/sustainable database infrastructure. 2.4 Student Equity. Please comment on any current outcomes or initiatives related to increasing outreach, retention and student success of underrepresented students in your program. cIR‐ProgramReview2013.docx 3/12/2014 Page 3 IR worked with members of ASCR to develop a survey for multicultural awareness on campus. A major purpose of the survey was to determine student’s awareness of the new Multicultural and Diversity Center, and to find out how students would like to use the center. IR also assisted members of the Veteran’s Resource center in developing an intake survey. Section 3 –Critical Reflection of Assessment Activities (2011/2012) 3.0 Describe Service Area Outcomes Assessed or reviewed in the current cycle: IR evaluated the outcome of providing data reports that are highly relevant to decision making by surveying program review authors in 12‐13 following a major revision of the program review dataset. Authors were asked to evaluate the utility of the data sets after their analysis was complete. Over three‐fourths of respondents indicated that the analysis of program review data was useful in assessing their program. A small number of authors responded to the survey, which was probably because they were surveyed weeks to months after submitting their program reviews. This year, assessment questions about the data sets were imbedded directly into the template in an effort to obtain more information directly following the process. 3.1 Summarize the conclusions drawn from the data and the experience of staff working to achieve the outcomes: The PRC detailed a set of improvements to the data sets for the 2013‐2014 year based on feedback form the committee and survey responses from authors. Several changes were made to the data set using this feedback (e.g., persistence and graduation rates based on course‐taking patterns, five‐year trends provided for all reviews, frequency of completions and majors provided by year). 3.2 Summarize how assessments have led to improvement in Service Area Outcomes (top three.): The impact of the revisions made to the program review data are being evaluated as part of this year’s program review. This information needs to be analyzed before we’ll know if the changes made improvements. 3.3 (Optional) Describe unusual assessment findings/observations that may require further research or institutional support: Section – 4 Evaluation of Previous Plans cIR‐ProgramReview2013.docx 3/12/2014 Page 4 4.1 Describe plans/actions identified in the last program review and their current status. What measurable outcomes were achieved due to actions completed. Action plans may encompass several years; an update on the current status, or whether the plan was discarded and why. Click here to view completed program reviews from last year. Actions Current Status Impact of Action (describe all relevant data used to evaluate the impact) Administer and evaluate the Student Satisfaction Inventory The survey was administered in spring 2013. The survey results have been presented to the Board of Trustees, Expanded Cabinet, Enrollment Management Committee, Student Success Leadership Group, and the Academic Senate. The Student Success Leadership Group developed action items to include in the Enrollment Management Plan based on SSI results. The Enrollment Management Committee reviewed scheduling data in more detail following‐up on the results. Implement query and reporting tool The Tableau analytic tool was implemented in fall 2012. Several standard IR reports and the Program Review datasets have been redone using Tableau. The impact of the new program review datasets is currently being evaluated. Usefulness of the Tableau reports on the IR website needs to be evaluated further. This has been added as an action plan below. Communicate institutional survey results to a broad audience. In addition to presenting the Student Survey results were presented to a large Satisfaction Inventory, results of the completer number of groups in the district. (alumni) survey were shared with the Board of Trustees and several planning committees. Survey data was also shared at Counselor’s day and at CR’s convocation. Work with the Institutional Effectiveness Committee (IEC), promoting consistent evaluation of data In 2012‐2013, the IR Director became the co‐
chair of the IEC. She worked with the IEC to revise the Institutional Effectiveness Scorecard, and to write the Institutional Effectiveness Report, which included a more prominent link to data. IR also created a planning indicator database for the IEC and all planning committees to use in evaluating effectiveness. cIR‐ProgramReview2013.docx 3/12/2014 Committee self‐evaluations showed clear room for improvement in terms of using data to support decisions. We will look for improvement this year, following these changes. Page 5 Include a vetted set of reports on the IR website using faculty and staff input (cont.) Several reports on the IR website were revised The same action plan below, to evaluate the using Tableau, and some of the content was reports on the IR website created using refined. Specifically, feedback about the Tableau, applies here. completion report led to breaking out degrees received by those which are active vs. inactive. 4.2 (If applicable) Describe how funds provided in support of the plan(s) contributed to program improvement: Section – 5 Planning 5.1 Program Plans Based on data analysis, learning outcomes and program indicators, assessment and review, and your critical reflections, describe the actions to be taken for the 2013‐2014 academic year. Use as many rows as you have actions, and add additional rows if you have more than 5 actions. Please number all rows that you add. Please be specific. This section and section 6 should include a detailed justification so that the resource prioritization committees understand your needs and their importance. * Not all actions in this program plan section may require resources, but all resource requests must be linked to this section. 5.1 Program Plans Action # Action to be taken: Relationship to Institutional Plans Include the specific plan and action item relevant to your action to be taken. For example: Annual Plan 2013‐
2014 Theme: Persistence; or List the specific action Goal 1: Student Success: to be taken in enough EP.1.6.2 Develop a plan for narrowing the achievement gap detail so that someone outside of your area can for underrepresented student populations. understand. cIR‐ProgramReview2013.docx 3/12/2014 Expected Impact on Program/Student Learning Describe the expected impact in a way that someone outside the program can understand. The impact should be measurable. Page 6 Relationship to Assessment Include all assessment results that indicate that this action will yield the desired impact on the program. If the assessment has yet to be conducted, explain when and how it will be conducted. Resources Needed (Y/N) A yes here requires a corresponding request in the next section. 1 2013‐2014 Annual Plan EP.3.4.1: Enhance Institutional Evaluate and improve Data Reports, Education Master reports on the IR Plan 3.4 Systematically use data website to inform decision making 2 Administer and evaluate the Employee Satisfaction Inventory Education Master Plan 3.4 Systematically use data to inform decision making 3 Develop web application in place of the overall section report 2013‐2014 Annual Plan EP.3.4.1: Enhance Institutional Data Reports, Education Master Plan 3.4 Systematically use data to inform decision making 4 Join the Voluntary Framework of Accountability (VFA) Education Master Plan 3.4 Systematically use data to inform decision making Assessment tool will be administered to evaluate IR reports, and the reports will Reports on the IR website should be be revised based on accessible and easy to interpret. If successful, it would free up IR staff time the results, leading to for other projects. improved evaluations. Information from this survey would A successful survey allow IR to inform decision makers about administration will be employee attitudes towards mission and evidenced by district decisions being based goals, the work environment, and on the results of the insights into keeping employees survey. satisfied. The overall section Faculty and staff will be able to obtain report currently fails section‐level data more efficiently. to run at least once a Maintaining the report will require less month, causing users IR staff time than maintaining the current access queries which write to to contact us. This is excel. expected to improve. Usefulness of the VFA will be The District will have access to another determined by dataset from which to benchmark. VFA evaluating whether or metrics could be added to the not the metrics are Institutional Effectiveness Scorecard, which is currently limited in the number used to inform decisions. of external benchmarks. 5 6 5.2 Provide any additional information, brief definitions, descriptions, comments, or explanations, if necessary. Section 6 ‐ Resource Requests cIR‐ProgramReview2013.docx 3/12/2014 Page 7 Yes Yes No Yes 6.0 Planning Related, Operational, and Personnel Resource Requests. Requests must be accompanied by an action plan in the above section. Requests should include estimated costs. Submit a support ticket if you do not know the estimated costs. If you are requesting personnel resources, you must also include the “Request for Faculty or Staffing” forms, located at inside.redwoods.edu/program review. Submit one form for each request. Additional Instructions: 


Put down the full amount you are requesting in the “Amount” column. Put down the annual amount of any ongoing or recurring costs in the “Annual Recurring” column. For example, a personnel request for a permanent position might show an Amount of $30,000 and an Annual Recurring Cost of $30,000. A request for equipment might show an Amount of $5,000 and an Annual Recurring cost of $200. A professional development request might show an Amount of $800 and a recurring cost of $0. If you have a grant or some other source of funding, include in the “Request” column a brief description of the source of funds and the dollar amount that is expected to be covered by the other source and if the other source covers any of the annual recurring costs. Note in the “Request” column if this is a repeat request, and how many times you have submitted this request. The item number must match the corresponding action # from section 5. Add rows as necessary. Type of Request (Check One) Operational Personnel Professional Planning To be Development reviewed by Prioritization To be Contact Committees To be To be reviewed by Person of the reviewed reviewed by the Request Action # $ Annual (Name, Budget and grouped Faculty Professional Describe your request here in a way that someone $ use # Recurring email, Planning by Associate Prioritization Development outside the program can understand. Amount
above Costs phone) Committee Deans. Committee. Committee Tableau Premium allows data reports to be stored in the cloud and accessed using Tableau’s server. Allows for interactive reporting in a non‐
public environment where student information Angelin
can be protected. x $4500 a Hill 1 Angelin
2 Noel‐Levity Employee Satisfaction Survey x x $3500 a Hill Membership to the Voluntary Framework of Angelin
4 Accountability x $2000 a Hill cIR‐ProgramReview2013.docx 3/12/2014 Page 8 Section 7‐Author Feedback Provide any constructive feedback about how this template or datasets could be improved. The template did not seem suitable for administrative services. I suggest changing outcomes to objectives, and relying on the indicators used in the data section as the major source of assessment information. How much do you agree with the following statements? (mark your choice with an x )
Strongly Somewhat
Somewhat Strongly
Neutral
Agree
Agree
Disagree Disagree
This year’s program review was valuable
[ x]
[]
[]
[]
[]
in planning for the ongoing improvement
of my program.
Analysis of the program review data was
useful in assessing my program.
[]
[ x]
[]
[]
[]
Section 8‐ PRC Response by section (completed by PRC after reviewing the program review) 8.0 The response will be forwarded to the author and the supervising Director and Vice President: S.1. Program Information: Completed S.2. Data Analysis: Completed. Thorough and informative. Shows improvements. S.3. Critical Reflection of Assessment Activities: Completed. Section 3.2. How assessments have led to improvements…is still in progress for program review data sets and will be completed at the end of this academic year. S.4. Previous Plan. Completed. Thorough. S.5. Planning: Completed. Well done. Resources request are noted in section 6 and show linkages. cIR‐ProgramReview2013.docx 3/12/2014 Page 9 S.6. Resource Requests: cIR‐ProgramReview2013.docx 3/12/2014 Page 10 
Download