Minutes ‐ DRC Meeting: October 15, 2015; 9:30 am

advertisement
Minutes ‐ DRC Meeting: October 15, 2015; 9:30 am
Present: Emil Mubarakshin (C), Rene Marquez (T), Rhea Estoya (H), Moon Ko (D), Sarah Crespo (D), Joan Lang (H),
Laura Cruz‐Atrian (T), Rebecca Tillberg (W), Maury Pearl (D), Bryan Ventura (E), Cathy Jin (E), Edward Pai (H), and
Anna Badalyan (T).
Over the phone: Oleg Bespalov (P), Ani Zarpas (V), Michelle Fowles (V), and Agyeman Boateng (W)
Anna chaired the meeting.
Agenda items:
1. Agenda, added to the items to the floor section:
 Title 9 (Rhea)
 EMSI (Maury)
2. Review 9/17/2015 minutes (all)
 Consensus on last meeting's minutes.
3. Accreditation (all)
 Report Out on Accreditation Visit (Anna)
i. Visit focus depends on the visit Chair.
ii. Outcome depends on what the college people believe – everybody must believe they are doing something
good.
iii. SLOs and Assessments are very important.
iv. Annual report is important; the numbers in the report must match Accreditation report.
v. It is important to know what you are doing (e.g., who did the calculations and what those calculations are).
vi. Show that you are evaluating and re‐evaluating your processes.
vii. Explain that you are in transition when you are in transition (e.g., in the process of putting everything in
eLumen). Explain where you are and what are you doing.
 Michelle's visit (V)
i. Respond to the standard items using the same language.
ii. Use MIS and Scorecard, and local data; document the analysis showing examples of closing the loops. Show
the full cycle, and explain where you are.
iii. Issues with data sources: Reliance on external data without a good understanding of what the data shows.
iv. Document methodology and processes.
v. People need access to assessment examples – if using eLumen, people need to be prepared to show/access
the assessment data.
 QFE (all)
i. Anna: Colleges that have a plan to close known issues will be in a better stand during the Accreditation visit.
ii. Joan: Master plan already implemented, and QFE needs to be connected to it. Anna: it could be a subset
(e.g., improve the quality of assessment and we are focusing on that).
 Data
i. Anna showed and explained Trade's introduction draft, including: geographical service area, service area
demographics (same as the equity report), education attainment, occupations (EMSI), FTES, special
populations, etc. Anna will send a copy of the draft to the DRC.
ii. The report shows the targets and not the institutional set standards.
iii. Pai: is the data reliable? Ani: each college needs to decide how to do it. Gender almost perfect; age group
slight differences; BOGG, Pell, BOGG and Pell – Ani will send checked fields.
iv. Emil will send info on recommended report.
v. Maury: what about the jump in Trade's assessment results? Anna: Trade can explain that, now everybody is
being assessed with Accuplacer.
vi. Anna: You don't have to disaggregate everything.
vii. Anna: DataMart use for Equity report, so we use it for this report too.
viii. Bryan: Age/Gender grouping show more gaps. Anna: We reported that before and were asked not to do so
again because it is confusing.
4.
5.
6.
7.
ix. Anna: Licensing data is from the college annual report.
x. Anna: We don't count the "grandfathered" students in our analysis. Different report counts contact for
"grandfathered" students.
xi. Ed: why the "3347" are not included in the analysis? (LATTC's Accreditation Introduction Draft / Matriculation
table / Total of Target: Entering) Anna: Many of those have been "grandfathered" and "Cum." shows the work
that has been done, and "Num Cum." is the work to be done. Again, each college has to decide how to
approach this. Ed: I would use a fourth column to show the "exempts." Emil: Adding a table that shows the
"exempts" may better explain this section.
xii. Ed: what to do with non‐performing programs? Anna: Need to provide an explanation of why the program is
in that situation.
xiii. Ed: we need to show "how" we are using the data – in terms of "part of the Program Review process."
Essentially, we are providing an introduction to I B.
xiv. Anna: Any good things need to be included in the self‐study.
xv. Anna: Standard for completions is dynamic in Trade because completions are directly related to incoming Ns.
xvi. Anna: For job placement, we use 80% of the minimum of last 5 years for the specific program – moving
standard too.
xvii. Anna: For Trade, this is the year to review the standard for course completion, which is 70% at this time.
Basic Skills need more attention, compared to Vocational or Transferable.
xviii. Ed: DE was the area with issues in our case. My interpretation is that ACCJC wants the college to identify a
standard that is applied to the programs.
xix. Bryan: Departments are the "experts" and have a better understanding of what is possible.
Future Board Presentations: College Effectiveness Reports; Student Equity Plans (Maury)
 Maury shared the draft calendar for the Institutional Effectiveness Success Committee Calendar.
 Reports are schedule for April.
 Stan is working on data set, almost complete.
 2014 student survey data – some of the questions changed, we are considering recalculating measures. Stan has
a mapping of what questions were used and which ones are no longer available.
 Maury will follow up with the details later – needed about 2 weeks before the deadline.
 Ed: How is the effectiveness report included in the self‐evaluation? Anna: Trade included it. Maury: part of the
draft. DO using the colleges' report and it is good evidence.
 Maury: A better unified strategic plan would make the presentation superfluous as Oleg said, but we are not
there yet. Anna: We cannot just drop the report, it has to be incorporated somewhere; maybe not done every
year. Maury: more integration is recommended in the standard.
 Maury: 2‐hour Board mock visits to colleges planned. Drafts should be ready for visits.
 Maury: Anything going out to the Board, needs to follow the 14‐day rule.
 Maury: The Board expects something very close to the final.
 Maury: All new members in the committee.
Gainful Employment Update (Maury)
 Stan has been doing all the data work and working with the college Financial Aid administrators.
 09‐14 done back in July.
 Various issues came out. However, there were simple issues (e.g., coding mismatching – CIP codes).
 Anna: Colleges need to make sure the CIP codes are correct (Z Class Instruct Programs table).
IPEDS Update (Maury)
 Lock day: 10/14
 Everything requires an explanation, DO will take care of it unless it needs college intervention.
SSSP Reports Update (will plan to send out methodology for reports) (Maury)
 Maury shared the SSSP Contacts Report.
 There is a question on how to report exemptions, currently done by the colleges. If a student is exempt at
Valley, the DO codes that student as exempt at Valley only.
 Anna: the code is in the student table – if the student is exempt in one college, it impacts all colleges. Maury: the
DO exemption is in the college table. Anna: We don't know how DO is calculating that when populating the
student (matriculation) table.
 Ani: Have found inaccuracies in the data; don't know if the dates are when the exemption was done or when
loaded to the database.

8.
9.
10.
11.
12.
Coordinators got "their" formulas from the state but nothing matches the DataMart. A third set of number that
could be used for SSSP appropriation. Maury got the funding formula.
 New "grandfathered" students are new to a college, but not new to the district.
2015 Scorecard Update (Maury)
 Presented to state – minutes reflect that; we'll be in compliance.
 Board is thinking on setting targets.
 We do not use the state peer grouping.
NCES 2015‐2016 National Postsecondary Student Aid Study (Oleg)
 Pierce invited to NCES. No other district college has been invited.
 Harbor has participated for the past 3 years.
OAC/contact reports and the FTES calculations use by DO (Ani)
 Under initial assessment (APMS), the numbers match.
 Maury/Anna: Data is based on the student table and see the related codes.
 Maury: Credit students enrolled in SEMC; most likely enrolled.
 Maury: Contact – service contact table – use the days of month; bottom, year‐to‐date with that Sem code.
 Anna: Some contact is done during the summer semester but for the following fall.
 Maury: Some departments are confused on how enter date/semester data. Term/date should coincide when
the service is provided.
 Everybody in the enrollment table is included in the OAC report (any enrollment – including those who dropped
before the drop deadline).
 Bryan: Date of service is system time stamped.
 Ani: How and when are FTES calculations done? Maury: Check the "run" dates. Calculate estimate from
FTE/RDB.
 Joan: There is a definition on how to round up FTES numbers for calculations.
 Anna recreated Ani's method in Brio to compare to Sarah's. Both methods resulted in the same numbers.
Items from the floor (all)
 Title 9 compliance – Form R4
i. Where can we get data (for one academic year)?
ii. Maury to check where data is (Stud CCCapply table – field: ??).
 EMSI
i. Service was not discontinued.
ii. Contract includes 2 certifications per college.
iii. LATTC uses EMSI's Career Coach – Everybody is invited to check it out: lattc.emsicareercoach.com
Adjourn
 Meeting adjourned at 12:43 p.m.
Download