3-D Management (Data Driven Decisions)

advertisement

3

D

 

Management

(Data

 

Driven

 

Decisions)

Tama   S.

  Celi,   Ph.D.

Statistical   Analysis   &   Forecast   Unit  

Manager

What

 

is

 

managing

 

by

 

data?

• Process   by   which   employees   of   the   agency   can   use   data   in   order   to   make   decisions,   address   issues,   examine   concerns,   and   solve   problems

• Enhances   process   we   already   have   in   place   to   manage   performance  

• Reinforces   learning   organization   model   and   evidence   based   practices

2

Why

 

manage

 

by

 

data?

• Improve   performance  

• Effective   implementation   (fidelity)

• Process   enhances   learning   and   commitment   to   the   issue  

• Uses   strategies   that   work   and   allows   for   adjustments   when   necessary

• Decisions   based   on   up ‐ to ‐ date   information

• Decisions   made   in   a   timely   manner

• Focus   on   solutions  

3

Who

 

can

 

use

 

this

 

process

• Employees   at   every   level   of   the   organization   may   encounter   an   issue   that   can   be   addressed   through   this   process  

4

VADOC

 

3

D

 

Model

Issue  

Identification

Evaluation

Evidence   Based  

Practices

Measurement  

Design

Implementation

Data  

Analysis

Action   Plan

5

Getting

 

Started:

 

Issue

 

Identification

When

 

considering

 

an

 

issue

 

using

 

3

DM,

 

you

 

should

 

be

 

able

 

to

 

answer

 

“yes”

 

to

 

all

 

of

 

the

 

following:

– Is   the   issue   aligned   with   DOC   missions   and   goals?

 

– Does   the   issue   have   a   clear   definition   that   is   easy   to   understand?

– Does   the   agency   have   authority   over   the   issue?

– Have   all   of   the   stakeholders   provided   input?

6

Measurement

 

Design

• Measurement   should   be   carefully   aligned   with   the   issue   you   want   to   address:

– What   is   your   goal?

 

– What   measurement   will   provide   evidence   as   to   whether   you   have   accomplished   your   goal?

• Outputs   and   Outcomes   are   very   different   – use   caution

– Outputs   measure   the   process;   how   many   did   you   do?

 

How   long   did   it   take?

– Outcomes   measure   results;   did   you   accomplish   your   goal?

7

Measurement

 

Design

 

Resources

• Is   the   issue   already   being   measured?

– Look   to   readily   available   information  

– Examples   of   readily   available   information   include   reports   on   DOC   website,   reports   on   Statistical   Analysis   &   Forecast   Unit   internal   website  

( http://idoc/operations/statistical ‐ analysis ‐ and ‐ forecast ‐ unit.aspx

),   Operations  

Efficiency   Measures,   CORIS   reports,   and   agency   presentations

• Is   the   data   readily   available   to   measure?

– Considerable   data   is   entered   routinely   into   databases   such   as   VACORIS.

   Any   data   entered   into   a   database   can   potentially   be   used   for   measurement   and   analysis

– Always   check   on   the   availability   of   data   extracts   when   developing   measurement   strategies

• Does   additional   data   need   to   be   collected?

 

– Data   collection   is   labor   and   resource   intensive

– Data   needs   to   be   collected   in   a   format   that   is   analyzable

• Can   it   be   measured   at   a   frequency   that   allows   us   to   adequately   solve   problems   and   make   mid ‐ course   adjustments?

8

Measurement

 

Design

• Is   data   quality   adequate   for   analysis?

– Reliability:   consistency   of   the   measure.

  Be   cautious   of   using   data   that   contains   missing   values   or   lacks   consistency   in   how   it   is   defined,   collected   or   entered   into   the   database

– Validity:   the   extent   to   which   a   measurement   measures   what   it   is   supposed   to

9

Data

 

Analysis

• Methods   used   for   analysis   will   vary   depending   on   the   data   you   are   using   and   the   question   you   are   answering

• Correlation   does   not   mean   causation

• Consider   alternative   explanations

• For   advice   and   guidance,   help   is   available   from   the   SAF  

Unit

10

Action

 

Plan

• What   actions   are   to   be   taken  

• Who   will   be   responsible   for   implementation

• When   will   it   be   done   (schedule)

• Where   will   it   be   implemented

• How   will   it   be   implemented

• Desired   goals   (outcomes)   defined

• Measurement   is   part   of   the   plan

– Process

During   implementation – Fidelity

– Outcome  

After   implementation

• The   plan   should   be   ambitious,   yet   realistic

11

Implementation

• Actions   are   implemented   according   to   the   plan

• Process   measures   are   in   place   at   the   start   of   implementation   and   monitored   throughout   the   process   to   assure   that   the   action   items   are   being   done   according   to   schedule

• Fidelity   is   monitored   to   assure   that   action   items   are   being   done   correctly

• If   appropriate,   data   for   outcomes   are   collected   throughout   implementation

12

Evaluation

• Outcomes   are   tied   directly   to   the   goals   of   the  

Action   Plan

• The   result   of   evaluation   is   a   recommendation:

– Continue   with   action   plan

– Modify   action   plan

– Start   new   action   plan

– Investigate   any   new   issues   identified

13

Issue  

Identification

Example:

 

Identifying

 

Reentry

 

Program

 

Gaps

Recidivism   Reduction   Plan   Goal   2:    Increase   operational   support   and   infrastructure   needed   to   ensure   reentry   services   to   all   incarcerated   offenders

– Is   the   issue   aligned   with   DOC   missions   and   goals?

 

 Yes,   this   issue   is   aligned   with   DOC   mission   and   goals

– Does   the   issue   have   a   clear   definition   that   is   easy   to   understand?

 Yes,   the   goal   is   to   identify   those   who   were   not   receiving   reentry   programing   prior   to   release   from   incarceration

– Does   the   agency   have   authority   over   the   issue?

 Yes

– Have   all   of   the   stakeholders   provided   input?

 This   goal   was   developed   through   the   Recidivism   Reduction   Task   force   and   an   action   team   was   developed   to   involve   Reentry   &  

Programming,   Central   Classification   Services,   and   the   Statistical  

Analysis   &   Forecast   Units

14

Measurement  

Design

Measurement

 

Design

• Goal   Alignment

– What   is   your   goal?

 

 Identification   of   those   who   did   not   receive   reentry   programing   prior   to   release   from   incarceration   so   that   strategies   can   be   developed   to   reduce   gaps   in   the   future

– What   measurement   will   provide   evidence   that   you   have   accomplished   your   goal?

 Reduction   in   number   and   percent   of   releases   who   do   not   receive   reentry   programming   over   time

• Outputs

 Examination   of   the   process   by   which   offenders   get   into  

Reentry   Programs

• Outcomes

 Reduction   in   number   and   percent   of   releases   who   do   not   receive   reentry   programming   over   time

15

Measurement  

Design

Measurement

 

Design

 

Resources

• Information   already   available ?

 No   existing   reports   are   available   on   this   topic

• Data   available ?

 Yes,   in   CORIS

• Additional   data   needed ?

 TBD   through   analysis   process

• Timeliness   of   data   that   is   available ?

 Focus   on   recent   time   period

16

Measurement  

Design

Measurement

 

Design

 

Quality

• Is   the   data   quality   adequate   for   analysis?

– Reliability:   consistency   of   the   measure

 Select   recent   6   month   period   because   programming   data   should   be   reliably   entered   during   that   period.

  (SR   Releases   from   October   2013   through   March  

2014)   

 If   testing   reveals   it   is   not,   improving   the   quality   of   these   data   will   become   the   next   step

– Validity:   identifying   the   target   group

 Reentry   &   Programs   identified   the   program   names   that   would   be   considered  

 Extensive   collaboration   and   testing   between   the   Reentry   and   SAF   Units   to   identify   and   determine   what   constituted   an   offender   who   received   Reentry  

Programming

17

Data  

Analysis

Choice

 

of

 

Methods

• Identifying   Releases   that   participated   in   Reentry  

Programming

 State   Responsible   Releases   identified

 Determined   who   received   Reentry   Programming   by   completer/non ‐ completer

 Determined   which   were   in   DOC   facilities   during   term

 Determined   completer/non ‐ completer/non ‐ participant   by   LOS,   bed   type,   release   type,   restricted   housing,   and   other   possible   factors  

• Examination   of   the   process

 Worked   with   Central   Classification   Services   to   determine   key   dates

18

Data  

Analysis

Results

 

of

 

Analysis:

 

Gaps

 

Identified

• Gaps   Identified

– 42%   of   SR   Releases   during   that   period   never   entered   a   DOC   facility   and,   therefore,   did   not   receive   VADOC   Reentry   programming   pre ‐ release  

• Recidivism   Reduction   Plan   Goal   #2,   Objective   #1,   Strategy   A   &   B

– Of   the   live   releases   from   DOC   facilities;   95%   spent   time   in   Reentry  

Programming   (55%   of   Total   SR   Releases)

– Program   Participants

• 55%   completed

• 27%   removed

• 18%   listed   as   enrolled   or   pending   at   time   of   release

• Further   investigation   needed   on   non ‐ participants   and   non ‐ completers  

– Recidivism   Reduction   Plan   Goal   #2,   Objective   #1,   Strategy   A   &   B

19

Action   Plan

Actions

 

to

 

be

 

taken

• Referral   Process

– Review   extreme   outliers   and   negative   values   for   timestamp/sequencing   issues

• Who:   Statistical   Analysis   &   Forecast   and   Central  

Classification   Services   Staff

• When:   October   2014  ‐ January   2015

• Determine   cause   of   outliers   and   future   approach

• Determine   cause   of   sequencing   issues   and   future   approach

• Goal:   Identify   any   impediments   in   the   referral   process   to  

Reentry   Program   participation   and   formulate   process   that   will   increase   participation

20

Action   Plan

Actions

 

to

 

be

 

taken

• Identify   reasons   for   non ‐ participation   on   a   case ‐ by ‐ case   basis

– Who:   Reentry   &   Programs   Staff

– When:   October   2014  ‐ March   2015

– Policy/Procedure   change   recommendations

– Desired   Goals   – increased   participation   in   reentry   programming

• Recidivism   Reduction   Plan   Goal   #2,   Objective   #1,  

Strategy   A   &   B

21

Action   Plan

Actions

 

to

 

be

 

taken

• Identify   reasons   for   non ‐ completion

– Reentry   &   Programs   Staff

• Data   entry   issues   (still   enrolled)

• Look   at   current   policies   for   non ‐ completers   (removals)

– Statistical   Analysis   &   Forecast   Staff

• Charged   &   moved   (119s)

– October   2014   – March   2015

– Policy/Procedure   change   recommendations

– Desired   Goals   – increased   program   completion

• Recidivism   Reduction   Plan   Goal   #2,   Objective   #1,   Strategy   A  

&   B

22

Implementation

Implementation

• Action   Items   are   determined

• Responsible   parties   are   determined

• Timeline   is   established

• Goals   are   established

• Group   will   reconvene   April   2015   to   review   results   of   Action   Plan  

23

Evaluation

Evaluation

• Once   action   plan   is   completed   and   results   are   in   place,   the   analysis   will   be   repeated   to   track   progress

• Items   that   have   resulted   in   the   desired   result  

(increased   Reentry   Program   participation)   will   continue

• Items   that   have   not   resulted   in   desired   goal   will   be   re ‐ addressed

• Additional   items   not   addressed   in   the   current  

Action   Plan   will   be   addressed

24

Moving

 

Forward

• Increased   emphasis   on   Data   Driven   Decisions

• Everybody   can   use   the   3 ‐ D   model

• All   agency   initiatives   will   include   outcome   evaluations

• 3 ‐ D   Model   is   being   presented   at   various   levels   of   the   agency

25

Contact

Dr.

  Tama   Celi,   Manager

Statistical   Analysis   and   Forecast   Unit

Virginia   Department   of   Corrections tama.celi@vadoc.virginia.gov

804 ‐ 887 ‐ 8248

26

Download