National R&D Program Evaluation System in Korea

advertisement
IN-DEPTH EVALUATION OF R&D
PROGRAM IN KOREA
Seung Jun Yoo, Boo-jong Gill, Woo Chul Chai
Contents
1. Overview of Public R&D Program Evaluation
2. Objective
3. Procedure
4. Methodologies
5. Utilization of Evaluation Results
6. Challenges and Discussion
2
Overview of Public R&D Program Evaluation 1
- Players
NSTC
Report
MOSF
KISTEP
(Evaluator)
Evaluation Supporting
Groups
R&D
programs
of each
ministry
MEST
MOE
Taking Charge of R&D Evaluation
and Budget Allocation
MKE
MIFAFF
MW
……
Agency
Agency
Agency
Agency
Agency
3 *NSTC (National Science & Technology Council), MOSF(Ministry of Strategy and Finance)
Overview of Public R&D Program Evaluation 2
- R&D program management process
Evaluation strategy & Data collection
R&D Budget
Survey/Analysis
Programs/Projects
implemented
Evaluation
In-depth
Self → Meta
Corrections*
Feedback
4
Recommendations to
program ministries
Evaluation process
Correction process
Overview of Public R&D Program Evaluation 3
- Overview of Public Finance Program
General Public Finance Programs
• SOC, Agriculture, Health/Welfare, National Defense,
Industry/Energy, etc.
Informatization Programs
• Information Infrastructure
• Information Utilization & Impact
R&D Programs
• Science and Technology
• R&D Human Resources Development, Infrastructure, etc.
5
6
Overview of Public R&D Program Evaluation 4
- In-depth evaluation & Self/Meta evaluation
In-depth Evaluation
R&D Program Evaluation
- ~ 10 programs with evaluation issues
- logic model, evaluation questions, indepth analysis, communications,
recommendations, coordination, etc.
- depends on evaluation questions
Self → Meta Evaluation
7
- 1/3 out of all programs
(70 programs out of 207 programs, ‘09)
- self evaluation by each ministry
meta evaluation by MOSF/KISTEP
- depend on indicators with weight
Objective of the Evaluation
• To increase the efficiency and effectiveness
: find out and diagnose the problems at all aspects
: improve the program by applying evaluation results
8
Procedure 1
- 4(5) steps
1. Evaluation
Strategy
0. Selecting target
program
2. Analyze &
Interpret
3. Results
coordination
4. Utilization
of the results
9
• logic model, evaluation questions
• data collection plan, methodologies
• collect data
• data analysis & interpret
• interim & final results
• interview & coordination
• 4 types
• management action plan
Procedure 2
- 7-month schedule (depends on the cases)
- (month 0) : selected by selection committee based on special issues, etc.
¶ In-depth evaluation procedure for selected program(s)
- month 1 : form evaluation group, gather program(s) data,
study target R&D program(s), find major evaluation questions
- month 2 : develop logic model and methodologies
- month 3/4 : perform in-depth analysis (relevance, efficiency, effectiveness,
program design & delivery, etc)
10
Procedure 3
- month 5 : interview (researchers, program managers, etc.)
- month 5 : report interim evaluation result (MOSF, ministries)
- month 6 : report final evaluation result & recommendations
Large program : ~ 10 months
Specific needs for short-term
: 2 ~ 3 months (specify the needs → perform evaluation)
11
Methodologies
Types of
Evaluation
Satisfaction
Importance-Performance Analysis (IPA)
Qualities (Paper,
Patent, etc.)
Paper : Impact Factor, Times Cited, etc.
Patent : Technology Transfer, Value, Citation, Survival
Analysis, etc.
Policy, Strategy
System Dynamics, etc.
Portfolio
(Investment, etc. )
12
Methods
BCG Growth Share Matrix, GE/McKinsey Matrix, etc.
Methodologies_IPA
Performance (high)
Performance (low)
Concentrate Here! Keep Up Good Work!
Low Priority!
Importance (low)
Possible Overkill!
Importance (high)
Source: Martilla, J. & James, J. C.(1977), Importance-perfomance analysis. Journal of Marketing, 41(1) : 78
13
Methodologies_Survival Analysis
Survival Analysis
Idea/Pilot Study → Project → Paper ↔ Patent → Technology Transfer
Following Study
Modified from: DeVol, R. & Bedroussian, A.(2006), Mind to Market: A Global Analysis of University Biotechnology
Transfer and Commercialization. Milken Institute
14
Methodologies_System Dynamics
Source: Ahn, N. (1999), A system dynamics model of a large R&D program, MIT Press
Yoo, S. et al. (2009), In-depth Evaluation of Health & Medical R&D Program in Korea, KISTEP
15
Methodologies_Portfolio Analysis
16
Utilization of Evaluation Results
Types of
Correction
17
Management Action
Improve Program
Delivery System
Correct the inefficient system in program delivery to
achieve the goal(s) of the program more efficiently
Coordinate Budget
Allocation
Increase or cut the budget corresponding to
insufficient or unnecessary investment, respectively
to achieve the goal(s) of the program more efficiently
Improve Research
Environment
Improve regulation or act to achieve the goal(s) of the
program more efficiently
Consult Program
Planning
Provide better ideas or guidelines to plan new or
following programs
Challenges and Discussion
• Evaluation as R&D Management Tool
Self/Meta
In-depth
18
Measure the
achievement
according to
performance plan
Diagnose problems and
correct to improve the
efficiency/effectiveness
• Efficient budgeting
• Improve goal achievement
Efficient R&D
Management
Challenges and Discussion
• UCI concept among stakeholders
: Understanding – Change - Improvement
is important for raising the accountability
(responsibility + acceptability) of evaluation
: Understanding = communication with the facts
: Change = 4 types of corrections
: Improvement = efficiency & effectiveness
19
Thank you!
Seung Jun Yoo, PhD
biojun@kistep.re.kr
www.kistep.re.kr
Download