The Impact of Educators` Training in Photovoltaic Solar Energy in

advertisement
Paper ID #11397
The Impact of Educators’ Training in Photovoltaic Solar Energy in Developing Countries
Dr. Rim Razzouk, Arizona State University
Rim Razzouk is a Senior Instructional Designer at Arizona State University’s Ira Fulton School of Engineering. In her current position, Rim leads the curriculum development and the assessment and evaluation
processes for the VOCTEC (Vocational Training and Education for Clean Energy) project. She coordinates the production of instructional materials with subject matter experts. Rim is also responsible for the
data analyses and the write up of research reports for the purpose of continuous curriculum improvement.
Rim has a PhD in Instructional Systems/Educational Technology from the Florida State University (FSU).
Rim also holds a M.Sc degree in Instructional Systems and a Certificate in Human Performance Technology from FSU, and a B.Sc in Information Technology from Notre Dame University. Rim’s major project
and research interests include technology integration in education; assessment and evaluation; learnercentered methods and strategies; and any other methods that assist in enhancing human performance and
learning improvement. Rim has authored and co-authored several published articles in peer-reviewed
journals, and conferences proceedings.
Prof. Anshuman Razdan, Arizona State University
Anshuman Razdan is Professor in the Ira A. Fulton Schools of Engineering in the School of Computing,
Informatics and Decision Systems Engineering (CIDSE). Dr. Razdan has a BS and MS in Mechanical
Engineering and PhD in Computer Science. He has been a pioneer in computing based interdisciplinary
collaboration and research at ASU. He leads the Image and 3D Exploitation and Analysis (I3DEA) lab
(http://i3dea.asu.edu) He is the Principal Investigator and a collaborator on several federal grants from
agencies including NSF, NGA and NIH and DHS, US Army, USAID, and Science Foundation of Arizona.
He has led or participated in over $25Million grants in his career. Anshuman has published extensively in
refereed journals and conferences and is sought as an invited speaker for many technical and non-technical
forums. He has mentored over 30 Masters, PhDs and Post Docs. Anshuman works with industry and
global organizations and has extensive experience negotiating contracts and executing projects globally
such as Pacific Islands, Africa, Asia and the Caribbean.
Dr. Ambika Prasad Adhikari, Arizona State University
Ambika P. Adhikari is Program Manager (Research) at the Office of Knowledge Enterprise and
Develop-ment at Arizona State University (ASU). At ASU, he is also a Research Professor (affiliate
faculty) at the School of Geographical Sciences and Urban Planning, and Sr. Sustainability Scientist at
the Julie Ann Wrigley Global Institute of Sustainability. Ambika was Sr. Planner and Impact Fees
Administrator at SRPMIC, Scottsdale, Arizona, and a Village Planner and Project Manager at City of
Phoenix. He was the Nepal Country Representative of the Switzerland based IUCN – International
Union for Conservation of Nature. Earlier, he was a Senior Director at DPRA Inc. in Toronto and
Washington DC. In Nepal, Ambika was an Associate Professor of Architecture and Planning at
Tribhuvan University. He was a member of the Government of Nepal’s National Water and Energy
Commission – the highest policy making body in this sector. He is a Fellow of the American Society of
Nepalese Engineers (ASNE).
c
American
Society for Engineering Education, 2015
The Impact of Educators’ Training in Photovoltaic Solar Energy in
Developing Countries
Rim Razzouk, Anshuman Razdan, and Ambika P. Adhikari
Arizona State University, 7001 East Williams Field Road, Mesa, AZ 85212
Abstract
The Vocational Training and Education for Clean Energy (VOCTEC) program, at Arizona State
University (ASU), delivers training workshops to support the global objectives of sustainability
and security of energy supply in developing countries through educating, training, and preparing
the people to use their energy resources to enhance their quality of lives. In 2011, VOCTEC
received an award from the United States Agency for International Development (USAID) for
creating and delivering a long-term vocational education and training in solar photovoltaics (PV)
energy systems in the Pacific islands and Africa. Through this paper we report the effectiveness
of three train-the-trainer (educators) vocational PV trainings that were delivered by the VOCTEC
program in Fiji (2013 and 2014), and Kenya in 2014. The expectation by the end of each training
was that the educators (trainees) would show an increase in learning outcomes (knowledge and
skills acquisition), and demonstrate an enhanced ability to conduct future technician/workforce
trainings on solar PV in their respective countries and communities. A total of forty two
participants in Fiji and Kenya, selected from different institutions, attended the training
workshops. They engaged in a 10 day long program that comprised an array of training modules
on basic and advanced technical topics (e.g., installation of solar PV system), hands-on exercises,
non-technical (e.g., gender inclusion), and educational games to reinforce specific concepts
taught in the training. The process of curriculum development was based on specific set of
learning objectives, which motivated the development of the assessments. A framework based on
Kirkpatrick’s evaluation model was used for the assessment and evaluation of the training
intervention. This framework consists of four different focus areas: 1) reaction assessment:
measures the participants’ perception of and satisfaction with the design of the training program
and delivery of the content; 2) learning assessments: measures the extent to which the
participants acquired new knowledge and skills from the training; 3) behavior evaluation:
measures the participants’ ability to apply the newly learned knowledge and skills; and 4)
impact: measures the long-term effect of the training intervention on the educators’ knowledge
and skill acquisition within 6 months of the initial training. The data used to assess the first three
areas was collected via ten different assessment instruments administered at various times during
each workshop. Results from the data analysis indicate a high degree of participant satisfaction
with the training workshops. In terms of learning, results show significant increase from pre- to
post-assessments in all content areas. The performance measures for the hands-on exercises, and
participants’ impression of their learning, triangulate the data and support this finding. Regarding
the behavior measure, the participants’ perception about their preparedness and confidence in
their abilities to train technicians were also high. As of now, the long-term impact measures were
collected for only the first two training (Fiji,2013 and 2014), and results show that educators’
knowledge and skill acquisition were maintained even after 6 months of their training. The data
for the long-term impact of the third training is being collected/analyzed. Overall, despite certain
challenges, which will be discussed in the paper, the trainings were effective as evident from the
results. Feedback and insights gained from the trainees allow us to continuously improve future
trainings and the VOCTEC program.
1. Introduction
Renewable energy (e.g., wind, solar, hydro) can support in the economic development
efforts of developing countries, many of which are geographically well-placed to exploit the
renewable energy potential. For example, many Pacific Island nations and those in Africa and
Asia have significant potential renewable energy resources, which are under-exploited.
Additionally, these regions are an important hub for the renewable energy practitioners and
interested aid organizations, as their dependency on fossil fuel import has exacted heavy
economic and environmental costs [4]. However, these developing countries face a number of
barriers to clean energy development including limited financial resources, inadequate local
human capacity to support systems, high turnover of trained persons within the population, and a
lack of the standardized training for technicians, operators, and engineers [3][2].
Donor agencies continue to invest in solar photovoltaics (PV) and renewable energy
technologies within the Pacific Islands region and Africa. At present, donors are generally poorly
coordinated, and often focus only on technology provision with almost no efforts directed toward
capacity building [1]. Based on the limited initiatives that have been undertaken in these regions,
building capacity in renewable energy technologies could contribute significantly to the
development of the energy sector in the Pacific Islands and Africa.
The Vocational Training and Education for Clean Energy (VOCTEC) program, at
Arizona State University (ASU), delivers training workshops to support the global objectives of
sustainability and security of energy supply in developing countries through educating, training,
and preparing the people to use their energy resources to enhance their quality of lives. In 2011,
VOCTEC received a 5-year Leader with Associates (LWA) award from the United States
Agency for International Development (USAID) for creating and delivering a long-term
vocational education and training in solar photovoltaics (PV), micro-hydro and small wind
energy systems that will strengthen local capacity (for both men and women) to design, install,
operate, maintain, and repair solar PV energy equipment in the Pacific islands and Africa. In
2012, VOCTEC obtained an Associate Award from USAID Manila for providing solar PV
training in 12 Pacific countries.
Toward the fulfillment of the award obligations, the ASU VOCTEC team designed and
developed curricular materials for training both PV technicians and educators. The training of
educators is one of the key mechanisms for ensuring that there will be a sustained pipeline of
solar PV technicians, particularly beyond the 5-year award. This paper focuses on the experience
VOCTEC has on PV solar trainings.
The purpose of this paper is to report the effectiveness of three train-the-trainer
(educators) vocational PV trainings that were delivered by the VOCTEC program in Fiji (2013
and 2014), and Kenya in 2014. The expectation by the end of each training was that the
educators (trainees) would show an increase in learning outcomes (knowledge and skills
acquisition), and demonstrate an enhanced ability to conduct future technician/workforce
trainings on solar PV in their respective countries and communities.
2. Method
2.1. Setting and participants
In collaboration with the University of the South Pacific (USP) in Suva, Fiji and
Strathmore University in Nairobi, Kenya; Arizona State University (ASU) faculty delivered three
educator training workshops. Two workshops took place at the USP campus in Suva, Fiji one in
February 2013 and another in January-February 2014. The third workshop took place at SU
campus in July 2014. To make the program effective the trainers were selected based on specific
criteria: 1) have post-secondary education in electronics, electrical engineering, or other related
technology fields; 2) are affiliated with institutional stakeholders in training industry; and 3)
have prior technical vocational teaching experience. The educators received training on how to
initiate and deliver vocational-level technician and installer training for off-grid solar PV
technologies. The content of the training material and assessments were similar for all the three
workshops.
The 10-day long training workshops comprised an array of training modules on basic and
advanced technical topics (e.g., installation of solar PV system), hands-on exercises, and nontechnical (e.g., gender inclusion), and educational games to reinforce specific concepts taught in
the training. A total of forty two participants in Fiji and Kenya, selected from different
institutions, attended the training workshops. Only forty one participants completed the
background/demographic survey. Background data revealed that 3 participants were females and
38 were males. Thirty eight participants reported having some type of technical education
background, affiliation with a technical training institution, and prior teaching experience.
Although not required, 25 participants also reported having received prior training in solar PV.
2.2. Curriculum development
The project started with a needs assessment, involving an analysis of existing knowledge,
skills, and attitudes using self-reporting surveys as well as interviews with organization
management. The ultimate goal of the needs assessment was to help the instructional design
process of the curriculum, to define learning and performance objectives, and to meet the needs
of the local community and participants.
To design, develop, and deliver the learning material, a combination of expertise in
different areas of solar PV was required for the trainers. Faculty and staff from ASU in
collaboration with the counterparts of the USP and SU were involved at different levels. For
curriculum development, for example, the subject matter experts (SME) provided the initial
content materials to the VOCTEC instructional designer to design and develop training
materials. Each learning module was developed based on specific set of learning objectives,
which motivated the development of the assessments. Alignment of curriculum and assessments
with learning objectives allows for a more accurate measurement of the level of the participants’
acquisition of knowledge. The process of curriculum development, including the assessments
and discussion items, was done iteratively with the respective SME and a team of reviewers who
provided content specific reviews and feedback to ensure that the instructional material met the
desired objectives and needs.
2.3. Evaluation framework and dependent variables
A framework based on Kirkpatrick’s evaluation model was used for the assessment and
evaluation of the training intervention. This framework consists of four different focus areas: 1)
reaction assessment: measures the participants’ perception of and satisfaction with the design of
the training program and delivery of the content; 2) learning assessments: measures the extent to
which the participants acquired new knowledge and skills from the training in 3 different content
areas (technical content, advanced technical content, and non-technical content); 3) behavior
evaluation: measures the participants’ ability to apply the newly learned knowledge and skills;
and 4) impact: measures the long-term effect of the training intervention on the educators’
knowledge and skill acquisition within 6 to 9 months of the initial training. All the assessments
were developed in collaboration with professors (SMEs) who have been using similar assessment
items for many years in their classroom courses and have been modified throughout the years,
which increase validity of the items. Figure 1 shows the assessment and evaluation model and
the dependent variables (measures).
Figure 1: Assessment and Evaluation Framework
2.4. Data collection
The data used to assess reaction, learning, behavior, and impact was collected via 11
different assessment instruments administered at various times throughout the workshops. Table
1 lists the assessment instruments, data captured through each of the instruments, and
period/time it was administered
Table 1: Assessment instruments and data collection
Assessment Instrument
Data Collected
Time of Administration
Background Survey
Designed to capture
background information on
the attendees
Completed prior to the start of
the workshop
Pre-assessment of technician
training modules
Designed to measure
participants’ understanding of
the technical concepts
presented in the workshop
Completed at the beginning of
the first day, before the
training materials were
delivered
Pre-assessment for nontechnical modules
Designed to measure
participants’ prior knowledge
of the non-technical concepts
presented in the workshop
Completed at the beginning of
the first day, before the
delivery of training material
Pre-assessment of advanced
technical training modules
Designed to measure
participants’ prior knowledge
Completed on the first day of
the training before the delivery
of the advanced technical
concepts presented during the
third week of the workshop
of any training materials
Post-assessment of technician Designed to measure
training modules
participants’ understanding of
the technical concepts
presented in the workshop
Completed by the end of the
training after all the training
materials were delivered
Post-assessment for nontechnical modules
Designed to measure
participants’ understanding of
the non-technical concepts
presented in the workshop
Completed by the end of the
training after all the training
materials were delivered
Post-assessment of advanced
technical training modules
Designed to measure
participants’ understanding of
the advanced technical
concepts presented during the
last couple of days of the
workshop
Completed on the last day of
the workshop, after all the
advanced technical materials
were delivered
Hands-on evaluation
Designed to measure the
performance the performance
of the participants on the
hands-on laboratory exercises
Completed by the instructional
team by the end of the first
week of the training when
participants completed
assigned hands-on exercises
Readiness survey
Designed to capture each
participant’s perception of the
value of the training program
and their level of preparedness
and confidence to deliver
future technician training
workshops as a result of the
VOCTEC training.
Completed by the end of the
training following the delivery
of all the modules and before
taking the post-assessments
Post-training evaluation
survey
Designed to measure
participants’ satisfaction with
the training workshop
Completed on the last day of
the workshop following the
delivery of all material and
before the administration of
the post-assessments
Long-term impact assessment Designed to measure
participants’ long term
acquisition of knowledge
(same assessment as the
technician post-assessment)
Completed 6 to 9 months of
the initial training workshops
2.5. Training format/procedures
Each workshop spanned a period of 10 days, with each day beginning at 8:30am and
ending at 5:00pm. There were two 15-minute breaks (one mid-morning and the other midafternoon), and an hour-long lunch period each day. The first week of the workshop was focused
on familiarizing participants with the PV Technician curriculum (i.e., technical topic
presentations, related hand-on laboratory exercises, and the two games related to PV sizing and
troubleshooting). Presentations and discussions about non-technical topics (e.g., social and
gender inclusion; tools for effective teaching) were given on the last day of the first week of the
training. In the second week of the training, participants were introduced to advanced PV topics
and related laboratory exercises. Although the advanced topics are not part of the technician
training material, they were covered to strengthen the educators’ overall understanding of PV,
and to improve their ability to explain the technician-level concepts. The trainers who gave the
training had comparable teaching experience and expertise.
3. Results
Preliminary data analysis was conducted for the pretest data to test the equality of means
and equality of error variance. Results revealed no violations of assumptions. For the main
statistical analysis, the independent variables (outcomes) included learning outcomes (i.e.,
acquisition of knowledge) in three content areas: technical, advanced, and non-technical content
modules; hands on activities (i.e., application of skills through hands-on activities), and attitudes
toward instructions and training workshops. Although there were 42 participants, one participant
did not complete the post-training evaluation survey and the readiness survey, and 3 participants
did not complete the pre and post-test for the Advanced Technical Training Modules. Therefore,
the number of participants would vary among different outcomes.
3.1. M1-Reaction
The participants’ reaction to the training program was captured using the Readiness Assessment
and the Trainer Post-Training Evaluation Survey. Between these two instruments participants
provided feedback on: 1) overall satisfaction with the training course; 2) satisfaction with the
structure/organization of the course; and 3) satisfaction with instructors. These reaction
performance objectives were assessed in multiple ways using question items that required
participants to provide their response using one of two 3-point scales and/or a 5-point rating
scale. The 3-point scales were used with question items that asked participants about their
perception of either value or knowledge; e.g., “How valuable have you found the presentations
this week.” A score of 1= “not at all valuable” or “not very knowledgeable;” 2 = “somewhat
valuable” or “somewhat knowledgeable;” and 3 = “very valuable” or “very knowledgeable.” The
5-point rating scale was used with question items that asked participants to indicate their
agreement/ disagreement with statements that expressed a desired outcome related to the
objectives, e.g., “The material was presented in an interesting way.” A score of 1= “strongly
disagree;” 2 = “disagree;” 3 = “neither agree nor disagree;” 4 = “agree; and 5 = “strongly agree.”
Only 41 participants out of 42 completed the surveys. Table 2 shows the aggregate results of the
reaction performance objectives in the different trainings.
Table 2: Results of reaction (M1) performance objectives
M1: Reaction
Performance Objective
1. Overall satisfaction with training course
2. Satisfaction with structure/organization of course
3. Satisfaction with instructor
 Knowledge level of instructors
Mean Std. Dev.
4.76
0.50
4.70
0.29
4.86
0.28
2.92
0.16
N
41
41
41
41
Scale
5-pt.
5-pt.
5-pt.
3-pt.
There were two other reaction question items that differed in structure to the others. The first
relates to overall satisfaction with the training course, and asked participants whether they will
recommend the training to others. All (100%) of the participants responded “yes.” The second
question relates to satisfaction with the structure and organization of the course, and asked
participants to provide feedback on the balance between presentation and interactive discussions
by selecting one of the following options: “not enough presentation”; “not enough discussion”;
“good balance”; “too much presentation”; and “too much discussion”. Thirty nine out of the 41
participants who completed the post-training evaluation survey felt the balance was good.
3.2. M2-Learning outcomes
The three types of content material, i.e., technical, advanced technical, and non-technical
were presented separately in the workshop. As a result, the participants’ learning or acquisition
of new knowledge and skills in relation to each of these content areas was assessed separately
using different instruments. Collectively, the knowledge measurement instruments consist of
questions that allow for the assessment of participants’: 1) attainment of course objectives; and
2) increase in understanding. The learning outcomes will be further discussed for each content
type in the following sections.
3.2.1. Technical content
The individual learning outcomes on technical content were measured through the
technical content knowledge test at the end of each training workshop. The test included 24 items
that measured the participants’ ability to recall and apply the taught technical concepts. To
determine the effect of the training workshops on the participants’ learning outcomes in the
technical content area in solar PV, two tail t-tests were carried out to compare the participants’
scores between the pre-tests and post-tests of the participants in the technical content area. The
mean pre-test score for all technical content where pre-tests were administered (i.e., Fiji 2014,
and Kenya, 2014 trainings) (M ± SD) = 60.68% ± 12.11%. After the delivery of the technical
material, the overall mean post-test score for these two trainings was (M ± SD) = 84.65% ±
12.85%. The increase in knowledge from pre-test to post-test where comparison was valid (i.e.,
in these two trainings) is statistically significant for the overall technical material (t(41) =24.96, p
<0.05). Table 3 shows the scores in the technical material for each of the trainings.
Table 3: Breakdown of scores in technical material for each of the trainings
Training
Fiji 2013
N
16
Mean technician-level
pre-test scores
N/A
Mean technician-level
post-test scores
88.00*±13.00
Fiji 2014
14
53.00±14.50
84.00±12.00
Kenya 2014
12
68.36±9.72
85.30±13.70
*
Score are over 100. Scores are given in terms of M±SD
3.2.2. Hands-on activities for technical content
Throughout the training workshops, the participants worked in teams to complete 25
hands on exercises provided during the training. The hands-on exercises covered topics related to
PV safety, installation, troubleshooting, and maintenance. For each exercise the participants
completed a combination of measurement and computation tasks. However, for the purpose of
assessment, the participants were assessed individually on the troubleshooting exercise, which
required them to troubleshoot following the right troubleshooting process, and to fix the
identified problem(s) in the PV system. The troubleshooting exercise was chosen because it
incorporated the knowledge learned from all the other hands-on exercises. The performance of
the individuals on the exercise was assessed by instructors using the hands-on evaluation form on
the last day of the technician-level material. Participants were rated on 4 major troubleshooting
steps:
Step 1: Describe the symptoms of the problem
Step 2: Diagnose/identify the problem using a systematic approach
Step 3: Find the cause of the major problem
Step 4: Fix the problem
Participants were assessed on a 0-3 scale where a score of 0 = “Bad: didn’t perform the step
correctly”; 1 = “Fair: missing major details”; 2 = “Good: missing minor details”; 3 = “very
good/Excellent: completed the step without missing any details”. Table 4 shows the aggregate
results of the 4 steps for the three trainings.
Table 4: Aggregate results of the troubleshooting steps
Step
Step 1
Step 2
Step 3
Step 4
Mean
2.91
2.81
2.60
2.86
Std. Dev.
0.35
0.48
0.67
0.38
N
42
42
42
42
3.2.3. Advanced technical content
The second outcome variable was the individual learning outcomes in advanced technical
content. This learning outcome was measured through the advanced technical knowledge test
which comprised 19 questions that examined the participants’ understanding and knowledge of
the advanced technical content. The test was administered at the end of the trainings after the
advanced technical materials were delivered. The pre and post-tests for the advanced technical
content were completed by 39 participants only. To determine the effect of the training
workshops on the participants’ learning outcomes in advanced technical material, two tail t-tests
were carried out to compare the participants’ scores between the pre-tests and post-tests of the
participants in the advanced technical content area. The data analysis yielded significant
differences (t(38) = 29.73, p <0.05) between the pre-test mean score (M ± SD) = 65.98% ±
17.50% and the post-test mean score for all the advanced technical content (M ± SD) = 86.10% ±
8.86%. Table 5 shows the overall scores in each of the trainings.
Table 5: Breakdown of scores in advanced technical material for each of the trainings
Training
Fiji 2013
N
15
Mean advanced-level
pre-test scores
70.00±13.00
Mean advanced-level
post-test scores
87.00±8.00
Fiji 2014
12
73.00±13.00
86.00±11.00
Kenya 2014
12
54.94±26.46
85.20±7.57
*
Score are over 100. Scores are given in terms of M±SD
3.2.4. Non-technical content
The participants’ knowledge in the non-technical content area was also measured through
a knowledge test (non-technical knowledge test) that included 12 assessment items related to the
non-technical content (e.g., gender inclusion, entrepreneurship, and project management). Two
tail t-tests were carried out to measure the effect of each of the trainings on the participants’
knowledge scores of the non-technical content. Overall the mean post-test scores (M ± SD) =
77.70% ± 8.44% were higher than the pre-test scores (M ± SD) = 66.87% ± 11.24% of the nontechnical knowledge test. The increase in knowledge from pre-test to post-test is statistically
significant for the overall non-technical material of all the trainings (t(41) = 23.03, p <0.05).
Table 6 shows the overall scores of the non-technical knowledge tests in each of the trainings.
Table 6: Breakdown of scores in non-technical material for each training
Training
Fiji 2013
N
16
Mean non-technical
level pre-test scores
82.00±11.00
Mean non-technical level
post-test scores
90.00±5.00
Fiji 2014
14
61.00±11.00
72.00±9.50
Kenya 2014
12
57.61±11.74
71.00±10.82
*
Score are over 100. Scores are given in terms of M±SD
3.3. M3-Behavior
The participants’ ability to apply and perform the newly learned knowledge and skills
following the training have currently been measured in terms of their perceived preparedness and
confidence to fulfill the objectives of the educator training program; specifically: to present PV
instructional material and demonstrate hands-on exercises in a technician training, and to utilize
inclusion and teaching strategies to train technicians. The participant responses in this respect
were captured using the Readiness Assessment instrument. This instrument utilizes a 3-point
scale where 1= “not at all prepared” or “not at all confident;” 2 = “somewhat prepared” or
“somewhat confident;” and 3 = “very prepared” or “very confident;” to have participants respond
to 5 question items that asked about their:
a) Preparedness to teach the solar PV technician course
b) Preparedness to provide technicians with information on the importance of women’s
involvement in energy transactions
c) Preparedness to use inclusive teaching practices, foster community in the classroom, and
help students make connections to the material
d) Provided technicians in training with information on business opportunities related to PV
and the entrepreneurship process
e) Confidence to recruit women for the technician training
Only 41 participants out of 42 completed the survey. Table 7 shows the aggregate results of the
reaction question items for the three trainings
Table 7: Aggregate results of the reaction question items
Question
Item a
Item b
Item c
Item d
Item e
Mean
2.81
2.73
2.85
2.78
2.64
Std. Dev.
0.29
0.47
0.35
0.40
0.49
N
41
41
41
41
41
3.3.1. Overall perception of learning and increase in knowledge
Two items were included on the Trainer Post-Training Evaluation Survey to further
assess participants’ overall perception of learning and increase in knowledge. The first of these
items asked participants’ to indicate their agreement with the statement: “I learned new things
from the material covered in the training.” Responses were given using a 5-point scale that
ranged from 1 – “strongly disagree”, to 5 – “strongly agree”. The average (M ± SD) response
was =4.78 ±0.42, representing that the participants highly valued the training.
The second item asked participants’ to indicate their agreement with the statement: “I learned
new skills (e.g., instructional techniques) that will improve my ability to deliver future
trainings.” Responses were given using the same scale as the previous question. The average (M
± SD) response was = 4.71±0.48.
3.4. M4- Impact
The impact assessment was measured using the long-term technical content assessment
that measured participants’ long-term understanding and knowledge acquisition of the technical
concepts presented at the workshops. This assessment consisted of the same items that were
administered in the post-assessment of the technical content. The long-term assessments were
administered within 6 to 8 months after the delivery of the trainings. So far, the impact
assessments were collected only for the trainings that took place in the Pacific Islands in Fiji in
2013 and 2014. Kenya impact assessments are being collected.
In total, only 13 out of 30 participants from 2013 and 2014 educator trainings completed
the long-term impact assessments. Overall, the results of the long-term impact assessments
showed that the overall mean of the post-test score for those participants in 2013 and 2014 was
(M = 86.69%). After around 8 months on these educator trainings, the overall mean score of the
long-term impact test for those 13 participants was (M = 88.36%). In other words, the knowledge
acquisition level seems to be established and steady between the training workshops and the
impact assessment data collection period, which shows a long-term positive impact on the
trained participants’ knowledge acquisition. Table 8 shows the breakdown of scores in the
technical material based on the trainings.
Table 8: Breakdown of scores in technical material for each training
N
Mean Post-Test
Scores
Mean Long-term
Impact Scores
Fiji 2013
8
86.00
89.00
Fiji 2014
5
87.38
87.72
Training
In addition to the technical questions, the training program long-term impact was also
measured using an attitudinal survey that captured each of the participant’s perception of the
value and impact of the program on their career and delivery of technician training workshops;
e.g., “The VOCTEC training program helped me improve the existing teaching methods and
strategies at my institution”; or “The VOCTEC training program helped me utilize the hands-on
technical activities in technician trainings”.
The participants provided feedback on 4 categories:
a) Improving techniques and proficiency;
b) Improving delivery of lectures and hands on material;
c) Enhancing institution course; and
d) Utilizing the hands-on technical activities
The participants provided their responses on a 5-point rating scale. A score of 1= “strongly
disagree”; 2 = “disagree”; 3 = “neither agree nor disagree”; 4 = “agree”; and 5 = “strongly
agree.” The average mean score on all categories exceeded 4.
Table 9 shows the aggregate results of the attitudinal impact survey question items for the two
trainings
Table 9: Aggregate results of the attitudinal impact survey question items for the two trainings
Question
Item a
Item b
Item c
Item d
Mean
4.40
4.43
4.25
4.20
Std. Dev.
0.64
0.42
0.37
0.75
N
13
13
13
13
On the attitudinal impact survey, participants were also asked to provide information on the
approximate number of technicians that they have trained (through VOCTEC or non-VOCTEC
technician trainings) after completing the L-2 trainings.
The total number of technicians trained by the educators who attended the L-2 2013 and 2014
training is 225 technicians. More specifically, 141 technicians were trained by educators who
received the L-2 2013 training, and 84 technicians were trained by educators who received the L2 2014 training.
4. Summary and discussion
The three solar photovoltaic training workshops for educators were attended by 44
participants. They were all affiliated with an organization or institution that facilitates technical
training. The purpose of these trainings was to strengthen local capacity (for both men and
women) to design, install, operate, maintain, and repair solar PV energy equipment in the Pacific
islands and Africa.
The attendees all responded positively to their training experience. They expressed high
levels of satisfaction with the overall program, structure/organization of the course; and
instructors. In terms of learning, results also showed significant increase from pre- to post
assessments in all content areas (technical, advanced technical, and non-technical content). The
performance measures for the hands-on exercises, and participants’ impression of their learning,
triangulate the data and support the findings. Regarding the behavior measure, the participants’
perception about their preparedness and confidence in their abilities to train technicians were
also high. Feedback and insights gained from the trainees will allow us to continuously improve
future trainings and the VOCTEC program.
As of now, the long-term impact measures were collected for only the first two educator
trainings (Fiji, 2013 and 2014), and results show that educators’ knowledge and skill acquisition
were maintained after 8 months of their training.
Although the trainings had positive effects on the trainee, several challenges were
encountered by VOCTEC at different times during the trainings. The first challenge was the
difficulty to recruit a large number of females for technical trainings. Another challenge was the
diversity of language which in some cases resulted in translation of some words or assessment
items from local coordinators. The third major challenge was the difficulty to follow up with the
trainees to conduct long-term impact evaluations due to communication issues. For example,
lack of or slow speed of internet connectivity in these countries which led to lack of responses
from the participants.
Despite these and other challenges, the trainings were effective as evident from the
results. If other academic institutions decided to do similar trainings, they should take into
consideration these and other challenges and limitations that they might face. The VOCTEC
trainings only incorporated pre-/post- assessments within the same group; therefore, it would be
interesting to implement a research design (experimental design) that allows comparison
between comparable groups: a group receiving the VOCTEC training and a group who did not
attend the training. It would also be interesting to follow up on the two separate groups and look
at the longitudinal effects after a year.
Overall, the results support the implementation of such trainings in developing countries
do elicit greater knowledge and learning in the solar PV area. The growth of the renewable
energy market will continue to require increased technical know-how in developing countries,
including local capabilities to adapt, install, operate, and maintain technologies and to build local
manufacturing industries. Therefore, in light of the growing attraction of renewable energy,
national governments and international donors should continue to support trainings and
education in different renewable energy areas (e.g., solar PV, wind, micro hydro) to strengthen
local capacity and to achieve and secure sustainable energy supply.
Bibliography
[1] Acker, R.H., & Kammen, D.M. (1994). The quiet (energy) revolution: analyzing the dissemination of
photovoltaic power systems in Kenya. Report, Energy Policy.
[2] International Renewable Energy Agency (2013). Pacific lighthouses: renewable energy opportunities
and challenges in the Pacific Islands region. Report, IRENA.
[3] Johnston, et al. (2004). Renewable energy in Africa: prospects and limits. Fiji national report, GEF,
UNDP, SPREP and the Pacific Islands.
[4] Martinot, E., Chaurey, A., Lew, D., Moreira, J.R., & Wamukonya, N. (2002). Renewable energy
markets in developing countries. Annual Review of Energy Environment, 27,309- 348.
Biography
Rim Razzouk, Ph.D. is a Senior Instructional Designer at Arizona State University’s Ira Fulton School of
Engineering. In her current position, Rim leads the curriculum development and the assessment and evaluation
processes for the VOCTEC (Vocational Training and Education for Clean Energy) project. She coordinates the
production of instructional materials with subject matter experts. Rim is also responsible for the data analyses and
the write up of research reports for the purpose of continuous curriculum improvement. Rim has a PhD in
Instructional Systems/Educational Technology from the Florida State University (FSU). Rim also holds a M.Sc
degree in Instructional Systems and a Certificate in Human Performance Technology from FSU, and a B.Sc in
Information Technology from Notre Dame University. Rim’s major project and research interests include
technology integration in education; assessment and evaluation; learner-centered methods and strategies; and any
other methods that assist in enhancing human performance and learning improvement. Rim has authored and coauthored several published articles in peer-reviewed journals, and conferences proceedings.
Anshuman Razdan, Ph.D. is a Professor in the Ira A. Fulton Schools of Engineering in the School of Computing,
Informatics and Decision Systems Engineering (CIDSE). Dr. Razdan has a BS and MS in Mechanical Engineering
and PhD in Computer Science. He has been a pioneer in computing based interdisciplinary collaboration and
research at ASU. He leads the Image and 3D Exploitation and Analysis (I3DEA) lab (http://i3dea.asu.edu) He is the
Principal Investigator and a collaborator on several federal grants from agencies including NSF, NGA and NIH and
DHS, US Army, USAID, and Science Foundation of Arizona. He has led or participated in over $25Million grants
in his career. Anshuman has published extensively in refereed journals and conferences and is sought as an invited
speaker for many technical and non-technical forums. He has mentored over 30 Masters, PhDs and Post Docs.
Anshuman works with industry and global organizations and has extensive experience negotiating contracts and
executing projects globally such as Pacific Islands, Africa, Asia and the Caribbean.
Ambika P. Adhikari, Ph.D. is Program Manager (Research) at the Office of Knowledge Enterprise and
Development at Arizona State University (ASU). At ASU, he is also a Research Professor (affiliate faculty) at the
School of Geographical Sciences and Urban Planning, and Sr. Sustainability Scientist at the Julie Ann Wrigley
Global Institute of Sustainability. Ambika was Sr. Planner and Impact Fees Administrator at SRPMIC, Scottsdale,
Arizona, and a Village Planner and Project Manager at City of Phoenix. He was the Nepal Country Representative
of the Switzerland based IUCN – International Union for Conservation of Nature. Earlier, he was a Senior Director
at DPRA Inc. in Toronto and Washington DC. In Nepal, Ambika was an Associate Professor of Architecture and
Planning at Tribhuvan University. He was a member of the Government of Nepal's National Water and Energy
Commission – the highest policy making body in this sector. He is a Fellow of the American Society of Nepalese
Engineers (ASNE).
Acknowledgement
The authors would like to acknowledge the generous support by the United States Agency of International
Development (USAID) (Leader with Associates (LWA) award - Cooperative Agreement No.AID-OAAL-11-00005 and Associate Award and MFAT - Cooperative Agreement No. AID-492-LA-12-00002). Our
thanks are extended to all the VOCTEC team members and professors who contributed to the success of
the training program.
Download