Uploaded by Erica Mae Oca

capstone2revised (1)

advertisement
Assessing the Digital Competence of Secondary Teachers in Agusan National High
School: A Quantitative Research Using Digital Competence of Educators Framework
A Capstone by
Balmocena, Gerald L.
Oca, Erica Mae
Porta, Pamela M.
Submitted to the Department of Information Systems
College of Computing and Information Sciences (CCIS)
Caraga State University – Main Campus
In Partial Fulfillment
of the Requirements for the Degree
Bachelor of Science in Information System
June 2024
ii
APPROVAL SHEET
This capstone project entitled Evaluation of the Factors Affecting the Actual Use of
Microsoft Office tools (MS Word, MS PowerPoint & MS Excel) among Senior High School
Students at Southeast Butuan District 1 Secondary School, prepared and submitted by
Michael Christian Rey M. Ejandra, Paolo L. Enriquez, and Glenn Jhune L. Polia, in partial
fulfillment of the requirements for the degree Bachelor of Science in Information System
is hereby accepted.
ELBERT S. MOYON
Capstone Adviser
JANE FRANCIS P. JAICTIN, MBA
Chair, Oral Examination Panel
JENIE L. PLENDER-NABAS, MSc.
Panel Member
IVY G. NALAM
Panel Member
Accepted and approved for the conferral of the degree Bachelor of Science in
Information System in the 2nd semester of SY 2022-2023.
VICENTE A. PITOGO, DIT
Dean, CCIS
iii
DEDICATION
We would want to express our sincere gratitude and commitment to everyone who
has helped us along the way as we pursue our knowledge and complete our capstone
project. First and foremost, we are grateful for our families' unwavering support and
understanding. Their constant encouragement gave us a foundation to start our
academic pursuits. Their faith in us has been a source of hope for us during the difficult
times.
We are really grateful to our friends and mentors. Our educational experience has
been enhanced by your advice, wisdom, and contagious passion. Our team's
collaborative attitude, which is a result of the various viewpoints and skill sets that
each member brings to the table, is evidence of the effectiveness of teamwork. We
understand that this accomplishment is equally yours and ours as we stand on the
point of success.
Lastly, the drive of knowledge itself is the focus of this capstone project. It is an
observance that honors tenacity, intellectual curiosity, and the passion of learning. We
hope that this work adds, even somewhat, to the enormous body of knowledge that
is human understanding. We gratefully dedicate our capstone effort to all those who
support the pursuit of greatness and the transformational potential of education.
iv
ACKNOWLEDGMENT
The proponents wish to extend their heartfelt appreciation to all individuals whose
contributions were vital in ensuring the successful completion of this capstone project.
Their unwavering support and commitment greatly contributed to its overall success.
To begin with, the authors wish to express their sincere appreciation to their
capstone adviser, Mr. Elbert S. Moyon, for his consistent mentorship, helpful advice,
and the information he shared, all of which significantly improved the quality of this
capstone project.
Additionally, it is crucial to extend appreciation to the esteemed individuals
comprising the thesis advisory panel—Maam Ivy G. Nalam, Maam Jenie L. PlenderNabas, and Ma'am Jane Francis P. Jaictin (chairperson of the defense panel)—for their
precious time and insightful feedback provided during the capstone defense. Their
guidance was crucial to the development and improvement of this study.
The parents and guardians of the proponents, whose presence and unwavering
support have been an immense source of strength, are acknowledged. The
inexhaustible support, encouragement, and financial assistance they have provided
serve as sources of motivation, encouraging the proponents to put in additional effort.
Lastly, the proponents would like to thank their research colleagues, associates,
and peers for their unwavering support, valuable feedback, and relevant information,
v
which helped the capstone project succeed. Their help with memories and
collaboration will always be appreciated.
vi
ABSTRACT
This study employs a stratified sampling approach to assess the digital competence of
educators at Agusan National High School, revealing diverse ages and teaching
experiences through demographic analysis of respondents. Utilizing Partial Least
Squares Structural Equation Modelling (PLS-SEM), the study examines the
relationships between various variables of the DigCompEdu Framework to understand
its relationship to digital competence (DC) as the higher-order construct. To determine
educators' levels of digital competency, it also makes use of the DigCompEdu
competencies and progression model. Reliability and validity of constructs are ensured
through Cronbach's alpha, composite reliability (CR), convergent, and discriminant
validity tests. The Fornell-Larcker-Criterion, cross-loading, and average
variance
(AVE) are used to evaluate the validity of latent variables, ensuring the robustness of
the study's measurements. The results of structural equation modelling show that
professional engagement, use of digital resources, integrating digital technologies
into teaching and learning, digital assessment practices, giving students more power,
and helping them become more digitally competent are all positively related. This
shows how important these factors are in creating effective digital teaching practices.
This research provides a comprehensive understanding of educators' digital
competence at Agusan National High School, emphasizing the importance of various
factors in shaping effective digital teaching practices. The findings contribute to the
broader discourse on digital literacy in education and offer insights for educational
institutions aiming to enhance their educators' digital competencies.
Keywords: DigcompEdu, PLS-SEM, Digital Competence
TABLE OF CONTENTS
APPROVAL SHEET ...............................................................................................................ii
DEDICATION ....................................................................................................................... iii
ACKNOWLEDGMENT ......................................................................................................... iv
ABSTRACT ..........................................................................................................................vi
TABLE OF CONTENTS.......................................................................................................... 1
LIST OF FIGURES ................................................................................................................. 5
LIST OF TABLES .................................................................................................................. 6
CHAPTER 1 . INTRODUCTION ............................................................................................. 7
1.1 Background of the Study ........................................................................................ 7
1.2 Statement of the Problem..................................................................................... 11
1.3 Objectives of the Study ........................................................................................ 12
1.4 Significance of the Study ...................................................................................... 12
1.5 Scope and Limitation of the Study ...................................................................... 13
CHAPTER 2. REVIEW OF RELATED LITERATURE ............................................................ 15
2.1 Exploring Digital Literacy and Digital Competence ............................................... 15
2.2 Educators Digital Competence ............................................................................. 18
2.3 Identified Gaps in Educators Digital Competence .............................................. 23
2.4. Theoritical Framework...........................................................................................26
Literacies (Sindoni et al., 2019)........................................................................... 29
2.5 The DigCompEdu Framework ............................................................................... 30
2.5.1 Professional Engagement...........................................................................32
2.5.2 Digital Resources ........................................................................................32
2.5.3 Teaching and Learning ...............................................................................33
2.5.4 Assessment .................................................................................................33
2.5.5 Empowering Learners ............................................................................... 34
2.5.6 Facilitating Learners’ Digital Competence .............................................. 34
2.5.7 Progression Model .....................................................................................35
2.6 Theoritical Framework ...........................................................................................37
2.6.1 Professional Engagement ......................................................................... 38
2.6.2 Digital Resources ....................................................................................... 38
2.6.3 Teaching and Learning .............................................................................. 39
2.6.4 Assessment ................................................................................................ 39
2.6.5 Empowering Learners............................................................................... 40
2.6.6 Facilitating Learners’ Digital Competence ............................................... 41
CHAPTER 3 METHODOLOGY......................................................................................... 42
3.1 Research Method................................................................................................. 42
3.2 Identifying Respondents ...................................................................................... 43
3.2 Identify Critical Dimensions/Questionnaire Tool ................................................ 44
3.4 Identifying Relationship between Independent and Dependent Variable ......... 47
3.5 Data Collection ....................................................................................................... 48
3.6 Data Analysis .......................................................................................................... 49
3.7 Reliability and Validity Construct ........................................................................... 50
3.7 Progression Model and Aligned Scoring Rule for Assessing Digital Competence
of Educators .........................................................................................................52
CHAPTER 4 RESULTS AND DISCUSSION......................................................................... 54
4.1 Analysis of the Respondent Demographic Profile ............................... 54
4.2 Measurement Model ..............................................................................................55
Cross Loadings Result ................................................................................................. 60
4.3 Structural Equation Modeling ................................................................................ 61
4.4 Participants Digital Competence ......................................................................... 68
4.5 Average Score by Competence ............................................................................. 70
4.6 Participants DC base on Years of Teaching ...........................................................73
4.7 DC base on Age ...................................................................................................... 74
4.8 Discussions ............................................................................................................. 76
4.9 Implication ............................................................................................................. 79
4.9.1 Implications for Practice ..................................................................................... 79
4.9.2 Implications for Future Practice......................................................................... 80
CHAPTER 5 SUMMARY, CONCLUSION AND RECOMMENDATION.............................. 83
5.1 Summary ................................................................................................................. 83
5.2 Conclusion ............................................................................................................ 84
5.3 Recommendations .............................................................................................. 85
5
LIST OF FIGURES
Figure 2-1. DigiLit Leicester (Fraser et al. 2013).......................................................... 16
Figure 2-2. ICT Competency Framework for Teachers from UNESCO (2018). ......... 17
Figure 2-3. Digcomp 2.2 Framework (Vuorikari et al. 2022). .................................... 18
Figure 2-4. Common Framework of Reference for Intercultural Digital Literacies
(Sindoni et al., 2019). ................................................................................................... 19
Figure 2-5. DigCompEdu Framework from Redecker (2018). ................................... 21
Figure 2-6. Progression Model of DigCompEdu (Redecker 2017)............................. 24
Figure 2-7. Proposed Framework Adapted from DigCompEdu (Redecker 2018) ... 26
Figure 3-1. Conceptual Framework of the Study. ...................................................... 31
Figure 4-1. Structural Model Results ........................................................................... 51
Figure 4-2. Participants Level of Digital Competence (Derived from Benali et al.,
2018)............................................................................................................................. 54
Figure 4-3. Average scores by competence (Derived from Benali et al., 2018; Dias
Trindade et al., 2020). ................................................................................................. 55
Figure 4-4. Participants Digital Competence Based on Years of Experience
(Derived from Benali et al., 2018). .............................................................................. 58
Figure 4-5. Digital Competence Based on Age (Derived from Benali et al., 2018)... 59
6
LIST OF TABLES
Table 3-1. Survey Questionnaire Tool .......................................................................... 33
Table 4-1. Demographic Profile of Respondents. ...................................................... 42
Table 4-2. Construct’s reliability Test ......................................................................... 44
Table 4-3. Convergent Validity of Lower Order Constructs. ..................................... 45
Table 4-4. Fornell - Larcker Criterion .......................................................................... 47
Table 4-5. Collinearity Statistic (VIF)........................................................................... 49
Table 4-6. Path Coefficients Result ............................................................................ 52
Table 4-7. Hypothesis Testing Results........................................................................ 53
7
CHAPTER 1. INTRODUCTION
1.1 Background of the Study
Digital literacy, defined by Gilster (1997), is crucial for individuals to effectively
comprehend and use information presented through computers and the internet. In
the 21st century, digital literacy is foundational for personal and professional success
(Tejedo et al., 2020). Within education, digital competence and literacy encompass
essential skills like online communication, data literacy, and multimedia creation, all
used responsibly, critically, and confidently (Johannesen et al., 2014). Digital
technology has transformed education, offering various avenues to enhance
teaching and learning (Haleem et al., 2022), with tools like Microsoft Teams, Google
Classroom, Canva, learning management systems (LMS), and video conferencing
tools like Zoom being commonly used (Ballano et al., 2022).
The importance of educators possessing digital literacy and competency is
emphasized by the increasing role of digital technology in education (Tejedo et al.,
2020). Educators play a crucial role in guiding students through modern challenges,
leveraging technology for teaching, communication, and professional development
(Utami et al., 2019; Buabeng-Andoh C, 2012). As our society becomes more hybrid,
sustainable learning environments require digitally literate teachers who can
effectively integrate digital tools (Dias-Trindade et al., 2022). Educators must enhance
their competence to meet the needs of the twenty-first century and adapt their
teaching strategies to changing educational contexts (Caena, 2019).
8
Despite
significant
investments
in
ICT
infrastructure
and
professional
development, many countries still have limited adoption and integration of ICT in
teaching and learning (Buabeng-Andoh C, 2012; Forutanian, 2021). In the Philippines,
the COVID-19 pandemic prompted a sudden shift to synchronous and asynchronous
digital classes, testing educators' adaptability to flexible learning using digital
resources (Al-Lily et al., 2020). This shift challenged educators' professional roles,
career satisfaction, and digital literacy compared to traditional teaching methods (Li
& Yu, 2022). Consequently, educators needed to enhance their Digital Competence
(DC) to teach effectively on long-distance online platforms due to the increased use
of digital educational technologies. As a developing country, the Philippines is
working to improve its online education system and the digital capabilities of its
educators. The COVID-19 outbreak led the country's educational systems to reassess
their proficiency in utilizing digital tools for learning, preparing for potential similar
circumstances (Ballano et al., 2022). However, there hasn't been enough research on
the effective use of digital tools by educators for online instruction, particularly in the
context of flexible learning, highlighting the need for further exploration in this area.
Agusan National High School (ANHS), located in Butuan City, Agusan del Norte,
with 400 educators from Junior and Senior High School, aims to produce competent
learners well-prepared in education and technology. However, concerns remain
regarding the digital competency and technology integration skills of its teaching
staff. ANHS may conduct internal evaluations, but a comprehensive assessment of
educators' digital competency using a different method is needed to identify gaps
and enhance the learning process, given the evolving nature of technology. This
study is crucial due to educators' critical impact on students' futures, especially at
9
ANHS, and the need to assess its teaching staff's digital competency. Some of the
problems faced by ANHS include limited access to digital tools, varying levels of
digital literacy among teachers, and potential gaps in how technology is used in the
curriculum. Addressing these issues is important as they directly affect the quality of
education students receive in the digital age (Garzon et al., 2023).
The DigCompEdu framework, published in 2017 through collaborative efforts
across Europe, defines the essential digital competencies required by educators to
integrate technology effectively into their teaching methods, addressing the evolving
digital landscape. DigCompEdu enables educators to engage proficiently with digital
technologies in education and foster digital competence among teachers and
educators (Redecker, 2017). Studies have demonstrated that DigCompEdu is an
effective tool for assessing educators' digital competence, exhibiting good reliability
and internal consistency (Colás-Bravo et al., 2021; Benali et al., 2018; Ghomi et al.,
2019).
To assess the digital competence levels of educators in ANHS and how they utilize
digital technologies effectively in teaching, the study will draw upon the DigCompEdu
framework. The results can guide the development of interventions and training
programs to assist educators in enhancing their digital literacy and integrating
technology effectively into their lesson plans. The study's findings will contribute to
the existing information on teachers' digital literacy and provide guidance for future
research
in
this
field
(Abella
et
al.,
2023).
10
1.2 Statement of the Problem
The COVID-19 pandemic has emphasized the increasing importance of technology
in education. While technology offers opportunities to enhance teaching and
learning, there is a significant gap in teachers' digital competence, hindering effective
technology integration (Fraillon et al., 2019).
1. There is insufficient research on educators' effective use of digital tools for
online instruction, especially regarding their digital pedagogical practices
(Ballano et al., 2022).
2. Addressing this gap is crucial, as demographic factors like educators' ages
correlate with digital competence, with younger educators often showing
higher proficiency and enthusiasm for new technologies (Saripudin et al
.,
2021).
Therefore, this study aims to assess the digital competence of secondary
teachers at Agusan National High School. It also seeks to identify the factors
contributing to the gap in digital competence and explore strategies for enhancing
teachers' digital skills, utilizing the Digital Competence of Educators Framework..
11
1.3 Objectives of the Study
The general objectives of the study are to assess the current digital competence of
ANHS teachers, identify their proficiency levels, and evaluate how ANHS can support
their development of digital literacy skills to effectively utilize technology for
education. Additionally, the relationship between years of service and age of an
educator will be investigated, as highlighted by Benali et al. (2018), there is a potential
association between years of teaching experience and age of educator towards their
digital competence levels.
Specifically, the project aims to:
1. Assess ANHS educators' digital competence levels.
2. Identify the relationship between educators' age and years of service in relation
to the current level of digital competence among its educators.
1.4 Significance of the Study
The study aims to give significance to the following sectors of our community:
The Educator/Teachers - The result of this study will add to the current body of
knowledge among teachers about their digital competency. The study can uncover
gaps in teachers' knowledge and skills and make recommendations for increasing their
digital competency levels by analyzing it which can result in their students receiving
superior teaching and learning results. This can eventually help educators become
12
more competitive in the job market, increasing their chances of professional growth
and progress.
The junior and senior high school students – This study is essential for empowering
junior and senior high students by encouraging educators to become more tech- savvy
and skilled at incorporating technology into lesson plans, making their classrooms
more interesting, collaborative, and productive. This serves as preparation that is vital
for their transition to tertiary education.
The Institution - This study provides a necessary and comprehensive review of
educators' levels of digital competence in ANHS, highlighting areas for improvement
and motivating the decision-makers to hold conferences, seminars, and training
sessions appropriate for educators. In addition to promoting more digital competency
and literacy, this action will raise the standard of education in the school,
strengthening their ability to compete globally and to quickly adopt new trends in the
constantly developing digital world.
1.5 Scope and Limitation of the Study
The study assesses the digital competence of ANHS teachers, including both junior
and senior high teachers, using a stratified sampling technique at Agusan National High
School. Researchers collect information from ANHS teachers regarding their current
digital competency levels using the DigCompEdu framework and their utilization of
digital tools in teaching techniques. However, because the survey is carried out during
13
working hours, the respondents' involvement is only based on their voluntary
participation in the questionnaire. Thus, the study is not able to encompass the entire
population of educators at ANHS. Data gathering is limited to institutional self-reports,
and the conclusions may not be relevant to other schools. Furthermore, this research
does not investigate the impact of educators' digital skills on student learning
outcomes; rather, it solely focuses on educators' self-assessment of their digital
competency. The responsibility for utilizing and enhancing their competency levels lies
with the educators and the institution themselves.
CHAPTER 2. REVIEW OF RELATED LITERATURE
This section presents the in-depth search of the researchers on the topic "Digital
Competence". The chapter also includes the ideas, published literature,
generalizations or conclusions, methodologies, and others of the said topic. This
chapter will help the reader grasp the idea of the topic and what the researchers want
to achieve.
2.1 Exploring Digital Literacy and Digital Competence
Concepts like "digital competence" and "digital literacy" are being used more
frequently in public discourse. However, they differ according to whether the terms
are established by policy, research, or both, as well as whether they put more of an
emphasis on social practices or technical abilities (Spante et al. 2018). While "digital
competence" and "digital literacy" are sometimes used interchangeably, it is essential
to discern their distinct definitions and distinctions (Ilomäki et al., 2011).
Coined by Paul Gilster in 1997, "digital literacy" underscores cognitive abilities and
the application of information from diverse sources, prioritizing critical thinking over
technical proficiency (Chan et al., 2017). Joosten et al. (2012) adopt the definition of
Paul Gilster (1997) digital literacy, emphasizing the adaptation of skills to a new
medium. They claim that our experience with the Internet is shaped by how well we
master its
16
core competencies. Widana et al. (2018) define digital literacy as cognitive abilities
crucial for locating, assessing, producing, and sharing online content for success in
education, the workplace, and interpersonal relationships. Highlighting its critical
importance in today’s world, Kuek and Hakkennes (2020) emphasize that digital
literacy is fundamental for proficient technology use. This involves the ability to handle
technological devices, including both hardware and software functionalities (Machin‐
Mastromatteo, 2012). Moreover, the concept of "digital literacies," as highlighted by
Dudeney et al. (2016), underscores the integration of technical proficiency with an
understanding of appropriate social behavior. In a similar way digital literacy is thus
defined as “the capabilities required to thrive in and beyond education, in an age when
digital forms of information and communication predominate” (Littlejohn et al.,
Citation2012, p. 547). In summary, the idea of digital literacy has been connected to
diverse agendas and perspectives, encompassing technical proficiency, cognitive
abilities, social practices, and proactive interaction with digital content (Spante et, al
2018).
Competency refers to the capacity to perform a task by applying the skills,
information, and attitudes acquired via learning (Abella et al. 2023). The European
Union (EU) Commission identifies digital competence as one of the key competences
necessary for personal fulfillment, active citizenship, social cohesion, and
employability (European Parliament, 2006). Digital competence involves basic ICT
skills, legal and ethical principles, information processing skills, creativity, and critical
thinking (UNESCO, 2017). In relation to that, Tømte et al. (2015) defines digital
17
competence for teachers as proficiency in using ICT with effective teaching judgment.
Understanding the effects on learning methodologies and student digital
development is required. Digital competence is more complicated and holistic ICT use
with pedagogical judgment in educational environments. The concentration is on
pedagogy and subject matter, with technical skills falling under the complex digital
competence idea. (Tsankov et, al. 2017) (Morellato, M. (2014). Since educators must
handle the subject and instructional tools, digital competency is crucial. According to
this notion, digital competence helps educators learn and update professional abilities
(Spante et, al. 2018).
In conclusion, digital competence in professional development for educators refers
to the instructor's ability to use ICT to improve students' knowledge and
understanding (Krumsvik, 2009). Furthermore, in education, digital competence, also
known as digital literacy, involves basic digital skills such as online communication,
understanding data, and creating multimedia. It means using these skills responsibly,
critically, and confidently in educational settings. However, it's important to note that
while competence is often used similarly to literacy, they are not direct synonyms
(Johannesen et al., 2014). Therefore, the researchers opt for the term "Digital
Competence" in the study, it forms a solid connection with the specialized knowledge
and skills of instructors and has a pivotal function in enhancing educator professional
growth (Spante et, al. 2018).
18
2.2 Educators Digital Competence
A study by Cruz, J. A. (2018) employed a descriptive correlation method to assess
the digital literacy skills and engagement levels of elementary teachers within selected
private schools in Cavite, aiming to inform the development of an enhanced
technology-rich teaching program. Using a structured assessment tool, the study
evaluated six key components of digital literacy. The findings revealed an overall high
level of digital literacy among respondents, categorized into varying proficiency levels.
However, specific gaps were identified, particularly in creative use and information
navigation. These gaps were discerned through a comprehensive analysis of
assessment results, utilizing statistical techniques such as frequency analysis and
correlation assessments. Qualitative data from surveys or interviews may have also
been analyzed to provide deeper insights. These methodological approaches
facilitated the identification of precise areas for intervention, informing targeted
training programs aimed at bridging identified gaps in digital literacy skills among
elementary teachers and bolstering overall digital competence.
Abella et al. (2023) study in Olongapo City comprehensively examined teachers'
digital literacy (DL) and digital competence (DC), identifying factors that influence
their development. Using a descriptive-correlational design with 274 participants, the
study employed validated instruments and statistical tests to address the research
hypothesis. A hierarchical multiple regression model revealed significant predictors of
respondents' DL and DC. The study highlights negative correlations between digital
19
literacy and age/pre-service training, emphasizing younger teachers and those with
pre-service ICT training exhibit higher digital competence. This implies that teachers'
age has a big influence on the degree to which they adopt new technology; teachers
who are younger tend to be more willing and capable than those who are older
(Saripudin et al. 2021).
In addition to the challenges posed by the COVID-19 pandemic, which limited
access to resources and hindered the development of positive ICT attitudes, the
study's three-stage regression analysis underscores the cumulative influence of
personal/work-related variables, ICT factors, and attitudes on digital literacy and
competence. This means that people who have had positive experiences with
technology, have access to good resources, and have a positive attitude towards
technology are more likely to be good at using technology (Kim H. J. et, al. 2018).
Despite these findings, Olongapo City teachers demonstrate a positive attitude toward
computers and digital literacy, with favorable outcomes in the affective domain,
particularly in perceived usefulness and control. Nevertheless, the study's findings
reveal persisting gaps: the lack of adequate digital literacy programs in the Philippines
to bridge the digital divide, the scarcity of resources addressing the current
phenomenon, and the need to investigate the utilization and capacity of online
learning platforms and teachers' digital competencies in the Philippines.
A study by Dela Fuente, J. A. & Biñas, L. C. (2020) evaluated teachers' ICT
competence in a Philippine high school and proposed an intervention program based
on the findings. The study employed a descriptive research design to assess teachers'
20
ICT proficiency using the NICS-Basic skill set, covering ICT basics, word processing,
spreadsheets, presentations, information and communication, computer ethics, and
security. The key findings revealed that teachers' ICT competence was generally at an
intermediate level, and factors such as age, gender, highest educational attainment,
and teaching position did not significantly influence their ICT proficiency. However,
similar to the study of Abella et al. (2023) the number of ICT-related seminars and
training attended in ICT basics, spreadsheets, computer ethics, and security were
found to be significant factors in improving teachers' ICT competence.
The study suggests that teachers can enhance their ICT competence for teaching
purposes by attending ICT seminars and training specifically focused on low-level ICT
skill sets. Additionally, teacher education programs and professional development
initiatives should prioritize improving teachers' ICT competence to ensure they can
deliver quality education in the digital and technological era. The study recommends
that school administrators reevaluate and strengthen their ICT programs by providing
appropriate seminars and training to enhance teachers' ICT competence, as training
and seminars have been proven effective in improving digital proficiency.
A study by Mumbing et al. (2021) aimed to determine teachers' attitudes and
technological competence. Descriptive statistics like mean and standard deviation
were used to assess Southern Mindanao respondents' data. The study found that
educators have mixed attitudes towards online teaching, finding some aspects
challenging, similar to the findings of Moralista and Ocudado (2020). However, in
contrast to the perceived drawbacks of online education, such as academic
21
dishonesty, lack of personal touch, and technological challenges, teachers in this study
disagreed with the popular belief that students are more likely to cheat in online
classes. They also have an interest in learning new applications and technologies for
online teaching. The study's findings align with those of Huang & Liaw (2011), which
demonstrated that teachers with high technological competence tend to have a
positive attitude towards online teaching, while teachers with low technological
competence tend to have a negative attitude towards online teaching.
A study by Guillen-Gamez et al., (2021) with the aim to describe the digital
competence levels of teachers across various knowledge areas, genders, and age
groups. Results indicate a general deficiency in digital training among teachers,
irrespective of gender, age, or knowledge area. The study is composed of 13.10% of
educators in Andalusia, Spain, focusing on Higher Education professors in the region.
Despite the relatively small sample size of the study the DigCompEdu Self Reflection
Tool is proven to be effective in achieving the main goal of the study. The framework
provided a structured approach to measuring digital competence, allowing for a
comprehensive analysis of educators' skills and capabilities in the digital domain. By
using the DigCompEdu framework, the study was able to compare digital competence
levels among professors from various fields of knowledge and age groups, providing
valuable insights into the training needs for enhancing educators' digital skills.
A study by Salminen et al., (2021) aimed to explore how the Basics of Digital
Pedagogy training affects the digital teaching skills of healthcare educators and
candidates. Researchers used pre- and post-tests with the OODI tool to track changes
22
in participants' digital competence levels. The main goal of this study is to explore the
connection of an educational intervention on the competence of health care educators
and educator candidates in digital pedagogy. Despite the study's small sample size of
only 20% of the total sample, the researchers effectively utilized DigCompEdu as a
framework to interpret respondents' data and achieve study goals. This framework
provided a structured approach to assess educators' digital competence in teaching
and learning areas. Aligning with DigCompEdu, the Basics of Digital Pedagogy
intervention covered essential aspects of digital pedagogy.
Using DigCompEdu, researchers measured participants' self-assessed digital
pedagogy competence pre- and post-intervention, enabling a comprehensive
evaluation and tracking of improvement. The study found the intervention led to
improvements across all competence areas outlined in DigCompEdu, indicating its
success in enhancing participants' digital pedagogy proficiency.
A study by Vieira et al. (2023) aimed to measure the digital proficiency of a sample
of Portuguese teachers and examine differences in digital proficiency across various
STEM subjects, including mathematics and natural sciences, physics and chemistry,
and biology and geology, with a total sample size representing only 21% of the entire
sample size. Despite the study's small sample size, the DigCompEdu framework and its
self-reflection instrument were effective in interpreting respondent data and
achieving study goals. This framework provided a structured approach to assessing
digital competence among educators, facilitating systematic evaluation of strengths
and areas for improvement. Utilizing the DigCompEdu Check-In instrument,
23
researchers evaluated teachers' digital proficiency across STEM subjects, enhancing
reliability and validity. Overall, the framework guided the study and enabled effective
comparison and analysis of digital proficiency among STEM teachers.
In contrast, a substantial association has been found between respondents'
attitudes toward online teaching and technology ability, suggesting that a positive
attitude predicts technological proficiency. This research highlights the need for
significant improvements in teachers' attitudes towards online teaching and their
technological competence. As suggested by Dela Fuente and Biñas (2020) and Ballano
et al. (2022), regional and central offices should provide more support to help teachers
improve in these areas. These studies highlighted the need for focused seminars and
training programs tailored to low-level ICT skill sets to significantly enhance digital
competency knowledge and skills (Dela Fuente & Biñas, 2020; Ballano et al., 2022)
Teachers also have a responsibility to seek out opportunities to learn about online
teaching and technology and integrate in into their teaching methods (Tezci, E. (2011).
2.3 Identified Gaps in Educators Digital Competence
A common pattern emerges from studies undertaken by Cruz (2018), Abella et al.
(2023), and Dela Fuente and Bias (2020) in the field of digital literacy and competence
among educators. Collectively, these studies underline the imperative for intensive
training and seminars to effectively address specific gaps identified in digital literacy
and competence among teachers. The recognition of this need forms a foundational
aspect for enhancing educators' proficiency in navigating the digital landscape.
24
Additionally, the studies by Abella et al. (2023) and Cruz (2018) shed light on the
interplay between age and pre-service training, emphasizing their significant impact
on digital competence. These findings advocate for targeted interventions tailored to
different age groups, acknowledging the diverse needs and experiences of educators.
In addition, Mumbing et al. (2021) and Dela Fuente and Biñas (2020) explore into
the interdependent relationship between teachers' positive attitudes toward online
teaching and their technological competence. This highlights the importance of
cultivating positive attitudes as a catalyst for improving technological proficiency
among educators. Furthermore, Abella et al. (2023) and Mumbing et al. (2021) bring
attention to resource limitations, indicating a vital need for additional resources and
infrastructure to bridge the digital gap. The collective insights from these studies
underscore the multifaceted nature of challenges faced by educators, from agerelated impacts to the crucial role of attitudes and the overarching need for enhanced
resources to fortify digital competencies.
Digital competence emerges as a pivotal skill set for educators in contemporary
society, as highlighted by Basilotta et al. (2022). The swift pace of technological
advancement underscores the indispensability of digital literacy in educators'
professional growth (Nguyen et al., 2023). With the responsibility of seamlessly
integrating digital technologies into the educational landscape, educators must exhibit
proficiency in navigating digital tools and platforms (Gümüş et al., 2023). The
exigencies of the COVID-19 pandemic have accentuated the need for educators to
25
possess robust digital literacy skills to effectively facilitate online teaching and
implement modern pedagogical models (Sánchez-Cruzado et al., 2021).
However, the inadequacy of digital literacy training for instructors and the
unpreparedness of higher education institutions for unforeseen events, such as the
pandemic, have significantly impacted teaching and learning outcomes (Udeogalanya,
2022). Addressing contemporary educational challenges necessitates an enhancement
of instructors' competency profiles and a paradigm shift in teaching methodologies to
empower 21st-century learners (Caena, 2019). As education, particularly online
learning, continues to evolve amidst rapid digitalization and computerization, the
adoption of a digital curriculum and the cultivation of digital literacy skills emerge as
imperative strategies to enhance teaching and learning outcomes (Forutanian, 2021).
From this study, it is evident that digital competences encompass a spectrum of skills,
including digital literacy, proficiency in utilizing digital tools for instruction, adaptability
to technological advancements, and the effective integration of digital technologies
into educational practices.
26
2.4. Theoretical Framework
Figure 2-1. DigiLit Leicester (Fraser et al. 2018)
The DigiLit Leicester framework, developed by Fraser et al. (2018), serves as a
comprehensive tool for assessing and enhancing the digital literacy skills of secondary
school teachers. Encompassing various aspects such as information management,
content creation, assessment, communication, safety, and professional development,
the framework categorizes proficiency into four levels: Entry, Core, Developer, and
Pioneer. Teachers can utilize this framework to self- assess their digital literacy, identify
areas for improvement, and follow suggested enhancements at each level. While the
framework remains valuable for gauging competence, it's crucial to acknowledge its
release in 2013 and the evolving nature of technology. To ensure accurate
assessments, educators should stay updated on current pedagogical theories
addressing digital challenges (From, J. et al., 2017), emphasizing the need for the
27
framework to reflect the ever-changing landscape of digital education (Nguyen et al.,
2023).
Figure 2-2. ICT Competency Framework for Teachers from UNESCO (2018)
The ICT Competency Framework for Teachers from UNESCO (2018) provides
recommendations and abilities needed for teachers to effectively incorporate
technology into their duties related to teaching, learning, and assessment. It helps
policymakers and teacher educators create effective training programs that help
educators learn digital skills and get ready for the digital age. The framework can
improve educational quality by improving the application of technology in the
classroom and the digital literacy of teachers. It has also three levels namely: Basic ICT
competence, Intermediate ICT competence, Advanced ICT competence.
28
Although the framework was useful in assessing and promoting using ICT
technology into educators’ pedagogic competences, it does not include how to
measure digital literacy as a whole because it focuses more on the technical aspects of
digital literacy. Digital literacy is not about technicality in using ICT technology; it is
beyond that. Digital literacy goes beyond technical proficiency to include the ability to
use technology successfully as well as knowledge of social standards surrounding its
proper use (Akayogluet al. 2020).
Figure 2-3. Digcomp 2.2 Framework (Vuorikari et al. 2022)
The DigComp 2.2 framework stands as a vital tool in advancing digital skills in
Europe, providing a unified reference for individuals, companies, and governments.
With pillars such as communication, cooperation, problem-solving, digital content
production, and safety, it offers proficiency levels crucial5f5or success in the digital
era. The framework serves policymakers in assessing citizens' digital literacy and
devising improvement strategies. Despite its significance, limitations include an
individual focus, neglecting broader socioeconomic factors like the digital divide. Its
applicability is confined to the European context, necessitating frequent 264 updates
29
to remain relevant amid evolving technologies (Vuorikari et al., 2022; From, J. et al.,
2017; Nguyen et al., 2023). Notably, its relevance to the study on educators' digital
competence is limited.
Figure 2-4. Common Framework of Reference for Intercultural Digital Literacies
(Sindoni et al., 2019)
The Common Framework of Reference for Intercultural Digital Literacies (CFRIDiL)
outlines five core competencies—informational, communicational, creative, critical,
and safe—within three proficiency levels. It emphasizes international skills, diversity,
and ethical considerations in the digital realm, promoting global awareness and
intercultural understanding. While similar to DigComp 2.2, CFRIDiL uniquely focuses on
intercultural digital competences. However, lacking a formal certification program and
global applicability, its acceptance and resource support need enhancement (Sindoni
30
et al., 2019). Despite the limitations of various frameworks, including outdated
versions and narrow scopes, each contributes valuable insights, albeit with barriers to
broader application and relevance in assessing educators' digital competence, the
primary focus of the study.
2.5 The DigCompEdu Framework
A study by Colás-Bravo et al. (2021) that used the DigCompEdu model to analyze
research on digital competence and sustainability over a ten-year period. This study
found that the model was helpful in understanding how technology is used in teaching,
with a focus on pedagogical digital competences. It also identified areas related to
sustainable development, such as inclusion and educational quality, which are linked
to teaching digital competence.
A study conducted by Benali et al. (2018) and Ghomi et al. (2019) adapted an aligned
scoring rule with Common European Framework of Reference (CEFR). The maximum
total number of points is 88, equivalent to 0 point to the lowest answer option, 1 to
the second lowest, and so on, so that the maximum number of points per question is
4. The studies mentioned above found that DigCompEdu is an effective tool for
measuring educators' digital competence, with findings showing that the framework
has good reliability and internal consistency (Cronbach's alpha) of the instrument.
Rapidly advancing technologies and globalization have led to the digital
revolution’s expansion and increased use of digital media. Due to this, there is a
demand for remote teaching and distance learning, particularly amid the COVID-19
31
pandemic (Whalen et al. 2020). Teachers themselves must be literate in order to
support young learners in their development of competence and to ensure the best
use of information and communication technologies (ICTs) (Napal et al. 2018). Digital
competence has risen in popularity in the educational context and is now one of the
most important skills that teachers need to possess in modern society (Basilotta et al.,
2022).
Figure 2-5. DigCompEdu Framework from Redecker (2018)
The DigCompEdu framework provides a comprehensive and structured approach
to teachers' digital competence. It ensures that educators have the necessary
attitudes, skills, and knowledge to effectively integrate digital technology into their
teaching practices (Redecker, 2017). The development of DigCompEdu involved
extensive collaboration and consultation with experts from various backgrounds,
resulting in a well-rounded and evidence-based framework. Additionally, the
32
framework incorporates insights from diverse sources, including local, national,
European, and international instruments, further enhancing its relevance and
applicability in different educational contexts (Ghomi et. al., 2019; Punie et. al., 2017;
Cabero et. al., 2020).
2.5.1 Professional Engagement
Digital competence in educators, as defined by Redecker (2017), encompasses their
ability to utilize technology for professional interactions with various stakeholders,
fostering their own growth and contributing to organizational innovation. The
framework consists of four subcomponents: leveraging digital technologies for
enhanced communication and instructional practices, engaging in professional
collaboration through digital platforms, practicing reflective evaluation of pedagogical
and digital approaches, and participating in continuous professional development
through various online resources and collaborative learning environments.
2.5.2 Digital Resources
Educators must identify, adapt, and manage digital educational materials that
meet learning objectives. Redecker (2017) divides these competencies into three parts.
First, educators should use effective search strategies to find learning- related digital
resources. Second, beyond selection, educators must be able to create or modify
digital resources for learning objectives. Finally, educators must be skilled at
33
managing, protecting, and ethically sharing digital resources while considering
copyright laws and material reuse.
2.5.3 Teaching and Learning
Digital technology enhances many teaching methods. Redecker (2017) states that
educators need digital competency to integrate technology into various learning
phases and contexts. Teaching requires creating, structuring, and using digital
technology. This domain has four main subcomponents: Digital technology improves
student learning when teachers structure material and interactions.
Second, digital teaching strategies must be assessed to facilitate educational
technology innovation and efficiency. Third, digital tools enable collaborative learning
and information creation. Fourth, digital technology helps students track progress,
collect data, and learn lifelong (Redecker, 2017).
2.5.4 Assessment
Digital education technology can support new evaluation methods and improve
assessment systems. It produces rich student behavior data that requires extensive
research and assessment to guide decisions. Teachers can modify strategies, provide
timely feedback, and track progress with digital tools. They allow customized progress
and learning outcomes assessment. Teachers must critically evaluate digital learning
data to improve instruction and student performance. Students and parents can track
progress and set goals with digital tools (Redecker, 2017).
34
2.5.5 Empowering Learners
Digital technologies in education enable student-centered learning and active
engagement, enabling customized learning experiences. Through strategic digital tool
use, educators foster student openness, personalization, and active participation. To
reduce inequality, all ages, especially those with special needs, need equal access to
digital tools. Educators should differentiate and personalize instruction to match
students' learning styles, paces, and abilities to encourage active learning (Redecker
2017).
2.5.6 Facilitating Learners’ Digital Competence
Students must know how to utilize digital technologies safely. Media and
information literacy, digital communication, content development, and ethical use are
included. It entails accessing internet resources, comprehending digital ethics, and
becoming responsible digital citizens. Information and media literacy, digital
communication, content creation, and responsible use are key. Independent learning,
academic success, and career success require digital problem-solving (Redecker 2017).
35
2.5.7 Progression Model
Figure 2-6. Progression Model of DigCompEdu (Redecker 2017)
The Framework also suggests a progression model for educators to assess and
improve their digital competence. It outlines six stages of digital competence
development to help educators determine how to improve their skills at their current
level. Bloom's updated taxonomy inspired these stages and their developmental
theory. This taxonomy is thought to explain the cognitive stages of learning, from
"Remembering" and "Understanding" to "Applying" and "Analysing" to "Evaluating"
and "Creating" (Armstrong, P. 2010).
Similarly, DigCompEdu's first two stages, Newcomer (A1) and Explorer (A2),
incorporate new information and develop basic digital practices; the next two,
Integrator (B1) and Expert (B2), apply, expand, and reflect on these practices; and the
highest, Leader (C1) and Pioneer (C2),
Newcomer (A1): Educators who are new to adopting digital technology in the
classroom are known as newcomers. They may simply use it for administrative tasks;
36
thus, encouragement and incentive are needed to help them realize it’s potential
(Redecker et al., 2017).
Explorer (A2): Explorers are aware of the possibilities of digital technologies and are
interested in using them to improve pedagogical and professional activities. They have
begun to use digital technologies in some areas of digital competency, but without a
comprehensive or uniform strategy (Redecker et al., 2017).
Integrator (B1): Integrators use digital technologies in many ways and for different
purposes. They use them creatively to improve their careers. They want to expand
their practices. However, they are still determining which tools work best in specific
contexts and aligning digital technology with pedagogical ideas and approaches
(Redecker et al., 2017).
Experts (B2): Experts improve their work with confidence, creativity, and critical
thinking using digital technologies. They actively select digital technologies and weigh
digital strategy pros and cons. They like new ideas because they haven't tried much.
They experiment to structure and solidify their methods. Any educational institution
needs experts for innovation (Redecker et al., 2017).
Leader (C1): Leaders consistently and thoroughly use digital technology to improve
teaching and practice. They use many digital methods to pick the best for each
occasion. They review and improve their methods constantly. Sharing ideas informs
37
coworkers of new discoveries. They inspire others by sharing knowledge (Redecker et
al., 2017).
Pioneer (C2): Pioneers are leaders among educators who challenge both educational
and digital practices. They aspire to further reinvent education by experimenting with
advanced digital tools and new forms of teaching. They are rare and serve as role
models for younger educators by driving innovation (Redecker et al., 2017).
2.6 Theoretical Framework
Figure 2-7. Theoretical Framework Adapted from DigCompEdu (Redecker 2017)
38
There are six dimensions that make up the Digital Competence of an Educator namely:
(PE) Professional Engagement, (DR) Digital Resources, (TL) Teaching and Learning, (A)
Assessment, (EL) Empowering Learners, (FLDC) Facilitating Learner’s Digital
Competence, that encompasses the attitudes, abilities, and knowledge crucial for the
proficient use of digital technology within a learning environment (Redecker 2018).
2.6.1 Professional Engagement
Engaged teachers demonstrate higher commitment, experience, and investment
in education, leading to enhanced digital literacy. This proficiency enables effective
integration of technology into teaching methods, positively impacting student
learning (Becker et al. 2000). Professional development further boosts teachers'
capacity to incorporate technology, improving overall learning outcomes. Including
place/community pedagogies in teacher education programs fosters professional
engagement, preparing educators for collaboration within teaching networks and the
broader community (Green 2016). Thus, the researchers hypothesized that:
H1: Professional Engagement is positively related to the digital competency level of
an educator.
2.6.2 Digital Resources
Digital literacy involves using technology to access, evaluate, and generate data.
However, resource shortages and outdated equipment may hinder digital competence
implementation. Teachers can improve their digital literacy by receiving learning and
39
professional development opportunities. Digital tools can help instructors become
more proficient in technology, improving student learning. Thus, using digital tools in
the classroom is crucial to improving teacher and student digital literacy (Pratolo, B.
W., & Solikhati, H. A., 2021). Thus, the researchers hypothesized that:
H2: Educators who use Digital Resources in educating learners positively related to
their digital competence.
2.6.3 Teaching and Learning
Developing digital literacy for teaching and learning is important since it is a
transversal ability with obvious educational consequences. In order to enhance digital
literacy, educators can use a variety of strategies, such as offering professional
development opportunities, incorporating technology into lesson planning, and
setting an example of responsible online behavior. Teaching professionals can improve
their approaches, student engagement, and learning outcomes by becoming more
digitally literate (Marín, V. I., & Castaneda, L. 2023). Thus, the researchers hypothesized
that:
H3: Educators that integrate teaching and learning through digital technologies are
positively related to their digital competence.
2.6.4 Assessment
Digital technology, particularly electronic rubrics, enhances the evaluation of
students' learning by providing precise information on abilities and performance
40
guidelines. This enables students to monitor their progress and allows teachers to
refine their pedagogical approaches. Incorporating digital technology into
assessments has been shown to improve student performance, readiness for practical
tests, and foster independence and collaboration. Teachers should consider student
perspectives and exercise caution when selecting structured criteria and performance
levels. In conclusion, digital technology is a crucial tool for effective assessment,
contributing to improved student learning outcomes (Casey and Jones 2011). Thus, the
researchers hypothesized that:
H4: Educators who employ digital technology in assessing students' performance
exhibit a positive relationship with their digital competence level.
2.6.5 Empowering Learners
According to Meyers et al. (2013), educators who are skilled in using technology in
the classroom can make learning more interesting and productive for learners. For
students who may have limited access to educational materials because of their
location or socioeconomic status, digital technology can improve accessibility in
education. Digital Technology can enhance learning by personalizing it to fit all
learner’s specifications and interests. A key component of contemporary education is
empowering students through the use of technology, emphasizing the value of having
digitally competent educators. Thus, the researchers hypothesized that:
H5: Empowering learners towards digital technology is positively related to the
digital competence level of an educator.
41
2.6.6 Facilitating Learners’ Digital Competence
Educators' efforts to support students' digital information and communication
abilities, which can be considered as an essential component of their professional
growth and aimed at fostering students' digital competence, are positively correlated.
This relationship holds for instructors' self-efficacy, frequency of use, and perception
of the value of ICT. The study found that teachers' own technological abilities,
confidence, and consistency of using technology in lesson plans significantly improve
when they put their attention toward improving their students' digital competencies.
Teachers are better able to appreciate the benefits that technology can offer to
education. This shows how prioritizing student digital competencies can boost
teaching techniques and instructor digital competency (Loving, C. 2023). Thus, the
researchers hypothesized that:
H6: Educators who facilitate learners' digital competence are positively related to
their digital competence level.
42
CHAPTER 3 METHODOLOGY
This chapter provides an in-depth overview of the study's methodology. Therefore,
the research areas and the rationale for selecting the area were clarified. The research
design, concept, and approaches were addressed. In addition, the data collection
techniques and methods for validating the instruments, collecting the data, and
conducting the analysis were provided.
3.1 Research Method
Figure 3-1. Conceptual Framework of the Study
In this section, the research methodology for conducting the study is outlined,
encompassing the chosen data collection approach and the statistical techniques to
be employed. The study heavily relies on the DigCompEdu Framework developed by
Redecker et al. (2017). Firstly, the researchers aim to explore the relationship among
the constructs of DigCompEdu to determine the extent to which they capture digital
43
competence. This exploration also aims to provide an overview of how each variable
correlates with digital competence. Secondly, the study seeks to assess the digital
competence level of ANHS teachers using the 22 competencies outlined in the
DigCompEdu Framework, employing a quantitative approach.
Quantitative research is used for pattern analysis, prediction, and testing causal
relationships, while qualitative research is used for understanding ideas and exploring
past observations (Bhandari Pritha, 2021). Qualitative research analysis is less popular
in research papers due to the time-consuming nature of organizing data into themes
and the difficulty in generalizing the findings to a wider audience (Elkatawneh 2016).
Furthermore, a correlational research approach was chosen to determine the
connection between the age range and years of service of the educators towards their
digital competence level.
3.2 Identifying Respondents
The participants in this study were consisted of teachers currently employed at
Agusan National High School, encompassing both junior and senior high school
levels, who voluntarily agreed to take part. The researchers aimed to achieve a
sizable sample size for the survey; however, due to time constraints and the survey's
scheduling during working hours, participation was limited. Consequently, data was
collected from 107 participants, representing a small fraction of the total population
of educators at ANHS, which amounts to 400 individuals. This limitation was
attributed to the survey's timing during normal working hours. The researchers
emphasized that respondents' participation in the study was solely based on their
44
voluntary engagement with the survey questionnaire, with only those expressing
willingness included. In the methodology, the researchers explicitly outlined their
intention to target junior and senior high school teachers, aligning with the study's
focus on assessing digital competence levels within the secondary education. This
deliberate targeting aimed to provide insights relevant to the research objectives
outlined in the study's scope.
Table 3-1. Total number of respondents
Age
20-30
31-40
41-50
51-60 and above
Total
Years of Service
1-10
11-20
21 above
Total
Frequency
37
30
20
20
107
Frequency
63
24
20
107
Percentage
34.58%
28.04%
18.69%
18.69%
100%
Percentage
58.87%
22.43%
18.69%
100%
3.3 Identify Critical Dimensions/Questionnaire Tool
The survey questionnaire heavily relies on the European Framework for the Digital
Competence of Educators developed by Redecker et al. (2017) with 22 competencies.
The DigCompEdu framework, seeks to provide educators of all levels and in a wide
range of circumstances with a common point of reference and direction as they work
to improve their digital teaching competencies. The framework is meant to act as a
background structure to support the execution of training programs for digital
competencies and guide policy (Redecker et al. 2017). The scoring rule for the
45
instrument allocates 0 point to the lowest answer option, 1 to the second lowest, and
so on, so that the maximum number of points per question is 4. The maximum total
number of points is 88 (Benali et. al., 2018; European Commission Joint Research
Center (2018).
46
Table 3-1. Survey Questionnaire tool (Redecker, et. al, 2017)
CONSTRUCT
Professional
Engagement
CODE
PE1
PE2
PE3
PE4
Digital
Resources
DR1
DR2
DR3
Teaching and
Learning
TL1
TL2
TL3
TL4
ITEM/QUESTION
I have the ability to use digital
technologies to enhance organizational
communication with learners, parents
and third parties.
I use digital technologies to engage in
collaboration with other educators,
sharing and exchanging knowledge and
experience.
I constantly evaluate my practices,
develop my skills, and seek
professional growth.
I am continuously expanding and
updating my digital skills and
knowledge through targeted training
and development opportunities.
I have identified, assess and select
digital resources for teaching and
learning.
I consider the specific learning
objective, context, pedagogical
approach, and learner group, when
designing digital resources and
planning their use.
I can organize digital content and make
it available to learners, parents and
other educators.
To increase the efficacy of instructional
interventions, I organize and
incorporate digital tools and resources
into my teaching.
By utilizing digital tools and services, I
enhance my relationships with
students both individually and
collectively, during and after class.
I encourage students to use technology
as part of their education of
collaborative assignments as a way to
improve teamwork, communication,
and the sharing of knowledge.
I help students manage their own
learning through the use of digital
SOURCE
Redecker et
al. (2017)
Redecker et
al. (2017)
Redecker et
al. (2017)
47
Assessment
A1
A2
A3
Empowering
Learners
EL1
EL2
EL3
Facilitating
Learners’
Digital
Competence
FLDC1
FLDC2
technologies, assisting them in
planning, measuring, and commenting
on their progress as well as discussing
ideas and coming up with creative
solutions.
I gather data on learners' progress and
keep track of the learning process
using digital assessment tools.
I can use digital proof to give learners
feedback on their performance and
progress and guide them towards
areas where they need to improve.
Digital tools help me modify my
teaching methods, give students timely
feedback, and provide tailored
assistance based on data.
I choose and apply digital pedagogical
tactics thattake intoaccount the
learners' competencies, expectations,
attitudes, misconceptions, and misuses
of technology, as well as the contextual
limits on their technology use (such as
accessibility).
I use digital tools to fulfill the various
learning requirements of my pupils,
enabling them to advance at different
rates and levels while upholding their
own particular learning objectives.
I use technology to get students
excited and involved in their learning.
This helps them think critically, be
creative, and solve real-world problems
while making the subject matter more
engaging and hands.
I create activities and assignments that
help students express what they need
to learn, search online, organize,
analyze, and verify information.
As a teacher, I incorporate lectures,
assignments, and exams that demand
students use digital platforms for civic
involvement, communication, and
collaboration in an ethical and efficient
manner.
Redecker et
al. (2017)
Redecker et
al. (2017)
48
FLDC3
FLDC4
FLDC5
I want my learning activities,
assignments, and assessments to allow
students to express themselves
through technology, teach them how
to create and change digital content,
and teach them about copyright, citing
sources, and crediting licenses.
I make learners aware of the
consequences of online misbehavior
(e.g. cyber bullying, hacking) and teach
them what to do if others misbehave.
I encourage learners to use digital
technologies creatively to solve
concrete problems.
Redecker et
al. (2017)
3.4 Identifying Relationship between Independent and Dependent Variable
To determine the relationship of the areas of DigCompEdu namely: Professional
Engagement, Digital Resources, Teaching and Learning, Assessment, Empowering
Learners, Facilitating Learners’ Digital Competence and digital competence, a partial
least square structural equation model (PLS-SEM) analysis was applied to carry out this
study. Structural Equation Modeling (SEM) is a statistical method used by the
researcher in various fields such as social, behavioral, education, biological, economic,
marketing, and medical researchers (Rustandi Kartawinata et al., 2021). The structural
model describes the causal relationships and their related construct (Kang et. al., 2021).
The Structural Equation Model (SEM) analysis estimates a series of regression
equations to examine the relationship between constructs (Hair et al., 2019).
49
3.5 Data Collection
Prior to conducting the research, the researchers submitted a letter to the school,
seeking authorization to carry out the study. After receiving confirmation and approval
from the school, all terms and conditions, including the survey's duration, were
mutually agreed upon. An agreement was also reached between the researchers and
the institution to ensure the confidentiality of the respondents. Subsequently, data
collection was conducted through a face-to-face survey, wherein participants utilized
a Five-Point Likert scale to rate their proficiency across 22 DigCompEdu competencies,
with scores ranging from 0 (Never) to 4 (Always). The scoring scale was aligned with
the Common European Framework of Reference (CEFR) language competence levels,
developed by the European Commission Joint Research Center. Among the
dimensions examined was teachers' adaptation to digital tools and learning
environments.
The researchers managed to gather data from only 107 participants, representing
a small fraction of the total population of educators in ANHS, which amounts to 400
individuals. This is attributed to the survey being conducted during normal working
hours. Hence, the respondents' participation in the study is based solely upon their
voluntary engagement in the survey questionnaire. Only individuals who have
expressed their willingness to complete the survey questionnaires are included. Given
the challenge of obtaining large samples in certain fields, Partial Least Squares
Structural Equation Modeling (PLS-SEM) was employed, known for its suitability in
50
handling small sample sizes. PLS-SEM's focus on latent variables, robustness to nonnormal data, and predictive accuracy render it suitable for generating reliable results
with limited observations (Memon et al., 2021). Studies by Hair et al. (2019) and Kock
and Hadaya (2018) have demonstrated PLS-SEM's efficacy in small sample analyses,
affirming its reliability and predictive power.
3.6 Data Analysis
Structural equation modeling (SEM) was utilized by the researchers in order to
analyze the data that was collected from the teachers at ANHS and to evaluate the
relationships that existed between the latent variables. Structural equation modeling
(SEM) has been widely utilized as a method for conducting data analysis in the field of
social science (Cillo et al., 2018). In addition to enabling the testing of hypotheses, it
enables the simultaneous analysis and prediction of complex construct relationships
(TomassMHultt 2022). In addition, the SmartPLS version 4 software was selected for
this investigation because of its capability to deliver comprehensive analyses and
results for a wide range of quantitative data types, such as measurements of the mean
and standard deviation that will be utilized in this study.
51
3.6.1 Reliability and Validity Construct
In PLS-SEM, it is essential to evaluate the reliability and validity of measurement
scales. This is because the accuracy of the results heavily relies on the quality of the
measurement model. The measurement model is an essential part of the overall
structural model as it establishes the connection between the observed indicators and
the underlying latent variables (Memon et al., 2021). Given that the framework utilized
in the study is new, it is crucial to provide a comprehensive overview of the constructs
within the framework, as well as the reliability and validity of the measurement scale.
Cronbach's Alpha and Composite reliability, can be used to assess the reliability of
data (CR). Cronbach's Alpha is a statistical measure used to assess the internal
consistency or reliability of a construct measurement. The components of the
construct are closely interconnected as a cohesive unit. The most frequently observed
outcome is a value ranging from 0 to 1. On the other hand, it is important to note that
a negative Cronbach's Alpha can also be observed, which suggests that there may be
significant issues with the procedure being used. For example, if certain score items
have opposite polarity compared to others, the average of all the correlations
between items can be negative. Therefore, it is important to ensure that the polarity
of all items is consistently aligned (Cillo et al., 2018).
Cronbach's Alpha provides recommendations for assessing construct reliability
and validity. According to these recommendations, a value below 0.58 is considered
unacceptable, while a range of 0.58-0.70 is considered minimally acceptable (Memon
et. al, 2021). A value of 0.70-0.80 is considered acceptable, and a value of 0.80-0.90 is
regarded as very good. Composite reliability (CR), also known as the McDonald's
52
coefficient, is calculated by summing the actual score variances and covariances in the
composite of indicator variables linked to constructs and dividing this sum by the
composite's overall reliability. Cronbach's Alpha is a reliability indicator that assumes
constant factor loadings across all items (Risher et. al, 2018).
On the other hand, Composite reliability measures how well latent construct
indicators capture the underlying concept and how consistent and reliable they are.
Composite reliability is calculated from indicator factor loadings and measurement
error variances. This metric helps evaluate the measurement model by revealing how
well the indicators represent the latent construct. Composite reliability values above
0.7 indicate stronger internal consistency among indicators and support the
measurement model's reliability (Memon et al., 2021: Risher et. al, 2018).
Data validity was assessed using discriminant and convergent validity. A
convergent validity indicator, the average variance extracted (AVE), compared
concept variation to measurement error. In most cases, an AVE of at least 0.5 is needed
to avoid the 35 variances of error exceeding the variance explained. Discriminant
validity evaluates whether model constructs are highly associated. The Square Root of
AVE of a concept is compared to its correlation with other constructs. The Square Root
of AVE is often considered higher than its correlation with others. If not, the individual
construct lacks discrimination, or unique explanatory power (Dakduk et al. (2019).
53
Assessing convergent and discriminant validity is important because it ensures that the
measurement model is accurately capturing the underlying constructs and that the
constructs are distinct from each other. If the measurement model does not have good
convergent and discriminant validity, the results of the structural model may be biased
or inaccurate, leading to incorrect conclusions and recommendations. Therefore, it is
essential to assess convergent and discriminant validity in research to ensure the
validity and reliability of the results (Memon et al., 2021: Risher et. al, 2018).
3.7 Progression Model and Aligned Scoring Rule for Assessing Digital
Competence of Educators
In order to ascertain the DigCompEdu competence level, the European
Commission Joint Research Center devised a scoring rule that was in accordance with
the language competence levels outlined in the Common European Framework of
Reference (CEFR). The initial assumption is that an individual whose proficiency would
revolve around the "Sometimes" response alternative, denoted by a score of 44,
would be classified as an Integrator (B1); an individual whose expertise would be
defined as consisting solely of the straightforward "Often" option, as illustrated by a
score of 66, would be on the verge of advancing from Expert (B2) to Leader (C1); and
that the discrepancy between the first two "Never" responses would be
approximately equivalent to the gap between Newcomers (A1) and E.
54
The scoring system for the Newcomer (A1) category is as follows: scores below 20,
for the Explorer (A1) category, between 20 and 33 (with the upper limit corresponding
to half of the items selected being "Rarely" and the other half "Sometimes"), for the
Integrator category, between 34 and 49, and for the Expert (B2) category, between 50
and 65; this divides in half the distance between the upper limit of the Explorer (A2)
category and the lower limit of the Expert (B2) category. Leader (C1) status is assigned
to scores ranging from 66 to 80, and individuals who meet this criterion by selecting
the highest option for a minimum of two-thirds of the 22 competencies are eligible to
be certified as Pioneers (C2) (Benali et. al., 2018: Redecker et. al., 2017; European
Commission Joint Research Center 2018).
CHAPTER 4 RESULTS AND DISCUSSION
This section displays the results of data collection, data processing, and
interpretation. The researchers revealed a substantial association between the
independent and dependent variables, which leads to educators' digital competency
being unlocked and discovered in a specific locale. Moreover, the research model,
methods, and instrument used in this study were anchored to the general objectives
of this study.
Table 4-1. Demographic Profile of Respondents
Age
Frequency
20-30
37
31-40
30
41-50
20
51-60 and above
20
Total
107
Years of Service
Frequency
1-10
63
11-20
24
21 above
20
Total
107
Percentage
34.58%
28.04%
18.69%
18.69%
100%
Percentage
58.87%
22.43%
18.69%
100%
4.1 Analysis of the Respondent Demographic Profile
According to the table, the age distribution of respondents is concentrated in
the 20-30 and 31-40 age ranges, comprising a majority of 62%. In contrast, respondents
aged 41-60 and beyond constitute a smaller portion at 37%. The average age is
computed at 38.804. This diverse age range among teachers may yield a valuable blend
of skills, as they bring varied perspectives, especially in incorporating technology into
55
teaching practices. In the same table, over 58.87% of respondents have less than ten
years of teaching experience. A smaller proportion (41.12%) has been teaching for 11-20
years and beyond. The mean teaching experience is calculated at 14.13 years. This
aligns with Table 1, highlighting that most teachers are aged between 20-29.
4.2 Measurement Model
The researchers used PLS-SEM to analyze data collected through five Likert scale
survey questionnaires, applying the DigCompEdu scoring rule with a range from 0
(lowest) to 4 (highest). The analysis was conducted using SmartPLS version 4
software, which facilitated the partial least squares structural equation modeling (PLSSEM) analysis. This analysis aimed to evaluate the latent variables in the model and
explore the relationships between independent variables (PE, DR, TL, A, EL, FLDC) and
the dependent variable DC (digital competence). The goal was to provide an overview
and context on how each independent variable relates to DC.
To ensure the reliability of the constructs, the researchers utilized Cronbach’s alpha
and Composite Reliability (CR). Convergent and discriminant validity were examined
using Average Variance (AVE) to identify convergent validity, following the approach
outlined by Hair et al. (2019). Discriminant validity was assessed using the FornellLarcker criterion and Cross loading to evaluate the validity of the latent variable.
Moreover, following the specification of the measurement model for the higherorder construct DC, a Mode B Repeated Indicator was applied to evaluate Digital
Competence (DC). The use of a Mode B Repeated Indicator is deemed more
56
appropriate when measuring a reflective-formative hierarchical latent variable model,
as suggested by Chin (2010) and Ringle et al. (2012). Within this model, there are six
lower-order constructs that are reflective at the lower level and formative at the higher
level, contributing to the construction and explanation of Digital Competence (DC)
depicting a reflective-formative hierarchical latent variable model. These lower-order
constructs include Professional Engagement (PE), Digital Resources (DR), Teaching
and Learning (TL), Assessment (A), Empowering Learners (EL), and Facilitating
Learners’ Digital Competence (FLDC).
Table 4-2. Construct Reliability Test
Cronbach's alpha
Composite reliability
PE
0.848
0.898
DR
0.799
0.882
TL
0.744
0.841
A
0.709
0.835
EL
0.763
0.864
FLDC
DC
0.854
0.946
0.896
0.951
Traditionally, Cronbach’s Alpha has been employed to assess the reliability of
constructs. However, a more precise measure of internal consistency reliability is
provided by composite reliability (CR) compared to Cronbach’s Alpha, as asserted by
Hair et al. (2014). Cronbach's alpha is considered less precise since it involves
unweighted items. In contrast, composite reliability assigns weights to items based on
the individual loadings of the construct indicators, resulting in higher reliability than
Cronbach's alpha. While Cronbach's alpha may be overly conservative, and composite
57
reliability may be overly liberal, the true reliability of a construct is often considered to
lie between these two extremes (Hair et al., 2019).
In terms of composite reliability (CR), a value greater than 0.70 is considered
indicative of a reliable construct (Hair et al., 2019). On the other hand, a Cronbach’s
alpha exceeding 0.60 suggests that the construct is reliable (Rahmawaty et al., 2021).
This observation underscores the survey questionnaire's reliability, derived from the
DigCompEdu Framework, affirming its capacity to accurately capture the essence of
Digital Competence. The findings indicate that it will serve as a valuable tool for
evaluating teachers' proficiency in integrating digital technologies into their
instructional practices, highlighting technology's pivotal role in enhancing student
learning experiences, empowering educators to embrace innovation and propel
educational practices to unprecedented heights.
58
Table 4-3. Convergent Validity of Lower Order Constructs
Latent
Variable
PE
DR
TL
A
EL
FLDC
Indicator Item
PE1
PE2
PE3
PE4
DR1
DR2
DR3
TL1
TL2
TL3
TL4
A1
A2
A3
EL1
EL2
EL3
FLDC1
FLDC2
FLDC3
FLDC4
FLDC5
Outer Loadings
0.829
0.867
0.794
0.826
0.874
0.811
0.847
0.815
0.577
0.807
0.803
0.775
0.762
0.839
0.788
0.828
0.856
0.821
0.685
0.817
0.814
0.833
Note: Bold Value does not meet the threshold
AVE
0.688
0.713
0.574
0.629
0.68
0.633
59
The second step in evaluating the reflective measurement model is focused on
assessing convergent validity, which gauges how effectively a construct explains the
variance within its items. Convergent validity is measured using the average variance
extracted (AVE), obtained by squaring the loading of each indicator on a construct and
calculating the mean. An AVE of 0.50 or higher is considered acceptable, indicating that
the construct accounts for at least 50% of the variance in its items (Hair et al., 2019).
Ensuring convergent validity requires that the outer loadings for each item have
substantial values, preferably exceeding 0.70 (Alchalidy et al., 2020), and the AVE
should surpass 0.50 (Hair et al., 2019). Despite concerns about potential statistical
grounds for excluding survey questions, the researchers decided to retain indicators.
This decision is crucial as eliminating questions based solely on statistical reasons could
compromise the content validity of the measures. Content validity, emphasized by
Henseler et al. (2015), ensures that survey questions effectively represent the full
scope of what researchers aim to measure. By preserving indicators, the researchers
intend to maintain a more comprehensive and accurate representation of the
construct under study.
Apart from the two mentioned indicators, reflective measurement models exhibit
higher loadings on their intended latent constructs compared to any other constructs
in the model. These outcomes satisfy cross-loading evaluation criteria, providing
satisfactory evidence for discriminant validity. As established in prior tests for
60
construct reliability and validity, the model meets the minimum criteria, indicating the
appropriateness of the research instrument in this study.
Table 4-4. Fornell-Larcker Criterion
PE
DR
TL
A
EL
FLDC
PE
DR
TL
A
EL
FLDC
0.829
0.739
0.701
0.577
0.576
0.641
0.844
0.675
0.554
0.579
0.699
0.757
0.631
0.741
0.758
0.793
0.685
0.7
0.824
0.774
0.796
Discriminant validity was evaluated using the Fornell-Larcker criterion, which
compares the square root of the average variance extracted (AVE) to the correlations
of latent constructs. The principle is that a latent construct should better explain the
variation of its own indicators than the variance of indicators from other latent
constructs. Consequently, the square root of each construct's AVE should be greater
than the correlations with other latent constructs (Ab Hamid et al., 2017; Risher et al.,
2019). The coefficient of predictive accuracy measures how effectively a model
predicts the value of an endogenous construct, such as a reflective measurement
model (Salloum and Shaalan, 2019), and is calculated by taking the average's square
root.
According to the results, the constructs (PE, DR, TL, A, EL, FLDC) exhibit good
discriminant validity, with the square root of each construct being relatively higher on
the off-diagonal values representing correlations between the constructs. However,
61
there is a minor concern with the TL and FLDC constructs, where the square root of
FLDC is slightly higher than the square root of TL by 0.001. In the context of examining
Digital Competence (DC), and considering its broad scope, some overlap among
indicators is expected (Tsankov et al., 2017; Morellato, M., 2014). Despite this, the
researcher has chosen not to modify the indicators, ensuring that survey questions
effectively represent the full scope of what the researchers intend to measure
(Henseler et al., 2015).
Cross Loadings Result
Cross loading is a statistical technique utilized to evaluate the alignment of
indicators with constructs in a reflective measurement model. This method examines
the correlations between indicators and constructs, ensuring that each indicator is
appropriately associated. Al-Emran et al. (2019) and Hair et al. (2019) stress the
importance of an indicator having a higher loading on its linked construct than its
correlation with other constructs to ensure valid measurement. The maximum
correlation between any two constructs should be smaller than the squared Average
Variance Extracted (AVE) of the highest construct. Validating constructs through crossloading analysis is crucial. In the reflective measurement models of this study,
indicators exhibit the highest loading on their respective constructs compared to other
constructs, satisfying cross-loading criteria and providing substantial evidence for
discriminant validity.
62
The model meets the minimum criteria for testing construct reliability and validity,
with minimal to no issues, confirming the suitability of the research instrument. It is
also important to emphasize that one should not prioritize strict research methods
over a strong conceptual foundation. Effective research necessitates a combination of
rigorous methods and solid conceptual frameworks (Farrell et al., 2009).
4.3 Structural Equation Modeling
Structural Equation Modeling (SEM) is a statistical tool for investigating the
relationships between variables and the constructs to which they are related (Kang &
Ahn, 2021). It is used to investigate the connection between variables and to assess the
strength of these correlations. This is performed by estimating a set of regression
equations and then assessing the collinearity of the variables, which could potentially
lead to biased conclusions (Hair et al., 2019). Researchers can discover causal linkages
and the extent of their influence by evaluating collinearity and other elements of the
regression equation.
63
Table 4-5. Collinearity Statistic (VIF)
PE
PE
DR
TL
A
EL
FLDC
DR
TL
A
EL
FLDC
DC
2.709
2.757
3.308
2.269
3.115
3.77
The Variance Inflation Factor (VIF) quantifies how much collinearity inflated the
variance of a calculated regression coefficient. According to Hair et al. (2017), if the
value of VIF reaches 5, it indicates the presence of collinearity and should be
addressed. As depicted by the Table 5, each value within the constructions falls
between 1.0 and 5.0. As a result, the collinearity level of each build satisfies the
requirements.
R-squared (R2) is used to calculate the amount of variance in the dependent
variable explained by the independent variables in a regression model. R2 is a statistical
metric that measures how well the regression model fits the data. R2 is a value
between 0 and 1, with higher values suggesting a better fit of the model to the data. A
value of 0 indicates that the model explains no variance in the dependent variable,
whereas a value of 1 show that the model explains all variance in the dependent
variable (Hair et., al 2019). However, because the research used the repeated indicator
approach, regardless of Mode A or Mode B measurement, and the higher-order
64
construct is formative (i.e., reflective-formative or formative-formative), the lowerorder constructs already account for all of the variance in the higher-order construct
(i.e., R2 = 1.0). As a result, alternative antecedent constructs cannot explain any
variation in the higher order construct, and their routes to it are zero (insignificant)
(Ringle et al., 2012; Wetzels et al., 2009). Thus, indicating a flaw to the model (Becker
et al., 2012), the presence of a global variable or a single-item variable, capturing the
essence of the construct, is generally sufficient as an alternative measure (Sarstedt et
al., 2016a). Recognizing the importance of including a single-item variable ensures a
robust evaluation of convergent validity for formatively measured constructs (Hair et
al., 2019). Nonetheless, there are no other antecedent constructs besides the lowerorder constructs; thus, the study remains relevant in explaining the connection
between the lower-order constructs and the higher- order construct (DC).
65
Figure 4-1. Structural Model Results
In Figure 4-1, all p-values are accepted, as the lower-order construct indicators
collectively explain digital competence (DC). The research employs the repeated
indicator approach with mode B on the higher-order construct, and the inner path
weighting scheme can yield R-squared values of 1 for the second-order construct. This
occurs because the repeated indicator approach with mode B fixes the variance of the
higher-order construct to 1, making the R-squared value for the second-order construct
consistently 1 (Becker, J., 2012; Ringle et al., 2012; Wetzels et al., 2009).
66
Table 4-6. Path Coefficient Result
T statistics
P values
Supported
1PE -> 7DC
14.169
0.000
YES
2DR -> 7DC
16.042
0.000
YES
3TL -> 7DC
15.507
0.000
YES
4A -> 7DC
10.702
0.000
YES
5EL -> 7DC
15.953
0.000
YES
6FLDC -> 7DC
16.293
0.000
YES
The reliability of reflective variables of the model is assessed using Cronbach's
Alpha, whereas for the formative construct (DC), reliability is gauged through path
coefficients. In partial least squares structural equation modeling (PLS-SEM), path
coefficients play a crucial role in offering quantitative insights into the strength and
direction of relationships among latent variables. These coefficients serve as valuable
tools for evaluating theoretical models and comprehending the interconnections
within the constructs (Becker, J., 2012; Hair et al., 2019).
As depicted from the table above, all dependent variables that form the digital
competence exhibit p-values of 0.000, indicating extremely strong evidence against
the null hypothesis, supporting the alternative hypothesis and suggesting that the
observed data is highly statistically significant. On the other hand, T values are used to
assess the significance of path coefficients and the relationships between latent
variables in the structural model. As depicted form the table the t- values are very high
indicates a high level of statistical significance, suggesting that the relationship
between the lower construct of the model to the higher construct (DC) is unlikely to
67
be due to random chance (Hair et al., 2019). As a result, the researchers' purpose to
determine into what extent each independent variable form the dependent variable
(DC) is extremely high and proves to be very significant in terms of the statistical
results.
On the contrary, all dependent variables comprising digital competence exhibit pvalues of 0.000, indicating robust evidence against the null hypothesis and in favor of
the alternative hypothesis. This suggests a positive relationship between the
constructs: Professional Engagement (PE), Digital Resources (DR), Teaching and
Learning (TL), Assessment (A), Empowering Learners (EL), and Facilitating Learners'
Digital Competence (FLDC), with digital competence. These results support the
researchers' hypothesis regarding the association between these constructs and
digital competence.
68
Table 4-7. Hypothesis Testing Results
Hypothesis
H1
H2
H3
H4
H5
H6
Supported
Professional Engagement is positively related to
the digital competency level of an educator.
Educators’ who use Digital Resources in educating
Learners positively related to the digital
competence of an educator.
Educators’ that integrate teaching and learning
through digital technologies are positively related
to the digital competence of an educator.
Educators who use digital technology in assessing
students’ performance are closely related to the
digital competence of an educator.
Empowering Learners towards digital technology
are positively related to digital competence of an
educator.
Educators’ who facilitate learners’ competence are
positively related to the digital competence of an
educator
Yes
Yes
Yes
Yes
Yes
Yes
After examining the relationship between the DigCompEdu Framework's
constructs and how well they relate to digital competence, to determine if the
dependent variables of the constructs are suitable for the framework. Upon carrying
out the SEM model, it was found that the relationship of each independent variable,
namely: professional engagement (PE), digital resources (DR), teaching and learning
(TL), assessment (A), empowering learners (EL), and facilitating learners' digital
competence (FLDC), is positively related to an educator's digital competence level.
As a result, all of the researchers' hypotheses are accepted and supported. The
independent variables are proven to be significant in understanding and forming
the dependent variable Digital Competence (DC) in an educator.
69
Furthermore, the results indicated that these dependent variables (PE, DR, TL, A, EL,
and FLDC) should be represented as a reflective construct to form digital DC (Digital
Competence) that serves as the formative construct of the framework. This means
that (PE, DR, TL, A, EL, and FLDC) actually help define or form the concept DC (Digital
Competence)
4.4 Participants Digital Competence
Figure 4-2. Participants Level of Digital Competence (Derived from Benali et al.,
2018)
Figure 4-2 presents data from 107 respondents, indicating that educators from
Agusan National High School demonstrate a high level of digital competence, with
more than half falling into the categories of Experts (B2) and Leaders (C1). This
suggests that most educators at Agusan National High School use digital technologies
consistently and thoroughly to improve both pedagogic and professional practices.
70
They display a wide range of digital methods and the ability to select the best solution
for varied situations. Educators also participate in continual reflection and growth of
their practices, they stay current on innovations and ideas by exchanging thoughts
with peers, contributing to the enhancement of teaching, and learning through digital
technology (Redecker 2018; Benali et al., 2018). However, other participants are
distributed among different categories, with 12.15% falling into the Integrator (B1)
category. Those educators that belongs to this category utilize digital technologies
diversely and creatively to enhance their careers and expand their practices.
Nonetheless, they are still in the process of determining the most effective tools for
specific contexts and aligning digital technology with pedagogical ideas and
approaches (Redecker et al., 2017). While 2.80% fall into the Pioneer (C2) category,
representing educators who challenge educational and digital practices, aspiring to
reinvent education through experimentation with advanced digital tools and
innovative teaching methods. They are rare leaders serving as role models for
younger educators, driving innovation (Redecker et al., 2017). Notably, there are no
educators categorized as Newcomers (A1), the lowest level of digital competence.
71
4.5 Average Score by Competence
Figure 4-3. Average scores by competence (Derived from Benali et al., 2018;
Dias- Trindade et al., 2020)
72
Figure 4-3 illustrates the varying levels of difficulty for the 22 DigCompEdu
competencies, scored on a scale from 0 (lowest) to 4 (highest). The mean scores
across multiple variables suggest that participants are effective in most areas on
average. However, the range of results, represented by the standard deviations,
indicates different levels of performance, leaving room for improvement among
specific individuals. Higher standard deviations point to significant performance
disparities, which may be influenced by factors such as experience, training, resource
availability, and demographic factors like age and years of experience (Benali et al.,
2018).
As observed, three competencies of DigCompEdu stand out: Guidance (3.327)
refers to utilizing digital tools and services to enhance their relationships with
students both individually and collectively, during and after class. Responsible use
(3.168) refers to prioritizing the safety and well-being of learners when using digital
tools, empowering them to manage risks and use technology responsibly. Lastly,
digital communication and collaboration (3.112) refers to designing activities and
assignments that teach students how to use digital tools for communication,
teamwork, and participating responsibly in their communities.
In contrast, the competencies with the lowest average scores in DigCompEdu
suggest that some educators find them challenging to acquire. Differentiation and
Personalization (2.449) refers to enabling students to progress at various rates and
levels while achieving their own learning goals using digital tools. The creation and
modification of digital resources (2.495) entail developing new digital educational
73
materials, adapting open-licensed and other permissible resources, and aligning them
thoughtfully with learning objectives, contexts, pedagogical approaches, and learner
groups. Lastly, Assessment Strategies (2.533) that involves using digital tools to
monitor and assess students' learning progress through diverse and appropriate
assessments.
Furthermore, the educators' overall mean score on every item on the instrument
was 2.92 out of a possible 0 to 4, indicating a high value. This means that educators in
ANHS answered mostly “often” on the instrument, suggesting that they frequently
engage in the behaviors assessed by the instrument. This suggests that educators at
ANHS are aware of the impact of digital technologies in education and are integrating
them into their pedagogic competencies. However, it is noteworthy that almost half
of the respondents are younger teachers, with ages ranging from 20 to 40. This
indicates a strong assertion that younger teachers are more adept at utilizing digital
resources, often categorized as "digital natives" (Abella et al., 2023; Saripudin et al.,
2021; Cabero et al., 2021). This finding emphasizes the areas in DigCompEdu that
require improvement and additional training to enhance educators' capacity in
integrating technology into their pedagogic competencies.
74
4.6 Participants DC base on Years of Teaching
Figure 4-4. Participants Digital Competence Based on Years of Experience (Derived
from Benali et al., 2018)
Digital competence among educators strongly correlates with their teaching
experience. Younger educators generally exhibit higher competence compared to
their more experienced counterparts. Significantly, individuals with 1–10 years of
experience do not possess the Newcomer (A1) and Explorer (A2) skill levels, but
instead begin at the Integrator (B1) level. Among educators with less than ten years of
experience, 4.76% are classified as Pioneers (C2), signifying the highest level of digital
competence. On the other hand, teachers with 11-20 years of experience display a
broader distribution, with the majority (54.16%) at the Leader (C1) level, and a small
percentage (4.17%) at the Explorer (A2) level. Among those with 21 or more years of
experience, 45% are Experts (B2), and only 35% are Leaders (C1), notably lower than the
percentages for those with less than 20 years of experience (50.79% and 54.16%).
75
The study reveals that teachers at Agusan National High School possess a notably
high level of competence, which can be attributed to the demographic composition
of the respondents. Specifically, more than half of the respondents, 63 out of 107, fall
within the age range of 20-40, indicating a prevalence of younger teachers. This
demographic skew likely contributes to the observed high level of digital
competence among educators. The higher competence among these younger
educators can be attributed to their increased exposure to and positive disposition
toward digital technologies, as indicated by research from Benali et al. (2018), Cabero
et al. (2021), Zakharov et al. (2021), Romero et al. (2020), and Fernandez-Diaz et al.
(2016). The prevalence of younger educators with higher digital competence levels
implies a generational advantage, likely stemming from their familiarity with and
inherent affinity for technology.
4.7 DC based on Age
Figure 4-5. Digital Competence Based on Age (Derived from Benali et al., 2018)
76
The table above clearly illustrates that younger teacher, aged 20-30, exhibit
significantly higher digital competence, with 62% falling into the Leader (B2) category.
Among educators aged 31-40, competence is more evenly spread, with Expert (B2) and
Leader (C1) categories each comprising 36.7% falling into the Pioneer (C2) category. In
the 41-50 age group, the number of educators in the Leader (C1) category is relatively
higher, accounting for 55%, compared to the 30% in the Expert (B2) category. However,
for teachers aged 51 and above, the digital competence levels are notably lower, with
only 35% in the Leader (C1) category. It is also noteworthy that 10% of educators in this
age group still fall into the Explorer (A2) category, contrasting with the results of the
previous age group where no educators fell into this category.
Therefore, younger teachers, often categorized as "digital natives," exhibit a
higher likelihood of accessing and possessing a more significant capacity for learning
and utilizing digital resources. This age group is more adept at developing their
knowledge, abilities, and attitudes toward digital resources due to their recent
educational experiences. As Their enhanced digital literacy can be attributed to the
evolution of technology in their generation, making them more accustomed to its
usage compared to earlier generations (Abella et al., 2023; Saripudin et al. 2021; Cabero
et al. 2021).
77
4.8 Discussions
To comprehensively explore the correlation between underlying variables in the
DigCompEdu framework, researchers used PLS SEM with SmartPLS4 software. The
researchers examined both the measurement and structural models of the
DigCompEdu framework. Validating these models ensures that the measurement
instruments accurately capture theoretical constructs and that proposed relationships
between constructs are supported by the data.
This testing method improves credibility, reliability, and integrity of the research
findings, hence strengthening the overall validity of the study (Hair et al., 2019). Based
on the results of the measurement model, the lower-order constructs of digital
competence (DC), namely PE, DR, A, EL, and FLDC, are instrumental in determining the
essence of digital competence. Despite the broad nature of digital competence
causing some indicators to overlap (Tsankov et al., 2017; Morellato, M., 2014), the
researchers were still able to gain valuable insights to understand it.
According to the SEM results, all constructs leading to the explanation of digital
competence demonstrate a highly significant association between the variables under
investigation. T values, ranging from 10 to 16, are similarly relatively high, indicating a
high level of statistical significance. This implies that the association between the
model's lower construct and the higher construct (DC) is unlikely to be due to chance
(Hair et al., 2019). As a result, the researchers' entire hypothesis is accepted.
Consequently, the researchers' goal of determining the extent to which each
78
dependent variable forms the independent variable (DC) is relatively high and proves
to be statistically significant.
The objective of the study is to examine the fundamental digital competence skills
possessed by ANHS teachers and investigate the correlation between educators' age
range, years of service towards their levels of digital competence. Due to the study
being conducted during the educators' working hours, the researchers were only able
to collect responses from 107 out of 400 educators in ANHS, representing
approximately 26.75% of the total respondents. Hence, the assessment is limited to
only those 107 respondents. Findings reveal that Agusan National High School
educators demonstrate a significant level of digital competence, with a substantial
number classified as "Experts" and "Leaders.". This demonstrates their successful
incorporation of technology into both teaching and professional initiatives,
highlighting their versatile proficiency in different digital environments.
In addition, the outcomes of the study are influenced by the age of an educator
and their years of experience. Younger educators (20-40 years old) demonstrated
higher levels of digital competence, notably in the Leader (B2) group. The gradual
decline in digital competence levels among educators aged 41 and above, with a
notable presence in the Explorer (A2) category, supports the idea that younger
teachers, often referred to as "digital natives," have the advantage in adapting to and
utilizing digital resources due to their increasing exposure to digital technology
compared to their older counterparts. This finding is supported by the studies of Abella
et al. (2023), Saripudin et al. (2021), Cabero et al. (2021), and Benali et al. (2018), where
79
younger teachers, ranging from less than 25 to 40 years of age, often exhibit greater
enthusiasm, and better preparedness to learn. Implying that an educator's age has an
impact on their eagerness and capacity to learn new technologies.
Furthermore, the study found a strong link between educators' digital proficiency
and their years of teaching experience. The absence of the Newcomer (A1) and
Explorer (A2) categories within the 1 to 10 years’ experience group signifies a distinct
pattern. The existence of educators in the Pioneer (C2) group within this experience
range, beginning at the Integrator (B1) level, demonstrates an early commitment to
digital integration. This result aligns with findings from Benali et al., 2018; Cabero et al.,
2021; Zakharov et al., 2021; Romero et al., 2020; Fernandez-Diaz et al., 2016, suggesting
that teachers with less than 10 years of experience tend to have lower competence
levels, while those with 10 to 15 years of experience exhibit higher levels. With more
than 15 years of experience, the scores were spread across different levels, with more
individuals in the Expert and Integrator groups. This is mainly because some new
generation teachers are more exposed to technology than the old generation (Abella
et al., 2023).
Despite the small sample sizes, the DigCompEdu framework proved highly
effective in achieving study goals and providing valuable insights into educators' digital
competence levels. It offers a structured approach to measuring digital competence,
enabling comprehensive analysis across different knowledge areas, genders, age
groups, and educational contexts. In studies by Guillen-Gamez et al. (2021), Salminen
et al. (2021), and Vieira et al. (2023), despite representing small percentages of the total
80
population, the framework facilitated thorough examinations of digital competence
levels. These findings underscore the robustness and effectiveness of DigCompEdu in
evaluating digital competence, even with limited sample sizes.
4.9 Implication
4.9.1 Implications for Practice
The pandemic caught the Philippines somewhat off guard, prompting a swift yet
somewhat unprepared response. This event serves as a wake-up call for our schools,
teachers, students, and other stakeholders to be better prepared for unforeseen
scenarios that may impact various aspects of our public, including the education
sector. The pandemic prompted the timely initiation of this study to assess the digital
competency levels among teachers at Agusan National High School. The goal is to shed
light on which competencies teachers should improve. While the school is dedicated
to providing students with a high-quality education, with the ultimate goal of
equipping graduates with essential skills and competencies, it is equally important for
educators to align their competencies with digital technologies. Given that digital
technologies are likely to persist for a long time, utilizing them for educational
purposes can yield positive results rather than harm.
The study's outcomes underscore the essential for targeted professional
development initiatives customized to the specific digital competence levels identified
among educators at Agusan National High School (ANHS). Priority should be given to
reinforcing competencies associated with lower-order constructs such as PE, DR, A,
EL, and FLDC. Addressing these foundational skills is crucial for educators to improve
81
their overall digital competence, thereby facilitating effective technology integration
in both pedagogical and professional contexts.
Acknowledging the age-related differences in digital competence, institutions can
formulate age-responsive training modules. Tailored programs for younger educators
should focus on advanced digital skills and innovative teaching practices. Conversely,
older educators may derive significant benefits from foundational training initiatives
designed to bridge the exposure gap to digital technologies. This ensures a
comprehensive and inclusive approach to professional development. The
establishment of mentorship programs and leadership initiatives within ANHS can
cultivate a collaborative learning environment. Educators in the Leader (C1) and Expert
(B2) categories can serve as mentors for those in lower competence levels, fostering
knowledge exchange and skill development. This approach nurtures a culture of
continuous learning and peer support.
4.9.2 Implications for Future Practice
The DigcompEdu Framework utilized by the researchers in this study is proven to
be significant in assessing the digital competence of teachers at Agusan National High
School, providing valuable insights into their current level of digital competency.
However, to understand more about the Framework reliability and robustness in
underlying the connection of its lower order constructs PE, DR, A, EL, and FLDC to the
higher order construct (DC) digital competence. The researchers are advised to plan
ahead in data gathering to incorporate a single-item variable capturing the higher
82
order construct's essence can offer an alternative measure that strengthens the
evaluation of convergent validity for formatively measured construct (DC).
On the other hand, future research can look into how teachers' digital skills affect
how well their students do in school. Finding out how teachers' different levels of
digital proficiency affect their students' interest, performance, and overall learning can
help to develop evidence-based teaching methods. It is also important to consider the
sample size, as a larger sample would provide a more reliable picture of the current
digital proficiency of teachers. Exploring contextual factors influencing digital
competence, such as institutional support, available resources, and cultural
considerations, attitudes, and perception towards technology, can provide a
comprehensive understanding of the dynamics at play. This knowledge can guide the
development of strategies that align with the unique context of ANHS and similar
educational institutions.
Furthermore, the study revealed a connection between age and experience, with
younger educators generally demonstrating higher digital competence than their
older counterparts. Notably, educators in the 31 and above age range fall into the
"Explorer" category (A2), suggesting they are enthusiastic about digital tools and
experimenting
83
in some areas, but lack a systematic approach. Furthermore, the research indicates the
necessity of implementing teacher training programs in the three primary areas that
needs to improve, as indicated by the DigComEdu competencies where teachers have
the lowest average scores. This suggests that some educators find these
competencies difficult to acquire. These competencies are:

Differentiation and Personalization that the teacher acquire the necessary skills
and knowledge in using LMS systems like as Moodle, Google Classroom, or
Edmodo that can be used to create personalized learning paths that let
students follow their own pace while addressing their particular learning
objectives. Educators that possess this ability are able to modify their lesson
plans, curriculum, and evaluation procedures to meet the needs of students
with varying learning styles, speeds, and preferences (Redecker 2017).

By creating and modifying digital resources, teachers can gain the skills and
knowledge necessary to create, edit, and modify digital content for educational
purposes, making sure that it is appropriate, efficient, and in line with the
particular requirements of the target audience and learning environment.
Educators with proficiency in this competency can leverage various digital tools
and platforms to create engaging and relevant instructional materials that
enhance the learning experience for t h e i r students (Redecker 2017).
84

Assessment Strategies that teacher acquires the essential competencies and
knowledge to use digital technologies to design, implement, and evaluate
assessments that meet learning objectives, contexts, pedagogical approaches,
and diverse learners. Competent educators can choose and use digital tools to
improve assessment, give students timely and constructive feedback, and
adapt assessment methods to the digital age. Using learning management
systems (LMS) and Google Classroom, teachers can administer online tests,
surveys, and assignments to evaluate student progress and learning. This
competency emphasizes integrating digital tools to improve education
assessment strategies as technology evolves (Redecker 2017).
The domain of digital competency is vast and cannot be comprehended in just one
setting. Therefore, it is important to investigate alternative areas and environments,
particularly in urban areas, to enhance and explore educators' current digital
competence. This will enable them to take measures that can greatly enhance their
utilization of digital technologies, as DigCompEdu has 22 competencies that can be
used to refer to which competencies teachers lack. This is particularly crucial because
some urban areas lack the necessary materials and resources to effectively incorporate
these technologies. Ultimately, this will help to narrow the digital divide between
teachers in urban and rural areas. Exploring patterns and methods to underscore it
85
should be investigated, making it a promising candidate for further understanding in
this research area.
CHAPTER 5 SUMMARY, CONCLUSION AND RECOMMENDATION
5.2 Summary
To assess the digital competence of educators at Agusan National High School, this
study employed stratified sampling as the sampling approach, PLS-SEM to evaluate the
relationship of each variable, and utilized the DigCompEdu competencies and
progression model to determine the digital competence level of each educator.
Demographic analysis revealed diverse ages and teaching experiences among
respondents. Cronbach's alpha and Composite Reliability (CR) were utilized to test
construct reliability, while convergent and discriminant validity were employed to
ensure indicator validity, guaranteeing both constructs' reliability and validity.
Discriminant validity was assessed using the Fornell-Larcker criterion and Cross
loading, while convergent validity was quantified using Average Variance (AVE).
Structural equation modeling showed positive relationships between professional
engagement, use of digital resources, integration of digital technologies in teaching
and learning, digital assessment practices, empowering learners, and facilitating
learners' digital competence, shedding light on educators' digital competence. The
study's findings demonstrate the importance of these factors in shaping effective
digital teaching practices and provide a comprehensive understanding of educators'
digital competence. Moreover, the study found that over half of educators are Expert
(C1) and Leader (C1) digitally competent. None are Newcomers (A1), demonstrating
84
their widespread use of digital technologies for pedagogy and professional purposes.
Digital competence levels are higher in younger teachers (20-40), with 62% classified as
Leaders (B2). Educators over 51 have lower digital competence, highlighting
generational differences in digital literacy. The association between digital
competence and teaching experience shows that less experienced teachers have
higher competence levels, suggesting that exposure and attitudes toward digital
technologies in younger educators are positive.
5.3 Conclusion
In today's world, it is very necessary for teachers to have a high level of digital
competency in order to ensure that their pupils are prepared for the future. In order
to guarantee the dependability and validity of the outcomes of the research, the
DigCompEdu framework was verified with great care through the use of PLS SEM and
SmartPLS4 software. The lower-order digital competence constructs played a crucial
role in defining the essence of digital competence, and substantial connections were
discovered between all of the constructs. ANHS instructors demonstrated a notable
degree of digital competence, as seen by the findings, which highlighted their
expertise in integrating technology into teaching and professional activities across a
variety of digital contexts. Furthermore, relationships between age, experience, and
digital competence were found, which highlights the necessity of doing additional
research into the contextual elements that influence the digital proficiency of
educators.
85
In addition to this, it is important to recognise the limitations of the study, which
include the small sample size and the fact that it only focused on ANHS. In the future,
research endeavors should widen their reach to cover alternative educational
environments and dive further into the influence that instructors' digital abilities have
on the outcomes of their students. Furthermore, in order to conduct a full examination
of convergent validity inside the DigCompEdu framework, it is essential to incorporate
single-item variables in order to capture the core of higher-order constructs. Despite
these limitations, this research offers useful insights into the current status of digital
competence among educators working in the ANHS. It also paves the way for further
investigation into this vital field of study.
5.4 Recommendations
The study is limited by factors such as the limited number of participants and its
exclusive focus on ANHS. Future research endeavors should broaden their focus to
include rural regions and employ a significantly larger sample size in order to gain a
more comprehensive understanding of the digital competence of a specific area and
avoid any potential bias. Examining a variety of educational settings and exploring the
precise impact of educators' digital skills on student achievements are also essential
areas for investigation.
Furthermore, the study found that experience and age were related, with younger
educators often more digitally competent than their older counterparts. Notably,
86
educators in the 31 and above age range fall into the "Explorer" category (A2),
suggesting they are enthusiastic about digital tools and experimenting in some areas,
but lack a systematic approach. Additionally, the study reveals that several critical
competences, such as differentiation and personalization, creating and modifying
digital resources, and assessment strategies, show lower-than-average proficiency.
These are areas that teachers often hesitate to prioritize or focus on when embarking
on their own digital teaching journeys, indicates that these competencies need to be
improved (Benali et al., 2018).

To close this disparity, the research highlights the need of training initiatives to
improve the competence of teachers with technologies. The findings indicate
that participation in digital competence training significantly improves
teachers' skills in leveraging technology in their pedagogic competences.
Additionally, research indicates that fostering collaboration among educators
from all generations can support learning, the growth of skills, and the
improvement of competency. Teachers from the millennial generation can help
older teachers adjust to the demands of digital technology in the classroom by
offering their own viewpoints on how to use it.
By targeting the specific areas in which teachers face the greatest challenges,
these programs can enhance educators' overall digital proficiency and ultimately
enhance students' learning experience (Beneyto-Seoane et. al., 2018). As evidenced by
Abella et al.'s (2023) study, teachers with ICT pre-service training demonstrate higher
levels of digital competence.
The achievement of digital competence among educators in the Philippines is
87
essential thereby demanding that the government and the education sector work in
collaboration. Nevertheless, considerable barriers are presented by logistical
challenges, including limited access to digital technology, which is especially
problematic for individuals who are financially constrained (Dotong et al., 2016). The
situation worsens due to high costs and inadequate internet connectivity, negatively
affecting students and educators with limited financial resources (Asio et al., 2021).
Moreover, difficulties with internet access prevent teachers from acquiring the ICT
competencies necessary for efficient learning and instruction (Alvarez, 2020). In rural
areas with inconsistent access to the internet and electricity, online learning presents
particular challenges (Tanucan, Hernani, & Diano, 2021). (Tanucan, Hernani, & Diano,
2021). The slow integration of digital technology in education can be attributed to
structural issues, financial constraints, and management challenges (Dotong et al.,
2016). Recognizing these challenges, comprehensive strategies, infrastructure
improvements, financial assistance, and targeted training programs to improve
Philippine teachers' digital competence are needed.
Download