Exploration of Digital Equity in Texas` Educational Service

advertisement
Exploration of Digital Equity in Texas’ Educational Service Center Region 13
Renata Geurtz
University of Texas, Austin
EDA 383: Advanced Quantitative Research and Analysis
Fall, 2012
Dr. J. Wayman
DIGITAL EQUITY
1
Exploration of Digital Equity in Texas’ Educational Service Center Region 13
Abstract
As the world becomes more digitally centered, schools must prepare students to live and
work in a technology-rich society. Studies in digital equity have shown that traditionally
marginalized students are also digitally marginalized. This brief analysis investigates
digital equity at the campus school level by examining the relationship between school
characteristics and technology integration practices in K-12 schools. Educators in the
State of Texas self-assess their technology integration practices on a four-point scale
across four key technology areas and the data is reported on the State Technology and
Readiness (STaR) chart. Campus-level STaR chart average scores were compared to
accountability ratings, school type, and demographic characteristics of elementary,
middle, and high schools in Education Service Center Region 13 (n= 484). ANOVA
analyses indicate that Exemplary campuses have statistically higher levels of technology
integration than Recognized or Acceptable campuses. No differences in technology
integration were found between campus types. In a linear regression model, with
technology integration as the dependent variable and schools’ demographics as
independent variables, higher percentages of economically disadvantaged students was a
predictor, when controlling for percentages of total minority students, limited English
proficiency students, and at-risk students.
Across the globe, the nation, and Texas, technology is becoming more ubiquitous
for greater diversities of citizens. Students of today will live and work in a world where
technology is a tool to accomplish most any activity. Those who are digitally proficient
will have acquired economically valuable skills that will ensure a competitive advantage
in a more global society. In response to these 21st century economic, political, and social
realities, Texas Education Agency has addressed the importance of teaching digital
literacy through the Long Range Plan for Technology (LRPT) which was first written in
1996. The Texas Education Code, Section 32.001, requires the State Board of Education
to develop a long-range plan for technology, and it requires regular reporting to the
governor and Legislature on the progress of the plan. The overarching goal of the LRPT
is to position Texas public schools to transform teaching and learning through the
integration of technology. The plan describes a vision to prepare “each student for the
DIGITAL EQUITY
2
success and productivity as a lifetime learner, a world-class communicator, a competitive
and creative knowledge worker, and an engaged and contributing member of our
emerging digital society (Texas Education Agency, 2010).
Starting with the Civil Rights movement in the 1960, educational leaders have
focused on education equity in the American public education system that has historically
been divided by racial, ethnic, and class differences. At the advent of the digital
revolution, schools adopted computers so that students could benefit from the power of
computers and connections to the world through the Internet. The Internet has expanded
access to education, political participation, and medical information, among others – and
in doing so, created multiple paths towards a more participatory democracy.
The benefits of computing and the Internet have not been distributed fairly.
Those who are part of the racial majority, high levels of education, and high income have
participated in the technology revolution, while others have not. The gap between the
haves and have nots is called the digital divide. The National Center for Education
Statistics (NCES) report, (DeBell & Chapman, 2006) Computer and Internet Use by
Students in 2003 documents the digital gap and identifies that it exists along demographic
and socioeconomic lines. Specifically, students and citizens who have historically been
marginalized continue to be marginalized in the technology revolution.
Before the gap can be closed, researchers need to understand who is caught in the
gap and why. Researchers need to understand the extent of the gap and identify systemic
inequities. Closing the gap is key to our nation’s economic future because an
impoverished and disconnected population with inequitable educational and employment
DIGITAL EQUITY
3
opportunities limits social development, economic innovation, job opportunities, and
political participation.
Digital equity is defined as equal access and opportunity to digital tools,
resources, and services to increase digital knowledge, awareness, and skills (Davis,
Fuller, Jackson, Pittman, & Sweet, 2007). Digital equity is more than access to hardware
and software, but also the opportunity to collaborate, create, critically consider, and
communicate. The digital divide based on unequal access to computers has largely been
eliminated (Gray, Thomas, & Lewis, Educational Technology in U.S. Public Schools:
Fall 2008, 2010). In US public schools, the ratio of students to instructional computers
with Internet access was 3.1. However, researchers have found significant differences in
how computers are used in the classroom and the types of technology-enhanced learning
experiences students have. In low-income schools, 83% of teachers report that their
students use technology “to learn or practice basic skills.” While in high-income schools,
students are more likely to prepare written text; conduct research; correspond with others;
create or use graphics or visual displays; develop and present multimedia presentations;
create art, music, movies, or webcasts; or design and produce a product (Gray, Thomas,
& Lewis, Teachers' Use of Educational Technology in U.S. Public Schools: 2009, 2010).
These inequities in technology integration practices perpetuate and accentuate existing
educational and societal inequities.
Research in digital equity, recent publications, state and national plans, state and
national mandates regarding digital equity are at the forefront of education leaders and
researchers across the country. Nearly all the research in digital equity targets
descriptions at the national level, macro level, or within the classroom, micro level. A
DIGITAL EQUITY
4
need exists to explore digital equity at an intermediary level, such as the State or smaller
community.
The purpose of this study is to investigate digital equity by examining the
relationship between school characteristics and technology integration practices in K-12
schools located in Educational Service Center (ESC) Region 13 of the Texas public
school system. The research questions guiding this study are:
RQ1: is there a relationship between the campus STaR chart level of progress and
campus accountability rating and school level?
RQ2: is there a relationship between the campus STaR chart level of progress and
the percentage of economically disadvantaged students, English language learners, total
minority, and percentage of at-risk students at the campus?
Method
This expository study uses a correlational design. The 484 schools in this study
are elementary (n = 301), middle (n= 108), and high (n= 75) schools located in Region
13 as defined by Texas Education Agency. SPSS was the statistical tool used to correlate
school demographic data as presented on the Academic Excellence Indicator System
(AEIS) and the levels of progress for technology integration as presented on the State
Technology and Readiness (STaR) chart. The data originated from the Texas Education
Agency (TEA) and is for the 2009/10 academic school year.
Participants.
Region 13 has 603 public schools, 484 schools are used in this study. Schools
were eliminated from the study for the following reasons:
1. They were charter schools;
DIGITAL EQUITY
5
2. They had a “1” or “x” for an accountability rating;
3. They were labeled as school type “B” indicating a campus which provides
K-12 education;
4. They had incomplete data in the datasets.
Datasets.
The data sets used for this analysis are publically available data collected and
maintained by the Texas Education Agency for the 2009-2010 school year. In this study,
portions of two datasets were downloaded from the Texas Education Agency website.
Data to represent technology integration was defined from the levels of progress available
from the STaR chart. Campus demographic information is collected in the Academic
Excellence Indicator System (AEIS) (http://ritter.tea.state.tx.us/perfreport/aeis/). The two
data sets were merged in MS Excel using the 9-digit campus code.
AEIS.
Since 1990, the TEA has presented information about schools and student
performance through the AEIS. The data presented on the AEIS is collected by the TEA
through the Public Education Information Management System (PEIMS) that is updated
annually through a comprehensive collection process. Collecting information on 1,200
school districts, more than 8,000 schools, approximately 320,000 educators and over 4.7
million students, PEIMS is a vital information data warehouse on the countless aspects of
public education.
Factor Definitions.
DIGITAL EQUITY
6
In this research project, factors were selected based on the need of these data to
answer the research questions. Table 1 provides a listing of the variables as well as a brief
definition (http://ritter.tea.state.tx.us/perfreport/aeis/2011/glossary.html).
Table 1
Variable names and definitions
Factor Name
Accountability rating
School type
Total minority percent
Percent white
Percent economically disadvantaged
Percent at-risk
Percentage of limited English proficiency
STaR Chart.
Definition
the accountability rating assigned to a district
by the Texas Education Agency (TEA)
E = exemplary
A = acceptable
R = recognized
L = academically unacceptable
X = not rated: other
schools are placed into one of four
classifications based on the lowest and highest
grades in which students are enrolled at the
school (i.e. in membership):
E = elementary,
M = middle (including junior high school),
S = secondary, and
B= both elementary/secondary (K-12).
the percentage of total students identified as
African American, Hispanic, Asian/Pacific
Islander, and Native American
the percentage of total students identified as
White
the percentage of total students reported as
economically disadvantaged.
Economically disadvantaged students are those
who are reported as eligible for free or reducedprice meals under the National School Lunch
Program and Child Nutrition Program or other
public assistance.
The percent of at-risk students is calculated as
the sum of the students coded as at risk of
dropping out of school, divided by the total
number of students on the campus.
Former Limited English Proficiency students
who did not receive any BE/ESL services and
for current LEP students receiving any services.
DIGITAL EQUITY
7
In the Spring of each academic year, all Texas educators are required to complete
a 24-question self-assessment of their technology integration practices via an on-line
survey. Their answers are aggregated at the campus, district, and state level providing
valuable information about school progress towards meeting the goals of the Long-Range
Technology Plan and the No Child Left Behind, Title II, Part D. 172,783 teachers
completed the Teacher STaR Chart during the 2005-2006 school year. (Instructional
Materials and Educational Technology Division, 2006) A similar number of educators
completed the STaR Chart for the 2009/10 school year. For this research project, the data
presented on the levels of progress will represent technology integration.
Questions on the assessment focus on four areas of technology integration and
mirror the four key technology goals outlined in the LRTP, which are:
1. Teaching and Learning
2. Professional Development
3. Leadership, Administration, and Instructional Support
4. Infrastructure for Technology.
Educators assess their own level of proficiency on indicators (1) Teaching and
Learning and (2) Professional Development. On indicators (3) Leadership,
Administration, and Instructional Support and (4) Infrastructure for Technology,
educators identify their perceptions of the campus environment.
On the survey instrument, educators identify their levels of progress for each of
the stated technology performance descriptions on a one to four scale. The options are:
Level 1: Early Tech
Level 2: Developing Tech
DIGITAL EQUITY
8
Level 3: Advanced Tech
Level 4: Target Tech
Table 2 provides an example of how teachers could identify their levels of
participation in technology oriented professional development using the one to four point
scale.
Table 2
Example of Professional Development Levels
Level
Early Tech
Developing Tech
Advanced Tech
Target Tech
Description
Teacher has received training in skills including basic operations skills,
electronic attendance, grade book, e-mail, and integrated learning systems.
Teacher receives professional development on how to integrate technology
into the curriculum, help with classroom management skills, and increase
teacher productivity.
Teacher receives professional development on how to integrate technology
to enhance and advance instruction in new ways (i.e., student collection,
analysis, and presentation of real-world data, use of edited digital video to
synthesize related concepts, cross-curricular activities in various content
areas, and vertical alignment across grade levels to connect concepts).
Teacher continues to participate in professional development experiences
but expand his/her influence by collaborating, mentoring, and training
others. Teacher encourages the development of student lead learning
environments.
(http://starchart.epsilen.com/docs/TxTSC.pdf)
For the purposes of this research study, a campus-level average score was
calculated. The two-step process began by averaging the score for each of the four
technology goals (Teaching and Learning, Professional Development, Leadership, and
Infrastructure). Then, these four scores were averaged to develop a campus level of
progress or technology integration score. For the 484 schools in Region 13, technology
integration score ranged from a high of 3.46 to a low of 1.38. with a mean score of 2.468,
which places schools in the “developing tech” level of progress.
Tests of Statistic
DIGITAL EQUITY
9
To answer research question 1, the relationship between the campus STaR chart
level of progress and campus accountability rating and school level was established by
running a two-way ANOVA with technology integration as the dependent variable and
school type, accountability rating, and their interaction as the independent variables.
Significance was determined at the .05 alpha level and the p-value was observed to assess
the strength of the relationship between school type, accountability rating, their
interaction and levels of technology integration. The R-squared value measured the
strength of the model. Tukey’s post hoc analysis was conducted on the significant factors.
To answer research question 2, a regression was generated using the campus
STaR chart technology integration average as the dependent factor and the percentage of
economically disadvantage students, English language learners, total minority, and at-risk
students as the independent factors. The significance was assessed at the .05 level. Rsquared value measured the strength of the multi-factor model.
Results
To describe the levels of technology integration in ESC Region 13, analysis of
variance (ANOVA) and regression techniques were used to analyze school demographics
and technology integration data.
Research question 1
To answer research question one, does the level of technology integration vary by
campus accountability rating and school type, the researchers ran a two-way ANOVA,
predicting technology integration with school type, accountability rating, and their
interaction. The hypothesis statement is:
H0: there is no school type x accountability rating interaction
DIGITAL EQUITY
10
Ha: there is a school type x accountability rating interaction
Table 3
Accountability Rating, Campus Type, and their Interaction
Tests of Between-Subjects Effects
Dependent Variable:STaR Avg
Type III Sum
Source
of Squares
Corrected Model
Intercept
@2010CampusRating
SchoolType
@2010CampusRating *
SchoolType
Error
Total
Corrected Total
Mean
df
Sig.
Square
F
7.721a
10
.772
6.330
.000
220.950
1
220.950
1811.311
.000
2.740
3
.913
7.486
.000
.419
.892
2
5
.210
.178
1.718
1.463
.181
.201
57.698
473
.122
2971.201
484
65.420
483
a. R Squared = .118 (Adjusted R Squared = .099)
The F of the interaction is 1.463 that corresponds to a p-value of .201, which is
greater than the alpha of .05. As a result, we accept the null hypothesis and conclude that
we have no statistical evidence that suggests that campus type and accountability ratings
interact to explain variation in technology integration.
Since the interaction was not significant, the model is reformulated and school
type and accountability rating are the main effect. The results are listed in the Table 4.
Table 4
Accountability Rating and Campus Type as Main Effects
Tests of Between-Subjects Effects
Dependent Variable:STaR Avg
Type III Sum of
Source
Corrected Model
Intercept
Squares
df
Mean Square
F
Sig.
6.829a
5
1.366
11.143
.000
183.949
1
183.949
1500.715
.000
DIGITAL EQUITY
11
@2010CampusRating
6.640
3
2.213
18.057
.000
SchoolType
1.468
2
.734
5.990
.003
Error
58.590
478
.123
Total
2971.201
484
65.420
483
Corrected Total
a. R Squared = .104 (Adjusted R Squared = .095)
At alpha .05, both school type and accountability rating are found to be significant
when controlling for each other. The F value for accountability rating is 18.057 with a
corresponding p=.000 and the F value for school type is 5.990 with a corresponding p =
.003. The R-squared value indicates that 10.4% of technology integration practice
variation can be explained by campus accountability rating and school type. The
relationship between technology integration, accountability rating, and school type is
visually represented in Graph 1.
Graph 1
Technology Integration, Accountability Rating, and School Type
DIGITAL EQUITY
12
Since the variables are significant, Tukey’s post hoc test was conducted. The
statistical output is presented in the Table 5.
Table 5
Tukey’s Post Hoc Analysis of Accountability Rating
Multiple Comparisons
STaR Avg
Tukey HSD
(I) 2010
(J) 2010
Mean
Campus
Difference
Campus Rating Rating
A
E
(I-J)
95% Confidence Interval
Std. Error
Sig.
Lower Bound
Upper Bound
E
-.275239*
.0437645
.000
-.388068
-.162410
L
-.150485
.2499541
.931
-.794890
.493919
R
-.105247
.0421156
.061
-.213825
.003331
A
.275239*
.0437645
.000
.162410
.388068
L
.124753
.2490227
.959
-.517250
.766757
R
.169992*
.0361798
.000
.076717
.263266
DIGITAL EQUITY
L
R
13
A
.150485
.2499541
.931
-.493919
.794890
E
-.124753
.2490227
.959
-.766757
.517250
R
.045238
.2487383
.998
-.596032
.686508
A
.105247
.0421156
.061
-.003331
.213825
E
-.169992*
.0361798
.000
-.263266
-.076717
L
-.045238
.2487383
.998
-.686508
.596032
The Tukey post hoc test on campus accountability shows that statistically
significant differences exist between campus ratings Exemplary (E) (M=2.58) and
Acceptable (A) (M=2.30) and Exemplary (E) and Recognized (R) (M = 2.41). Exemplary
campuses score, on average, .27 points more than Acceptable campuses (p= .000) when
controlling for school type. Furthermore, Exemplary campuses score .17 points more
than Recognized campuses (p = .000) when controlling for school type.
Since school type was also found to be a significant main effect, Tukey post hoc
test was run and the results are presented in the Table 6.
Table 6
Tukey’s Post Hoc Analysis of Campus Type
Multiple Comparisons
STaR Avg
Tukey HSD
95% Confidence Interval
(I) School
Type
E
M
S
Mean Difference
(J) School Type
(I-J)
Std. Error
Sig.
Lower Bound
Upper Bound
M
-.022752
.0392704
.831
-.115078
.069574
S
-.054372
.0451834
.452
-.160600
.051855
E
.022752
.0392704
.831
-.069574
.115078
S
-.031620
.0526238
.820
-.155341
.092100
E
.054372
.0451834
.452
-.051855
.160600
M
.031620
.0526238
.820
-.092100
.155341
DIGITAL EQUITY
14
Tukey’s post hoc analysis indicates no statistically significant variation of
technology integration between school types when controlling for accountability rating.
Research Question 2
To answer research question 2, whether there is a relationship between the
campus STaR chart technology integration and the percentage of (1) economically
disadvantaged students, (2) English language learners, (3) total minority, and (4) at-risk
students at the campus, SPSS was used to run a regression controlling for the four
variables. The hypothesis statement for this analysis is:
H0: B=0, when controlling for percentage of economically disadvantaged
students, English language learners, total minority, and at-risk students at the campus
Ha: B ≠0, when controlling for percentage of economically disadvantaged
students, English language learners, total minority, and at-risk students at the campus.
Table 7 shows the regression model that indicates at the .05 alpha level, the
regression line is significant. The F-statistic of 32.74, produces a p-value of p = .0000
(p<.05), so we conclude that the percentage of economically disadvantaged students,
English language learners, total minority, and at-risk students at the campus together
explain a significant amount of variability in technology integration at the campus.
Table 7
ANOVA of Percentage of Economically Disadvantaged Students, English Language
Learners, Total Minority, and At-risk Students
Model
Regression
1
Residual
Total
Sum of Squares
14.045
51.375
65.420
df
ANOVAb
Mean Square
4
3.511
479
.107
483
F
32.736
Sig.
.000a
DIGITAL EQUITY
15
a. Predictors: (Constant), Total % minority, LEP %, at-risk %, ECO %
b. Dependent Variable: STaR Avg
We found the R2 to have a value .215 of which means that we are able to explain
21.5% of technology integration on the STaR chart by accounting for the effects of the
percentage of economically disadvantaged students, English language learners, total
minority, and at-risk students at the campus together. This percentage is relatively high,
because on the 4-point levels of progress scale, a 21.5% variation represents an entire,
one step increase on the scale.
An analysis of the coefficients for each variable is listed in the Table 8.
Table 8
Coefficient Analysis
Coefficientsa
Model
Unstandardized
Standardized
Coefficients
Coefficients
B
Beta
(Constant)
1
2.744
.044
ECO %
-.007
.001
LEP %
.000
at-risk %
.003
Beta
95.0% Confidence Interval for B
t
Sig.
Lower Bound
Upper Bound
61.920
.000
2.657
2.832
-.534
-5.515
.000
-.010
-.005
.001
.015
.214
.831
-.003
.003
.002
.128
1.416
.157
-.001
.006
-.056
-.637
.524
-.003
.002
Total % minority
-.001
.001
a. Dependent Variable: STaR Avg
We see the percentage of (1) English language learners, (2) total minority, and (3)
at-risk students are not statistically significant. In this model, there is not enough
evidence to conclude that percentage of English language learners, total minority, or atrisk students makes a difference in technology integration on the STaR chart when
accounting for the effects of percentage of economically disadvantaged students. On the
other hand, we see that percentage of economically disadvantaged students is significant.
DIGITAL EQUITY
16
Our t-statistic here is -5.515 and our p-value is p=.000 (p<.05). We conclude that the
percentage of economically disadvantaged students makes a difference in technology
integration practices on a campus when controlling for the percentage of English
language learners, total minority, and at-risk students.
The regression coefficient for the significant variable of percentage economically
disadvantaged, b(1) = -.007. In this model, we estimate that levels of technology
integration reduces by .07 (on a 4 point scale) when the percentage of economically
disadvantaged students increases by 10 percent when controlling for English language
learners, total minority, and at-risk students. Furthermore, we are 95% confident that the
true integration value is between -.010 lower and -.005 higher.
Discussion
Under the direction of the Texas Legislature, the Texas Education Agency has
identified four key areas for successful technology integration: (1) teaching and learning,
(2) professional development, (3) leadership, and (4) infrastructure. On an annual basis,
over 180,000 educators in the state, complete an on-line self-assessment to identify their
technology integration practices in their classrooms on a scale of one to four, with one
representing an early level of technology and four representing a target level of
technology. The results of this on-line assessment are aggregated at the campus, district,
and state levels and reported through the STaR chart. For the purposes of this
quantitative analysis, the STaR chart data was correlated with campus demographic data
to find statistically significant relationships that would indicate digital inequities.
A 4x3 ANOVA analysis between technology integration and campus type and
accountability rating and the interaction between campus type and accountability found
DIGITAL EQUITY
17
equality as well as inequality. Finding that there is no interaction between campus type
and accountability ratings indicates that elementary, middle, and high schools are
integrating technology at equal levels. A teacher in elementary school will integrate
technology at similar levels to a teacher in high school or middle school. The campus
accountability rating was found to be statistically significant when controlling for campus
type. We found that schools with an Exemplary accountability rating had a statistically
higher level of technology integration than schools with a Recognized or Acceptable
accountability rating, when controlling for school type. Students who attend an
Exemplary campus will experience a higher level of technology integration in their
learning experiences than students who attend a campus with a Recognized or Acceptable
campus rating. Digital inequity exists even though the campuses are rated Recognized
and Acceptable.
The second analysis explored the relationship between technology integration and
the percentage of economically disadvantaged students, limited English proficiency
students, Total minority students, and at-risk students. These factors were selected for
analysis because they typically represent those who are on the “have not” side of the
digital divide. The statistical analysis found that the percentage of economically
disadvantaged students predicts the level of technology integration, when controlling for
limited English proficiency students, Total minority students, and at-risk students. In
fact, nearly 21.5% of variation in technology integration can be explained by these
demographic factors.
Although the student to computer ratio is relatively equal across schools (Gray,
Thomas, & Lewis, Teachers' Use of Educational Technology in U.S. Public Schools:
DIGITAL EQUITY
18
2009, 2010), the analyses in this study indicate that technology integration is statistically
unequal. Specifically, schools rated as Exemplary exceed technology integration
practices of even schools rated as Recognized and Acceptable; while schools with high
levels of economically disadvantaged students have lower levels of technology
integration. Students are receiving unequal technology enhanced learning experiences
based on the accountability rating and demographic characteristics of campus they attend.
This research study found that the digital divide is alive and well in K-12 schools
in ESC Region 13. Those on the “have” side of the divide as represented by the
Exemplary accountability rating participate in higher level technology integration
learning environments while those on the “have not” side of the divide as represented by
high percentage of economically disadvantage students participate in lower level
technology enhanced learning environments.
Study Limitations
The schools used in this analysis is limited to those in Region 13, the Austin area.
Region 13 represents 14% of schools in Texas (Texas Education Agency, 2011). The
STaR chart data is a self-assessment of technology integration practices by educators.
Self-assessments can over-estimate or under-estimate educational practices and not
represent the reality of technology integration. Technology integration is a difficult
construct to define and using the STaR chart data may not be fully representative of
technology integration.
Future research
Understanding the digital divide and who is affected by it is critical to equity in
education. Quantitative and qualitative studies are needed to investigate the extent of the
DIGITAL EQUITY
19
divide, identify the types of students affected by the divide, and how teachers can
overcome the systemic inequities to create technology rich educational opportunities for
students.
DIGITAL EQUITY
20
References
Davis, T., Fuller, M., Jackson, S., Pittman, J., & Sweet, J. (2007). A National
Consideration of Digital Equity. International Society for Technology in Education.
Washinton, DC: International Society for Technology in Education.
DeBell, M., & Chapman, C. (2006). Computer and Internet Use by Students in
2003. U.S. Department of Education, National Center for Education Statistics.
Washington, DC: National Center for Education Statistics.
Gray, L., Thomas, N., & Lewis, L. (2010). Educational Technology in U.S. Public
Schools: Fall 2008. US Department of Education, National Center for Education
Statistics. Washington, DC: National Center for Education Statistics.
Gray, L., Thomas, N., & Lewis, L. (2010). Teachers' Use of Educational
Technology in U.S. Public Schools: 2009. US Department of Education. Washington,
DC: Natinal Center for Education Statistics.
Instructional Materials and Educational Technology Division. (2006, Fall). STaR
Chart. Retrieved October 23, 2012, from Texas Education Agency:
http://starchart.epsilen.com/nclb/default.html
Texas Education Agency. (2010). 2010 Progress Report on the Long-Range
Technology Plan. Texas Education Agency, Austin.
Texas Education Agency. (n.d.). Academic Excellence Indicator System.
Retrieved October 17, 2012, from TEA: http://ritter.tea.state.tx.us/perfreport/aeis/
Texas Education Agency. (2011). Snapshot 2011: Summary Tables. Retrieved
October 23, 2012, from Texas Education Agency:
http://ritter.tea.state.tx.us/perfreport/snapshot/2011/sumtables.html
DIGITAL EQUITY
Texas Education Agency. (n.d.). STaR Chart Advanced Search. Retrieved
October 17, 2012, from STaR Chart: http://starchart.epsilen.com
21
Download