Digital Portfolios: A Study of Undergraduate Student and Faculty Use... Alverno College’s Diagnostic Digital Portfolio

Digital Portfolios: A Study of Undergraduate Student and Faculty Use and Perceptions of
Alverno College’s Diagnostic Digital Portfolio
by
Linda Ehley
A Dissertation presented in partial fulfillment of the requirements for the
Doctor of Education/Philosophy degree in
Leadership for the Advancement of Learning and Service
College of Education
Cardinal Stritch University
May, 2006
i
Dissertation Approval
As members of the dissertation committee for Linda Ehley, and on behalf of the Doctoral
Program at Cardinal Stritch University, we affirm that this report meets the expectations
and academic requirements for the Ed.D. degree in Leadership for the Advancement of
Learning and Service.
Peter M. Jonas, Ph.D., Chairperson
Approval Date
Michael Dickmann, Ph.D.
Approval Date
Georgine Loacker, Ph.D.
Approval Date
As the Dean of the College of Education, and on behalf of the Doctoral Program at
Cardinal Stritch University, I affirm that this report meets the expectations and academic
requirements for the Ed.D. degree in Leadership for the Advancement of Learning and
Service.
Anthea Bojar, Ph.D.
Approval Date
ii
Copyright © 2006 by Linda Ehley
All rights reserved
iii
Dedication and Acknowledgements
This study is dedicated to all those who assisted me, put up with me, and provided
extremely flexibility in various deadlines, especially my family who did not protest when
Mom was writing on Thanksgiving, Christmas, and every other holiday. To my husband,
who served as an editor, sounding board, and only glazed over a little when I went on and
on excitedly describing my latest “find”. This study is also dedicated to my doctoral
committee, including my chair, Peter Jonas, who suffered through innumerable questions,
lively discussions on what should and should not be included (especially in Chapter 2),
and who, above all, promptly responded to all my requests and aided me in adhering to
my timeline. In particular, I would like to dedicate this study to my committee member
and mentor, Georgine Loacker. Not only did she assist me in furthering my
understanding of Alverno’s philosophy and my ability to succinctly articulate it, she also
spent an immense amount of time patiently explaining writing and grammar rules,
context setting, and flow in such a masterful manner that my writing will be forever
improved. A heartfelt thank you to all!
i
Abstract
Digital Portfolios: A Study of Undergraduate Student and Faculty Use and Perceptions of
Alverno College’s Diagnostic Digital Portfolio
The use of digital portfolios in higher education has significantly increased in the
last few years. According to Batson (2002), “E-Portfolios have a greater potential to alter
higher education at its very core than any other technology we’ve known thus far” (p. 1).
Despite the boom and potential of digital portfolios, research is limited, focusing mainly
on descriptions, categories, strategies for implementation, and programs under
development. Research on faculty and students use and perceptions of digital portfolios
is scarce.
This study addressed the effectiveness of Alverno College’s Diagnostic Digital
Portfolio (DDP) by describing and evaluating student and faculty use and perceptions of
the DDP during a one year period applying a program evaluation methodology. An
Interactive Form of program evaluation (Owen, 1999) that relies on observations,
surveys, and interviews was used in this study. Data were gathered using a three-prong
approach: (a) mining of the DDP database (all undergraduate students and faculty who
logged onto the DDP between January 1, 2005 and June 26, 2005), (b) surveys
administered to 324 students and 93 faculty, and (c) post survey interviews of eight
students and nine faculty.
The results of this study indicated that undergraduate students and faculty WERE
logging onto the DDP and they perceived the DDP as an easy to use, useful tool; a tool
students would like to use more often and more consistently. Student and faculty use of
the DDP has continued to increase since it was implemented in 1999. Results of this
ii
study underscore a need for more consistent use of the DDP throughout the curriculum,
as well as the need for increased student and faculty training.
This research will be used by Alverno to evaluate and improve the DDP with the
goal of assisting student learning. Although the results of this study can not be directly
generalized to other higher educational institutions, they do provide insights on the
student/faculty use and perceptions of digital portfolios. In addition, this study adds to
the body of knowledge on digital portfolios and serves as a model for other digital
portfolio evaluations and research.
iii
Table of Contents
Page
Approval Page
Copyright Page
Dedication and Acknowledgements ............................................................................i
Abstract ........................................................................................................................ii
Table of Contents .........................................................................................................iv
List of Tables ...............................................................................................................ix
List of Figures ..............................................................................................................xiv
CHAPTER ONE: INTRODUCTION .........................................................................1
General Background ........................................................................................1
Conceptual Context of the DDP ......................................................................6
Alverno College Learning and Assessment Philosophy .......................7
Development of the Diagnostic Digital Portfolio ................................12
DDP version 2.0 ...................................................................................19
Initial Research on the DDP ................................................................21
Purpose of Study ..............................................................................................22
Significance of the Study .................................................................................24
Approach to Study ...........................................................................................25
Limitations/Delimitations ................................................................................27
Vocabulary of the Study ..................................................................................28
Summary and Forecast .....................................................................................29
CHAPTER TWO: LITERATURE REVIEW .............................................................31
Organization of Review ...................................................................................31
History of Portfolios in Education ...................................................................33
Digital Portfolios ..............................................................................................36
Categories of Digital Portfolios ...........................................................39
Tools Used for Construction of Digital Portfolios ..............................42
Benefits and Challenges of Digital Portfolios ......................................46
Student Digital Portfolios ................................................................................50
Research on Student Digital Portfolios ............................................................53
Portfolio Research in Teacher Education ............................................54
Research on Student Learning Portfolios .............................................60
iv
Initial Research on the Diagnostic Digital Portfolio ........................................64
Grant Report Research ........................................................................65
Quantitative Data Summary ................................................................71
Qualitative Data Summary ..................................................................72
Student Interviews.....................................................................72
Preliminary Observations ............................................78
Faculty Surveys ........................................................................80
Classroom Observations ..........................................................81
ERE General Observations Concerning the DDP....................83
Self Reflection – Self Assessment ...................................................................86
Program Evaluation .........................................................................................90
Summary and Forecast .....................................................................................93
CHAPTER THREE: RESEARCH DESIGN ...............................................................95
Purpose of Study ..............................................................................................96
Participants........................................................................................................99
Data Mining of the DDP Relational Database ....................................99
Survey of Students and Faculty ............................................................99
Student and Faculty Interviews ............................................................100
Procedures and Methods ..................................................................................101
Data Mining of the DDP Relational Database ....................................101
Survey of Students and Faculty ............................................................102
Student and Faculty Interviews ............................................................108
Data Analysis ...................................................................................................109
Limitations .......................................................................................................110
Ethics ................................................................................................................112
Summary ..........................................................................................................114
CHAPTER FOUR: RESEARCH RESULTS .............................................................116
Presentation Approach .....................................................................................116
Demographic Description of Sample ...............................................................118
Database Mining ..................................................................................118
Survey of Students and Faculty ............................................................119
Student and Faculty Interviews ............................................................120
Test of Assumptions ........................................................................................121
Demographic Description of Results ...............................................................122
Student Demographic Data Analysis ...................................................122
Faculty Demographic Data Analysis ...................................................127
Sub-question 1: How Often Do Students and Faculty Log onto the DDP? .....132
Database Mining ..................................................................................132
Survey Data Analysis ...........................................................................135
Student Survey Results .............................................................137
Faculty Survey Results .............................................................139
v
Sub-question 2: What Do Students and Faculty Do When They Log Onto
the DDP? ..........................................................................................................140
Database Mining ..................................................................................140
Survey Data Analysis ...........................................................................144
Student Survey Results .............................................................144
Faculty Survey Results .............................................................160
Interview Data Analysis .......................................................................169
Student Interview Results .........................................................170
Faculty Interview Results .........................................................171
Sub-question 3: What features of the DDP are perceived by students and
faculty as useful or not useful? ........................................................................173
Survey Data Analysis ...........................................................................174
Student Survey Results .............................................................174
Faculty Survey Results .............................................................188
Interview Data Analysis .......................................................................196
Student Interview Results .........................................................196
Faculty Interview Results .........................................................197
Sub-question 4: What are student and faculty perceptions of the overall
usefulness of the DDP? ....................................................................................198
Survey Data Analysis ...........................................................................199
Student Survey Results .............................................................199
Faculty Survey Results .............................................................203
Interview Data Analysis .......................................................................206
Student Interview Results .........................................................206
Faculty Interview Results .........................................................208
Sub-question 5: What do students and faculty think of the ease of use of
the DDP? ..........................................................................................................209
Survey Data Analysis ...........................................................................210
Student Survey Results .............................................................210
Faculty Survey Results .............................................................213
Interview Data Analysis .......................................................................216
Student Interview Results .........................................................216
Faculty Interview Results .........................................................217
Sub-question 6: What are students and faculty perceptions concerning their
frequency of use of the DDP? ..........................................................................217
Survey Data Analysis ...........................................................................218
Student Survey Results .............................................................218
Faculty Survey Results .............................................................222
Interview Data Analysis .......................................................................225
Student Interview Results .........................................................225
Faculty Interview Results .........................................................226
vi
Sub-question 7: What suggestions do students and faculty have on:
improvement of the usefulness of the DDP, assistance in using the DDP
more, general ideas for improvement of the DDP, and additional comments
on the DDP? .....................................................................................................227
Survey Data Analysis ...........................................................................227
What do you think could enhance the usefulness of the
DDP? .......................................................................................227
Student Survey Results .................................................228
Faculty Survey Results .................................................230
What do you think could help you use the DDP more? ...........232
Student Survey Results .................................................232
Faculty Survey Results .................................................235
What are your suggestions for improving the DDP? ...............237
Student Survey Results .................................................237
Faculty Survey Results .................................................239
Do you have any additional comments on the DDP that you
would like to share? .................................................................242
Student Survey Results .................................................242
Faculty Survey Results .................................................244
Interview Data Analysis .......................................................................246
Student Interview Results .........................................................246
Faculty Interview Results .........................................................247
Characteristics of Key Performances ...............................................................248
How many active key performances are being used by students? .......248
What discipline departments have completed key performances? ......249
How are completed key performances connected to the abilities? ......251
How are completed key performances connected to other matrices? .253
Summary of Results..........................................................................................254
CHAPTER FIVE: DISCUSSION ...............................................................................265
Overview ..........................................................................................................265
Summary of Findings .......................................................................................266
Summary of Research Sub-question Results ...................................................268
Sub-question 1: How often do students and faculty log onto the
DDP? ...................................................................................................268
Sub-question 2: What do students and faculty do when they log
onto the DDP? .....................................................................................269
Sub-question 3: What features of the DDP are perceived by
students and faculty as useful or not useful? .......................................274
Sub-question 4: What are student and faculty perceptions of the
overall usefulness of the DDP? ............................................................275
Sub-question 5: What are student and faculty perceptions of ease
of use of the DDP? ...............................................................................278
Sub-question 6: What are student and faculty perceptions
concerning the frequency of use of the DDP? .....................................280
vii
Sub-question 7: What suggestions do students and faculty have
on: how to improve the usefulness of the DDP, how to assist them
in using the DDP more, and what general ideas would suggest
improvement of the DDP? ....................................................................282
Summary of Results on Characteristics of Key Performances .........................285
How many active key performances are being used by students? .......285
What discipline departments have completed key performances? ......285
How are completed key performances connected to the abilities?.......286
How are completed key performances connected to other matrices? .287
Comparison of the DDP to Love, McKean, and Gathercoal’s Levels of
Maturation for Digital Portfolios ......................................................................288
Relationships of Results to Previous Research.................................................294
Conclusions.......................................................................................................298
Implications for Practice ...................................................................................299
Limitations of Study .........................................................................................300
Future Research Possibilities ............................................................................302
Bibliography ................................................................................................................303
viii
List of Tables
Table
Page
1. Barrett’s Comparison of Portfolio Development Process ...................................... 38
2. Summary of Carney’s Five Studies ........................................................................ 57
3. Guidelines for Selecting or Designing A Key Performance .................................. 68
4. Quantitative Data Summary of Initial ERE DDP Research 2000-2003 ............... 73
5. Results of 2002 ERE Student Survey Question: What kinds of things have
you done on the DDP? ............................................................................................ 75
6. Results of 2002 ERE Student Survey Question: What stands out from your
DDP experiences with the DDP?........................................................................... 75
7. Results of 2002 ERE Student Survey Question: As you know, your DDP is
accessible to you at any time. Have you found yourself using it on your
own outside of a particular course or assignment? ................................................ 76
8. Results of 2002 ERE Student Survey Question: In what ways have your
experiences with the feedback and self assessment on the DDP been alike
or different from other ways you share feedback and self assessment at the
College? .................................................................................................................. 77
9. Results of 2002 ERE Student Survey Question: What purposes do you think
faculty had in mind when they designed the DDP?................................................ 77
10. Results of 2002 ERE Student Survey Question: If you could tell the DDP
design team one thing, what would it be?............................................................... 78
11. Criteria for Ascertaining Levels of Maturation ..................................................... 93
12. Institutional and Survey Data Comparison ............................................................ 121
13. Comparison Institutional and Survey Data for Majors and Support (Minor) ....... 124
14. Summary of Results of Student Survey Participants Number of Semesters at
Alverno .................................................................................................................. 126
15. Number and Frequency of Students Logging onto the DDP From
August 2000 to Fall 2003........................................................................................ 133
ix
16. Results of Student Survey Question: How many times during a typical month
do you log onto the DDP?....................................................................................... 138
17. Student Survey Statistics on Completed Key Performances .................................. 146
18. Student Survey Statistics on How Often A Key Performance is Added To
the My Work Area .................................................................................................. 147
19. Student Survey Statistics on How Often Students Upload A Self Assessment...... 149
20. Student Survey Statistics on How Often Students Check Feedback ..................... 150
21. Student Survey Statistics on How Often Students Review Past Key
Performances........................................................................................................... 152
22. Student Survey Statistics on How Often Students Use The My Resource
Area......................................................................................................................... 153
23. Student Survey Statistics on How Often Students Use The Reference Area ........ 154
24. Student Survey Statistics on How Often Students Attach A Key Performance
To A Matrix ............................................................................................................ 156
25. Student Survey Statistics on How Often Students View A Video.......................... 157
26. Student Survey Statistics on How Often Students Use the Help Menu.................. 158
27. Summary of Students’ Most Often Used Features of the DDP .............................. 159
28. Summary of Students’ Three Least-Often Used Features of the DDP ................... 160
29. Summary of Faculty Most-Used and Least-Used Features of the DDP ................. 169
30. Student Survey Statistics on Usefulness of Accessing the DDP from OffCampus ................................................................................................................... 175
31. Student Survey Statistics on Usefulness of Accessing Work and Self
Assessments ............................................................................................................ 177
32. Student Survey Statistics on Usefulness of Accessing Feedback........................... 178
33. Student Survey Statistics on Usefulness of Reviewing Past Key Performances .... 179
34. Student Survey Statistics on Usefulness of My Resources Area............................ 181
35. Student Survey Statistics on Usefulness of the Reference Area............................. 182
x
36. Student Survey Statistics on Usefulness of Attaching a Key Performance to a
Matrix...................................................................................................................... 184
37. Student Survey Statistics on Usefulness of Viewing a Video of Their Work ........ 185
38. Student Survey Statistics on Usefulness of the Help Menu.................................... 186
39. Summary of Student Perception of the Most-Useful Features of the DDP ........... 187
40. Summary of Student Perception of the Least-Useful Features of the DDP ........... 188
41. Summary of Faculty Perception for Most-Useful and Least-Useful Features
of the DDP ............................................................................................................. 195
42. Student Survey Statistics on Overall Usefulness of the DDP................................. 200
43. Thematic Conceptual Matrix for Student Survey Responses to Overall
Usefulness of the DDP............................................................................................ 202
44. Thematic Conceptual Matrix for Faculty Survey Responses to Overall
Usefulness of the DDP............................................................................................ 205
45. Student Survey Statistics on Overall Ease of Use of the DDP .............................. 211
46. Thematic Conceptual Matrix for Student Survey Responses to Overall
Ease of Use of the DDP .......................................................................................... 212
47. Thematic Conceptual Matrix for Faculty Survey Responses to Overall
Ease of Use of the DDP .......................................................................................... 215
48. Student Survey Statistics on Frequency of Use of the DDP .................................. 219
49. Thematic Conceptual Matrix for Student Survey Responses to Frequency
of Use of the DDP .................................................................................................. 221
50. Thematic Conceptual Matrix for Faculty Survey Responses to Frequency
of Use of the DDP................................................................................................... 224
51. Thematic Conceptual Matrix for Student Survey: What could enhance the
usefulness of the DDP?........................................................................................... 229
52. Thematic Conceptual Matrix for Faculty Survey: What could enhance the
usefulness of the DDP?........................................................................................... 231
53. Thematic Conceptual Matrix for Student Survey: What do you think would help
you use the DDP more? .......................................................................................... 234
xi
54. Thematic Conceptual Matrix for Faculty Survey: What do you think would help
you use the DDP more? .......................................................................................... 236
55. Thematic Conceptual Matrix for Student Survey: What are your suggestions for
improving the DDP? ............................................................................................... 239
56. Thematic Conceptual Matrix for Faculty Survey: What are your suggestions for
improving the DDP more? ...................................................................................... 240
57. Thematic Conceptual Matrix for Student Survey: Do you have any additional
comments on the DDP you would like to share?.................................................... 243
58. Thematic Conceptual Matrix for Faculty Survey: Do you have any additional
comments on the DDP you would like to share?.................................................... 245
59. Summary of Discipline Departments and Completed Key Performances ............. 251
60. Summary of Ability Matrix Connections to Completed Key Performances for the
Spring, 2005 Semester ............................................................................................ 252
61. Summary of DDP Relational Database Data on Completed Key Performances
Connections to Matrices (Other Than Ability Matrix) ........................................... 254
62. Summary of Student Perceptions of How Often They Use Features
of the DDP .............................................................................................................. 256
63. Summary of Faculty Perceptions of How Often They Use Features
of the DDP .............................................................................................................. 256
64. Summary of Student and Faculty Survey Results for Most-Often and LeastOften Used Features of the DDP ............................................................................ 257
65. Summary of Student Perceptions of Useful Features of the DDP .......................... 258
66. Summary of Faculty Perceptions of Useful Features of the DDP .......................... 258
67. Summary of Student and Faculty Survey Results for Most-Useful and LeastUseful Features of the DDP .................................................................................... 259
68. Comparison of Student and Faculty Survey Results for Least-Often Used
Features of the DDP................................................................................................ 272
69. Comparison of Student and Faculty Survey Results for Most-Often Used
Features of the DDP................................................................................................ 273
xii
70. Comparison of Student and Faculty Survey Results of Least-Useful
Features of the DDP................................................................................................ 274
71. Comparison of Student and Faculty Survey Results of Most-Useful
Features of the DDP................................................................................................ 275
72. Comparison of the DDP to Love, McKean, and Gathercoal’s Levels of
Maturation for Digital Portfolios ............................................................................ 289
73. Comparison of the DDP to Level 5 Maturation: Authentic Evidence
as the Authoritative Evidence -- Webfolio ............................................................. 290
74. Comparison of ERE’s Student Experience Categories........................................... 296
xiii
List of Figures
Figure
Page
1. Ability-Based Learning/Student Assessment-as-Learning and its
connection to key performances in the DDP .......................................................... 13
2. Screen shot from Demonstration DDP (3/1/05) ..................................................... 16
3. Creation and completion of a key performance in the DDP .................................. 17
4. Screen shot from Demonstration DDP for example student Jane Alverno ........... 18
5. Student survey results: What general program are you in? ................................... 123
6. Student survey results: Do you live on campus? ................................................... 125
7. Student survey results: Are you currently full-time or part-time? ......................... 125
8. Faculty survey results: How long have you been teaching at Alverno? ................ 129
9. Faculty survey results: In what department do you primarily teach? .................... 130
10. Faculty survey results: Are you full-time or part-time faculty? ............................ 131
11. DDP relational database results: Number of times students logged onto
the DDP during spring, 2005 .................................................................................. 134
12. DDP relational database results: Number of times faculty logged onto
the DDP during spring, 2005 .................................................................................. 136
13. Student survey results: How many times a month do you log onto the DDP?....... 137
14. Faculty survey results: How many times a month do you log onto the DDP? ....... 139
15. DDP relational database results: Number of completed key performances
spring, 2005............................................................................................................. 141
16. DDP relational database results: Number of faculty files uploaded
spring, 2005............................................................................................................. 142
17. DDP relational database results: Faculty active key performances
spring, 2005............................................................................................................. 143
18. Student survey results: How many key performances have you completed
this semester? .......................................................................................................... 145
xiv
19. Student survey results: How often do students add a key performance to the
My Work area? ....................................................................................................... 147
20. Student survey results: How often do students upload a self assessment? ............. 148
21. Student survey results: How often do students check feedback? ........................... 150
22. Student survey results: How often do students review past key performances? .... 151
23. Student survey results: How often do students use the My Resource area?........... 152
24. Student survey results: How often do students use the Reference area? ................ 154
25. Student survey results: How often do students attach a key performance
to a matrix? ............................................................................................................. 155
26. Student survey results: How often do students view video?................................... 156
27. Student survey results: How often do students use the Help Menu?...................... 158
28. Faculty survey results: How many key performances do you have on the DDP?.. 161
29. Faculty survey results: How often do faculty create a key performance? .............. 162
30. Faculty survey results: How often do faculty upload student feedback?................ 163
31. Faculty survey results: How often do faculty read student work?.......................... 164
32. Faculty survey results: How often do faculty read students’ self assessments? ..... 164
33. Faculty survey results: How often do faculty use the My Resource area? ............. 165
34. Faculty survey results: How often do faculty use the Reference area? .................. 166
35. Faculty survey results: How often do faculty check a student’s past work? .......... 167
36. Faculty survey results: How often do faculty use the DDP for Narratives?........... 168
37. Faculty survey results: How often do faculty use the Help Menu? ........................ 168
38. Student perception of the usefulness of accessing the DDP from off-campus ....... 175
39. Student perception of the usefulness of accessing work and self assessments....... 176
40. Student perception of the usefulness of accessing feedback .................................. 177
xv
41. Student perception of the usefulness of reviewing past key performances ............ 179
42. Student perception of the usefulness of My Resources .......................................... 180
43. Student perception of the usefulness of the Reference area ................................... 182
44. Student perception of the usefulness of attaching a key performance to a matrix . 183
45. Student perception of the usefulness of viewing a video of their work.................. 184
46. Student perception of the usefulness of the Help Menu ......................................... 186
47. Faculty perception of the usefulness of accessing the DDP from off-campus ....... 189
48. Faculty perception of the usefulness of providing feedback to students ................ 190
49. Faculty perception of the usefulness of viewing student work............................... 191
50. Faculty perception of the usefulness of viewing student self assessments............. 191
51. Faculty perception of the usefulness of the My Resource area .............................. 192
52. Faculty perception of the usefulness of the Reference area.................................... 193
53. Faculty perception of the usefulness of checking a student’s past work ................ 193
54. Faculty perception of the usefulness of the DDP for narratives ............................. 194
55. Faculty perception of the usefulness of the Help Menu ......................................... 195
56. Student perception of the overall usefulness of the DDP ....................................... 200
57. Faculty perception of the overall usefulness of the DDP ....................................... 204
58. Student perception of the overall ease of use of DDP ............................................ 211
59. Faculty perception of the overall ease of use of DDP ............................................ 214
60. Student perception of the frequency of use of the DDP ......................................... 219
61. Faculty perception of the frequency of use of the DDP.......................................... 223
62. Discipline departments with completed key performances? .................................. 250
xvi
1
CHAPTER ONE: INTRODUCTION
General Background
The use of digital, electronic, or web portfolios is significantly increasing in
higher education. During a current issues round table discussion at EDUCAUSE 2004,
John Ittelson, National Learning Infrastructure Initiative (NLII) fellow, stated that
approximately 70% of higher educational institutions are implementing or currently using
some form of electronic portfolios (Personal communication, EDUCAUSE 2004, Denver,
October 21, 2004). According to Batson (2002) such use in higher education has
approached a critical mass as electronic saturation on campuses is reached.
We seem to be beginning a new wave of technology development in higher
education. Freeing student work from paper and making it organized, searchable,
and transportable opens enormous possibilities for re-thinking whole curricula:
the evaluation of faculty, assessment of programs, certification of student work,
how accreditation works. In short, ePortfolios might be the biggest thing in
technology innovation on campus. Electronic portfolios have a greater potential
to alter higher education at its very core than any other technology application
we’ve known thus far. (p. 1)
Digital/electronic portfolios are a relatively new innovation; however, portfolios,
defined by Webster as “a selection of representative works,” have a history of use in
education, particularly in the professional and artistic disciplines. Jay Mathews (2004)
traced some of the history of portfolio use as an alternative to the selected response
method of standardized testing. He describes the history of portfolio use in education as
linked to the notion of authentic assessment as he defines it (judging a student’s work
2
first hand, rather than summing it up with a letter or a number) and has its roots in the
progressive education movement that started a century ago.
Although considered time consuming, portfolios have appealed to many teachers
and students. They became a key part of the alternative public schools in the 1960’s and
1970s. Portfolio use was integrated in the National Writing Project, started in 1974 at the
University of California. It gained additional strength in the 1980’s with the Arts Propel
project, in which Drew Gitomer, Howard Gardner, and Dennie Palmer explored the idea
of portfolio use in writing, music, and the arts, for all students. The Arts Propel project
was funded by the Rockefeller Foundation, in connection with the Educational Testing
Service, Harvard Project Zero, and involved a five-year period (1987-1993) of
experimentation with middle and high school art teachers. The curriculum involved not
only manipulating materials, but also emphasized students analyzing their own work.
This analysis involved students reflecting on the learning process they used to complete
their work. Several assessment approaches were used in the project classrooms, one of
which involved students keeping a portfolio of all work, including preliminary work and
reflective writing, to be used as a reference point throughout the course (Jones, 1994, p.
25). However, the Arts Propel project was focused on learning, not on testing for
accountability.
Mathews further suggests that portfolios started losing ground as an educational
tool with the inception of the standards movement. In 1994 Daniel Koretz, a RAND
corporation researcher released a report on portfolio assessment in Vermont that seemed
to mark the beginning of the decline of portfolio use in grading. Mathews’ quotes the
Koretz report as stating that teachers complained that portfolios were cutting into
3
valuable teaching time. “Math teachers”, Koretz said “frequently noted that portfolio
activities take time away from basic skills and computation, which still need attention”
(Mathews, 2004, p. 13). Mathews’ noted that at about the same time as the Koretz report,
British Prime Minister John Major discarded the portfolio system that had been used for
20 years as the exit English examination in Great Britain. Mathews quotes Dylan
Williams (a British assessment expert who now works for ETS) as saying “…timed
written examinations were the fairest way to assess achievement at the end of compulsory
schooling” (Mathews, 2004, ¶14).
Mathews summarizes his finding by saying that the argument between advocates
of standardized tests and advocates of portfolios “usually ends with each side saying they
cannot trust the results produced by the other” (Mathews, 2004, ¶19). He quotes Lisa
Graham Keegan, chief executive officer of the Washington-based Education Leaders
Council, as saying “A collection of student work can be incredibly valuable, but it cannot
replace an objective and systematic diagnostic program. Hopefully, we will come to a
place where we incorporate both” (Mathews, 2004, ¶23). As digital portfolios increase in
popularity, the same issues of use and reliability are again being raised. These issues, in
addition to the confusion of terms, the plethora of types and categories, and the variety of
uses only adds to the bewilderment concerning digital portfolios.
Research on electronic portfolio use can be somewhat confusing, given the many
definitions and distinctions among terms (digital, electronic, and web-based portfolios,
and webfolios), and the variety of classifications of electronic portfolios (institutional,
program, faculty, student, advising). In addition, a large body of research on digital
4
portfolios focuses on the technology used to create them, on strategies for
implementation, and on the benefits of using them.
Wiedmer (1998) describes digital portfolios as an outgrowth of the Exhibitions
Project, an effort of the Coalition of Essential Schools that looked at how schools were
beginning to use authentic assessments in the early 1990’s. Original distinctions between
electronic, digital and web-based portfolios, and webfolios have become somewhat
blurred. For example, Wiedmer (1998) defined an electronic portfolio as “a purposeful
collection of work, captured by electronic means, that serves as an exhibit of individual
efforts, progress, and achievements in one or more areas” (Wiedmer, 1998, p. 586).
Barrett (2001) makes a distinction between digital and electronic portfolios: “an
Electronic Portfolio contains artifacts that may be in analog form, such as a video tape, or
may be in computer-readable form; in a Digital Portfolio, all artifacts have been
transformed into computer-readable form” (Barrett, 2001, Section 3).
Batson (2002) talks about the term ePortfolio or “electronic portfolio” being used
to describe “…collections of student work at a Web site” (Batson, 2002, section 1).
Batson goes on to describe his definition of webfolios as being used within the field of
composition studies: “…static Web sites where functionality derives from HTML links.
E-Portfolios therefore now refer to database-driven, dynamic Web sites” (Batson, 2002,
section 1). However, Love, McKean, and Gathercoal’s (2004) definition of a webfolio is
“…a tightly integrated collection of Web-based multimedia documents that [could
include] curricular standards, course assignments, student artifacts in response to
assignments, and reviewer feedback of students’ work” (p. 26). A number of authors,
including Siemens (2004), Yancey (Cambridge and Yancey 2001), Lorenzo and Ittelson
5
(2005) and Jarfari (2004) refer to digital or electronic portfolios as one and the same. For
the purpose of this study, the terms digital portfolio, electronic portfolio, e-portfolio,
webfolio, and web portfolio are used synonymously and the term digital portfolio is used.
In this study a digital portfolio is defined as a computer-based portfolio in which all
learning artifacts have been converted to computer readable format (electronic) and are
accessible through the World Wide Web.
Research on digital portfolios in higher education remains somewhat limited. A
large body of the research seems to focus on what electronic portfolios are, how they are
categorized, what they contain, how they are implemented, what types of commercial
software are available, and what types of digital portfolio programs are being
implemented at various institutions. The available research does not seem to focus on
student and faculty use and perceptions of digital portfolios. Most research of digital
portfolios in higher education centers on use by pre-service education majors,
institutional use for accreditation, or use of digital portfolios in enhancing technology
skills. The majority of this research describes the process education departments are
using to move their non-electronic portfolios to electronic versions. Gathercoal, Bryde,
Mahler, Love, and McKean (2002) described their findings on implementing web-based
digital portfolios at two institutions. They found that literature available on digital
portfolios “…had more to do with students coming to terms with technology than with
faculty using electronic portfolios to enhance teaching and learning” (p. 30). Perhaps due
to the relative newness of digital portfolios, there seems to be limited research on how
students and faculty are actually using digital or web portfolios and how they perceive the
usefulness and benefits of these electronic tools.
6
This study addressed the question of the use of the Diagnostic Digital Portfolio
(DDP) at Alverno College by describing and evaluating student and faculty use and
perceptions. The DDP was created in 1999 to enable Alverno students to follow their
learning progress throughout their years of study. Therefore, it includes materials to help
students analyze their patterns of learning, including learning prompts, criteria, self
assessment, feedback, and sometimes the learning products. It was designed to help
students process the feedback they receive from faculty, external assessors, and peers, in
relation to their own self assessments, to enable them to look for learning patterns and
take control of their own academic development. Another purpose of the DDP is to assist
in making Alverno’s educational process more transparent to students and others who
seek to understand the institution’s educational philosophy. It also provides actual,
accessible performance data with which graduates can create an electronic resume for
potential employers or for graduate schools. The DDP mirrors Alverno’s educational
philosophy of Ability-Based Learning and Student Assessment-as-Learning developed in
the early 1970’s.
Conceptual Context of the DDP
The DDP was designed and built on the Student Assessment-as-Learning
philosophy developed by Alverno College. To understand the focus of this study, it is
necessary to be familiar with the learning and assessment philosophy of Alverno College,
the development of the DDP, initial research on the DDP, and version 2.0 of the DDP, the
tool addressed in this study.
7
Alverno College Learning and Assessment Philosophy
Alverno College is a woman’s liberal arts college founded by the School Sisters
of St. Francis in 1946. Located on Milwaukee’s residential south side, Alverno has a
current student enrollment of approximately 2,400 and offers undergraduate degrees in
over 50 programs of study in two time frames, Weekday College and Weekend College.
The college also offers Master of Arts degrees in Education and Nursing.
Alverno’s philosophy of learning and assessment began to be articulated by the
faculty and explicitly related to their practice in the late 1960’s when serious questions
were being raised about the nature and value of college and liberal arts education in
general. The faculty developed this philosophy as an effort to improve their approach to
liberal arts education by explicitly making the development of student learning its core.
Within several years of faculty meetings, President Joel Read asked four questions of
academic departments in the early 1970’s. These questions included:
1. What kind of questions are being asked by professionals in your field that
relate to the validity of your discipline in a total college program?
2. What is your department’s position on these?
3. How are you dealing with the problems in your general education courses, and
in the work for a major in your field?
4. What are you teaching that is so important that students cannot afford to pass
up courses in your department? (Alverno College Faculty, 1994, p. 8)
Each discipline department reported on their findings and faculty gradually formed
a consensus that outcomes for the student would be the demonstrable value of any
learning experience. From this idea, along with reflection on the professional experience
8
of the faculty and on-going review of literature, eight abilities were identified that, taken
together, would provide a framework for a liberal arts education at Alverno College.
These eight abilities were:
1. Communication
2. Analysis
3. Problem Solving
4. Valuing in Decision-Making
5. Social Interaction
6. Global Environment
7. Contemporary Events
8. Aesthetic Responsiveness (Alverno College Faculty, 1994, p. 8).
These eight abilities formed the basis for Alverno’s Ability-Based and Student
Assessment-as-Learning philosophy. Student Assessment-as-Learning is defined as: “A
multidimensional process, integral to learning, that involves observing performances on
an individual learning in action and judging them on the basis of public developmental
criteria, with resulting feedback to the learner” (Alverno College Faculty, 1994, p. 4).
The term assessment was chosen to contrast to testing, with its entomology “sitting down
beside”. In the seventeenth century an assessor was “one who sits down beside” or “who
shares another’s position.”
Alverno College’s assessment philosophy was influenced, in part, by the
Assessment Center Method. Loacker (1985) describes a history of assessment in business
and government that is essentially the history of the Assessment Center Method, which
focused on improved selection and screening rather than on development and learning.
9
When the Assessment Center Method started in England and Germany in the 1930’s,
assessment provided a new, behaviorally oriented means of selecting military officers.
The United States Office of Strategic Services used this Assessment Center Method to
select American intelligence agents. Harvard Psychological Clinic researchers, in the
1940’s, adapted and furthered the development of assessment. Led by AT&T in the
1950’s, non-military government departments and businesses added to the extensive
growth of assessment centers by using them to select managers. Loacker describes the
Assessment Center Method as one that, “…involves behavioral descriptors to develop a
rich picture of an individual’s ability, uses multiple techniques for judging the
performance and refines assessor judgment through articulation of more explicit
evidence” (Loacker, 1985, p. 48). The principles and strategies of the Assessment Center
Method were relevant to the approach to assessment developed by Alverno College.
Alverno’s assessment philosophy is founded on four basic assumptions about
learning:
1. Education goes beyond knowing to being able to do what one knows.
2. Educators are responsible for making learning more available to the learner by
articulating outcomes and making them public.
3. Abilities must be carefully identified in relation to what contemporary life
requires.
4. Assessment is integral to learning (Alverno College Faculty, 1994, p. 4).
The assessment process at Alverno mirrors these educational assumptions. In
order for students to develop abilities, to learn to do what they know, learning is viewed
as a process that continuously makes connections among all parts. The process needs to
10
be integrative/experiential (assessment must judge performance), characterized by self
awareness (must include self assessment, expected outcomes and developmental criteria
that are public), active and interactive (must include feedback and elements of externality
as well as performance), developmental (assessment must be cumulative and expansive),
and, transferable (assessment must be multiple in mode and context) (Alverno College,
1994, pp. 18 – 19). A fundamental precept in Alverno’s philosophy is the integration of
abilities with disciplinary content. Assessment of a student’s development of the eight
abilities occurs within general education and major/minor courses, with the discipline
content of the course forming the basis for assessment.
Student Assessment-as-Learning is a dynamic system. The College’s eight
abilities are refined on the basis of current knowledge and experiences. Criteria are
continually developed on the basis of a growing understanding of the abilities within the
context of disciplines. For example, currently the eight abilities include: Communication
(includes, reading, writing, listening, speaking, quantitative literacy, and computers),
Analysis, Problem Solving, Valuing in Decision Making, Social Interaction, Developing
a Global Perspective, Effective Citizenship, and Aesthetic Engagement. Each ability is
defined by six developmental levels, originally identified by examining the existing
curriculum in each of the disciplines at Alverno College. They are continuously reviewed
and refined by each ability department (made up of faculty from across the college). For
example, the first four levels of analysis include:
Level 1 — Show observational skills
Level 2 — Draw reasonable inferences from observations
Level 3 — Perceive and make relationships
11
Level 4 — Analyze structure and organization (Alverno College, Ability-Based
Learning Program, p. 2).
Levels five and six of the eight abilities are generically articulated. For example, a
generic description of levels five and six of analysis include:
Level 5 — Establish ability to employ frameworks from the major or support area
(minor) discipline in order to analyze
Level 6 — Master ability to employ independently the frameworks from the
major or support area (minor) discipline in order to analyze (Alverno College,
Ability-Based Learning Program, p. 2).
Within the context of a specific discipline levels five and six are conceptualized in the
form of advanced outcomes. For example, in English, analysis is integrated with literary
content in two of the program’s six advanced outcomes:
1. Reads and interprets diverse cultural expressions in works of literature, film
and other media.
2. Communicates an understanding of literary criticism, questions its
assumptions, and uses its frameworks to analyze and evaluate works
(Advanced Outcomes in the Major – English, 2002, p. 1).
Successful demonstration, in multiple contexts, of each of the eight ability levels one to
four are required for graduation. Besides successful completion of levels one to four,
students must demonstrate the advanced outcomes selected by their major and support
(minor) programs which are clearly integrated into their respective disciplines.
As a student proceeds through the curriculum, a wealth of data and information in
the form of student work, assessment documents, self assessments, and faculty/assessor
12
feedback is generated. Keeping track of these data and making them more assessable to
students and faculty was an important function when Alverno’s Diagnostic Digital
Portfolio was designed. Figure 1 graphically depicts Alverno’s Ability-Based and Student
Assessment-as-Learning philosophy and how it is integrated with the DDP. This
researcher created Figure 1 from the original DDP design team notes. It depicts
Alverno’s educational philosophy; its major components of criteria; self assessment;
feedback; and the connection of these components to the college-wide eight abilities and
advanced outcomes of the majors and supports (minors). The lower half of Figure 1
illustrates how the DDP connects to this philosophy with its use of key performances and
how key performances are organized into matrices based on the eight abilities,
major/minor advanced outcomes, Wisconsin State Teaching Standards, and Wisconsin
Content Guidelines. Figure 1 also includes examples of resources a student could enter
into their DDP.
Development of the Diagnostic Digital Portfolio
In 1994 the college started analyzing the location of student work, assessment
documents, self assessments, and instructor feedback. The Academic Vice President,
Kathleen O’Brien, identified 14 different locations where student learning artifacts were
stored. All of these learning artifacts were available to both faculty and students.
However, the learning artifacts could only be viewed at that location, making
accessibility an issue. Using technology to make these data more accessible became one
of the foci of a Title III grant, which was awarded to Alverno College in late 1998.
13
13
Figure 1. Ability-Based Learning/Student Assessment-as-Learning and its connection to key performances in the DDP
(Pictorial representation from original design team notes)
14
A college-wide design team was formed to develop an electronic method of
keeping track of the volume of student learning artifacts that demonstrated the Student
Assessment-as- Learning philosophy. Working with an outside consulting firm, the DDP
design team began the conceptual design of the DDP in March, 1999. Prototypes were
created, examined, tested, and refined. Because the DDP mirrors the developmental
nature of Alverno’s curriculum and abilities, the implementation of the DDP started with
all entering undergraduate students (students new to Alverno College) in October, 1999.
The focus of the DDP is student learning. The DDP was designed to assist
students in analyzing their patterns of learning and development, as well as to enhance
teaching, learning, and assessment. The design goal of the DDP was to provide an easily
accessible method of demonstrating and documenting the students’ development of the
Alverno abilities integrated into their general education and their major and support
(minor) program outcomes.
The DDP assists the student in accessing the feedback she receives from faculty,
external assessors, and peers, as well as enabling her to look for patterns in her academic
work so she can take more control of her own development, becoming a more
autonomous learner. The DDP also provides actual, accessible performance data with
which graduates can create an electronic resume for potential employers or for graduate
schools. The DDP is not, however, the student’s official record. It is collections of snap
shots of a student’s performances across time.
The original goal of the DDP was to assist in making Alverno’s philosophy of
Ability-Based Learning and Student Assessment-as-Learning more visible to the students
and faculty by providing easy access to numerous learning artifacts already collected.
15
Therefore, the organizing feature of the DDP is Alverno’s Ability Matrix. This matrix
lists the eight college outcomes and divides these outcomes into four levels of
development.
The operational core of the DDP is key performances. A key performance can be
an assignment, in-class assessment, project, internship, outside-of-class assessment or
any student performance that demonstrates her learning. Key performances are selective
and do not include all work a student might complete during her college career. A key
performance consists of a name, a title, a brief description of the learning experience,
criteria, and a self assessment template. Additional documents can be attached to the
description and criteria to more fully describe the key performance.
Figure 2 is a screen shot of a key performance (CS 270 final project) from the
Demonstration (Demo) DDP. The Demo is an instance of the DDP used in presentations
that contains actual student work and feedback, with permission of the students.
However, the names have been changed to indicate fictitious students. Figure 2 illustrates
a typical key performance that contains a title, description, criteria, and self
assessment/feedback template. The key performance pictured in Figure 2 also contains
additional information on the criteria for the key performance in an attached Word
document.
16
Figure 2. Screen shot from the Demonstration DDP (Obtained 3/1/05)
Key performances are created by the faculty or assessors, and can be connected to
one or more of the eight abilities and/or four levels. Figure 3 was created by this
researcher to visually depict the process of creating a key performance, as well as the
student and faculty process for completing a key performance. Essentially, the
completion process involves students uploading their self assessment and a
faculty/staff/assessor uploading their feedback. Figure 3 includes types of key
performances and examples of additional files a student could upload to her DDP. Figure
3 connects and expands Figure 2, providing more information on the process of how a
key performance is created, how a key performance is completed, and what is necessary
to have the key performance appear on students’ matrices.
17
Figure 3. Creation and completion of a key performance in the DDP
Once the key performance is completed, it appears on a matrix in the student’s My
Portfolio tab in the DDP. A student can have numerous matrices (e.g., Ability, advanced
outcomes of majors/supports, Wisconsin State Standards). An example of the Ability
Matrix in a student’s DDP is shown in Figure 4. It represents all key performances that a
fictitious student, Jane Alverno, has completed thus far in her DDP. A key performance
can be connected to multiple abilities and levels, as indicated in Figure 4 with EN 240
(English 240). This key performance is connected to the abilities it demonstrates:
Communication level 4, Analysis levels 3 and 4, and Aesthetic Engagement level 4. To
18
view the actual key performance (description, criteria, self assessment, feedback, and
additional files) the student would click on the underlined key performance name (in this
case EN 240).
Figure 4. Screen shot from the Demonstration DDP for example student Jane Alverno
In addition to the Ability Matrix, the DDP includes matrices of advanced
outcomes for each major and support (minor) offered at the college. Each program at
Alverno has a set of advanced outcomes that students must meet in order to graduate.
These advanced outcomes represent the advanced levels of the eight abilities integrated
into the discipline.
The DDP went through several minor refinements during its first four years of use;
however, the core of the DDP has remained essentially the same, allowing faculty to
create key performances (assignments, in-class assessments, experiential learning
examples, projects, outside-of-class assessments) that contain a description, specific
criteria, student self assessments, and instructor (assessor) feedback in the form of text
documents, video clips, or audio files. As use of the DDP progressed, students and
faculty informally suggested several new features. In addition, other institutions began to
inquire about adapting the DDP for use at their institutions. However, due to the specific
19
programming of the DDP and its vendor-specific platform, these suggestions were
difficult to implement. In March, 2003 a design team was formed to explore creating a
version of the DDP that would be flexible enough to accommodate suggestions made by
students and faculty, and also have the capability to be customized for use at other
institutions.
DDP version 2.0
During the first four years of use, training sessions for students and faculty
provided the opportunity to gather feedback concerning the functionality of the DDP,
including problems, issues, and suggestions for improvements. The DDP Operations and
DDP Policy Committees collected and analyzed this feedback, identifying several main
issues and ideas for improvement. For example, as the first class to use the DDP
approached graduation, it became apparent that these students wanted to take their DDP
with them. Students wanted a method to download information from the DDP which
retained the organizing matrices and connected learning artifacts. Alverno’s Education
Department also wanted the Wisconsin Education Standards and Content Guidelines
(standards required for Wisconsin teacher certification by the Department of Public
Instruction) to be included in the DDP. The Education Department wanted students to
have the ability to connect key performances from other discipline areas to the Wisconsin
Educational Standards and Content Guidelines. None of these improvements were
possible with the programming used to create the original DDP.
Other issues and suggestions for improvements revolved around the day-to-day
use and maintenance of the DDP. For example, in the original version of the DDP
advanced outcomes for majors and minors were listed by a number on the bottom of the
20
key performance, requiring students and faculty to look up the actual definition of the
advanced outcome. Faculty and students found the file upload process of the original
DDP to be cumbersome. A total of five mouse clicks were required to upload a file.
Video files needed to be identified separately from other documents being uploaded to
the DDP. Faculty and students had no way of removing a file uploaded by mistake. This
became an issue for DDP maintenance when upload errors were made. In the original
version of the DDP the inability for students and faculty to remove files was programmed
to maintain the developmental nature of the students’ work, self assessment and feedback
(a performance frozen in time). However, when students or faculty uploaded an incorrect
file, it became a maintenance issue because a system administrator was needed to correct
the error. This was problematic for the system administrator due to the limited
administrative and maintenance functionality of the original version, requiring direct
access of the DDP’s back end database (SQL 7).
In March 2003, the new DDP design team met to address these suggestions and
create a new version of the DDP based on a non-proprietary system. The DDP design
team (faculty, staff, and members of the DDP Operating Committee) reported to both the
DDP Policy Committee and the Council for Student Assessment concerning the
recommendations. The design team focused creating a new version of the DDP that
enabled students and faculty to easily view the text of the matrices and provide an
administrative maintenance system. Additional matrices were added to version 2.0 of the
DDP including: Advanced Outcomes for each major and support (minor) offered at the
college, Wisconsin Educational Standards, and Department of Public Instruction Content
Guidelines.
21
The file upload process of the DDP was redesigned to reduce the number of
mouse clicks from five clicks to two clicks and no longer required the identification of
video files. Version 2.0 was programmed to allow for removal of files by faculty and
students. A 24-hour timeframe was established for file removal. In this way students and
faculty could correct upload errors immediately identified (since they are trained to check
all files after uploading them). A web-based interface was created for administrative and
maintenance purposes. Due to the complexity of converting data from the original
version of the DDP, and the relatively short time line of implementation (version 2.0 was
implemented eight months later in January 2004), the enhancement providing students
with the ability to download a copy of their DDP was postponed and subsequently
implemented in August 2005.
In addition to the data collected by the DDP Operation and DDP Policy
Committees concerning issues, problems, and improvements, data were also gathered on
DDP use. The Educational Research and Evaluation Office (ERE) gathered and analyzed
data from the DDP’s database tables, as well as student interviews and informal
interviews of faculty, concerning their use of the DDP. These interviews were conducted
in the early years following the DDP’s implementation (2000 and 2001) and were
primarily focused on providing evaluation data for the Title III grant.
Initial Research on the DDP
The Education Research and Evaluation (ERE) department has conducted
research on the DDP since 2000. Most of the research on the DDP focused on analyzing
student log-ons and the characteristics of key performances. Log-on studies, started in
2001, document a constant increase in student use of the DDP. Student interviews and
22
faculty surveys administered in spring, 2001 and 2002 provided some insight into DDP
use. These research studies were conducted primarily to document goals and evaluations
for the Title III and other grant reports.
In spring, 2004, a preliminary study was conducted by this researcher, related to
her doctoral studies, focusing on describing the frequency of use, student characteristics,
and characteristics of key performances on the DDP. The preliminary study analyzed the
DDP database entries from the fall 2003 semester, including student log-ons, information
on students’ major and type of program (Weekend College or Weekday College), and
completed key performances. Their research also included an analysis of all active key
performances and their connection to the eight abilities and advanced outcomes of the
majors.
While this preliminary study provided some useful research into student use and
key performance characteristics, it also raised a number of questions. This preliminary
study did not research how students used the DDP once they logged on. Data from the
preliminary study indicated that students logged onto the DDP an average of 8.2 times a
semester, but they were not necessarily completing key performances. This raised the
question of what students were doing when they logged onto the DDP and what features
of the DDP they were using or not using. The preliminary study did not research faculty
use of the DDP, nor did it investigate student and faculty perceptions of the usefulness of
the DDP.
Purpose of Study
Numerous questions were raised by the initial research done by ERE and by the
preliminary study completed by this researcher. Addressing these questions, as well as
23
gathering research on version 2.0 of the DDP, formed the main focus of this study: How
is the DDP being used by undergraduate students and faculty at Alverno College? This
study examined undergraduate student and faculty use and perceptions of the DDP,
focusing on several sub-questions. These sub-questions include:
1. How often do students and faculty log onto the DDP?
2. What do students and faculty do when they log onto the DDP?
3. What features of the DDP are perceived by students and faculty as useful or
not useful?
4. What are students and faculty perceptions of the overall usefulness of the
DDP?
5. What do students and faculty think about the ease of use of the DDP?
6. What are students and faculty perceptions concerning their frequency of use
of the DDP?
7. What suggestions do students and faculty have on: (a) improvement of the
usefulness of the DDP, (b) assistance in using the DDP more, (c) general ideas
for improvement of the DDP, and (d) additional comments on the DDP?
Besides focusing on student and faculty use and perceptions of the DDP, this
study analyzed active key performances (available for student use) during spring, 2005.
The analysis of active key performances focused on the following sub-questions:
1. How many active key performances are being used by students?
2. What discipline departments have completed key performances?
3. How are completed key performances connected to the abilities?
24
4. How are completed key performances connected to other matrices?
Significance of Study
Understanding student and faculty use and perceptions of the DDP, as well as the
characteristics of completed key performances, can provide the college with valuable data
to use in evaluating the DDP. Research gathered in this study can be compared to the
initial research of the Educational Research and Evaluation Department (ERE) and the
preliminary study done by this researcher in 2004. This research study was designed to
build on the previous DDP research and add new dimensions: investigating student and
faculty use and perceptions of the DDP, analyzing completed key performances, and
analyzing connections between matrices and completed key performances.
The data from this study will be utilized by the institution to evaluate the use of
the DDP. Findings from this study will assist the institution in its on-going research to
study the DDP as a learning tool. The data gathered in this study will also be used in
evaluating institutional goals on DDP use and in determining how the DDP can be more
effective in providing students, faculty, and administration with information on student
learning, and development in achieving academic goals for graduation. It will also be
used by the DDP Operation and DDP Policy Committees to create viable plans for
faculty and student training and future enhancements of the DDP.
The majority of current research on digital portfolios focuses describing the types
of portfolios, implementation plans, benefits, and drawbacks. Limited research is
available on student and faculty use and perceptions of digital portfolios. This study can
expand this body of knowledge and serve as a basic for other digital portfolio
evaluations.
25
Information gathered in this research study will be shared with National Coalition
for Electronic Portfolio Research, an initiative begun by American Association for
Higher Education (before they disbanded) and the Pearce Center at Clemson University,
which involves Alverno College and nine other institutions (Northern Illinois University,
Bowling Green University, IUPUI, Stanford University, Portland State University,
LaGuardia Community College, Virginia Tech University, Mississippi State University,
and University of Washington). These institutions were chosen to inaugurate this
collaborative research effort concerning electronic portfolio learning. Findings from this
research study will also be shared with Alverno’s partner institution in the Electronic
Portfolio Action Connection (EPAC).
Approach to the Study
In order to describe and evaluate student and faculty use and perceptions of the
DDP, a program evaluation methodology was used that incorporated a three-prong
approach: (a) data mining of the DDP relational database, (b) student and faculty surveys,
and (c) follow-up interviews of students and faculty. The program evaluation
methodology used in this research is aligned with Owen’s (1999) program classification.
Owen’s definition of a program category follows Smith’s (1989) definition of “[a] set of
planned activities directed toward bringing about specified change(s) in an identified and
identifiable audience” (Owen, 1999, p. 24).
Owen classifies program evaluation into five categories or forms, including:
Proactive (a form that takes place before a program is designed), Clarification (a form
that concentrates on clarifying the internal structure and function of a program),
Interactive (a form that provides information about delivery or implementation of a
26
program), Monitoring (a form used when a program is well established and ongoing), and
Impact (a form used to assess the impact of a settled program) (Owen, 1999, p. 40). Of
these five categories, this research study used the Interactive form of evaluation which is
usually concerned with providing information on delivery or selected activities,
documenting improvements of an innovation, understanding more fully how and why a
program operates in a given way, and providing information for improving the program.
The Interactive form of program evaluation supports programs which are constantly
evolving and changing (Owen, 1999, p. 44).
Quantitative data were gathered from a variety of the DDP’s relational database
tables. These data included student and faculty log-ons, completed key performances,
student program information, faculty file uploads, created key performances, and
characteristics of key performances. Quantitative data were also gathered from surveys
administered to both students and faculty. Qualitative data were collected from portions
of the surveys and from follow-up interviews of both students and faculty.
Students were surveyed in three groups, beginning (semesters one and two),
intermediate (semesters four and five), and advanced (semesters seven and eight) during
the spring, 2005 semester. A total of 172 beginning and 91 intermediate students were
surveyed in required general education courses and outside-of-class assessments. In
addition, 61 advanced students were surveyed in a number of advanced level courses. A
total of 93 faculty were surveyed in May, 2005 during an all college Institute, which all
full time faculty are required to attend. Both the student and faculty surveys included
demographic questions, Likert Scale questions, and opened ended questions, producing
both quantitative and qualitative descriptive data. The survey questions focused on
27
student and faculty perceptions of their use of the DDP, their frequency of use of DDP
features, their perceptions of the usefulness of DDP features, their perceptions of the ease
of use of the DDP, and their suggestions for improving the DDP. Additional qualitative
data on student and faculty use and perceptions of the DDP were gathered using followup scripted interviews.
As an additional program evaluation tool, the DDP was compared to Love,
McKean, and Gathercoal’s (2004) five levels of maturation for electronic portfolios.
These levels include: Level 1 Scrapbook; Level 2 Curriculum Vitae; Level 3 Curriculum
Collaboration Between Student and Faculty; Level 4 Mentoring Leading to Mastery; and
Level 5 Authentic Evidence as Authoritative Evidence for Assessment, Evaluation, and
Reporting.
Limitations/Delimitations
Due to the specific nature of the DDP and its integration into Alverno College’s
teaching, learning, and assessment philosophy, the results of this study are not
generalizable to other digital portfolios. The results of this research study, however,
provide insights on the use and perceptions of digital portfolios that will be of interest for
other institutions. When comparing data gathered in this research study to data gathered
in the preliminary research study completed in 2004, one must remember that this study
applies to version 2.0 of the DDP. While data on log-ons, completed key performances,
and student information gathered in this study can be compared to the preliminary study,
no such comparison can be made to faculty use due to the limitations of the 2004
preliminary study. Another limitation of this study is its specificity to Alverno College,
that it is bound to a one-year time frame of 2005 (five years after the DDP was first
28
implemented), and student and faculty perceptions are limited to participants who filled
out a survey and self-selected to participated in the follow-up interviews.
Research on digital portfolios seems to focus on what digital portfolios are, how
they are categorized, what they contain, and how they are implemented/used at various
institutions. A large body of research is primarily centered on digital portfolio use by
pre-service education majors, accreditation use, or enhancement of student technology
skills. According to Gathercoal, Bryde, Mahler, Love, and McKean (2002) “…portfolios
are traditionally something that is done ‘to’ students. Rarely is a portfolio something that
is done ‘with and for’ students” (p. 3). The findings of this study could be used to
enhance research on specific digital portfolio use in higher education.
Vocabulary of the Study
The terminology specific to this study is defined as the terms are introduced.
Fundamental terms used in this study include:
DDP – Alverno College’s Diagnostic Digital Portfolio, a web-based tool focusing on
student learning and enabling students to analyze their patterns of learning and
development. It is not the student’s official college record, but is a snapshot of a
student’s performances across time.
Digital Portfolio, e-portfolio, electronic portfolio, webfolio – terms used to describe an
electronic portfolio. Some authors distinguish between these terms. However, in
this study the term “digital portfolio” is used and can be considered a synonym for
e-portfolio, electronic portfolio, and webfolio.
29
Digital Portfolio – a computer-based portfolio in which all learning artifacts are
converted to computer readable format and are accessible through the World
Wide Web.
Key Performance – the operational core of the DDP. A key performance can be an
assignment, in-class assessment, project, internship, outside-of-class assessment
or any performance of a student that demonstrates her learning. Key
performances are selective and do not include all work a student might complete
during her college career.
Student Assessment-as-Learning – is Alverno’s term for a multi dimensional process,
integral to student learning that involves observing performances on individual
student’s learning in action and judging these performances on the basis of public
developmental criteria with resulting feedback to the learner.
Weekday College (WDC) – Alverno College program that mirrors a traditional college
program in which students attend classes Monday through Friday. More than 60
different programs are offered during the weekday college timeframe.
Weekend College – Alverno College accelerated program offered during the weekend
timeframe. Eight different majors are offered during this timeframe.
Summary and Forecast
While the use of digital portfolios in higher education has significantly increased
in the last two years, research on digital portfolio use is limited. Like its paper-base
counter part, most of the research seems to focus on the nature of digital portfolios
including, general descriptions of digital portfolios, digital portfolio terminology,
categories of digital portfolios, tools for implementation, and implementation programs.
30
There is limited empirical research on actual student and faculty use and perceptions of
digital portfolios, which is the focus of this study.
In preparation for this study, research was gathered on paper-based portfolios, the
predecessor of digital portfolios, general information on digital portfolios including their
history and types, student focused digital portfolios, and research on student focused
digital portfolios. In addition, initial research on the DDP was gathered, along with an
exploration of the variety of forms of program evaluation, the methodology used in this
study.
31
CHAPTER TWO: LITERATURE REVIEW
Organization of Review
Although digital portfolios are a relatively new phenomenon in higher education,
they have their roots in print and paper-based portfolios. A portfolio, defined in Webster
as “a selection of representative work,” has been used for years by artists and graphic
designers to seek additional work, demonstrate their art ability, and/or showcase their
work. Portfolios also have a history of use in business and education. Financial portfolios
have been used in business and contain a record of investments and financial holdings
that represent an individual’s monetary worth (Barrett, 2005, p. 2). Portfolio use in
education dates back to the 1960’s and 70’s, although portfolios seemed to fall out of
favor in the early 1990’s (Matthews, 2004). By the mid 1990’s, portfolio use was
resurfacing in the form of digital portfolios.
Early articles describe digital portfolios as a way to use technology to create more
portable and searchable portfolios – to “digitize” paper-based portfolios (Barrett, 1994).
Whether digital or paper based, substantive research on digital portfolios seems
somewhat limited. Herman and Winters (1994) state:
Evidence about the impact of portfolio assessment on curriculum and
instruction is weak, but provocative. Most educators believe that the use
of portfolios encourages productive changes in curriculum, instruction,
and student learning. Although this evidence is based solely on self-report
data (with their well-known limitations), teachers and principals seem to
think that portfolio assessment has encouraged them to rethink and to
change their curriculum and instructional practices (pp 54 – 55).
32
Like its paper-based counterpart, most of the research on digital portfolios seems
to focus on the nature of digital portfolios, categories of digital portfolios, and current use
of digital portfolios in education. Together with a brief history of general portfolios use
in education, these areas are integrated into the main themes of this literature review.
The first theme of this literature review contains a brief history of portfolios in
education to set the stage for the development of digital portfolios. The second literature
review theme centers on digital portfolios in general; the types of digital portfolios, tools
for construction and/or implementation, and their benefits and uses. This theme sets the
context for the numerous types of digital portfolios and describes basic digital portfolio
terminology.
The third theme of this literature review focuses on a description of one type of
digital portfolio - student digital portfolios and includes a description of several types of
student digital portfolios, their implementation, and their uses in higher education. This
theme includes specific research on student digital portfolios, most of which centers on
teacher education programs. The fourth theme flows from student digital portfolios and
describes the initial research on Alverno’s Diagnostic Digital Portfolio (DDP) completed
by Alverno College’s Educational Research and Evaluation Department (ERE).
The fifth theme of this literature review is self assessment and reflection and its
application to digital portfolios. Self assessment is a critical component of Alverno’s
Assessment-as-Learning philosophy, as well as a major topic of interest in research in
higher education, especially as it relates to portfolio use.
33
The final theme of this literature review focuses on program evaluation, its
various types, and applications to this study. Included in this area is a discussion of
Love, McKean, and Gathercoal’s (2004) five levels of maturation of digital portfolios.
It should be noted that research on the numerous other types of digital portfolios
is not included in this literature review, other than to set the context for student digital
portfolios. In addition, descriptions of the plethora of digital portfolio programs currently
implemented, or being implemented, in higher education, are not present in this literature
review, other than to assist in orienting the reader to the specific context of Alverno’s
digital portfolio.
As the themes in this literature review indicate, there is limited substantive
research on student and faculty use and perceptions of digital portfolios. This study
attempts to provide some substantive research using Alverno’s DDP as an example of a
student learning digital portfolio.
History of Portfolios in Education
This literature review theme is designed to provide a broad theoretical context for
portfolio use in education. Early adopters of digital portfolios viewed them as a
technological advancement that provided a container allowing students to collect and
organize portfolio artifacts in many median types (audio, video, graphics, and text) for
portfolios they were already creating (Barrett, 2005, p. 5).
Portfolio use in education has its roots in the progressive education movement
started a century ago, and they are linked to the notion of authentic assessment
(Matthews, 2004). The Arts Propel project, which used portfolios, began in the 1980’s.
Drew Gilmore, Howard Gardner, and Dennie Palmer worked on this project, which
34
explored portfolio use in writing, music, and the arts. This project was focused on
learning, not on testing for accountability.
The states of Vermont and Kentucky took a somewhat different route and began
to investigate the possibility of using portfolio assessments instead of standardized tests
to judge educational achievement. Vermont began a voluntary state-wide portfolio
assessment. However, in the early 1990’s a RAND corporation report on portfolio
assessment in Vermont, according to Mathews, seemed to mark the decline of portfolio
use in grading, stating that portfolio use seemed to cut into valuable teaching time (2004,
p. 13).
Research on educational portfolio use describes a large variety of uses, objectives,
and purposes. There seem to be three purposes of portfolio use that are also apparent in
the literature on digital portfolios. Purposes for digital portfolios include: (a) a showcase
of student work, (b) a demonstration of learning, and (c) a tool for evaluation
(assessment) of learning. For example, art students are asked to assemble a portfolio of
their best work (showcase). Education majors are asked to assemble a portfolio that
demonstrates their teaching skills (demonstration of learning). In the late 1980’s the use
of the term portfolio assessment emerged in education, primarily in college writing
courses, to address the need for accountability (tool for evaluation) (Barrett, 2005, p. 2).
According to Herman and Winters (1994), portfolios were “…heralded as
vehicles that provide a more equitable and sensitive portrait of what students know and
are able to do” (p. 48). In their article entitled Portfolio Research: A Slim Collection,
Herman and Winters describe the dearth of empirical research in this area. They stated
that of the “89 entries on portfolio assessment topics found in the literature over the past
35
10 years, only seven articles reported technical data or employed acceptable research
methods” (p. 48). Their studies found most articles on portfolios explained rationales,
presented ideas and models for how portfolios are constructed or used, or shared ideas on
implementation. Herman and Winters provide examples of the reliability and validity of
three portfolio models, all from the K-12 environment. Their focus was on the reliability
of scores given to the portfolios. They seem to view portfolios as a product created by
students to be evaluated (scored).
A significant number of other authors, Barrett (2005), Lorenzo and Ittelson (2005)
and Wilkerson (2003), also refer to portfolios, other than showcase portfolios, as
products that are evaluated based on some type of criteria or rubric. Barrett (2005) makes
the distinction between portfolio assessment – to address the need for accountability and
portfolio assessment – portfolios as a showcase for learning, or to illuminate capabilities
not covered by standardized testing. She goes on to differentiate “portfolios used for
assessment of learning” (purpose of the portfolio prescribed by the institution) and
“portfolios that support assessment for learning” (purpose of the portfolio agreed upon
with the learner) (Barrett, 2005, p. 18). In either case it appears that the portfolio is still a
product to be evaluated.
In contrast, Paulson, Paulson and Meyers (1991) explored the question of what
makes a portfolio a portfolio. Their conclusion indicates that a portfolio is a portfolio
“…when it provides a complex and comprehensive view of student performance in
context. It is a portfolio when the student is a participant in, rather than the object of
assessment. Above all, a portfolio is a portfolio when it provides a forum that encourages
students to develop the abilities to become independent, self-directed learners” (p. 63). It
36
is this last view of portfolios, encouraging and documenting student learning, rather than
a product to be evaluated that is the focus of Alverno’s DDP.
As the context of portfolio assessment has expanded, technology has begun to be
used to make portfolios more compact and accessible, hence the beginnings of digital
portfolios.
Digital Portfolios
The second theme of this literature reviews describes digital portfolios in general.
Most research on digital portfolios is similar to print-based portfolios, focusing on the
history of digital portfolios, the categories of digital portfolios, tools for construction
and/or implementation, and their benefits and uses.
By the mid 1990’s, portfolio use in education experienced an upsurge with the
appearance of digital portfolios, probably due to the ever-expanding use of technology in
education (Matthews, 2004). Weidmer (1998) describes digital portfolios as growing out
of the Exhibitions Project of the Coalition of Essential Schools that investigated how
schools were beginning to use authentic assessment.
Yancy (1992) describes the beginnings of digital portfolios as coinciding with the
advent of the Web and the increase in technology use. Educational institutions started
looking at how technology could be used to enhance the accessibility and the
organization of print based portfolios. Early digital portfolio studies describe a how to
approach to move print-based portfolios into digital versions or concentrated on
definitions, terminology, and classifications (Siemens, 2004, Barrett, 2001, Galloway,
2001, Lankers, 1998).
37
In 2002 Batson described an intersection of three trends that made digital
portfolio use so enchanting. These three trends include:
1. Student work is now mostly in electronic form, or is based on a
canonical electronic file even if it’s printed out: papers, reports,
proposals, simulations, solutions, experiments, renditions, graphics, or
just about any other kind of student work.
2. The Web is everywhere: We assume (not always true, of course) that
our students have ready access to the Web. The work is “out there” on
the Internet, and therefore the first step for transferring work to a Web
site has already been taken.
3. Databases are available through Web sites, allowing students to
manage large volumes of their work. The “dynamic” Web site that’s
database-driven, instead of HTML link-driven, has become the norm
for Web developers. (Batson, 2002, ¶ 2)
During the last five years a number of definitions pertaining to digital, electronic,
e-portfolios, web portfolios, and webfolios have been written. Barrett (2002), Batson
(2002), and Wiedmer (1998) all describe these terms with some variations. More recently
digital, electronic, or webfolios are referred to as e-portfolios (Seimens, 2004). Siemens
concedes that definitions of digital portfolios vary, “…but generally include the notion of
a digital resource (personal artifacts, instructor comments), demonstrating growth,
allowing for flexible expressions (i.e. customized folders and site areas to meet the skill
requirements of a particular job), and permitting access to varied interested parties
38
(parents, potential employers, fellow learners, and instructors)” (Seimens, 2004, Section
2).
According to Barrett (2002) digital portfolios are essentially a new type of
container for portfolios already being used across education and can be developed along
two paths. One type of digital portfolio uses generic tools such as word processors,
HTML editors, multimedia authoring tools, PDF (portable document format), and other
commonly used productivity software tools. The second type of digital portfolio uses a
customized system approach that involves servers, programming, and databases (Gibson
& Barrett, 2003, 560).
In a white paper on electronic portfolios and learner engagement (2005), Barrett
states her definition of electronic portfolios: “…uses electronic technologies as a
container, allowing students/teachers to collect and organize artifacts in many media
types (audio, video, graphic, text); and uses hypertext links to organize the material,
connecting evidence to appropriate outcomes, goals or standards” (Barrett, 2005, p. 5).
She presents a table that identifies the portfolio development process from literature and
how adding technology enhances the process.
Table 1
Barrett’s Comparison of Portfolio Development Process
Traditional Portfolio Process
include:
- Collecting
- Selecting
- Reflecting
- Projecting
- Celebrating
Adding Technology allows enhancement
through:
- Archiving
- Linking/Thinking
- Storytelling
- Collaborating
- Publishing
(Barrett, 2005, p. 5)
39
In an EDUCAUSE Learning Initiative white paper Lorenzo and Ittelson define an
e-portfolio as “…a digitized collection of artifacts, including demonstrations, resources,
and accomplishments that represent an individual, group, community organization, or
institution... that can be comprised of text-based, graphic, or multimedia elements
archived on a Web site or on other electronic media such as a CD-ROM, or DVD” (July
2005, p. 2). Lorenzo and Ittelson include another definition of e-portfolios from the
University of British Columbia Office of Learning Technology, “…personalized, Webbased collections of work, responses to work, and reflections that are used to demonstrate
key skills and accomplishments for a variety of contexts and time periods” (July 2005,
p.2). There are clearly a plethora of definitions for digital portfolios. Whatever the
source, the definitions seem to agree that digital portfolios use computer technology, and
more recently, as in the University of British Columbia Office of Learning Technology,
are web-based.
Alverno’s Diagnostic Digital Portfolio (DDP) follows the definitions listed above
in that the DDP is a collection of digitized learning artifacts, including self-reflection and
feedback, and is web-based and web accessible.
Categories of Digital Portfolios
Besides a variety of definitions, there are also a number of views on categories of
digital portfolios. One set of categories described by Lankers (1998) is based on how the
digital portfolio is used. Her classifications include: Developmental (documenting
student improvement in a specific subject), Proficiency (used to prove mastery in a
particular subject area), Showcase (documents student best work), Teacher Planning
(used to acquire information about an incoming class of students), Employment Skills
40
(used to evaluate prospective employee’s work readiness skills), and College Admission
(Lankers, 1998, Section 2).
Cambridge (2001) describes another set of broad categories of digital portfolio,
based on primary ownership: student, faculty, or institution. Lorenzo and Ittelson (July
2005) have a similar view, listing the categories of digital portfolios as: student eportfolios, teaching e-portfolios, and institutional e-portfolios. If one views a variety of
digital portfolios in each category, it would seem that these categories could be further
broken down by what the digital portfolios contain and how they are used. For example,
Hamp-Lyons and Condon (1998) describe the student digital portfolio category as usually
containing student work, self reflection, and perhaps faculty feedback. They describe
student digital portfolios as being used for a variety of purposes including
evaluation/grading, a showcase, and student learning (includes developmental and
program/discipline specific).
Cambridge (2001) describes faculty e-portfolios as containing information about
course and syllabi development, assessments, peer reviews, and learning activities. She
lists numerous uses for faculty portfolios including teaching assessment, course
assessment, and personal growth and reflection. Cambridge describes institutional digital
portfolios as containing information about particular programs, accreditation information,
and student outcomes. She lists institutional portfolio uses as including program
assessment, course assessment, and faculty assessment. However, these uses, as well as
the general categories, are very fluid with some digital portfolios falling into multiple
categories and uses.
41
Besides categories based on contents and use, Love, McKean, and Gathercoal
(2004) created five categories based on the level of maturation of digital portfolios.
These categories include: Scrapbook, Curriculum Vitae, Curriculum
Collaboration, Mentoring Leading to Mastery, and Authentic Evidence as the
Authoritative Evidence. These five levels of maturation are designed to help institutions
implement digital portfolios in an incremental way, bypassing the “begin at the end”
syndrome that they call a recipe for disaster (Love, McKean, and Gathercoal, 2004, p.
24).
To further assist institutions’ engaging in this incremental implementation
process, they provide a taxonomy for determining the level of maturation of the portfolio
program being used. This taxonomy can be useful for an institution to determine where
they are with their digital portfolio program and identify possible next steps. Love,
McKean, and Gathercoal also assert that digital portfolio programs can be at different
levels at the same institution.
To create their levels of maturation, Love, McKean, and Gathercoal describe eight
physical and theoretical qualities inherent in portfolio/webfolio processes and their
application. These include:
1. Type of portfolio/webfolio – working or showcase
2. Organization of the portfolio/webfolio
3. Type of student artifacts in the portfolio/webfolio
4. Presence and capture of feedback and assessment based on standards
5. Nature of the portfolio/webfolio content – static, dynamic, and/or evolving
6. Heuristic processes involved in developing the portfolio/webfolio
42
7. Context provided for each item in the portfolio/webfolio
8. Delivery mode for the portfolio/webfolio (Love, McKean, & Gathercoal,
2004, p. 25).
These qualities combine both use and content and have significant overlap with
the previously described types of digital portfolios. Three things seem clear: digital
portfolios are no longer new, justification for use seems a given (despite the limited
research), and digital portfolios are “…heralded as the ‘next big thing’ in some
educational technology circles” (Murphy, 2003, ¶ 3).
Within the different categories and types listed above, Alverno’s Diagnostic
Digital Portfolio is a student learning portfolio, and while it can be used to extract
artifacts for a showcase portfolio (or other purposes), its main function is to document
learning for a student’s own reflection on her development. Part of this research study
used Love, McKean, and Gathercoal’s taxonomy in a systematic way, to determine the
level of maturation of the DDP.
Tools Used for Construction of Digital Portfolios
One of the most common themes in the literature on digital portfolios concerns
the tools used in their constructions. Numerous articles describe the variety of tools
institutions have used to construct their digital portfolios. Some digital portfolios are
created with simple software tools (word processors, HyperStudio, Microsoft Office, and
Adobe PhotoShop) and stored on a CD-ROM (Wiedmer, 1998, Section 5). Others are
created using HTML and/or a variety of web page templates (Mullen, Bauer, & Newbold,
2001, Section 5).
43
Gibson and Barrett (2003) classify digital portfolio tools into two main types
based on the tools used. One type is classified as generic tools, in which learners
construct their own portfolios with generic tools (productivity software, word processors,
HTML editors, multimedia authoring tools, PDF formats) using whatever digital storage
space is available at the institution. The second type of tool used for digital portfolio
construction is a customized system approach (involves servers, programming, and
databases) in which an educational organization or company hosts an online database
environment that provides a structure and server space for learners to store and organize
their data (Gibson & Barrett, 2003, Section 3). The first approach requires the learner to
use multimedia tools and HTML, starting with a “blank slate” and constructing their own
unique portfolio collections that are difficult to compare from learner to learner. The
second approach does not require any knowledge of HTML or web construction. This
approach provides an on-line application for the user and appears more top-down,
controlled by the educational program/institution. While somewhat limiting student
control, this approach seems to maximize cross-portfolio comparisons.
Additional technology tools that could be used for digital portfolios include
eXtensible markup Language (XML) and weblogs (Blogs). XML is an open standard for
defining data elements on a web page. It is a “generalized framework for data files which
allows the same set of technologies to be applied to any type of data storage on any
computing platform” (Tosh & Werdmuller, 2004, p. 4). Weblogs or blogs refer to any
web pages or sites that contain dated entries in chronological order starting with the most
recent. One of the main issues with blogs concerns validity. Blogs can be written any
time, by anyone and do not necessarily contain accurate information.
44
According to Tosh and Werdmuller (2004), weblogs have enormous strength as a
communication medium, due in part to the immediacy and ease of publishing. Users can
click a button to load their weblog client, type some words into a box, and click another
button. Thus the entry is posted to the web for the world to see. They describe the ease of
use and immediacy of weblogs as paralleled only by email, which may explain the
increasing popularity of weblogs. Tosh and Werdmuller give two examples of weblog
sites: “Technorati.com, a weblog search engine, [that] watches nearly 2 million weblogs
and LiveJournal.com, a weblog community that has a further 2.5 million members” (Tosh
& Werdmuller, 2004, p. 4).
In addition to “home grown” digital portfolios or open-source initiatives, the last
few years have seen a number of commercial technology vendors enter the arena. These
products mirror the numerous types, purposes, and uses for digital portfolios previously
discussed. The January 2003 issue of Syllabus lists a product round-up that describes
five of these commercial digital portfolio products.
1. iWebfolio. Created by Nuventives, this product is designed to help instructors
evaluate student work through multiple portfolios created and maintained by
the students. Students give faculty access to view and assess all materials
within a specific portfolio. Faculty can request students to lock their
portfolios to ensure that they do not alter assignments after due dates.
iWebfolios are housed on the Nuventive server.
2. Folio. Created by ePortaro, this product is positioned as a cradle-to-career
portfolio tool. Students place two basic types of information into a central
repository (the folio): (a) documents (word processing, spreadsheets, graphics
45
or other electronic documents) and (b) standard forms supplied by Folio or by
the university. Some of the information in the Folio can be certified by the
university as being correct, such as student grades. Students are then able to
create different portfolios, using a subset of the data available. Portfolios are
usually housed at the institution, although ePorttaro does offer hosting
services.
3. E-Portfolio. Created by Chalk & Wire, this product allows students and
faculty without knowledge of web design to create showcase portfolios. The
software supports the creation of portfolios around standards such as INTASC
or ATE. Portfolios can be stored on the institution’s site or on Chalk & Wire’s
server.
4. FolioLive. Created by publisher McGraw-Hill, this product focuses primarily
on course-level assessment. Students either organize their work using
“frameworks” provided by FolioLive or create their own custom designs. The
course site contains a mechanism that enables instructors to comment on
specific student work. Portfolios are housed on McGraw-Hill’s server.
5. Web Folio Builder. Created by TaskStream, this product is a portfolio system
geared toward teachers. Teacher candidates can use the system to put together
portfolios that can serve a variety of academic and professional functions.
Student teachers can submit work to instructors for assessment and can
organize their work around state and national standards. Portfolios are housed
on TaskStream’s servers (Syllabus, 2003, pp. 38 – 39)
46
It is apparent from research on tools used to create digital portfolio that there are
numerous possibilities. In 1999, Alverno College chose to have outside consultants and
programmers create the DDP. This original version of the DDP was written in asp code,
with a SQL7 database back end, and ran on a Windows server. When the College made
the decision to create a new version of the DDP in 2003, they used in-house personal to
design, program, and convert the data from the original SQL database. The College also
made the conscious decision to move away from vendor specific products like Microsoft
Windows, SQL, and asp programming language, and use open-source, public domain
systems (Linux, MySQL, and php programming language).
Benefits and Challenges of Digital Portfolios
The benefits of digital portfolios have been the focus of numerous journal articles.
Perhaps it is these lists of proclaimed benefits that make digital portfolios seem so
appealing. Batson (2002) summarizes potential benefits by looking at how digital
portfolios can benefit students, faculty, and administrators. He describes the fact that
students seem most interested in the way digital portfolios can be used as resumes, both
before and after graduation. Students also can see where they are in their college career
regarding requirements (depending on the type of digital/electronic portfolios), and could
review their work and instructor comments. Faculty could use digital portfolios as their
own resume builders, support their teaching excellence, and help with letters of
recommendations. Batson describes the primary benefit of digital portfolios for faculty
to: “provide a tool to better manage, review, reflect, and comment on student work”
(Batson, 2002, Section 3).
47
Batson also states that administrators can see the potential value of digital
portfolios for:
1. Creating a system of tracking student work over time, in a single course, with
students and faculty reflections.
2. Aggregating the work of many students in a particular course to see how the
students as a whole are progressing toward learning goals.
3. Assessing many courses in similar ways that are all part of one major and
thus, by extension, assessing the entire program of study (Batson, 2002,
Section 3).
With respect to accreditation, administrators can discover how to:
1. Integrate courses with new methods, orienting syllabi and curricula around
learning goals.
2. Encourage continuity of student work from semester to semester in linked
courses.
3. Have a more fully informed and dynamic, constantly updated view of student
progress in a program, which is very helpful in formative assessment (Batson,
2002, Section 3).
Siemens (2004) describes a similar grouping of digital portfolio benefits by
focusing on the main participants of the process: learners, instructors, and institutions.
He describes several benefits for learners, as they seek to create and reflect on life
experiences, including:
1. Personal knowledge management
2. History of development and growth
48
3. Planning/goal setting tool
4. Assist learners in making connections between learning experiences
5. Provide the metacognitive elements needed to assist learners in planning
future learning needs based on previous successes and failures.
6. Person control of learning history (Siemens, 2004, Section 4).
Seimen (2004) describes the benefits for faculty use of digital portfolios including:
1. Means to share content with other faculty
2. Move to more authentic assessment (as opposed to testing)
3. Preparing learners for life-long learning
4. Create an assessment-trail that is centralized and under learner control
(Siemens, 2004, Section 4).
Institutional benefits of digital portfolios listed by Seimen (2004) include:
1. Providing value for learners by allowing personal control
2. Contribute to the development of a more permanent role in the lives of
learners (i.e. education is not viewed as a 2 to 4 year relationship, but rather a
life-long relationship) (Siemens, 2004, Section 4).
These various views on the benefits of digital portfolios all center on going
beyond rote determination of knowledge, to focus on actual student learning with respect
to development, goal setting, reflection, and life-long learning. Digital portfolios can also
provide institutions and faculty with tools for looking at student learning in a
developmental, dynamic, and constantly updated method.
The main issue with these long lists of digital portfolio benefits is that they center
on the type of digital portfolio being used. All digital portfolios do not provide all
49
benefits listed above. This fact is reinforced by Batson benefit list where he prefaces
digital portfolio benefits with could or may.
Tosh and Werdmuller (2004) summarize digital portfolio benefits under three
main areas related to the use of the digital portfolio: “a learning tool for the user; a
monitoring tool for institutions and a mechanism for employment opportunities” (Tosh &
Werdmuller, 2004, p. 3). The idea of a learning tool for the user stresses that the power of
digital portfolios lies in the monitory process as well as the product, outlining pedagogy
shifts from a course-driven focus to a student-centered focus. This shift away from
course-driven learning could allow for information and skills that normally fall through
the cracks (extra-curricular activities, work experiences, etc.) to be captured and used,
thus presenting a more in-depth portrait of the individual. Within the area of a monitoring
tool for institutions, Tosh and Werdmuller indicate that digital portfolios could help
departments more effectively demonstrate their graduates’ learning and work place skills,
because they could enable students to demonstrate, in their own words and with their own
products, the effectiveness and value of the educational experience. As in other research
articles, Tosh and Werdmuller stress that a digital portfolio can be a great resume
enhancer, providing direct links to actual objects that can back up the applicants’ claims
(Tosh & Werdmuller, 2004, p. 3).
While there is some confusion as to the types, definitions, and uses of digital
portfolios, most agree that digital portfolios can be classified by the intended user. The
third theme of this literature review describes several types of student portfolios, the
category the DDP falls into, and their implementation and use in higher education.
50
Student Digital Portfolios
Within the area of student portfolios, there are many varieties, uses, and
frameworks regarding what should be included in the portfolio. In the Electronic
Portfolio White Paper, created by ePort Consortium (2003), student portfolios are
classified by their purposes and audience. Personal portfolios are those designed for self
reflection. They can be used to organize materials from classes, activities, and journal
experiences and can assist students in recognizing skills and making decisions. Learning
portfolios are those designed to showcase student learning, demonstrate how skills have
been developed, and provide a framework for assessing academic progress. Professional
portfolios can be used to demonstrate the student’s attainment of program or certification
requirements, to present accomplishments for employment, to help make career
decisions, and to review professional development for career advancement or change
(ePort Consortium, 2003, p. 11).
In her introduction to the Digitized Student Portfolios section in Electronic
Portfolios Emerging Practices in Student, Faculty, and Institutional Learning, Kathleen
Yancey states that digital portfolios are governed by purpose and audience, as are their
paper counterpoints (2001, p. 20). She refers to student portfolios as showcases of the
student’s best work to present to an employer, and as a method for documenting student
learning in courses or programs.
Regardless of the purpose or audience for the digital portfolio, numerous authors
(Chen & Mazow, Lankers, Gathercoal, et al.) agree with Yancey that digital portfolios
provide a new type of space for intellectual work, as well as opportunities to connect and
present intellectual work in new ways. Digital portfolios offer the possibility of bringing
51
pedagogy and assessment in alignment and can provide a connection across classes and
curriculum. Yancey goes on to say that student digital portfolios rest on the assumption
that “…the engaged learner, one who records and interprets and evaluates his or her own
learning, is the best learner” (Yancey, 2001, p. 83).
There are a wide range of opinions on what should be included in a student digital
portfolio. Simon and Forgette-Groux (2000) suggest a cross-curricular sampling of
entries that provide evidence of the cognitive, behavioral, affective, metacognitive, and
developmental dimensions of a single but complex competency, such as communication
or problem solving. In contrast, Hamp-Lyons and Condon (1998) refer to a set of
features all prefixed with the word “can” to emphasize the potentials of digital portfolios.
For example, portfolios can feature multiple examples of work, portfolios can be context
rich, portfolios can offer opportunities for selection and self assessment, and portfolios
can offer a look at development over time.
The Portfolio Clearinghouse (2004), acquired by American Association of Higher
Education (AAHE) before they disbanded in 2005, listed 50 different portfolios used in a
varied of higher educational institutions. Various purposes are used to classify the
portfolios, including Advising (2 institutions), Integration of curriculum/co-curriculum (1
institution), Career/resume Planning (8 institutions), Program evaluation/Institutional
assessment (4 institutions), Faculty evaluation/tenure (2 institutions), Document
collection (1 institution), Reflection (19 institutions), and Student evaluation/grading (13
institutions). Of the 50 portfolios listed, two categories relate to student portfolios in
academics: the reflection and student evaluation/grading categories. Nineteen institutions
identified portfolios related to reflection. Of these 19, five institutions require the use of
52
reflective portfolios for all students. Four of these five institutions have web based
portfolios: Amsterdam Faculty of Education (EFA), Kalamazoo College, Stanford
University, and Indiana University-Purdue University Indianapolis. Valley City State
University’s reflective portfolio is CD based. Stanford University’s portfolio is required
for only one semester. Thirteen institutions are listed as having portfolios related to
student evaluation/grading. Of these 13, three institutions require the use of their
portfolios by all students: Sonoma State University, California State University at
Monterey Bay, and Olivet College. Sonoma State University’s portfolio is paper-based
while the other three are a combination of paper, web, and CD.
Alverno’s Diagnostic Digital Portfolio (DDP), which is required for all students,
is a combination of reflective and student evaluation/grading types of digital portfolios. A
significant difference between Alverno’s DDP and the portfolios of the other institutions
listed above centers on how the portfolios are created and what they contain. All of the
institutions requiring portfolios from the AAHE web site, other than Alverno’s DDP,
have portfolios that are created by students, using either web-editing software or
compiling paper and electronic files. The portfolios, which are focused on student
reflection of their work, do not mention the inclusion of faculty feedback.
In the institutions requiring digital portfolios, use and review processes for their
digital portfolios vary. Regarding the question of how the portfolio is reviewed and how
often, Stanford University simply states that the question is not applicable. IUPUI
describes a review committee, while California State University states their review
process varies but can include business people. Sonoma State University and Amsterdam
Faculty of Education review their portfolios three or four times a year, while Kalamazoo
53
College’s academic advisors review portfolios once a year. Both Olivet College and
Valley City State University talk about using their portfolios as part of an exit process at
graduation. In addition, Olivet College requires a portfolio review at the end of the
sophomore year.
Alverno’s portfolio is not similar to any of the institutions mentioned above,
making it difficult to compare. The DDP is used throughout the students’ education at
Alverno. Faculty and advisors can review a student’s DDP numerous times throughout
the year. For example, students and faculty can review information on the DDP during
pre-registration, or when difficulties with student performance are noted. Students are
required to complete a mid-program portfolio assessment (end of second year) that is also
a key performance on the DDP. In this assessment, students analyze their past work for
areas of strengths and areas of improvement with respect to the eight abilities. Students
then develop a learning plan that is uploaded to the DDP. The DDP is also used to create
Alverno’s narrative transcript for each graduating student.
Research on Student Digital Portfolios
The fourth theme of this literature review is student digital portfolio research.
Within this theme, research on digital portfolio use in teacher education and digital
learning portfolios are described.
Research on digital portfolios seems to mirror that of paper-based portfolios.
Despite numerous proponents of digital portfolios, substantive research on their impact is
sparse. In 1998, Lyons noted: “there is not yet a body of systematic data documenting
their [portfolio] uses or their long-term consequences” (p. 247). Even in the area of
teacher portfolios, perhaps the most widely used type of digital portfolio, Zeichner and
54
Wray (2001) described the same concern, “Despite the current popularity of teaching
portfolios, there have been very few systematic studies of the nature and consequences of
their use for either assessment or developmental purposes” (p. 615).
As discussed previously, there are numerous types and categories of digital
portfolios. Research articles seem to focus on two main types of digital portfolios:
learning digital portfolios and assessment portfolios (usually “high stakes” assessment).
Of these two types, there seems to be more focus on assessment portfolios, the evaluation
of a portfolio, than on student learning portfolios.
There are numerous organizations doing research on digital portfolios including
EDUCAUSE Learning Initiative, Electronic Portfolio Action Committee (EPAC), and
the National Coalition for Electronic Portfolio Learning. However, most of their research
has been focused once again on types, categories, tools used for construction, and digital
portfolio programs being implemented. However, one area that seems to have substantive
research on portfolios and digital portfolios is teacher education.
Portfolio Research in Teacher Education
Perhaps due to the nature of pre-service teacher education programs and the need
to document pre-service teacher learning for accreditation, there is substantive research
on portfolios in general, as well as digital portfolios. Carney (2004) quotes a study done
by Salzman and Denner in 2002 that found “Nearly 90% of schools, colleges, and
departments of education use portfolios to make decisions about candidates” (p. 1).
A key question, posed by Carney (2004), focusing on portfolio assessment, is
central to the research on digital portfolios: “Do we have empirical evidence that
portfolios can be scored reliably and enable us to make valid interpretations about student
55
achievement?” (p. 3). Carney breaks this question open by asking “…even if portfolios
can be made to function in this way, is it wise to use them in such a manner, to make
high-skates decisions, or will we have destroyed portfolios’ usefulness as a learning tool
in the process” (p. 3). She wonders if this high-stakes process will turn portfolios into
what Shulman (1998) has referred to as “very, very cumbersome multiple-choice tests”
(p. 35). While the focus of these statements center on teacher portfolios, the questions can
be asked of digital portfolios in general.
In her comprehensive article “Setting an Agenda for Electronic Portfolio
Research: A Framework for Evaluating Portfolio Literature” (2004), Carney describes
several foci for digital portfolio research. She notes that most articles on the subject are
“…conceptual or anecdotal rather than research-based” (p. 5). She categorizes and
describes one group of articles as ethnographic descriptions of the manner in which the
portfolios are structured and implemented, accompanied by survey data on the attitudes
and beliefs of the portfolio authors. She goes on to describe another group of articles that
offer studies of self-reported data from authors who acclaim the learning benefits of
portfolios. She cautions that this type of evidence is anecdotal in nature and needs to be
triangulated by other sources.
Carney, like Barrett (2005), differentiates between portfolios designed for the
purpose of fostering learning and those where the primary purpose is assessment. In her
article she adapts the Herman and Winters (1994) framework for documenting portfolio
effectiveness which includes providing evidence that assures technical quality, fairness,
effects, and feasibility. While she acknowledges that Herman and Winters’ framework
was designed primarily for assessment portfolios, she has chosen to adapt the framework
56
for portfolios that have learning as their primary purpose. However, it should be noted
that in either case, assessment portfolio or learning portfolios, she views both as a
“product” to be evaluated.
Besides examining Herman and Winters’ framework, Carney also studied
portfolios as they relate to Zeichner and Wray’s (2001) six critical dimensions of
variation to make statements about portfolio effect. These critical dimensions include:
1. Purpose(s) of the portfolio
2. Control (who determines what goes into the portfolio and the degree to which
this is specified beforehand)
3. Mode of presentation (portfolio organization and format – including the
technology chosen for authoring)
4. Social Interaction (the nature and quality of the social interaction throughout
the portfolio process)
5. Involvement (Carney notes that Zeichner and Wray identify degree of
involvement by the cooperative teacher important for preservice portfolios. In
considering involvement more broadly she includes other portfolio
participants such as university teachers, P-12 students and parents) (p.7).
Carney examined six studies of digital portfolios, which she selected from a larger
body of 22 empirical studies located during her literature research. In all cases, she uses
Herman and Winters’ categories, sometimes breaking those categories down into
assessment portfolios and learning portfolios. Although the learning portfolios are also
viewed as a product to be assessed, there is not the same factor of high stakes riding on
the evaluation of the learning portfolio.
57
Table 2 contains Carney’s summarization of these five studies. The main purpose
of this table is to summarize the studies she examined, rather than to present her data on
the studies’ connections to Herman and Winters’ Assessment effectiveness categories or
Zeichner and Wray’s Critical dimensions categories.
Table 2 indicates that all Carney’s studies are concerned with preservice teachers
and that both qualitative and quantitative data were collected. A variety of
methodologies were used, including case study, sociocultural frameworks, statistical
Table 2
Summary of Carney’s Five Studies (Carney, 2004)
Study
Type of
Portfolio
Type of
Study
Sample
Size
Methodology
Results
Avaamidou
& ZembalSaul
(2003)
Teacher –
Preservice
Qualitative
Case Study –
2
Studied digital portfolios of
two prospective elementary
science teachers. Two main
types of data: web portfolio
content & reflective
statements
Three analytic techniques
used: pattern-matching,
explanation-building, and
time-series analysis
Content analysis was done on
reflective statements
Data analysis revealed clear evidence
of learning & professional
development in:
1. Making connections between
university coursework & field
experience
2. A transformation from being
descriptive to being explanatory
3. Engaging in reflective and
metacognitive activities
Also noted that making thinking
visible to a large audience (web)
motivated authors to produce their best
work and enabled them to give and
receive feedback from a wide
audience,
Table Continued
58
Table Continued
Study
Type of
Portfolio
Study Type
Sample Size
Methodology
Results
Carney, J.
(2001)
Teacher –
Preservice
Qualitative
Case Study –
6
How preservice teachers
conceptualized themselves.
Three digital portfolios, three
traditional portfolios of Masters
in Teaching students.
Used a sociocultural frame to
consider how the tool chosen
for portfolio authoring interacts
with other artifacts to influence
conceptions of audience,
purpose, form, and content.
Analyzed portfolios for
pedagogical content knowledge
and the use of technology tools.
Derham,
C. (2003)
Teacher –
Preservice
Quantitative –
30
Dissertation – investigation of
the reliability and validity of
the Digital Portfolio
Assessment of Teaching
Competencies (D-PATCO).
Examined evidence of the DPATCO’s psychometric
properties based on: test
content, relations with other
variables, and reliability
estimates. Data collected from
30 preservice teachers over
four semesters and analyzed
using Pearson product-moment
correlation coefficients,
Cronbach’s alpha, and Cohen’s
kappa coefficient.
Hartmann,
C. (2003)
Teacher –
Preservice
Qualitative
Case Study –
7
Dissertation – investigated how
seven prospective teachers of
secondary mathematics learned
to represent their teaching
practice in a digital portfolio.
Undergraduate teacher
preparation program.
Data gathered over two
semesters including: two semistructured interviews, one focus
group interview, observation of
a portfolio seminar
presentation, and three versions
of the portfolio collected at
different times during the
process. Use a theory of
rendering as a conceptual
framework, constant
comparative method, and
“critical incidents of practice”
as the unit of analysis.
All cases studied indicated that the
preservice teachers were using their
portfolio to present a portrait of self
as teacher, and then to compare that
image with the ideal they had
formulated in their philosophies of
education.
The effectiveness of the portfolio was
dependent upon a number of
technological and psychological tools
operating in subtle and often
unexpected ways.
Feasibility of digital portfolios for
representing, assessing, and
enhancing teacher knowledge will
depend upon our awareness of these
complex interactions and willingness
to capitalize on tool affordances while
ameliorating tool constraints.
1. Found that assessment of
instructional competence is possible
via a digital, preservice teacher
portfolio.
2. Found that D-PATCO
demonstrated theoretically
acceptable relationships with
several other assessments of teacher
competency (Praxis II) and a
generally positive expert review.
3. D-PATCO does not singlehandedly address the breadth of
preservice teachers’ content
knowledge
4. Low percentages of agreement
between raters and inadequate
evidence reflecting preservice
teachers’ content knowledge
Suggests that learning to render one’s
practice is cognitive scaffolding for
preservice teachers as they develop
the habits of mind necessary for them
to grow toward high-quality
mathematics instruction.
Powerful learning occurred for his
participants when they were asked to
render a single lesson multiple times.
Establishing the portfolio as the
beginning of a professional
continuum of rendering and sharing
one’s practice gave portfolio authors
an intrinsic purpose which he
contended was important for
preservice teachers to make
connections between their teaching
experiences and university
coursework.
Tabled Continued
59
Table Continued
Study
Type of
Portfolio
Study Type
Sample Size
Hartmann
&
Calandra
(2004)
Teacher –
Preservice
Qualitative
Case Study –
7
Methodology
Additional analysis of first
study to investigate how
technology impacted learning
in a community of practice.
Seven participants examined
Results
Traced the technological innovation
through the group, contending that
technological innovations enhanced
portfolio authors’ capability for
representing their teaching of
mathematics
measures, and rendering. The point Carney makes is that each of the studies she
investigated include three important features: “multiple sources of data, triangulation of
evidence, and systematic analysis of portfolio content” (p. 24). Of note in her findings is
that although not all of the portfolios studies she used involved high-stakes evaluation,
each study focused on the portfolio as a product to be evaluated. Although she asserts
that Avraamidou and Zembal-Saul’s study focuses on how portfolios can be used for
learning, the learning is described as: connections with course work, transformation from
descriptive to explanatory, and engagement in reflective and metacognitive activities.
What is unclear in Avraamidou and Zembal-Saul’s study is the definition of learning and
how that learning directly connects to the knowledge and skills a teacher needs to teach
science.
In her conclusion, Carney makes the point that there is a need to insure that if
portfolios are used for high-stakes decision-making, they must be psychometrically
sound, or they might be subject to a host of legal challenges. Wilkerson and Lang (2003)
are even more forceful as to the use of portfolios in high-stakes situations. They state:
“As measurement professionals, we are frequently asked if portfolio assessment can be
used as an appropriate and safe vehicle to make summative decisions in a certification
context. Are they a good measure? Our answer is this: ‘No, unless the contents are
rigorously controlled and systematically evaluated’” (p. 2). They refer to Ingersoll and
60
Scannell (2002) who point out that portfolios “…are not assessments, but are instead
collections of candidate artifacts that present examples of what the candidate can do”
(Wilkerson and Lang, 2003, p. 2).
While there are numerous other studies of portfolio use in teacher education,
Carney’s comprehensive article provides a clear view of the limited substantive research
in the field of teacher education and digital portfolios. Other disciplines seem to be
following in the footsteps of teacher education. Recently, articles on portfolio use in
nursing and other health related fields have appeared. These articles seem to follow the
path of teacher education research in that they focus on the types of portfolios, benefits,
and implementation. As stated earlier there is limited research on student learning digital
portfolios.
Research on Student Learning Portfolios
Due to the ambiguity of the word learning it is sometimes difficult to research
student learning portfolios. There seems to be a multitude of articles “in the works” that
are described on a variety of web sites and hosted by various organizations. As in the
previous section, much of the research focuses on teacher education.
One study that does focus on student learning portfolios, done by Zou (2002),
describes how she organized her instructional practices around a mandated assessment
portfolio. While this study focuses on high-stakes portfolio assessment, in her study Zou
articulates some of the major questions and possibilities for student learning portfolios.
Zou’s study focuses on the Missouri Department of Elementary and Secondary
Education mandated implementation of the Missouri Standards for Teacher Education
Programs (MoSTEP). To document the successful attainment of these standards,
61
students in the education department are required to create a portfolio starting in their
second year and continuing to graduation. These student portfolios consist of two parts:
(a) artifacts selected by teacher candidates that provide evidence for how they are
meeting the standards, and (b) reflections in which the teacher candidates provide a
rationale for their selections.
Based on data she gathered from a survey, direct observations, and interactions
with her students, she found an overall passivity in students’ attitudes towards the
portfolio, including the perception of the portfolio as an add-on to the students already
heavy coursework. For example, in a survey of students in spring, 2001 the
overwhelming majority of the students ranked the portfolios as “not very useful”. Not
only did the students think the portfolio was not useful, but Zou observed problems with
inappropriate selection of artifacts for the standards, and with student reflections on the
artifacts that were “…irrelevant to the corresponding standards” (p. 5). She attributes the
students’ lack of initiative to three factors:
1. Lack of an apparent link between the portfolio and students’ coursework, thus
the instrumental nature of the portfolio was vague.
2. Limited student knowledge about portfolio assessment in terms of its
significance, processes involved, and organizational skills needed, so the
sense of self-efficacy which is a prerequisite for any engaged endeavor was
lacking.
3. Lack of student understanding of the mandated standards which are complex
and condensed in content and wording (p. 5).
62
In response to these problems Zou made several substantial changes in her
teaching for the next semester (fall 2001). These changes included:
1. Aligning the instructional content with five standards she identified
(Knowledge of Subject Matter, Knowledge of Human Development,
Motivation and Classroom Management, Communication Skills, and
Professional Development).
2. Linking course assignments to the documentation of artifacts for the portfolio.
3. Providing students with concrete assistance to help them truly understand the
select MoSTEP standards.
Zou had three main research goals for her study that focused on the issues
described earlier. In addition, she wanted to investigate if the changes she was making in
her teaching had any effects. These goals included: (a) determining what, if any, benefits
and/or disadvantages could be gained by organizing instructional practice around the
assessment portfolio; (b) investigating if students’ self-efficacy as well as their overall
performance in the portfolio increase; and (c) determining if students’ attitudes towards
the portfolio change positively.
Zou conducted her study with her own class of 24 students in fall, 2001 using
surveys as her primary data collection tool. She distributed the survey, which were
conceptually the same, at different points in the semester. The first distribution point was
early in the semester when she first introduced the portfolio and the second was at the end
of the semester when the students had finished their portfolios.
Zou assessed students’ attitudes towards the portfolio through five aspects: their
perception of usefulness of portfolios, their perception of the importance of developing
63
the portfolio, their preference between portfolios and traditional assessment methods,
their indicated intention to use portfolio assessment in their future teaching, and their
expressed level of personal liking for developing a portfolio (p. 8).
Her data indicates that an overwhelming majority of students assumed a positive
attitude toward the portfolio. However, the data also shows that students who belonged to
the unsure category in the first survey did not change their attitudes to positive in the
second survey. Another conclusion she makes was that after students went through the
portfolio process, their confidence in compiling the portfolio increased substantially, and
their grade projected for the portfolio seemed to become more realistic. She also notes an
apparent improvement in the students’ portfolio quality during the fall 2001 as compared
to portfolios from previous semesters. For example, 64% of students in the fall 2000
semester and 70% of students in the spring 2001 produced portfolios that met
departmental criteria. In the fall 2001 after implementing Zou’s instructional changes,
100% of the students met criteria.
Zou observed that some students’ attitudes remained negative throughout the
process. She believes this was partially caused by the nature of the assessment portfolio.
Using three portfolio models from Wolf and Dietz (1998) -- learning, assessment, and
employment -- she theorizes that the purpose of the portfolio could affect student
attitudes. She believes that of the three models, the learning portfolio seems to trigger
student interest and motivation to a greater degree than the assessment portfolio, due to
the learning portfolio’s promotion of self reflection, self-exploration, and autonomy over
the process. The assessment portfolio, on the other hand, is restricted by its focus on
evaluation and accountability and is constrained by its emphasis in the defined standards.
64
The assessment portfolio process limits students’ creativity and ownership, thus
generating some negative feelings.
In her conclusion, Zou states that the study convinced her that “Preservice teachers
should start with a learning portfolio, not an assessment portfolio. A learning portfolio
permits student’s authority for making decision on their portfolio’s structure, content and
process, thus their creativity and initiatives are encouraged” (p. 17). She believes that
students should “get involved in the process first, not just the product” (p. 17), and that
the focus of a learning portfolio should be for students to reflect on growth and assess
their learning, not just to fulfill some external standard.
Critical findings in Zou’s study focus on students’ perceptions of the portfolio,
especially their perceptions of its usefulness and benefits. It also reinforces the point that
portfolios as reflective learning tools, rather than products for evaluation, can enable
students to focus on their own learning development. This is the focus of Alverno’s
Diagnostic Digital Portfolio (DDP) – student learning. This study also underscores the
need for further research into students’ perceptions of portfolios. As implementation of
the DDP commenced, Alverno’s Education Research and Evaluation Department (ERE)
began to research the use of the DDP by students and faculty.
Initial Research on Alverno’s Diagnostic Digital Portfolio
After its implementation in October 1999, research on the DDP was undertaken
by (ERE). This fifth literature review theme contains several sub-themes including: (a)
research completed by ERE for grants reports (Atlantic Philanthropic Service Co.), (b)
analysis of quantitative data, (c) analysis of qualitative data collected from both student
and faculty, and (d) ERE’s general observations concerning the DDP. Because funding
65
for the DDP came from a variety of grants including Title III, Atlantic Philanthropic
Service Co, and Kellogg, the initial research on the DDP focused on the evaluation of
various grant objectives. In addition, ERE’s research was connected to a central goal of
the DDP - creating a system that mirrored Alverno’s Ability-Based Learning and Student
Assessment-as-Learning philosophy, providing an easily accessible location for storing
student learning artifacts previously housed in a variety of locations.
It is important to note that the DDP was designed as a system separate from the
official student information system. The learning artifacts stored in the DDP are
snapshots of student performances rather than the institution’s official record of abilities
demonstrated within courses. Also woven into the DDP’s design was the concept of
creating specific and identifiable times in a student’s curriculum where she would be
asked to reflect on her development (identify strengths and challenges with respect to the
eight abilities) and create learning goals. This reflection process was already in place
within variety of majors, but had not been formally instituted throughout the curriculum.
The DDP was designed to assist in the reflection process by providing an accessible place
for students to review some of their past performances, self assessments, and feedback in
order to analyze their patterns of learning.
ERE research began by focusing on various grant objectives and the included the
use of both qualitative and quantitative data.
Grant Report Research
The main research question addressed by ERE centered on how the DDP affects
student learning, with additional sub-questions including: how usable is the DDP for
students and how usable is the DDP for faculty? These questions formed the basis for
66
several grant reports and were tailored to meet the individual objectives of each grant.
For example, in a grant report to Atlantic Philanthropic Services Co. in July of 2001
several grant outcomes were evaluated. These outcomes included:
1. Design and implement a digital portfolio for each student that is part of an
accessible, searchable database system.
a. Due to the developmental nature of the DDP, the college provided
DDP accounts to all entering students, starting in October 1999. As of
May, 2001, 1,200 students (from a total of approximately 1,900) have
digital portfolio accounts (Final Report to Atlantic Philanthropic
Service Co., 2001, p. 2).
2. Design a digital portfolio that helps students diagnose their learning progress.
a. A mid-point outside-of-class assessment was redesigned to provide a
curriculum point for students to reflect on their learning progress and
set learning goals. Discipline departments designed diagnostic uses for
the DDP and teams of faculty incorporated new self assessment
strategies and in-course activities so that students would access their
portfolios and reflect on their progress. These efforts resulted in
changes to the Integrated Communication Seminars that all students
take, additions and revisions to courses in Business and Management,
Nursing, Education, Computer Studies, English, Social Science,
Professional Communication, Biology and Psychology. (Final Report
to Atlantic Philanthropic Service Co., 2001, pp. 4 – 5).
67
b. Unanticipated (but welcome) Outcome 2a: Developed concept of “key
performance”.
i.
To decide what to include in the DDP the institution continued
to follow the advice of the Educational Testing Service
consultants and relied on their previous experience in designing
Alverno’s Ability-Based Curriculum – study your institution’s
teaching and assessment practice and look for patterns.
Working with this premise, the DDP design team came up with
key performance as a central organizational unit in the DDP.
The guidelines given in the report for selecting or designing a
key performance are listed in Table 3. These guidelines were
designed to assist faculty and discipline departments in
selecting learning activities to be included in the DDP as a key
performance. The last guideline connects to the main goal of
the DDP – to provide information on the student’s learning and
development, in a variety of modalities, throughout their
studies at Alverno College.
68
Table 3
Guidelines for Selecting or Designing a Key Performance (Final Report to Atlantic
Philanthropic Service Co., 2001, p. 7)
Guidelines for Selecting or Designing a Key Performance
1. The assignment or assessment elicits and enables the student to demonstrate a range
of performance
2. The performance can be related clearly to a course outcome, and if possible, an
outcome of the student’s major
3. The performance should provide meaningful information on the student status
(beginning, intermediate or advanced) in her major, support area or in general
education
4. The performance is usually predictive of student success in her major or support area
or in some aspect of her program
5. Taken together, the set of key performances required to be entered into the DDP by
an individual student a) provide information on the student’s progress from
beginning through advanced stages of abilities and outcomes b) provide different
modes of response from the students (e.g. an individual student’s DDP should have
written assignments and assessments but also samples of her speaking, group
interactions etc.)
c. Unanticipated (but welcome) Outcome 2b: Invent a new category of
student achievement to be included in the portfolio – Independent
Learning Experience (ILE’s)
i.
During the first year of implementation the college solicited
feedback from early student and faculty DDP users. This
feedback indicated that students with part time or full time jobs
wanted to include samples of projects they completed at their
place of employment. Other students wanted to include
examples of their citizenship such as their volunteer work for a
political campaign or the assistance they rendered to a church
activity. As a result, the Council for Student Assessment
69
designed a process by which students could include these
independently organized learning experiences into their
portfolios. These independent learning experiences were called
ILEs (Independent Learning Experiences) and this prefix was
used in the DDP to identify these experiences. (Final Report to
Atlantic Philanthropic Service Co., 2001, pp. 6 – 7).
3. Train faculty and students in the effective use of the DDP.
a. Fifty desktop and notebook computers were purchased during the life
of the grant. Space was dedicated in the Liberal Arts building for
some of the equipment in what was named the Faculty Instructional
Design Lab (FIDL). The FIDL room was used to offer drop-in small
group training on the DDP (Final Report to Atlantic Philanthropic
Service Co., 2001, p. 7).
b. Unanticipated (but welcome) Outcome 3a: Increased student practice
and expertise with information technology.
i.
The Academic Computing Center noticed increased student use
of the campus-wide network and increased motivation to learn
computer skills. The college believed the DDP was partly
responsible for motivating students to become more computer
savvy since it gives them a more personal reason to do so. A
student quoted in the report stated: “This is MY academic
stuff.” Despite the fact that many of Alverno students (about
16% each year) come to the college with no, or minimal,
70
computer background and no access to a personal computer at
home, use of the computer center is up by 20%, largely due to
DDP usage (Final Report to Atlantic Philanthropic Service Co.,
2001, p. 7).
4. Disseminate what we have learned:
a. Since the inception of the grant, faculty and staff have made a number
of presentations on the DDP. Information on the DDP has been
presented to over 700 national and international participants that
attended Alverno’s annual “Day at Alverno” and summer assessment
workshops since 1998. Faculty and staff have also presented at each
of the 1999, 2000, and 2001 American Association for Higher
Education’s annual Assessment Forums on different aspects of the
DDP. Because of Alverno’s growing expertise in this area, AAHE
asked them to join a consortium of institutions that would continue to
explore how to develop on-campus versions of eportfolios focused on
enhancing student learning and assessment (Final Report to Atlantic
Philanthropic Service Co., 2001, p. 8).
Reports for the Title III grant were more general in nature, because the focus of
this grant was on expanding technology use in general and the DDP formed only a part of
the grant’s overall objectives. Title III report data included the quantitative data reviewed
below, as well as narrative comments from ERE on general observations concerning the
DDP.
71
Besides the grant reports listed above, data collected concerning the DDP were of
two types: quantitative (data mined from the DDP database on number of log-ins, number
of active key performances, connection of key performances to the abilities and advanced
outcomes) and qualitative (data gathered from interviews of faculty and students,
surveys, case studies, and talk aloud interviews). A summary of the quantitative and
qualitative data is listed below. These data results are discussed so as to lay a foundation
for data comparison for the results of this study.
Quantitative Data Summary
Quantitative data were gathered by mining the DDP relational database. In
October, 1999 student accounts were created for all new entering students during the fall,
1999 semester. The decision to start with entering students, rather than all Alverno
students, was made due to the developmental nature of the DDP. In addition, current
students could request a DDP account. These accounts were created manually by the
DDP System Administrator. The manual account creation made it difficult to analyze
how many students could log onto the DDP, since numerous students attend the college
part time and require more than four years to graduate.
First year use of the DDP focused on creating key performances that were
administered by the Assessment Center and designed by Ability Departments
(departments made up of full-time faculty who serve in these Ability Departments, as
well as in their own discipline departments and are responsible for creating criteria for
each ability). During the spring 2000 semester, selected faculty piloted various course
key performances. DDP use by the general faculty began in August, 2000.
72
Quantitative data on the DDP were gathered from a variety of database tables.
During the first year, October 1999 to August 2000, data gathered were in the form of the
number of student accounts, created key performances, and faculty who created key
performance. These types of data were used from October 1999 to January 2001.
Beginning in January 2001, the focus of the data gathered was on active key
performances, their connection to the abilities, and on student log-ons. Student log-on
data and active key performances and their connection to the abilities were gathered from
January 2001 through January 2002. No log-on data was collected for spring 2002
through spring 2003.
A summary of these data can be found in Table 4. This summary of data indicates
that multiple types of data were collected and not all types of data were collected each
semester. This makes it difficult to make analytic comparisons on increased use of the
DDP.
Qualitative Data Summary
Qualitative data were gathered by ERE through student interviews, faculty
surveys, and classroom observations completed from 2000 through 2003.
Student Interviews
In the spring and summer of 2002 ERE completed brief student perspective
interviews to address their experiences with the DDP. Pilot interviews were conducted
with a small number of students earlier in the year to establish the protocol questions.
Beginning with a stratified random sample of 22 students ERE was able to collect usable
transcripts from 11 of the interviews. ERE considered that the 11 students represented a
diversity of educational experience, time in the Alverno curriculum, and ethnic
73
Table 4
Quantitative Data Summary of Initial ERE Research (Taken from Numerous Educational
Research and Evaluation Department Documents August, 2000 to Fall, 2003)
Spring 2000 Data
Summary
(August 2000)
Fall 2000 Data
Summary
(January 2001)
Spring 2001 Data
Summary
(July 2001)
Fall 2001 Data Summary
(January 2002)
900 students logged in
average 5 – 6 times
15% logged in 10mor
more
84 active key
performances connected
to 70 abilities and 14
advanced outcomes
No data available
582 student
accounts
No data available
400 students
logging in average
of 3X
27 active key
performances
49 active key
performances
53 Active key
performance
13 different faculty
created key
performances
22 faculty and staff
trained in the DDP
May 1999- entire
faculty and staff
attend a session
introducing the
DDP
22 different faculty
created key
performances
150 faculty and staff
members attend
training workshops on
the DDP
104 External assessors
trained (Title III grant
report 9/30/00)
No data available
Spring 2002 Data
Summary
(July 2002)
Fall 2002 Data
Summary
(January 2003)
No data available
Spring 2003 Data
Summary
(July 2003)
No data available
No data available
No data available
No data available
A total of 2802
feedback documents
have been uploaded by
84 faculty and staff
(Title III grant report
9/30/02)
No data available
* Data taken from Pilot Study, Ehley, 2004
72 full time faculty or
69% have logged into
the DDP
90% of external
assessors used
computers to complete
feedback that was then
uploaded to the
DDP(Title III grant
report 9/30/01)
Fall 2003 Data Summary
(January 2004)
1, 534 students logged
onto the DDP a total of
12,385 times. The
range of frequency of
logons was 1 to 75 with
the mean student logon
of 8.07 *
2,444 completed KPs range was 0 to 6 with a
mean student completed
KP of 1.59*
74
backgrounds. Despite the limitations of a small sample ERE observations provided some
insights into the implementation of the DDP as experienced by students (Educational
Research and Evaluation Department, 2002, p. 1).
The foundation for the interview protocol was based on the experiences in
perspective interviews used in the college’s longitudinal study in the 1970’s and 80’s.
The questions began by having students reflect on their educational experiences at the
college and then examine their activities with the DDP. Six of the ten interview questions
directly related to the DDP. These questions, along with the corresponding data and
conclusions from an internal report by the Educational Research and Evaluation
Department, are quoted below. These data are useful to make comparisons between early
student use and perceptions of the DDP and current student use and perceptions. The
interview questions used by the ERE in the 2002 student interviews were also used in this
study to provide a basis for analytic comparison.
Question 4: What kinds of things have you done on the DDP?
All of the students had some familiarity with the DDP, although some had
apparently logged in only briefly in a guided session with staff. Others were very much
veteran users. About half referred directly to the experience of uploading self
assessments. While others referred generally to uploading work in their classes, it is
likely that some of these are also referring to self assessments. However, the more
important observation may be the way in which a teacher’s guidance is seen as critical to
early experiences with the DDP.
75
Table 5
Results of 2002 ERE Student Survey Question: What kinds of things have you done on the
DDP?
Response Category
Example
Frequency
Enter self assessments
Upload the self assessment for the 6
external
Use DDP in a structured activity in
Followed teacher’s instructions to 5
course, assessment or other protocol
enter and upload work
(e.g. at entry)
Unique responses
Read feedback, just logging in,
3 (1)
review procedures for using
(Educational Research and Evaluation Department, 2002, p. 4)
Question 5: What stands out from your experience with the DDP? Why does that stand
out?
In the diversity of these responses, the access that is afforded by the DDP and the
emphasis on feedback and self assessment project the general utility that students
perceive. However, the observations about the lack of clarity and the lack of use are also
important indicators of the emphasis that students place on support from their teachers.
Table 6
Results of 2002 ERE Student Survey Question: What stands out from your experience
with the DDP?
Response Category Example
Frequency
Access to records
Makes information on educational experiences
4
available, go back and review feedback
Lack of
I don’t know what it is for; generally, instructors
3
understanding and
don’t know what it is for and use it according to
explanation
specified procedures rather than to support
reflection
Value of feedback,
Getting instructor’s feedback offers closure;
3
self assessment
feedback helps in understanding outcomes, where I
am going
Learning to use in
Experience of going through whole process with
2
class
teacher; instructor’s use of template for
assignments/self assessments
Technical problems There were some glitches; some problems logging 2
in
(Educational Research and Evaluation Department, 2002, p.4)
76
Question 6: As you know, your DDP is accessible to you at any time. Have you found
yourself using it on your own outside the context of a particular course or assignment?
Could you give me an example? (If yes – What if anything, did you learn from reflecting
on the feedback and self assessment on the DDP?)
The uses outside of structured class activities vary between the general checking
in and targeted reflections to support learning.
Table 7
Results of 2002 ERE Student Survey Question: As you know, your DDP is accessible to
you at any time. Have you found yourself using it on your own outside the context of a
particular course or assignment?
Response Category
Yes, from home
Yes, check on progress
Example
From home, I can go on line when I am
with family
Read feedback on particular abilities;
foresee DDP use as a resume tool
See if all materials have been entered
Log into DDP to help family, others
understand what the education is about
Yes, to check in
Yes, to communicate with
others
No
(Educational Research and Evaluation Department, 2002, p. 4)
Frequency
2
2
1
1
4
Question 7: In what ways have your experiences with the feedback and self assessment on
your DDP been alike or different from other ways you share feedback and self
assessment in the College?
A substantial portion of the sample specifically noted that the experience was not
so different. However, many of them found the online version valuable as a permanent
record, one that evoked good writing. A similar number saw that the DDP limited the
kind of feedback—personal, close to the event, and interactive, with more open
opportunities for self assessment.
77
Table 8
Results of 2002 ERE Student Survey Question: In what ways have your experiences with
the feedback and self assessment on your DDP been alike or different from other ways
you share feedback and self assessment in the College?
Response Category
Appreciates on line
entry
Example
Engages different kind of writing, more
thoughtful; because it is permanent, tend to be
more careful; provides a collective, systematic
record
Interactive,
DDP seems more impersonal; instructors give
interpersonal quality more personal feedback in other forms; face-toof feedback is critical face feedback has more power; structured, timebound approach to writing self assessments can
interfere with reflection
Perceived as similar
Don’t see any difference from reading hard copy
(Educational Research and Evaluation Department, 2002, p. 5)
Frequency
6
5
4
Question 8: What purposes do you think the faculty had in mind when they designed the
DDP?
On this item, opinions were largely divided between a focus on efficiency of
storage and access, and an emphasis on improvement.
Table 9
Results of 2002 ERE Student Survey Question: What purposes do you think the faculty
had in mind when they designed the DDP?
Response Category
Better storage,
convenience of access
Example
More secure permanent record; better
access for students; have paperwork more
accessible
Support student reflection Support improvement; identify areas to
really check on; help students evaluate
work
Unique responses
For instructor’s own planning; demonstrate
abilities for post-college purposes; replace
ACRJ (Academic Career Resource Journal)
(Educational Research and Evaluation Department, 2002, p. 5)
Frequency
5
5
3
78
Question 10: If you could tell the DDP design team one thing what would that be?
While there were significant concerns about making use of the DDP more simple
and direct, there was substantial concern over supporting the faculty in the effective use
of the DDP.
Table 10
Results of 2002 ERE Student Survey Question: If you could tell the DDP design team one
thing what would that be?
Response Category
Simplify functions,
particularly for uploading
Example
Deal with problems of formatting; make it
easier to understand and use; make it easier
to enter self assessments
Better support from faculty Need to use more often, with structure
support; better training for instructors;
make a handbook and educate teachers in
using; have instructors stress the
importance of using, at least in freshman
year
Create opportunities for
Have a period after entering, when I can go
revision
back and edit/correct
(Educational Research and Evaluation Department, 2002, p. 6)
Frequency
5
5
1
Preliminary Observations (excerpts quoted from ERE internal report). Overall, these
responses show a dispersion of perspectives on the DDP and its use; some students have
very little experience with the DDP, even after a couple of semesters; others have had
very good experiences and see it as a powerful tool, but these are usually students whose
experiences have been supported by careful faculty instruction.
In general, the students come readily to the experience of the Diagnostic Digital
Portfolio, but seem to benefit greatly from having their early use mediated, modeled, or
supported through instruction. For those with positive experiences, the perspectives
divide roughly between those with good experiences, typically involving their access of
79
the records, and those with very good or even great experiences who are ready to make
enduring use of reflective learning.
Distributing the responses according to the questions may underestimate some
dimensions of the responses. For example, taking all the perspectives together, there was
frequent expression that the DDP might be a good thing but that there was not enough
exposure and integrated use to make this really happen. In the tables above, this shows
up in the expressions like those regarding confusion of purpose and the need for better
faculty development. However, in the actual interviews, these were very substantive
parts of the discussion (Educational Research and Evaluation Department, 2002, pp. 6-7).
A document on Research and Evaluation Activities 2001 – 2002, prepared by
William H. Rickards, Senior Research Associate, describes student experience with the
DDP as falling into three categories. These categories include:
1. Introductory: This category involves tasks like logging on, exploring sections,
preparing and uploading self assessments, and reading feedback. This
category is guided by a faculty or staff member who works closely with the
student and directs procedures. This type of session occurs at entry to the
college and in courses in the first few semesters. It may also occur at later
points if the faculty are introducing specialized applications.
2. Supported Use: This category is linked to particular activities in a course, with
the faculty designating the use of the DDP for a particular purpose and
providing instruction as needed. The difference in this category is that the
students’ primary application occur independent of course time and
supervision. Examples of this type of use include: the English Department’s
80
use of the developing reading list, AC 301 Mid-Program Review, and GEC
300. In these cases, the technical facility is a part of the student’s metastrategies for learning and she has the opportunity to explore and define its
uses.
3. Student constructing and creating her own uses: At some point, students
develop their own patterns and applications, integrating these with their own
active engagement in their education. This category can include individual
storage strategies (readily accessible materials) or successive entries used as a
means to identify the student’s own needs and targets for development
(Rickards, 2002, p. 3)
Faculty Surveys
Educational Research Evaluation (ERE) administered two surveys to faculty. The
first survey, completed in May 2002, focused on faculty use of technology, including the
use of the DDP. The second survey was administered in May 2003 to provide
information to the Academic Affairs office on the underlying interest in distance
learning.
In the first survey, faculty were asked to identify courses they currently taught,
check technologies used (word processing, email, Internet research, course management
software, and the DDP), and briefly describe the nature of the technology used. They
were also asked to identify any other software or technology application that they used in
each course and briefly describe its use. ERE reported that 47% of faculty completing
the survey used the DDP in their course. In addition to identifying the type of technology
the faculty were using, they were asked to briefly identify the nature of the use. Out of
81
the 35 comments collected concerning the DDP, 23 comments indicated faculty were
using the DDP in their courses, 11 comments referred to planning to use the DDP in the
courses in the near future, and one comment indicated the faculty member did not like the
DDP and questioned its effectiveness.
In the second survey (2003), which was focused on identifying underlying interest
in distance education, faculty were asked to complete a grid for each course they would
teach during the next academic year. Each grid included a space for numerous
technologies, including the DDP, email, course management software, video
conferencing, and internet research, to name a few. Although the questions on these two
faculty surveys did not directly compare (due to differences in format and questions),
there were some similarities. Out of the 79 faculty completing the survey 75% reported
that they used the DDP for reflection in their courses.
Classroom Observations
Two classroom observations were done by the Educational Research and
Evaluation (ERE) Department, one in the fall of 2001 (CM 110 course) and one in the
spring of 2002 (AH 150). Both of these courses are general educational courses required
for most students at Alverno College. The observations were completed using a think
aloud protocol that invites a student to talk about what she is thinking as she is working
on the DDP.
In the fall 2001, ERE staff individually observed and queried 16 students at some
point during their DDP upload process. The observations were made during a class in
which the primary activity was to upload a final self assessment for a communications
seminar course (CM 110). The observers noted that students experienced difficulties in
82
technically working with the DDP. Some students experienced difficulty with the
process of finding the right key strokes to upload their self assessments. Students also
encountered difficulties, ranging from momentary confusion to “losing” written self
assessment. Those that encountered some difficulty often noted that they were
comfortable with word processing, but not with the DDP. Overall, students seemed to be
completing the assignment on the DDP because they were asked by their instructor and
they saw it as a learning mechanism limited to the CM 110 (Communications: Integrated
Communication Seminar 1: Exploring Boundaries) course. The report sums up the
observation by saying, “In general, students did not seem to strongly differentiate the
DDP as distinct from working on the computer in another software platform, such as
word processing, other than that they were less familiar with it. At the same time, they
accepted it as a legitimate mechanism to deliver their self assessment and receive
instructor feedback” (Rogers & Reisetter Hart, 2001, p. 2).
A similar class observation using a think aloud protocol was completed on
another general education course, AH 150 (Arts and Humanities: Expressions and
Interpretations of Human Experience 1). In this think aloud interview and class
observation, six students completed their final performances in AH 150 and entered them
into the DDP along with their self assessments. Students had the option of completing the
upload to the DDP on their own or coming to class in the Computer Center. The observer
noted of the students interviewed, some were clearly comfortable with the process and
understood the commands and the system. A few students showed obvious frustration
with the process of completing the self assessment on-line, saving it, and uploading it to
the DDP. The observer also noted that a couple of students seemed to have devised their
83
own system for using the DDP. As one of these students was completing her final work
(a letter analyzing her perspective), she opened feedback from all her earlier assessments
that were stored in the DDP. She used the feedback from the earlier assessments to think
through any implications for her current work. The observer concluded that, “This was
not a procedure that she had been taught, but her sustained experiences with this teacher
and his encouragement for using the DDP seemed to be important factors, shaping the
context in which she developed her own patterns of use” (Rickards, 2002, p. 2).
ERE General Observations Concerning the DDP
The quantitative and qualitative data gathered on the DDP gives some
understanding of DDP use and perceptions by students and faculty. In addition, the
College has found that the DDP can make it easier to gather data on its Ability-Based
Learning and Student Assessment-as-Learning philosophy. For example, in the summer
of 2000 the Educational Research and Evaluation Department (ERE), in collaboration
with the Social Interaction Ability Department, began an analysis of AC 151 – the first
social interaction outside-of-class assessment, which was done on the DDP by entering
students in the fall semester 1999. ERE analyzed the students’ self assessment along with
feedback documents (created by faculty, staff, and/or external assessors). In their random
sample of 50 students they made the following observations:
1. Overall, the self assessments in this sample showed the students taking on
more objective observations of their performances, with varying degrees of
readiness for description, analysis, and judgment.
2. Students were more likely to make judgments about the quality of the
performance rather than offer a careful description of what they had done.
84
3. Different assessor styles did not seem to influence the quality of the students’
written self assessment (ERE, 2000, pp. 2-3).
While this study could have been done using records previously kept in the Assessment
Center, the DDP provided easy access to the documents and an efficient method for
reviewing the documents.
In 2004 ERE began to explore the student as learner through integrated studies
with the DDP-based student learning examples as a data source. The focus of this study
was an external assessment taken by all weekday college students around the mid point of
their curriculum. AC 301, the Mid Program Portfolio Self Assessment, focused on
students reflecting on their past work, identifying strengths and challenges with respect to
the abilities, and creating a learning plan. Both of these studies (AC 151 and AC 301)
demonstrate the use of the DDP in gathering data on student learning across the
institution.
In December 2004, the Educational Research and Evaluation Department (ERE)
created an internal document, The Context for Learning Inquiries in the DDP and Similar
Portfolio Environments: Seven Propositions for an Unfinished Tool in which they
describe electronic portfolios as “unfinished tools,” because digital portfolios can be
designed to “…support reflection and learning, and yet their power only really comes into
play in the context of individual faculty and department practices and the processes that
individual students develop and employ” (ERE, 2004, p. 1). ERE listed seven
propositions on electronic portfolios that they have learned through their own experience
and research, along with collaborative work with others. These propositions are:
85
1. Portfolios come in many forms, but their distinctions are perhaps best
understood in terms of the learning theories that underlie their construction.
2. Benefits of portfolios derive from the students’ use. While this seems
obvious, it has great implications for how students are prepared and supported
in their use as well as for research that examines the effectiveness of the
portfolios, the related practices of faculty, and the support to faculty in their
portfolio-related practices.
3. Without particular instruction or guidance, students will tend to use portfolios
as resume-builders, that is, as a means of representing themselves as
competent, based on the evidence of their experiences.
4. The use of portfolios for more complex purposes is mediated by faculty
practices, embedded in courses and in the curriculum.
5. In cases where portfolios are consciously used as learning tools, their
effectiveness cannot be separated from faculty practices through which they
are implemented.
6. While individual faculty may use electronic portfolios in very creative ways,
their effectiveness cannot be separated from the faculty practices through
which they are implemented.
7. The designers and implementers of portfolios have an obligation to study the
tool and its effective uses. However, faculty who are not users are not likely
to be persuaded by data that an electronic portfolio is an effective addition to
their educational practices. While some faculty will see immediate ways in
which the electronic portfolio can be implemented in their teaching, in many
86
situations faculty will need to use the portfolio as fostering particular aspects
of the teaching and learning in their classes. This will occur through the
faculty’s own operations as much as through any particular design factor in
the tool itself. Consequently, the faculty need to jointly construct the uses of
the portfolio, with colleagues across the educational program (ERE, 2004, p.
1).
The institution’s research on the DDP has been primarily focused on grant
evaluations, along with research on the DDP’s contribution to student learning. No data
has been gathered with respect to version 2.0 of the DDP (implemented January 2004)
and how or if it has impacted student and faculty use and perceptions of the DDP. These
are the main goals of this study.
Self Reflection – Self Assessment
A consistent and key aspect in the research on digital portfolios is student
reflection and self assessment. Reflective portfolios are the most common type of
portfolio listed in the AAHE database. Most of the categories of portfolios listed in the
database include some form of reflection, with the possible exception of program
evaluation/institutional assessment. There are many views on the meaning of reflection.
Yancey synthesizes the definitions of reflection by Dewey, Vygotsky, and Polyani as:
a process by which we think: reviewing, as we think about the products we
create and the ends we produce, but also about the means we use to get to
those ends; and projecting, as we plan for the learning we want to control
and accordingly manage, contextualize, understand. (Yancey, 1998, pp.
11 – 12)
87
For these authors, reflection requires the company of others, is a type of learning, and
requires divergent perspectives. In their words, “Reflection becomes a habit, one that
transforms” (Yancey, 1998, pp. 11 – 12).
Brew (1999) makes a clear statement of the relationship between self assessment
and reflection:
Self-assessment is usually concerned with the making of judgments about
specific aspects of achievement often in ways that are publicly defensible
(e.g. to teachers), whereas reflection tends to be a more exploratory
activity that might occur at any stage of learning and may not lead to a
directly expressible outcome. All self-assessment involves reflection, but
not all reflection leads to self-assessment. (p. 160)
Eisner (2002) discusses the need for student reflection focused on their comments
about their own work and the evidence they used to support their judgment. Eisner’s
views seem to mirror the self assessment process mentioned by Brew. Gathercoal et al.
(2002) state that a digital portfolio system invites self-evaluation and reflection and
allows students to “construct their own truth, reflecting on each artifact with many
mirrors (their peers, faculty, employers, supervisors and significant others)” (p. 2).
Alverno College has been recognized as a leader in using self assessment to assist
students to take charge of and evaluate their own learning. Alverno developed its
definition of self assessment from the ongoing study on the performance of Alverno
students. The definition of self assessment is related to the way Alverno defines student
assessment and focuses on performance that integrates knowledge and ability. “Self
assessment is the ability of a student to observe, analyze, and judge her performance on
88
the basis of criteria and determine how she can improve it” (Alverno College Faculty,
2002, p. 3). Alverno College spells self assessment without a hyphen to emphasize that
the self is not the object but the agent of assessment. Students are not assessing
themselves, but rather they are assessing their performance in a specific context. This
idea is also meant to assist the student in recognizing that faculty are not assessing her
personally, but are assessing her performance (Alverno College Faculty, 2002, p. 20). In
Learning that Lasts, Mentkowski and Associates (2002) state that when students integrate
performance and self-reflection they get a sense of “what I can do across settings, and
how I can improve” (p. 196). In the list of learning and action principles contained in
Learning that Lasts, the third principle is “Learning that lasts is self-aware and reflective,
self assessed and self-regarding” (p. 232).
Beneath Alverno’s concept of self assessment are a series of assumptions that
emerged from an experiential basis. These assumptions include:
1. Self assessment as integral to learning. Learning in the Alverno context is
essentially characterized by self awareness. If a student is to become a better,
self-determined learner, she need to be self aware of the state of her own
learning, including what standards she needs to meet and how well she is
meeting them thus far.
2. Self assessment as developmental. When students are beginning to develop
their ability to self assess, they usually expect the teacher to recognize their
problems. As students develop some understanding of what self assessment
entails, they can become increasingly sophisticated in probing what they know
and what they need to do to improve.
89
3. Self assessment is based on public criteria. Alverno defines criteria as
representing a picture of the ability or abilities demonstrated in a performance.
They are “public” in that the criteria are explicit for all involved. Students are
called to make judgments on their performance, based on the criteria.
4. Self assessment enhanced by feedback. Students need to complete their picture
of a performance by considering others’ perception of it, based on a belief that
meaningful learning is interactive.
5. Self assessment elicited by multiple approaches. Because of the varied
learning styles, students should have access to multiple approaches to self
assessment. For example, students could use reflective journals or use self
assessment prompts provided by the faculty (Alverno College Faculty, 2000).
Cowen’s (1998) views on reflection are similar to Alverno’s concept of self
assessment. He classifies reflection as analytical or evaluative: “Reflection often involves
me in thinking how I did something – which is analytical. It can also involve me in
thinking about how well I have done something – which is evaluative” (p. 17).
Catherine Marienau (1999) reported findings that indicate a strong endorsement
of “self-assessment as an integral component of the curriculum wherein students engage
in self-assessments intentionally, regularly, and with consistent reinforcement for the
program” (p. 137). Alverno has found that the capacity to self assess, becomes a key to
“…students’ ongoing learning and their transfer of learning to new contexts. These
factors endure after graduation and facilitate the transition to performing beyond college”
(Alverno College Faculty, 2000, p. 20).
90
Program Evaluation
There are numerous models of program evaluation. Most models describe
program evaluation in terms similar to those of Wholey, Newcomer, and Hatry –
“Program evaluation is the systematic assessment of a program results and, to the extent
feasible, systematic assessment of the extent to which the program caused those results”
(p. xxxiii). The term program is referred to as a set of resources and/or activities focused
toward one or more common goals. Owen (1999) uses an eclectic view of program
evaluation, which focuses on decision making and includes needs assessment,
benchmarking, and performance auditing under the purview of program evaluation. He
emphasizes the need to have access to knowledge that can influence a decision.
Owen describes evaluation as the process of: “negotiating an evaluation plan;
collecting and analyzing evidence to produce findings; and disseminating the findings to
identified audiences for use in: describing or understanding an evaluand [the object of the
evaluation]; or making judgments and/or decisions related to that evaluand” (p. 4). He
uses five evaluation forms, based on the “why” question, each with a defining orientation
and focus on a set of common issues that provide the framework for the planning and
conduct of the investigation. These five forms are: Proactive (evaluation takes place
before a program is designed), Clarificative (concentrates on clarifying the internal
structure and functioning of a program or policy), Interactive (provides information about
delivery of implementation of a program or about selected component elements or
activities), Monitoring (appropriate when a program is well established and ongoing), and
Impact (used to assess the impact of a settled program). Within each form there are a
91
series of orientations, typical issues, state of the program, major foci, timing and delivery,
key approaches, and assembly of evidence.
The form of program evaluation used in this study is Owen’s Interactive Form.
The orientation of this form is improvement. Typical issues include: What is the program
trying to achieve? How is this service going? Is the delivery working? Is delivery
consistent with the program plan? How could delivery be changed to make it more
effective? In the Interactive Form, programs are typically in development or evolving and
the evaluation is conducted during the program. These are the key approaches listed by
Owen for the Interactive Form of evaluation:
1. Responsive evaluation. This involves the documentation of illumination of the
delivery of a program. It is focused on process and takes into account the
perspectives and values of different stakeholders. It is oriented towards the
information requirements of audiences, usually the providers of the program.
2. Action research. This involves determining if the innovatory approaches to
delivery are making a difference.
3. Quality review. This approach is sometimes known as institutional self-study
and involves providing system level guidelines within which providers have a
large amount of control over the evaluation agenda.
4. Developmental evaluation. This approach involves working closely with the
program providers on a continuous improvement process, often on programs
that are innovatory and unique.
5. Empowerment evaluation. This approach involves assisting program providers
and participants to develop and evaluate their own programs, as part of a
92
broader goal of giving them more control over their own lives and destinies
(p. 45).
Two of these approaches are used in this study – responsive evaluation and
developmental evaluation.
Another variant of program evaluation specific to digital portfolios has recently
been advanced by Love, McKean and Gathercoal, focusing on evaluating digital portfolio
programs by their level of maturation. These levels are descriptions of developmental
stages of digital portfolio use, beginning with Level 1 (Scrapbook) and moving toward
the highest level, Level 5 (Authentic Evidence as the Authoritative Evidence). The focus
of Love, McKean, and Gathercoal’s work is the actual content and use of the digital
portfolio, rather than a rubric for evaluating individual portfolio products. The criteria
for ascertaining the level of maturation are listed in Table 11.
This table contains statements regarding system structure and functions that are
used to assess the level of maturation of a digital portfolio program. These overall
criteria for each level are further broken by Love, McKean, and Gathercoal. They break
each of the five level open into additional categories such as: description, type,
organization, content, value to the employer, value to the student, and value to the
educator. At the first two levels, a comparison of paper portfolios, e-portfolios, and
webfolios is used. As Table 11 indicates, level three and higher refer only to a Webfolio.
As part of this study, the DDP is compared to each of the five levels of maturation and
evidence is provided to determine the level of maturation of the DDP.
93
Table 11
Criteria for Ascertaining Levels of Maturation. Love, McKean, and Gathercoal, 2004, p.
27.
Maturation Level
Level 1: Scrapbook
Hard-copy, eportfolio, or webfolio
Level 2: Curriculum
Vitae
Hard-copy, eportfolio, or webfolio
Level 3: Curriculum
Collaboration
Webfolio
Level 4: Mentoring
Leading to Mastery
Webfolio
Level 5: Authentic
Evidence as the
Authoritative Evidence
Webfolio
Statement Regarding System Structure and Function
Students have no schema that guides the organization and artifact selection. A
portfolio is really just a scrapbook of assignments completed in course or
awards received along the way
Student work is guided and arranged by educator, department, or institution
determined curriculum requirements or standards and institution-wide “student
life” contributions.
The student can contribute to the content structure within the departmental and
program curricular framework or “student life” institutional showcase of
achievements. The portfolio is a working and a showcase portfolio.
Students can redeem their work multiple times based on feedback from a variety
of interested parties, educators, mentors, administrators, parent/caregiver(s),
employers, and recruiters.
Work-sample assessment is linked to standards, program goals, and other
descriptors like higher-order thinking taxonomies, and this data is retrieved for
analysis at the individual, class, program, or institutional level.
Summary and Forecast
Digital portfolios began as a way to use technology to digitize paper-based
portfolios, which had their roots in the progressive educational movement, started a
century ago. Like its paper-based counterparts, most of the research on digital portfolios
is general in nature and focuses on descriptions, types, categories, and implementation
strategies for digital portfolios. Most of this research is self-reporting or anecdotal and
usually concerns digital portfolio use in a specific discipline or series of courses.
One area that does seem to have substantive research concerning digital portfolio
is pre-service education. Digital portfolios used in pre-service teacher education are a
form of high stakes assessment and are evaluated as a final product. Carney (2004)
differentiates between portfolios whose primary purpose is assessment and those
portfolios designed to foster learning. In her study, Carney notes that research on digital
94
portfolios must go beyond anecdotal data and must include multiple sources of data and
triangulation of evidence.
A consistent and key aspect of research that directly connects to digital portfolios
is student reflection and self assessment. Alverno College has been recognized as a leader
in using self assessment to assist students to take charge of and evaluate their own
learning. The concept of self assessment is integral to Alverno Ability-Based Learning
and a critical component to the Diagnostic Digital Portfolio, a student learning portfolio.
As the themes of this literature review indicate, there is limited substantive
research on student and faculty use and perceptions of digital portfolios. This study
attempts to provide some substantive research using Alverno’s Diagnostic Digital
Portfolio as an example of a student learning portfolio. Using Owen’s (1994) Interactive
form of program evaluation this study gathered data from the DDP relational database,
student and faculty surveys, and follow-up student and faculty interviews to evaluate
student and faculty use and perceptions of the DDP.
95
CHAPTER THREE: RESEARCH DESIGN
The use of digital portfolios in higher education has increased significantly in the
last few years. Despite this boom, research on digital portfolios has focused mainly on
descriptions, types, categories, implementation, and programs under development.
Research on digital portfolio use by students and faculty including their perceptions of
digital portfolio’s usefulness, benefits, and drawbacks is limited. This study addressed the
question of the use of the Alverno College’s Diagnostic Digital Portfolio (DDP) by
describing and evaluating student and faculty use and perceptions during the spring, 2005
semester using a program evaluation methodology.
Wholey, Hatry, and Newcomer describe program evaluation as “the systematic
assessment of program results and, to the extent feasible, systematic assessment of the
extent to which the program caused those results” (2004, p. xxxiii). They note that
program evaluation includes “…ongoing monitoring of programs, as well as one-shot
studies of program processes or program impacts” (Wholey, Hatry, Newcomer, 2004, p.
xxxiii). They also point out that for program evaluation to be useful and worth its cost it,
“…should not only assess program results but also identify ways to improve the program
evaluated” (2004, p. xxxiv). This study researched the use of the DDP and student and
faculty perception of the DDP. It also sought to identify ways to enhance and improve the
DDP.
There are many different types or forms of program evaluation. Whorley, Hatry,
and Newcome (2004) describe several types of program evaluation including:
Evaluability Assessments (used to evaluate program designs and explore program
reality), Implementation Evaluation (accessing the need for and feasibility of the
96
program, planning and designing the program, program implementation and program
improvement), Performance Monitoring (primarily used in service areas), and QuasiExperimentation and Random Experiments (2004, pp. 3 – 5).
Owen (1999) describes a different type of program evaluation involving various
“forms” of evaluation. These include: Proactive Form (evaluation takes place before a
program is designed), Clarificative Form (concentrates on clarifying the internal structure
and functioning of a program or policy), Interactive Form (provides information about
delivery of implementation of a program or about selected component elements or
activities), Monitoring Form (appropriate when a program is well established and
ongoing), and Impact Form (used to assess the impact of a settled program) (1999, pp. 40
– 49). This study applies the Interactive Form of program evaluation methodology.
Purpose of Study
The purpose of this study was to address the question of the use of the DDP, first
implemented at Alverno College in October of 1999 and redesigned in January of 2004.
This study examined undergraduate student and faculty use and perceptions of the DDP,
focusing in on several sub-questions. These sub-questions include:
1. How often do students and faculty log onto the DDP?
2. What do students and faculty do when they log onto the DDP?
3. What features of the DDP are perceived by students and faculty as useful or
not useful?
4. Overall, what are students and faculty perceptions of the usefulness of the
DDP?
5. What do students and faculty think about the ease of use of the DDP?
97
6. What are students and faculty perceptions concerning their frequency of use
of the DDP?
7. What suggestions do students and faculty have on: (a) improvement of the
usefulness of the DDP, (b) assistance in using the DDP more, (c) general ideas
for improvement of the DDP, and (d) additional comments on the DDP?
Besides focusing on student and faculty use and perceptions of the DDP, this
study analyzed key performances that were active (available for student use) during
spring, 2005. The analysis of active key performances focused on four sub-questions:
1. How many active key performances are being used by students?
2. What discipline departments have completed key performances?
3. How are completed key performances connected to the abilities?
4. How are completed key performances connected to other matrices?
Of the program evaluation approaches described earlier, this study primarily
utilized Owen’s Interactive Evaluation approach, focusing on implementation and
delivery of the DDP. This study also combined the Interactive evaluation form with some
aspects of Wholey, Hatry, and Newcomes’ Evaluability Assessment to explore the
program (DDP) reality and its use at Alverno College.
The Interactive Form of evaluation is less concerned with end of program
analysis, since the key stakeholders “…never expect their program to be constant for
sufficient time to make a traditional Impact evaluation meaningful or useful. Instead,
program providers want evaluations which will support change and improvement”
(Owen, 1999, p. 222). Interactive evaluation relies on intensive onsite study including
98
observations, surveys, and interviews. At the provider level (the level of this study),
typical questions in Interactive evaluation include: What actually happens in this
program? What are practitioners doing that is working well? What is not working so
well? How are students affected by the program? How could we generally improve the
program for the future (Owen, 1999, p. 93)?
The typical questions in Interactive evaluation directly relate to the main purpose
of this study – to address the question of DDP use by describing and evaluating student
and faculty use and perceptions of the DDP. In order to support change and improvement,
with respect to the DDP, the College must know how students and faculty are using the
DDP, what they find useful or problematic, and what student and faculty perceptions of
the DDP are as an educational tool.
This study used a three-prong approach to gathering data on the participants in
this study, Alverno College students and faculty.
1. Mining data in the DDP relational database (quantitative). These data are
statistical in nature and were used to address the Interactive evaluation
questions of: What actually happens in this program? How is the service
going? Is the delivery consistent with the program plan?
2. Surveying students and faculty (quantitative and qualitative) concerning their
use and perceptions of the DDP. These data were used to address these
Interactive Form questions: What are practitioners doing that is working well?
What is not working so well? What are students and faculty perceptions of the
program?
99
3. Interviewing students and faculty (qualitative) concerning their use and
perceptions of the DDP. These data items were used to triangulate the data
gathered in first two approaches and to expand the understanding of
suggestions for program improvement.
Participants
The participants for the three approaches were Alverno College students and
faculty. This included all enrolled undergraduate students and all faculty, full-time or
part-time, who used the DDP during the spring, 2005 semester.
Data Mining of the DDP Relational Database
Data mining participants included all students and faculty who logged onto the
DDP during the spring, 2005 semester (January 1 – June 16, 2005). This included all
current Alverno College undergraduate students and all faculty, full and/or part time.
Survey of Students and Faculty
Students were surveyed at three levels: beginning (semesters one and two);
intermediate (semesters four and five); and advanced (semesters seven and eight).
Beginning students were surveyed in connection with general education communication
seminars. The majority of entering students, except for advanced transfer students, are
required to take at least one communication seminar during their first two semesters at
the College. Surveys were administered during the second half of the semester to
beginning students in both Weekday and Weekend college programs.
Intermediate students were surveyed during two outside-of-class assessments
required of all students and taken primarily during students’ fourth or fifth semesters.
100
Weekday students were surveyed during AC 301 (Mid Program Portfolio Self
Assessment). This assessment is administered in the beginning, middle, and near-end of
the semester. To give students an opportunity to use the DDP during the spring, 2005
semester, only Weekday students taking AC 301 (Mid Program Portfolio Self
Assessment) in the middle and end of the semester were given the survey. Weekend
students were surveyed during AC 260- Mid Program Portfolio Self Assessment that is
administered during the last Weekend College session in May.
Advanced students were identified across a number of major capstone courses
(courses taken in the last two semesters) and surveyed in these courses. Division Chairs
identified the appropriate courses, the names of the faculty teaching these courses, and
gave their permission to contact those faculty.
Faculty were surveyed during the college Institute held in late May 2005. The
institutional assumption is that all full-time (non-sabbatical) and category II (faculty with
a percentage of full-time) faculty attended the Institute.
Student and Faculty Interviews
Qualitative data on student and faculty perceptions of the DDP were gathered
using follow-up scripted interviews of selected faculty and students. Participants for the
interviews were selected from surveys where the final question (Would you be interested
in participating in a follow-up interview on the DDP?) was answered yes and the
student/faculty name was listed. Interview questions were developed from those used by
ERE with student interviews conducted in 2002 and from data gathered in the student and
faculty surveys. The focus of the interview questions was to explain/expand on major
themes identified in current survey data.
101
Procedures and Measures
The collection techniques mirrored each of the three approaches: (a) mining of the
DDP relational database, (b) student and faculty surveys, and (c) follow-up interviews of
both faculty and students. The data gathered included both quantitative (data mining and
certain survey questions) and qualitative (survey questions and interviews).
Data Mining of the DDP Relational Database
The DDP relational database was used to gather data on student and faculty use of
the DDP. This included: the number of times student and faculty logged on during the
semester, the number of key performances completed by students, feedback uploaded by
faculty, the number of files uploaded by students and faculty, and the number of active
key performances created by faculty and/or departments. In addition, data were gathered
on completed key performances and their connection to the various matrices (ability and
advanced outcomes).
The data mined from the DDP database were in a variety of formats and from a
variety of relational tables. The user_log provided data on the users (faculty and
students) logging on and file uploads. This table, which is connected to the user table,
provided information on the user type (faculty or student), username, and status (active,
disabled, grad). Student information (college ID, date entered, anticipated graduation
date, majors and supports) which is stored in the student_info table, was connected to the
user_log to gather information on students’ programs, majors, and support areas (minors).
Student information data were also used to remove all graduate students from this study.
Data on key performances (date created, status, department and creator) found in the
key_perf_design table, were linked to the key_perf table that contains data on all
102
completed key performances (student ID, date assigned, date completed). This provided
information on the number, type, and department of key performances completed during
the spring, 2005 semester. These data were connected to the key_perf_matrix_links table
to analyze the connections of key performances to the various matrices (abilities and
advanced outcomes).
Survey of Students and Faculty
A survey was used to gather data on student and faculty perceptions on their use
of the DDP, the usefulness of various aspects/features of the DDP, the ease of use of the
DDP, and suggestions for enhancements and improvements of the DDP. The survey
enabled data to be gathered from a cross section of students and the majority of faculty.
The surveys included demographic questions (categorical data), Likert Scale
questions (ordinal data), and opened ended questions. The questions were focused on
student and faculty perceptions of their use of the DDP (number of times they logged on
in a typical month and number of key performances created), their use of features of the
DDP (uploading files, using the My Resource and the Reference areas), their perceptions
of the usefulness of DDP features, suggestions for enhancing the DDP, and the ease of
use of the DDP.
The questions for the student and faculty survey were modeled after survey
questions used in the Flashlight Program headed by Dr. Steve Ehrmann. Dr. Jeana
Abromeit Co-Chair of the Council on Student Assessment at Alverno College, provided
advice concerning the wording of the questions. Dr. Abromeit, Professor of Social
Science, has created numerous student and faculty surveys for the institution.
103
Questions for the student and faculty surveys were similar. Both contained
demographic questions, perceptions of number of log-ons, evaluation of nine feature
(features depend on student or faculty access), overall perceptions of usefulness, ease of
use and frequency of use, as well as suggestions for enhancement/improvement of the
DDP. A copy of the student survey is listed below (the survey has been reformatted to fit
this document’s format).
Diagnostic Digital Portfolio (DDP) Survey
We would appreciate your participation in a study to gather information on DDP use, what you think of the
DDP and what aspects of the DDP seem useful or difficult to you. The survey will take approximately 15
minutes. All survey results will be completely anonymous and will not be used for any other purpose. By
completing this survey, you agree to be part of this study. If you would like to be a part of a follow-up
interview, you can indicate this by checking “yes” and adding your name to the last question in the survey.
In order to analyze the results of the survey we would like to gather some demographic data. Please fill out
the following questions to the best of your ability.
1.
3.
What general program are you currently in?
supports?
WEC – weekend college
WDC – weekday college
2. What is/are your majors and
Do you live on campus?
Yes
No
4. Are you currently a:
Full Time Student?
Part Time Student?
Major(s) _____________
Support(s)____________
5. How many semesters have you been at Alverno? _______
Now, we would like to ask you some questions on your use and perceptions of the DDP. Please answer the
following questions to the best of your ability.
6.
During a typical month how many times do you log into the DDP? ________
7.
Approximately how many Key Performances in the DDP have you completed this semester?
0
1
2
3
4 or more
When you log onto the DDP, how often do you: (Mark an X in the appropriate column for each item)
Do not
know what
this is
8.
Add a key performance to the My Work
area?
Never
Occasionally
Often
Very
Often
104
9.
10.
11.
12.
13.
14.
15.
16.
Upload a self assessment?
Check feedback for a key performance?
Review past key performances?
Use the My Resource Area?
Use the Reference area?
Attach a key performance to a matrix?
View a video of your work?
Use the Help Menu?
For you, how useful are the following functions/aspects of the DDP? (Mark an X in the appropriate
column for each item)
Do not
know what
this is
Not
Useful
Occasionally
Useful
17. Accessing the DDP from off campus
18. Accessing my work and self
assessments
19. Accessing my feedback
20. Reviewing past key performances
21. Using the My Resource area
22. Using the Reference area
23. Attaching a key performance to a
matrix
24. Viewing a video of your work
25. Using the Help Menu
26. Overall, what is your opinion of the usefulness of the DDP?
1______2______3______4_______5
|
|
|
Not Useful
Useful
Extremely Useful
Please explain:
27. What do you think could enhance the usefulness of the DDP?
28. Overall, how easy is it for you to use the DDP?
1______2______3______4_______5
|
|
|
Not Easy
Easy
Extremely Easy
Please explain:
29. What do you think could help you use the DDP more?
30. In your opinion, are you being asked to use the DDP
1______2______3______4_______5
|
|
|
Not Enough
Enough
Too Much
Please explain:
31. What are your suggestions for improving the DDP?
32. Do you have any additional comments on the DDP that you would like to share?
Often
Useful
Very
Useful
105
33. Would you be interested in participating in a follow-up interview on the DDP? Yes ___ No ___
If YES please give us your name ___________________________________
Thank you for participating in this survey!
The creation of the student and faculty surveys followed techniques outlined by
Suskie (1996). Suskie’s techniques used in the survey included (1996, pp. 44 – 51):
1. Keep it short. The survey was one, two sided page.
2. Make sure each items asks only one question. All survey questions were
checked by this researcher and Dr. Jeana Abromeit for the use of “and” and
“or”.
3. Keep it readable. The use of jargon was kept to a minimum. When specific
terms from the DDP were used a selection was included for Do not know what
this is.
4. Make all definitions, assumptions, and qualifiers clearly understood. The
survey was piloted with students and faculty to test the clarity of questions
and definitions.
5. Avoid making significant memory demands. Questions about logging onto the
DDP or about completed/created key performances were phrased in
relationship to a typical month or this semester. A series of choices was
included (none, 1, 2, 3, 4 or more than 4).
6. Make items easy and fast to answer. The format for the survey included short
questions and simple Likert Scales. The format of the Likert Scale questions
106
allowed respondents to mark anywhere along a line. Scores were rounded to
the nearest point or half point to standardize the data.
7. Keep it interesting. The format of the survey was varied so that it included
demographic data, Likert Scales in the form of tables, and sliding scales
(usefulness and ease of use), and free response questions.
8. Avoid biased, loaded, leading, or sensitive questions. Questions were phrased
in terms of the opinion of the participant (For you, how useful is…, Overall
what is your opinion of…). For Likert Scale questions concerning components
of the DDP a choice of “Do not know what this is” was provided.
To assist in internal consistency, several questions were asked in different ways.
For example, participants were asked to rank the usefulness of components of the DDP
and were then asked to rank the overall usefulness of the DDP. Cronback alpha reliability
estimates were used to analyze the internal consistency of the survey.
According to Suskie (1996), there are four basic ways that survey researchers can
develop evidence of validity (1996, pp. 57 – 59). These methods are listed below, along
with how they were incorporated into the survey used in this research.
1. Compare or correlate survey results with results from a variety of other
measures and data collections methods.
a. The results of the survey were triangulated with results from the data
mining and the follow-up interviews.
2. Compare results from diverse groups to see if differences match what others
have found.
107
a. For students, three basic categories were used to sample students
across the curriculum: beginning, intermediate, and advanced. Within
these groups the results were compared and data analysis done on
differences between beginning, intermediate and advanced students.
b. For faculty, a comparison was made between part-time and full-time
faculty, and the length of time the faculty have been at Alverno.
3. Have people with diverse backgrounds and viewpoints review the survey
before it is administered.
a. The surveys were reviewed by students who had varying degrees of
familiarity with the DDP. Feedback from the pilot survey was
incorporated into the final version of the student survey.
b. Volunteer faculty reviewed the survey, including faculty with varying
degrees of familiarity with the DDP. Feedback from these faculty was
incorporated into the final version of the faculty survey. In addition,
Dr. Jeana Abromeit, Professor of Social Science and co-chair of the
Council on Student Assessment, reviewed the surveys and offered
suggestions and changes. Dr. Georgine Loacker, Professor of English
and past chair of the Council on Student Assessment, also reviewed
the survey and offered suggestions for clarifying the questions.
4. Pilot test the survey
a. The survey was piloted with both student and faculty groups.
Adjustments were made to the questions to improve clarity. For
108
example, in the first version of the survey, students and faculty were
asked their opinion of the usefulness of the DDP using a continuous
line scale ranging from 1 (not useful) to 5 (extremely useful). After
the scale the word “Comments” was used to elicit additional
information. Both students and faculty piloting the survey suggested
changing this to “Please explain”, since the word “Comments” was
frequently overlooked by those piloting the survey.
Student and Faculty Interviews
The questions for the follow-up interviews of students and faculty were created
using data gathered from the survey to assist in clarifying survey answers and identifying
issues. In addition, student interview questions were used from an ERE student interview
conducted in 2002.
Students and faculty were interviewed during the fall 2005 semester. Participants
for the interviews were selected from surveys where the final question (Would you be
interested in participating in a follow-up interview on the DDP?) was answered yes ( the
student/faculty name listed). To insure that students and faculty selected for the
interviews included diverse opinions concerning the DDP, a stratified process was used.
Students who self-selected for the interviews from each level (beginning,
intermediate, and advanced) were placed into three groups based on their general survey
comments concerning the DDP. The three groups included: negative towards the DDP,
positive towards the DDP, and neutral towards the DDP. For example, a student who
answered an open-ended question with “The DDP is so redundant and generally a waste
of time because the questions and answers are so formulaic. I wish it were more exciting”
109
was placed in the negative group. A student who responded “DDP is a great tool. It is
nice to be informed of the work done. I love the idea to have the video downloaded. It
helps to see where I have to work on” was placed in the positive group. For each level
(beginning, intermediate, and advanced students) one student was selected from each
group (negative, positive, neutral). Originally, nine students were to be interviewed.
However, the advanced student sample only included six students who responded yes to
the question of participating in a follow-up interview. Of these six, only two agreed to be
interviewed. A total of eight students were interviewed for this study.
A similar process was used for faculty interviews. Faculty who self selected to
participate in a follow-up interview were placed in the same three groups as students:
negative, positive, and neutral based on their survey responses. A total of six faculty were
interviewed, two from each group.
Data Analysis
The data gathered in this study were mixed, including both quantitative data (data
mining and some survey questions) and qualitative (some survey questions and follow-up
interviews). Data analysis includes:
1. Data mining. Measures of central tendency were used with the data mined
from the DDP database. Depending on the type of data, means, medians, and
standard deviations were used. Frequency distributions and bar graphs were
used where applicable.
2. Survey. Analysis of the survey data depended on the type of questions.
Categorical data were analyzed using frequency distributions, and bar graphs.
110
Cronback alpha was used to test for internal consistency of the surveys.
Qualitative data were analyzed for common terms and themes using SPSS
Text Analysis software. A Thematic Conceptual Matrix, as described by
Miles and Huberman (1994) was used to display the results of each openended response question on the survey (seven questions). According to Miles
and Huberman (1994), a Thematic Conceptual Matrix is “…most helpful
when you have specified, or discovered, some clear conceptual themes”
(1994, p. 131). A Thematic Conceptual Matrix uses an ordering principle of
conceptual themes and has its rows and columns arranged to bring together
items that “belong together” (1994, p. 127).
3. Interviews. Qualitative data gathered from the interviews were coded and
searched for patterns of response using SPSS Text Analysis software. In
addition, a Thematic Conceptual Matrix, as described by Miles and Huberman
(1994), was used to display the results of student and faculty interviews.
In addition to using the above mentioned data analysis techniques, the researcher
used the data gathered in this study to assess the DDP’s level of maturation for digital
portfolios as described by Love, McKean, and Gathercoal (2004). The DDP was analyzed
using the system structure and functions of each level. Supporting evidence was provided
as to how the DDP met or did not meet each level’s statement.
Limitations
Due to the specific nature of the DDP and its integration into Alverno College’s
teaching, learning and assessment philosophy, the results of this study might not be
generalizable to other digital portfolios programs. Currently, the DDP is only being used
111
by Alverno College. However, version 2.0 of the DDP is a customizable version and
several institutions are in negotiations with Alverno College to use it at their institutions.
The data gathered by this study could be useful to these interested institutions or
institutions with similar attributes.
There are limitations concerning student participation in the survey. For example,
students could have been absent during administration of the survey. Intermediate
students are required to take AC 260 Mid Program Portfolio Self Assessment (Weekend
College) and AC 301 Mid Program Portfolio Self Assessment (Weekday College).
However, some students do not attend the assessment and are classified as no-shows. For
example, in the spring, 2005 semester 76.4 % (55 students) of students registered for AC
260 (Weekend College) attended the assessment, while 73.9% (54 students) registered for
AC 301 (Weekday College) attended the assessment. As described earlier in this study,
surveys were not distributed to students who took AC 310 early in the semester (8
students). A total of 91 intermediate students (90.1%) participated in the survey (48 AC
260 students and 43 AC 301 students).
Identification of advanced students was somewhat problematic in that some
advanced courses contain both juniors and seniors and not all advanced courses originally
selected completed the survey. Of the 16 courses originally identified as advanced
courses (received the survey), seven completed and returned the survey (61 students).
A total of 172 beginning students, 91 intermediate students, and 61 advanced
students participated in the survey (324). Other limitations of the survey include: only
one form of the survey was used and the survey was administered only once. Depending
112
on the respondent, there may be a perception of controversy surrounding the use of the
DDP and multiple completions of the survey could have lessened this limitation.
Participation in the interviews was self-selecting for both students and faculty.
This could result in bias – either pro or con to the DDP. Interviewee’s answers could also
be affected by their mood, motivation, fatigue, and time constraints.
Ethics
The ethics of this study revolved around ethical considerations outlined by Suskie
(1996). Suskie listed eight key points taken from a variety of standards including the
Association for Institution Research’s Code of Ethics (pp. 1-2). Although the context of
these key points focused on survey research, five of the eight points are applicable to any
research. Key points relevant to this study are listed below each followed by a description
of the three data gathering approaches used in this study.
1. Strive to conduct a survey in a manner that is free of potential bias. Minimize
potential sources of bias and disclose factors that may bias the results.
a. Database mining. There is little potential bias here, since the data were
taken from usage and completed key performance database tables.
b. Surveys. Participants were informed (in the directions) that the survey
would be confidential, unless they chose to participate in a follow-up
interview. Surveys were administered by course instructors or
assessment center staff and completed surveys were returned to this
researcher. However, no notes were made by the instructors or the
staff on students declining to complete the survey.
113
c. Interviews. Due to the self-selecting nature of the interviews,
participants who selected to participate could be bias. This bias could
be either positive or negative to the DDP. For example, a participant
who thought the DDP was a very good tool could elect to be
interviewed, introducing a positive bias. The interview process (asking
for examples) assisted in identifying this bias.
2. Protect the rights of privacy of those who are surveyed, and protect the
confidentiality of individually identifiable information.
a. Data mining. Although individual users could be identified by their
institutional ID number, data mined from the DDP used a DDP key
identifier to maximize privacy.
b. Surveys. The surveys were anonymous, unless the individual self
selected to be part of a follow-up interview. Survey results only
included the category of student (beginning, intermediate, or
advanced) to help protect privacy. A unique identifier was assigned to
each completed survey.
c. Interviews. Participants self-select for the interviews, based on their
answer to the final survey question (Would you be interested in
participating in a follow-up interview on the DDP?). Student and
faculty participants were given a unique numeric ID and category
(positive, negative, or neutral). Student interviewees were also labeled
by their student group – beginning, intermediate, or advanced.
114
3. Avoid harming, humiliating, embarrassing, or seriously misleading
respondents.
a. Data mining. Student and faculty are referred to by their DDP ID
number, thus protecting their privacy and avoiding any
embarrassment.
b. Survey. The nature of the survey questions sought to minimize any
harm or embarrassment to the participants. Participants also had the
option of not answering particular questions.
c. Interviews. Interviewees had the option of not answering interview
questions and all data were confidential.
4. Avoid the fraudulent use of copyrighted materials.
a. The survey was created by this researcher and any materials used in
this research study have been cited.
5. Disclose the results of the study (Suskie, pp. 1 – 2).
a. Results of this study will be available to faculty and students, via web
sites, internal publications, and presentations made to both faculty and
students at the conclusion of the study.
Summary
An Interactive program evaluation methodology was used to study the question of
the use and perceptions of student and faculty to the DDP. Data for this study were
gathered from three sources: (a) data mining (January 1, 2005 to June 20, 2005) of the
DDP relational database to describe student and faculty log-ons; completed key
115
performances, active key performances, and the connections of active key performances
to the various matrices; (b) a survey of students and faculty to explore their use and
perceptions of the DDP (spring, 2005); and (c) a follow-up interview with students and
faculty (fall, 2005) to further explore their use and perceptions as well as to triangulate
the data. Kumar explains the concept of triangulation as a procedure in which
“researchers make use of multiple and different sources, methods, investigators, and
theories to provide corroborating evidence” (p. 202).
This study used three data-gathering approaches as the procedure for
triangulation. A total of 172 beginning (semesters one and two), 91 intermediate
(semesters four and five), and 61 advanced (semesters seven and eight) students were
surveyed using beginning communication courses, mid program external assessments,
and advanced courses. A total of 93 faculty were surveyed during an all-college Institute
in May 2005. Interviews of students and faculty were conducted during fall, 2005. Eight
students (three beginning, three intermediate, and 2 advanced) were interviewed, along
with six faculty members. The results of each approach were analyzed in order to address
the sub-questions of this study.
116
CHAPTER FOUR: RESEARCH RESULTS
Presentation Approach
The data analyzed in this paper were collected as a part of a study to address the
question of the use of Alverno College’s Diagnostic Digital Portfolio (DDP) by
describing and evaluating undergraduate student and faculty use and perceptions of the
DDP. This study used a program evaluation methodology that included data gathered
from the DDP relational database, student and faculty surveys, and post survey interviews
of students and faculty. The data analyzed in this study were gathered during the spring,
2005 semester (database and student/faculty surveys), with follow-up interviews
completed during the fall 2005 semester.
This study examined undergraduate student and faculty use and perceptions of the
Diagnostic Digital Portfolio (DDP), focusing in on several sub-questions. These subquestions include:
1. How often do students and faculty log onto the DDP?
2. What do students and faculty do when they log onto the DDP?
3. What features of the DDP are perceived by students and faculty as useful or
not useful?
4. What are students and faculty perceptions of the overall usefulness of the
DDP?
5. What are students and faculty perceptions of the ease of use of the DDP?
6. What are students and faculty perceptions concerning their frequency of use
of the DDP?
117
7. What suggestions do students and faculty have on: (a) improvement of the
usefulness of the DDP, (b) assistance in using the DDP more, (c) general ideas
for improvement of the DDP, and (d) additional comments on the DDP?
Besides focusing on undergraduate student and faculty use and perceptions of the
DDP, this study analyzed key performances that were active on the DDP (available for
student use) during spring, 2005 semester. The analysis of active key performances
focused on the following sub-questions:
1. How many active key performances are being used by students?
2. What discipline departments have completed key performances?
3. How are completed key performances connected to the abilities?
4. How are completed key performances connected to other matrices?
An Interactive form of program evaluation methodology described by
Owen (1999) was used in this study. The Interactive form relies on intensive onsite study, including observations, surveys, and interviews. Interactive program
evaluation approaches used in this study include responsive evaluation (taking
into account the perspectives/values of the stakeholders) and developmental
evaluation (working with the program providers on a continuous improvement
process). A three-prong approach was used to collect data in this study:
1. Mining of the DDP relational database (log-ons, completed work,
connections to program and institutional outcomes, and student and
faculty use).
2. Surveys administered to students and faculty (full and part time).
118
3. Post-survey interviews of students and faculty.
The data gathered from these three approaches were analyzed and compared to
initial research on the DDP completed by Alverno’s Educational Research and
Evaluation Department (ERE). ERE gathered data during the early implementation of the
DDP on student log-ons, active key performances, and student perceptions of the DDP
through interviews (See Chapter 2).
Demographic Description of Sample
The participants for the three approaches were Alverno College students and
faculty, including all undergraduate students and all faculty, full-time or part-time, who
used the DDP during the spring, 2005 semester. The DDP relational database was used
to gather data on student and faculty use of the DDP, including number of times student
and faculty logged on during the semester, the number of key performances completed by
students, feedback uploaded by faculty, number of files uploaded by faculty, and the
number of active key performances created by faculty. In addition, data were gathered
from the DDP relational database on completed key performances and their connection to
the various matrices (ability and advanced outcomes).
Database Mining
All students and faculty who logged into the DDP during the spring, 2005
semester (January 1 to June 20, 2005) were included in the data analysis. There were a
total of 17,303 student log-ons to the DDP during spring, 2005 semester representing
1,893 different students. Faculty logged onto the DDP 3,953 times, representing 177
different faculty.
119
All test accounts were removed from the data before analysis. In addition, all
graduate (Masters) students were removed from the data. These removals were
accomplished by querying the student information data for major and program.
However, of the data queried, 166 records had blank fields for student majors. These 166
records were included in the study. The data analyzed in this study could contain students
who are non-degree students including education licensure students, or students pursuing
certificate programs.
Survey of Students and Faculty
Students were surveyed in the spring, 2005 semester from three groups:
1. Beginning students – students in their first two semesters. Students were
surveyed in communication seminars, usually taken during the first two
semesters. A total of 172 beginning students were surveyed.
2. Intermediate students –students in semesters four and five. Intermediate
students were surveyed in external assessments usually taken during the fourth
or fifth semester. A total of 91 intermediate students were surveyed.
3. Advanced students – students in their last two semesters. Students were
surveyed in advanced courses usually taken during semester seven and eight. A
total of 61 advanced students were surveyed.
Faculty were surveyed during an all-college institute in May 2005. All full-time
faculty are expected to attend and part-time faculty are encouraged to attend. A total of
93 faculty were surveyed.
Faculty and student surveys contained demographic questions (quantitative),
Likert Scale questions (quantitative), and open-ended response questions (qualitative).
120
Student and Faculty Interviews
Additional qualitative data on undergraduate student and faculty perceptions of
the DDP were gathered using scripted interviews of selected students and faculty.
Participants for the interviews were selected from surveys where the final question
(Would you be interested in participating in a follow-up interview on the DDP?) was
answered yes and the student/faculty name was listed. To insure students and faculty
selected for the interviews included diverse opinions concerning the DDP, a stratified
process was used.
Students who self-selected for the interviews from each level (beginning,
intermediate, and advanced) were placed into three groups based on their survey
responses concerning the DDP. The three groups were: negative perceptions of the DDP,
positive perceptions of the DDP, and neutral perceptions of the DDP. A total of eight
students were interviewed: three beginning students, three intermediate students, and two
advanced students. Only two advanced students were interviewed due to the low number
of yes responses to the last survey question: Would you be interested in participating in a
follow-up interview on the DDP? Six advanced students responded yes to this question.
All six advanced students were contacted to participate in the interviews. However, only
two agreed to participate.
Faculty who self-selected to participate in a follow-up interview were placed in
the same three groups as students: negative perceptions of the DDP, positive perceptions
of the DDP, and neutral perceptions of the DDP. A total of six faculty were interviewed,
two from each group.
121
Test of Assumptions
The data gathered in this study are descriptive in nature. Measures of central
tendencies were used to describe student and faculty use and perceptions of the DDP.
Means, medians and standard deviations were calculated for data mined from the DDP
relational database. In order to have comparison data, means and standard deviations
were also calculated for past DDP use, beginning in August, 2000.
Depending on the type of survey question, means, standard deviations, and
medians were used to describe the data. Institutional statistics from spring, 2005 list
1,855 total undergraduate students, 1,349 Weekday College (72.7%), 506 Weekend
College (27.3%), 1310 full time (70.6%), 549 part time (29.4%), with 188 students living
on campus (10.1%) (Academic Services, 2005). A comparison of institutional data to
data gathered in the student surveys is given in Table 12.
Table 12
Institutional and Survey Data Comparison (Institutional Data Taken from Academic
Services’ Enrollment and FTE Spring, 2005)
WEC Undergraduate Students
WDC Undergraduate Students
Full-Time Students
Part-Time Students
Residential Students
Commuter Students
N=
Institutional
Student Data
Spring, 2005
Student
Survey Data
27.3%
72.7%
70.6%
29.4%
10.1%
89.9%
1,855
34.6%
65.4%
76.7%
23.3%
10.3%
89.7%
324
The data gathered in the survey were comparable to the institutional data from
spring, 2005. There was a 7.3% difference in the percent of WEC/WDC students, a 6.1%
122
difference in the percent of full-/part-time students, and a 0.2% difference in the percent
of residential/commuter students.
A total of 93 faculty completed the faculty survey, including 84.9% full-time
faculty, 5.4% part-time faculty, and 9.3% staff assessors or instructional faculty. During
spring, 2005 the institution listed a total of 252 faculty, with 41.3% (104) full-time
faculty, 50.4% (127) part-time faculty, and 0.1% (21) Instructional Services faculty
(Academic Affairs, 2005). Faculty completed the survey during the all-college institute
in May, 2005. All full-time faculty are expected to attend the institute. However, parttime faculty, depending on their position, are not required to attend the institute.
To test internal reliability (or consistency), a Cronbach alpha was calculated on
the student and faculty survey data (demographic data was removed) (Santos, 1999). The
Cronbach alpha for the student survey data was 0.90. Faculty survey data had a Cronbach
alpha of 0.83. These data indicate that both the student and faculty surveys demonstrate
internal reliability.
Demographic Description of Results
The results of this study are organized by the research sub-questions, preceded by
an analysis of the student and faculty demographic data. Each sub-question section is
further broken down by the method used to collect the data. The three methods used in
this study were (a) database mining, (b) surveys (student and faculty), and (c) interviews
(student and faculty).
Student Demographic Data Analysis
The student survey contained five demographic questions. The demographic data
included questions on: student program (WEC or WDC), major and support, residential
123
or commuter, full- or part-time, and number of semesters students have attended Alverno
College.
The results of the demographic question: What general program are you in? are
shown in Figure 5. The results from the intermediate student group are approximately
Student Survey Results:
What general program are you currently in?
100%
83.6%
90%
Percent Students
80%
66.9%
70%
65.4%
60%
49.5% 50.5%
50%
40%
34.6%
33.1%
30%
16.4%
20%
10%
0%
WEC
WDC
Beginning
WEC
WDC
Intermediate
WEC
WDC
Advanced
WEC WDC
All Students
Student Groups
Figure 5. Student survey results: What general program are you currently in?
equal. Perhaps this was related to the fact that intermediate students were surveyed in an
outside-of-class assessment that is completed in both WEC and WDC. The totals for all
students are similar (7.3% difference) to the institutional statistics for the spring, 2005
semester (72.7% Weekday College and 27.3% Weekend College students).
The second demographic question asked students to list their majors and support
areas (minors). This question was designed to give the College information on specific
majors/support areas and their use of the DDP. Table 13 lists the major and support areas
data from the institution records and survey results.
124
Table 13
Comparison Institutional and Survey Data for Majors and Support Areas (Minors)
Institutional Data on Majors and
Support Areas
WEC
Major
Support
Survey Results on Majors and
Support Areas
WDC
Major
WEC
Support
Major
WDC
Support
Major
Support
1. Business
&
Mangmt
2. Prof.
Comm.
1. Elective
Studies
1. Nursing
1. Undecided
1. Business &
Mangmt
1. Blank/
Undecided
1. Nursing
1. Blank/
Undecided
2. Undecided
2. El.
Education
2. Elective
Studies
2. Elective
Studies
2. Psychology 2. Psychology
3. Comm.
Mangmt
& Tech
3. Prof.
Comm.
3. Psychology
3. Psychology
2. Prof.
Communicati
ons
3. Comm.
Mangmt &
Tech
3. Prof.
Comm.
3. English
3. Elective
Studies
4. Business &
Mangmt
5. Biology
4. English
related
5. Social
Science
4. Education
4. Spanish
5. Business &
Mangmt
5. Business &
Mangmt
Results from the student surveys were comparable to data from institutional
records. WEC students who completed the survey had the same three majors listed in the
institutional data. Some student surveys listed support areas (minors), while other
students left the area blank or entered undecided. The undecided and blank survey
entries were grouped together. Student survey results for WEC supports were similar,
although the ranking order was different. Institutional data and student survey data for
WDC students were similar, but with a different rank order. WDC student surveys listed
English as one of the top five majors, while institutional data lists Biology as one of the
top five majors for WDC. WDC student survey results for support areas (minors) were
similar to institutional data for the top three support areas.
The third demographic question on the student survey asked if they were
residential or commuter students. Figure 6 displays the student survey results for this
question. Institutional data for spring, 2005 list 10.1% of students as residential students
125
and 89.9% of students as commuters. The data for students surveyed was very similar to
the institutional data on residential and commuter students for the spring, 2005 semester.
Student Survey Results: Do you live on campus?
100%
9 3 .4 %
Percent of Students
8 9 .8 %
8 7 .6 %
90%
8 9 .7 %
80%
70%
60%
50%
40%
30%
20%
12 .4 %
10 .2 %
6 .6 %
10%
10 .3 %
0%
Beginning
Intermediate
Advanced
All Students
Student Groups
Residential
Commuter
Figure 6. Student survey results: Do you live on campus?
The fourth demographic question asked if students were full-time or part-time
students. Figure 7 displays the results of this survey question. The student survey results
Student Survey Results: Are you currently full-time or part-time?
100%
Percent of students
90%
80%
7 6 .6 %
8 0 .0 %
7 4 .7 %
7 6 .7 %
70%
60%
50%
40%
30%
2 5 .3 %
2 3 .4 %
2 3 .3 %
2 0 .0 %
20%
10%
0%
Beginning
Intermediate
Advanced
All Students
Student Groups
Full-T ime
Part-T ime
Figure 7. Student survey results: Are you currently full-time or part-time?
126
were similar to institutional data from spring, 2005 which listed 1,855 students, 70.6%
full-time and 29.3% part-time .
The last demographic question asked students how many semesters they have
been at Alverno. This question was included as a check to see if the survey participants
were from the correct student group (beginning, intermediate, and advanced). Table 14
displays the results of this question. The data indicate that 93.6% of beginning students
are in their first or second semesters. For intermediate students, 40.5% are in semester
four or five; 11.2% of are in semesters two or three; 24.7% of are in semester 6; and
23.5% of intermediate students have been at Alverno more than six semesters.
Advanced student data indicated that 24.6% of students have attended Alverno six
semesters or less; 21.3% are in semester six; 39.3% are in semesters seven and eight; and
14.7% of advanced students have attended Alverno for more than eight semesters.
Table 14
Summary of Results of Student Survey on Number of Semesters at Alverno
Beginning
Intermediate
Advanced
Semesters
Students
Students
Students
1
2
3
4
5
6
7
8
9
10 or More Semesters
Total Responses
Missing Responses
N
40.7%
52.9%
4.7%
1.7%
172
0
172
4.5%
5.5%
24.7%
16.9%
24.7%
7.9%
10.1%
5.5%
89
2
91
1.6%
3.3%
11.5%
8.2%
21.3%
13.1%
26.2%
3.3%
11.4%
61
0
61
127
Intermediate student results indicating less than four semesters of attendance at
Alverno College could have been due to transfer students. Transfer students could also
have accounted for advanced students listing less than six semesters at Alverno. Parttime students could have accounted for survey responses that indicated a higher number
of semesters of attendance at Alverno. For example, beginning students who listed that
they were in their third or fourth semester, intermediate students who listed that they had
attended Alverno more than six semesters, or advanced students who listed that more
than eight semesters of attendance at Alverno, could have been part-time students.
Alverno does not keep records on semesters of student attendance. The institution
uses the classification of freshman, sophomore, junior, and senior. For the spring, 2005
semester the institution listed 25% of its students as freshman, 25% of students as
sophomores, 19% of students as juniors, and 23% of students as seniors. The institution
listed 8% of the students as non-degree students.
Taken as a whole, the demographic data gathered from student surveys were
similar to institutional demographic data for the spring, 2005 semester.
Faculty Demographic Data Analysis
The faculty were asked three demographic questions. These demographic
questions were questions on length of time teaching at Alverno, on primary teaching
department, and on faculty category (full or part-time). The faculty survey was completed
during the May, 2005 all-college institute. In addition to faculty, staff who teach and/or
assess students attend the all-college institute and, therefore, could have accounted for the
two “write-in” categories listed on the faculty survey: full-time academic staff and fulltime staff assessors.
128
The first demographic question asked on the faculty survey concerned how long
the participant had been teaching at Alverno. Figure 8 summarizes the data gathered on
this question. There is a broad range of data from 0.3 years to 42 years at the College.
The mean number of years at Alverno is 14.8, with a standard deviation of 9.7, and a
median of 15.0. These data indicated that there were no outliers (scores greater than three
standard deviations). The data also indicated there was a large percent (67.3%) of faculty
who have been at Alverno ten or more years.
In the faculty survey the second demographic question asked participants to
identify their primary teaching department. Identification of a primary teaching
department can be difficult because numerous faculty teach in more than one department.
Figure 9 displays the 27 different departments identified in the survey. The departments
with the largest percent of faculty were Nursing (14.0%), English (9.7%), and Business
and Management and Psychology (7.5% each). Institutional data on faculty for spring,
2005 lists 252 faculty, 104 full-time, 127 part-time, and 21 Instructional Services faculty.
Institutional statistics on the four departments with the largest number of faculty were:
Nursing (12.3%), Psychology (9.1%), Business and Management (7.1%), and Education
(6.7%) (Academic Affairs, 2005). Three of the institution’s four largest faculty
departments were represented in the survey. One factor that could affect the survey data
was that part-time faculty are not required to attend the all-college institute.
The last demographic question on the faculty survey concerned the category of
faculty. The faculty survey listed two categories, full-time faculty and part-time faculty.
Two additional categories were written in by survey participants: full-time academic staff
129
Faculty Survey Results: How long have you been teaching at Alverno?
7%
7%
7%
6%
5%
5%
5%
5%
5%
Percent of Faculty
5%
4% 4% 4%
4%
4%
3%
3%
2%
2%
2% 2%
2%
2%
2%
2% 2%
2%
2%
2%
2%
1% 1%
1% 1%
1%
1%
1% 1%
1% 1% 1%
1%
1%
0%
1
2
3
4
5
6
7
8
9
10 11
12
13 14 15
16
17
18 19 20
21 22
23 24 25
26 27 28 29
30 31 32 33 34
35
36
Years of Teaching at Alverno
Figure 8. Faculty survey results: How long have you been teaching at Alverno?
129
0%
130
Figure 9: Faculty survey results: In what department do you primarily teach?
Social Science
Psychology
Philiosphy
Physical Science
Prof. Comm./Comm., Mgmt, & Tech.
Prof. Communications
Nursing
Mathematics
Business & Management
Management Accounting
Inst. Support Services/Assessment Center
Inst. Support Services
Internship
History
Fine Arts
English/Education
English
Education
Communication/Inst. Support Services
Communication
Chemistry
Career Ed/ MGT
Biology
Assessment Center
Arts & Humanities/History
Arts & Humanities
Art
Percent of Faculty
130
Faculty Survey Results: In what department do you primarily teach?
16%
14%
12%
10%
8%
6%
4%
2%
Primary Teaching Department
131
and full-time staff assessors. Figure 10 displays the results of this question. The data
indicated the majority of faculty who participated in the survey were full-time (84.9%).
These results could have been influenced by the fact that only full-time faculty are
expected to attend the all-college institute. Institutional data for spring, 2005 list 252
faculty, with 41.3% full-time, 50.4% part-time, and 8.3% instructional services.
Percent of Faculty
Faculty Survey Results: Are you full-time or part-time faculty?
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
84.9%
Full time Faculty
5.4%
4.3%
5.4%
Part time Faculty
Full time Staff Assessor
Full T ime Academic
Staff
Faculty Categories
Figure 10. Faculty survey results: Are you full-time or part-time faculty?
Taken as a whole, the demographic data gathered from the faculty survey were
similar to institutional demographic data for faculty during the spring, 2005 semester.
The exception to this was the large number of full-time faculty who participated in the
survey.
The results of this study are described below and organized by the research subquestions. Each research sub-question is then organized by the data-gathering approach
(database mining, student and faculty surveys, and student and faculty interviews).
132
Sub-question 1:
How Often Do Students and Faculty Log onto the DDP?
Data were gathered to address this question from two of the three approaches.
Database mining yielded the number of times students and faculty actually logged onto
the DDP. A survey question provided data on student and faculty perceptions of the
number of times they logged onto the DDP. The interview questions did not ask students
or faculty specifics on how often they logged onto the DDP.
Database Mining
Tracking the number of times students logged onto the DDP was one of the
original methods for recording DDP use. However, a consistent record of DDP log-on
data was not sustained during the early implementation period. Due to the variety of
initial quantitative data gathered (See Table 4) and the difficulty in comparing these data,
this researcher analyzed past user logs for the number of students who had logged onto
the DDP. The data from the user logs were adjusted to remove all test accounts and
multiple logins that were determined to be a system error (log-on times with one second
difference). A summary of this data can be found in Table 15. This table extends the data
gathered by ERE in Table 12 and affords the opportunity for data comparison between
semesters and years. The data in Table 15 verifies that the number of students who
logged onto the DDP increased each semester. Although the total number of student logons increased, the mean log-on decreased slightly from spring, 2003 (8.9) to fall, 2003
(8.6).
133
Table 15
Number and Frequency of Students Logging onto the DDP From August 2000 to Fall
2003
# Students
logging on
Total #
Logins
Mean
SD
Min/Max
Range
Aug.
2000
Fall
2000
Spring
2001
Fall
2001
Spring
2002
Fall
2002
Spring
2003
Fall
2003
10/1999
8/2000
8/2000
1/2001
1/2001
8/2001
8/2001
1/2002
1/2002
8/2002
8/2002
1/2003
1/2003
8/2003
8/2003
1/2004
726
506
519
900
957
1,240
1,396
1,528
2,811
2,472
3,677
4,902
6,655
8,815
12,457
13,141
3.9
5.2
1-74
73
4.9
5.8
1-59
58
7.1
8.2
1-76
75
5.5
5.0
1-34
33
7.0
9.6
1-134
133
7.1
8.2
1-98
97
8.9
11.0
1-139
138
8.6
11.2
1-242
241
Figure 11 displays the number of times students logged onto the DDP for the
spring, 2005 semester. Students logged onto the DDP a total of 17,303 times,
representing 1,893 different students. The mean number of log-ons was 9.1, with a
standard deviation of 10.1. The range of log-ons was large, from 1 to 117. Over 75% of
students logged onto the DDP 12 times or less. The data from spring, 2005 indicated an
increase of the mean log-on, from 8.6 in fall, 2003 to 9.1 in spring, 2005. Institutional
data from spring, 2005 lists 1,855 undergraduate degree students and 151 non-degree
students for a total of 2,006 students. Due to missing data on student programs stored in
the DDP (166 records were missing these data), it is somewhat difficult to compare the
number of students logging onto the DDP with institutional data.
134
DDP Relational Database Results: Number of Times Students Logged onto the DDP during the
Spring, 2005 Semester
13%
12%
11%
Percent of Students
10%
9%
8%
7%
6%
5%
4%
3%
2%
1%
0%
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 52 53 56 57 58 59 62 68 73 77 86 87 117
Number of Student Log-ons
Figure 11. DDP relational database results: Number of times students logged onto the DDP during spring, 2005
134
135
The initial research on the DDP did not include tracking faculty log-ons. Spring,
2005 was the first time data on faculty log-ons were gathered. Faculty log-on data were
edited to remove all test and generic accounts. Generic accounts were created for
multiple users. Generic accounts include Assessment Center, Ability Departments,
Faculty Teams, and External Assessors. In addition, log-on data from the DDP Assistant,
Academic Computing, and ERE were removed. It should be noted that faculty log-on
data could have included faculty who teach in the Masters Programs.
Figure 12 displays the results of faculty log-ons. Data from faculty log-ons have a
large range (1–157), with a mean of 22.0 and a standard deviation of 27.7. Due to the
large range of data, the median of 10.0 was a more accurate representation of the
frequency of faculty log-ons. Faculty logged on to the DDP a total of 3,961 times,
representing 180 different faculty. During the spring, 2005 semester the institution lists a
total of 252 faculty. These data indicated that 71.4% of faculty logged onto the DDP
during the spring, 2005 semester.
The data from the DDP relational database indicated that students who logged
onto the DDP generally logged on six times during the spring, 2005 semester. Data from
the DDP database indicated that faculty who logged onto the DDP during the spring,
2005 semester generally log on ten times during the semester.
Survey Data Analysis
Surveys completed by students and faculty contained questions on their
perception of the number of times they logged into the DDP each month. A standard of
one month, rather than a semester, was chosen due to the variety of times during the
136
DDP Relational Database Results: Number of Times Faculty Logged onto the DDP during the
Spring, 2005 Semester
14%
Percent of Faculty
12%
10%
8%
6%
4%
2%
0%
Number of Faculty Log-ons
Figure 12: DDP relational database results: Number of times faculty logged onto the DDP during spring, 2005
136
137
semester the surveys were completed. Because this was an open-ended question,
participants sometimes entered a range of log-ons and in those cases an averaged was
recorded. For example, if a survey participant entering 1 to 2 as the number of times they
log onto the DDP and during a typical month, a data entry of 1.5 was made.
Student Survey Results
Students were asked the number of times in a typical month they log onto the
DDP. A total of 318 students answered this question. Figure 13 displays the results of
this question. Approximately 32% of students responded they log onto the DDP once per
month. It is interesting to note that 17.9% of students responded they log onto the DDP
zero times in a typical month, while 12.3% of students log onto the DDP four or more
times per month.
Student Survey Results:How many times a month do you log onto
the DDP?
40%
Percent of Students
3 2 .4 %
30%
20% 17 .9 %
14 .2 %
10%
7 .2 %
6 .9 %
5 .3 %
5 .0 %
3 .5 %
1.9 %
1.3 % 1.6 %
0 .3 % 0 .9 % 0 .3 % 0 .3 % 0 .3 % 0 .3 % 0 .3 %
0%
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0
4.5
5.0
5.5
6.0
6.5
7.5
9.0
10.0
15.0
Number of Log-ons
Figure 13. Student survey results: How many times a month do you log onto the
DDP?
Table 16 summarizes, by student category (beginning, intermediate, advanced,
and all students), the log-on data from the student survey. The mean number of log-ons
138
per month was 1.7 with a standard deviation of 1.7. The range of log-ons was 0 to 15 and
the median was 1.0. Due to the range of data, the median was a more accurate
representation of student log-ons per month. The data from advanced students contained
an entry for 10 and 15 log-on times per month. The median is slightly higher for
beginning students (1.5), while the median for intermediate and advanced students is
equal at 1.0. Using the median (1.0), students perceived they logged onto the DDP once
per month or approximately four times a semester.
Table 16
Results of Student Survey Question: How many times during a typical month do you log
onto the DDP?
Log-ons
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0
4.5
5.0
5.5
6.0
6.5
7.5
9.0
10.0
15.0
Total Responses
Missing Responses
Total Respondents
Mean
Standard Deviation
Median
Beginning
Students
27
8
47
17
28
7
10
3
16
1
2
1
1
0
1
1
0
0
170
2
172
1.8
1.5
1.5
Intermediate
Students
9
4
33
5
15
4
5
3
5
2
0
0
1
1
0
0
0
0
87
4
91
1.8
1.3
1.0
Advanced
Students
21
4
23
1
2
0
2
0
1
1
3
0
1
0
0
0
1
1
61
0
61
1.5
2.6
1.0
All Students
57
16
103
23
45
11
17
6
22
4
5
1
3
1
1
1
1
1
318
6
324
1.7
1.7
1.0
139
Faculty Survey Results
Faculty were asked the same survey question concerning their perceptions of the
number of times they log onto the DDP in a month. Figure 14 displays
the data on faculty perceptions of the number of times they logged on in a typical month.
The data indicated that approximately 50% of faculty log onto the DDP two times a
month or less. The mean was 5.1, with a standard deviation of 6.7. The range was large,
from 0 to 35 and the median is 2.0. Due to the large range, the median is a more
appropriate representation of faculty perceptions of the number of times they log onto the
DDP in a typical month.
Faculty Survey Results: How many times a month do you log
onto the DDP?
Percent of Faculty
40%
30%
18 .8 %
20%
12 .9 %
9 .4 %
10%
7 .1%
4 .7 %4 .7 %
2 .4 %
5 .9 %
3 .5 %
7 .1%
3 .5 %3 .5 %
2 .4 %
1.2 %
1.2 %
4 .7 %
1.2 %
2 .4 %2 .4 %
1.2 %
0%
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
6
7
8
10
13
15
18
20
35
Number of Log-ons
Figure 14. Faculty survey results: How many times a month do you log onto the DDP?
Data from the surveys indicated that students perceived they logged onto the DDP
once per month or approximately four times a semester. Surveys indicated that faculty
perceived they logged onto the DDP twice per month or approximately eight times a
semester.
140
Sub-question 2:
What Do Students and Faculty Do When They Log Onto the DDP?
Data to address this sub-question were gathered from database mining, student
and faculty surveys, and student and faculty interviews. Database mining included data
on students (number of completed key performances) and faculty (number of active key
performances and number of files uploaded). Survey data for both faculty and students
included a series of nine questions that asked how often they used various features of the
DDP. The student survey also asked the number of key performances students had
completed during the semester. The faculty survey asked how many active key
performances they had on the DDP. Interview data for both students and faculty included
questions on what they do when they log onto the DDP.
Database Mining
Students and faculty use the DDP to complete key performances. For students,
data gathered from database mining contained the number of key performances
completed during the spring, 2005. For faculty, data gathered from the DDP database
included the number of feedback files uploaded, and the number of active key
performances for the spring, 2005 semester.
Student key performances are considered complete and appear on the appropriate
matrix when students have uploaded a self assessment and faculty (or assessors) have
uploaded feedback, given an overall key performance status, and a status for each matrix
connection. During the spring, 2005 semester, 1,669 students completed a total of 3,918
key performances. Figure 15 displays the number of completed key performances
during the spring, 2005 semester. The mean was 2.4, with a standard deviation of 1.5.
The median was 2.0, and the data range was 1 to 11. The data included a number of
141
outliers (three or more standard deviations), including students who completed seven or
more key
DDP Relational Database Results: Number of Completed Key
Performances Spring, 2005
40%
3 7 .4 %
35%
Percent of Students
30%
2 6 .2 %
25%
20%
16 .5 %
15%
9 .6 %
10%
6 .4 %
5%
3 .1%
0 .5 %
0 .1%
0 .1%
0 .1%
0 .1%
7
8
9
10
11
0%
1
2
3
4
5
6
Number of Completed Key Performances
Figure 15. DDP relational database results: Number of completed key performances
spring, 2005
performances. In this case, the median (2.0) was a more accurate representation of the
number of key performances completed during the spring, 2005 semester.
Faculty must upload feedback in order for a key performance to be complete. The
Assessment Center is responsible for uploading feedback on a number of required
outside-of-class assessments. Due to the large number of file uploaded (846 files), data
from the Assessment Center were removed. Data were also removed for the generic logon, faculty teams (20 entries), and files uploaded by the DDP Assistant (3), to provide a
more accurate picture of individual faculty file uploads.
Figure 16 displays the frequency of faculty file uploads. Uploads contained files
for primary and secondary feedback. During the spring, 2005 semester,
142
DDP Database Results: Number of Files Uploaded by Faculty Spring, 2005
9%
8%
Percent of Faculty
7%
6%
5%
4%
3%
2%
1%
0%
Number of Files Uploaded
Figure 16. DDP relational database results: Number of faculty files uploaded spring, 2005
142
143
116 faculty/assessors uploaded a total of 3,150 files. The mean number of files uploaded
was 27.2, with a standard deviation of 26.8. The range of file uploads was large, 1 to 120
and contained several outliers. The median for faculty file uploaded was 18.0, which is a
more accurate representation, given the large range.
Active key performances are those currently listed on the DDP for student use. A
key performance can be created by individual faculty or by faculty groups. A faculty
member can have students complete a key performance they have created, or one created
by another faculty member or faculty group. Figure 17 displays the results from the DDP
relational database for all active key performances during the spring, 2005. A key
performance that is active may, or may not, be used during the semester. All generic
accounts were removed from the data (faculty teams, ability departments, and
Assessment Center). In addition, data were moved for the DDP Administrator and the
Internship Department. The Internship Department creates key performances for each
department’s internship courses.
DDP Relational Database Results: Faculty Number of Active
Key Performances
40%
Percent of Faculty
35% 32%
30%
25%
20%
15%
15%
10%
10%
7%
7%
9%
6%
6%
3%
5%
1%
1%
0%
0%
1%
0%
1%
10
11
12
13
14
15
16
0%
1
2
3
4
5
6
7
8
9
Number of Active Key Performances
Figure 17. DDP relational database results: Faculty active key performances spring, 2005
144
There were a total of 374 active key performances, created by 100 different
faculty. The mean was 3.7 with a standard deviation of 3.1 and a range of 15. The
median was 3.0. The data contained two outliers, therefore the median (3.0) would be a
more accurate representation of the number of faculty who created active key
performances during spring, 2005.
Data from the DDP relational database indicated that students completed two key
performances during spring, 2005. DDP relational database data indicated that faculty
uploaded approximately 18 files and had three active key performances during spring,
2005.
Survey Data Analysis
There were two areas of the surveys that pertained to what students and faculty do
when they log onto the DDP. For students, the first area was a question on their
perception of how many key performances they completed during the semester. The
second area was a series of nine questions that pertained to how often they use certain
DDP features. Faculty were asked a similar series of nine questions on how often they
use certain features of the DDP, as well as how many active key performances they had
on the DDP.
Student Survey Results
Students were asked approximately how many key performances they had
completed during the semester (spring, 2005). They were given five choices:
0, 1, 2, 3, 4 or more. Figure 16 displays the results for student perceptions of the number
of key performances they completed during spring, 2005. The majority of advanced
students (50.8%) perceived they had completed no key performances, while beginning
145
students (36.5%) perceived they had completed two key performances for the semester.
Less than 10% of students responded they had completed 4 or more key performances
during the spring, 2005 semester. Data indicated that as students’ progress through the
curriculum, their perception of completed key performances during the semester
decreased.
Student Survey Results: How many key performances have
you completed this semester?
70%
Percent of Students
60%
50%
40%
30%
20%
10%
0%
0
1
2
3
4 or More
Number of Completed Key Performances
Beginning
Intermediate
Advanced
All Students
Figure 18: Student survey results: How many key performances have you completed this
semester?
Table 17 summarizes the statistics on student perception of the number of
completed key performances for spring, 2005. The mean for this question was 1.8, with a
standard deviation of 1.2 and a median of 2.0. It is interesting to note that the median for
advanced students was zero.
146
Table 17
Student Survey Statistics on Completed Key Performances
Number KP
0
1
2
3
4 or More
Total
Missing
Total
Mean
SD
Median
Beginning
Students
17
27
60
47
13
164
8
172
2.1
1.5
2.0
Intermediate
Students
9
34
23
12
7
85
6
91
1.7
1.1
1.0
Advanced
Students
30
8
10
6
5
59
2
61
1.1
1.4
0.0
All Students
56
69
93
65
25
308
16
324
1.8
1.2
2.0
The second data set that was concerned with what students do when they log onto
the DDP, involved a series of nine questions on various features of the DDP. Students
were asked to determine how often they use a particular feature. The choices on the
survey were: Do not know what this is (0), Never (1), Occasionally (2), Often (3), and
Very Often (4).
The first question asked how often students add a key performance to the My
Work area of the DDP. Students must add a key performance to the My Work area
before the key performance can be completed. Figure 18 displays the results of the
student survey. The most frequent response for all groups of students was Occasionally
(64.3%). A total of 6.1% of all students did not know the meaning of this feature.
However, only 1.1% of intermediate students did not know the meaning of this feature.
147
Student Survey Results: How often do students add a key
performance to the My Work area?
90%
Percent of Students
80%
70%
60%
50%
40%
30%
20%
10%
0%
Do not know what
this is
Never
Occasionally
Often
Very Often
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 19. Student survey results: How often do students add a key performance to the
My Work area?
Table 18 lists the data for each group and the corresponding measures of central
tendencies. The median for all three student groups are identical, indicating students’
perceptions of how often they add a key performance to the My Work area was
Occasionally. This is interesting because students must add a key performance the My
Work area before it can be completed.
Table 18
Student Survey Statistics on How Often a Key Performance is Added to the My Work
Area
Do not know what this is
Never
Occasionally
Often
Very Often
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
10
21
107
17
11
166
6
172
2.0
1.1
2.0
Intermediate
Students
1
13
58
12
5
89
2
91
2.1
0.7
2.0
Advanced
Students
8
6
37
5
3
59
2
61
1.8
1.0
2.0
All Students
19
40
202
34
19
314
10
324
2.0
0.9
2.0
148
The second question on the student survey that pertained to what students do
when they log onto the DDP asked how often they upload a self assessment. Students
must upload a self assessment in order to complete their portion of a key performance.
Figure 20 displays the results of this question. The majority of students responded they
upload a self assessment Occasionally (63.9%). The data also indicated that intermediate
students seemed to know the meaning of uploading a self assessment, due to the low
percent (0.0%) of intermediate students that responded Do not know what this is.
Student Survey Results: How often do students upload a self
assessment?
90%
Percent of Students
80%
70%
60%
50%
40%
30%
20%
10%
0%
Do not know what
this is
Never
Occasionally
Often
Very Often
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 20. Student survey results: How often do students upload a self assessment?
Table 19 lists the data for each student group. The mean for all students was 2.2,
with a standard deviation of 0.8, and the median was 2.0 (Occasionally).
Because uploading a self assessment is required for students to complete a key
performance, it is interesting that students perceived this as something they do
Occasionally.
Table 19
149
Student Survey Statistics on How Often Do Students Upload a Self Assessment
Beginning
Students
Do not know what this is
Never
Occasionally
Often
Very Often
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
1
18
109
29
13
170
2
172
2.2
0.8
2.0
Intermediate Advanced
Students
Students
All Students
0
3
4
5
3
26
56
39
204
20
12
61
8
3
24
89
60
319
2
1
5
91
61
324
2.4
2.2
2.2
0.7
1.0
0.8
2.0
2.0
2.0
The third question on the student survey that pertained to what students do when
they log onto the DDP asked how often students check feedback. Depending on the key
performance, students could receive their feedback as a hard-copy in class, or be required
to access the feedback via the DDP. Figure 21 displays the results of the data. The
majority of the students (56.3%) answered that they check their feedback Occasionally.
Very few students (1.6%) did not know the meaning of this feature. There was also a low
percent of students who responded that they checked their feedback Very Often.
150
Student Survey Results: How often do students check feedback?
90%
Percent of Students
80%
70%
60%
50%
40%
30%
20%
10%
0%
Do not know what
this is
Never
Occasionally
Often
Very Often
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 21. Student survey results: How often do students check feedback?
Table 20 lists the data for all student groups along with the corresponding
measures of central tendencies. The mean for all students was 2.0, with a standard
deviation of 0.8 and median of 2.0 (Occasionally). Intermediate students were the only
group who knew the meaning of this feature and were the group with the highest percent
of Often responses (26.1%).
Table 20
Student Survey Statistics on How Often Students Check Feedback
Do not know what this is
Never
Occasionally
Often
Very Often
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
2
45
93
26
5
171
1
172
1.9
0.8
2.0
Intermediate
Students
0
9
48
23
8
88
3
91
2.3
0.8
2.0
Advanced
Students
3
12
38
5
1
59
2
61
1.8
0.7
2.0
All Students
5
66
179
54
14
318
6
324
2.0
0.8
2.0
151
The fourth question on the student survey that pertained to what students do when
they log onto the DDP asked how often students review past key performances. Figure 22
displays the results of the data. The majority of students answered that they review their
past key performances Occasionally (48.4%). Advanced students were the only group
that did not list Very Often as a response. A large percent of beginning (40.4%) and
intermediate (40.7%) students responded they Never reviewed past key performances.
Student Survey Results: How often do students review past key
performances?
90%
Percent of Students
80%
70%
60%
50%
40%
30%
20%
10%
0%
Do not know what
this is
Never
Occasionally
Often
Very Often
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 22. Student survey results: How often do students review past key performances?
Table 21 lists the data from each student group with the corresponding measures
of central tendency. This question had a large percent of beginning and advanced students
that responded Never (40.0%), while 18.6% of intermediate students responded Never.
The mean for this question was 1.7, with a standard deviation of 0.8 and a median of 2.0
(Occasionally).
152
Table 21
Student Survey Statistics on How Often Students Review Past Key Performances
Do not know what this is
Never
Occasionally
Often
Very Often
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
7
69
74
20
1
171
1
172
1.6
0.8
2.0
Intermediate
Students
1
16
50
15
4
86
5
91
2.1
0.8
2.0
Advanced
Students
4
24
29
2
0
59
2
61
1.5
0.7
2.0
All Students
12
109
153
37
5
316
8
324
1.7
0.8
2.0
The fifth question on the student survey that pertained to what students do when
they log onto the DDP asked how often students use the My Resources area. Figure 23
displays the results of the data. For this survey question, the majority of students
answered that they Never use the My Resources area (52.5%). This question had 13.0%
of students who answered they did not know the meaning of the My Resource area and a
small percent. 0.9% of students who responded they used this feature Very Often.
Student Survey Results: How often do students use the My
Resources area?
90%
Percent of Students
80%
70%
60%
50%
40%
30%
20%
10%
0%
Do not know what
this is
Never
Occasionally
Often
Very Often
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 23. Student survey results: How often do students use the My Resources area?
153
Table 22 lists the data from each student group with the corresponding measures
of central tendency. The mean for all students was 1.3, with a standard deviation of 0.8
and a median of 1.0 (Never).
Table 22
Student Survey Statistics on How Often Students Use the My Resources Area
Do not know what this is
Never
Occasionally
Often
Very Often
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
24
89
45
11
1
170
2
172
1.3
0.8
1.0
Intermediate
Students
7
46
24
8
2
87
4
91
1.5
0.9
1.0
Advanced
Students
10
31
16
2
0
59
2
61
1.2
0.8
1.0
All Students
41
166
85
21
3
316
8
324
1.3
0.8
1.0
The sixth question on the student survey that pertained to what students do when
they log onto the DDP asked how often students use the Reference area. Figure 24
displays the results for this feature. The majority of students (53.5%) responded they
Never use the Reference area. This question also had a higher than expected percent of
students who answered they did not know the meaning of the Reference area (15.0%). A
low percent of students (1.3%) responded they used this feature Very Often and no
advanced student chose this response.
154
Student Survey Results: How often do students use the Reference
area?
90%
Percent of Students
80%
70%
60%
50%
40%
30%
20%
10%
0%
Do not know what
this is
Never
Occasionally
Often
Very Often
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 24. Student survey results: How often do students use the Reference area?
Table 23 lists the data from each student group with the corresponding measures
of central tendency. The mean for all students was 1.3, with a standard deviation of 0.8
and the median of 1.0 (Never).
Table 23
Student Survey Statistics on How Often Students Use the Reference Area
Do not know what this is
Never
Occasionally
Often
Very Often
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
24
89
45
11
1
170
2
172
1.3
0.9
1.0
Intermediate
Students
9
48
19
8
1
85
6
91
1.3
0.8
1.0
Advanced
Students
12
32
14
1
0
59
2
61
1.1
0.7
1.0
All Students
47
168
77
18
4
314
10
324
1.3
0.8
1.0
The seventh question on the student survey that pertained to what students do
when they log onto the DDP asked how often the students attach a key performance to a
matrix. Figure 25 displays the results of the data. For this survey question, the 39.0% of
155
students answered they Never attach a key performance to a matrix. This question also
had a higher than expected percent of students (15.1%) who answered they did not know
the meaning of attaching a key performance to a matrix. Very few students (2.5%)
responded they used this feature Very Often. No intermediate students responded that
they attach a key performance to a matrix Very Often.
Student Survey Results: How often do students attach a key
performance to a matrix?
90%
Percent of Students
80%
70%
60%
50%
40%
30%
20%
10%
0%
Do not know what
this is
Never
Occasionally
Often
Very Often
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 25. Student survey results: How often do students attach a key performance to a
matrix?
Table 24 lists the data from each student group with the corresponding measures
of central tendencies. The mean was 1.4 with a standard deviation of 0.8 and a median of
1.0 (Never). Data indicated that students perceived they generally Never attach a key
performance to a matrix.
156
Table 24
Student Survey Statistics on How Often Students Attach a Key Performance to a Matrix
Do not know what this is
Never
Occasionally
Often
Very Often
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
24
89
45
11
1
170
1
172
1.5
0.9
1.0
Intermediate
Students
9
38
36
3
2
88
3
91
1.4
0.8
1.0
Advanced
Students
13
25
20
1
0
59
2
61
1.2
0.8
1.0
All Students
48
124
124
14
8
318
6
324
1.4
0.8
1.0
The eighth question on the student survey that pertained to what students do when
they log onto the DDP asked how often they view a video on the DDP. Figure 26 displays
the results of the data. For this survey question, the majority of intermediate (69.7%) and
advanced (65.0%) students answered they Never view a video of their work on the DDP.
When the DDP version 2.0 was introduced, all beginning student videos were placed on
Student Survey Results: How often do students view video?
90%
Percent of Students
80%
70%
60%
50%
40%
30%
20%
10%
0%
Do not know what
this is
Never
Occasionally
Often
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 26. Student survey results: How often do students view video?
Very Often
157
the DDP. This could explain the 40.0% of beginning students that responded they
Occasionally view video of their work on the DDP. All student groups had a minimum
percent of students (less than 7%) who indicated they did not know the meaning of
viewing video on the DDP.
Table 25 lists the data from each student group with the corresponding measures
of central tendencies. The mean for this question was 1.6, with a standard deviation of 0.8
and a median of 2.0 (Occasionally). No advanced students responded they viewed video
of the work on the DDP Often or Very Often.
Table 25
Student Survey Statistics on How Often Students View a Video
Do not know what this is
Never
Occasionally
Often
Very Often
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
2
47
96
17
9
171
1
172
1.9
0.8
2.0
Intermediate
Students
9
38
36
3
2
88
2
91
1.2
0.6
1.0
Advanced
Students
3
39
18
0
0
60
1
61
1.3
0.5
1.0
All Students
11
148
133
18
10
320
4
324
1.6
0.8
2.0
The last question on the student survey that pertained to what students do when
they log onto the DDP asked how often they use the Help Menu on the DDP. Figure 27
displays the results of the data. For this survey question, the majority of students (70.0%)
responded they Never use the Help Menu. Intermediate and advanced students had no
responses for Often and Very Often. Very few students (5.0%) indicated they did not
know the meaning of this feature.
158
Student Survey Results: How often do students use the Help
Menu?
90%
Percent of Students
80%
70%
60%
50%
40%
30%
20%
10%
0%
Do not know what
this is
Never
Occasionally
Often
Very Often
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 27. Student survey results: How often do students use the Help Menu?
Table 26 lists the data from each student group with the corresponding measures
Often of central tendency. The mean for this question was 1.3, with a standard deviation
of 0.7 and median of 1.0 (Never).
Table 26
Student Survey Statistics on How Often They Use the Help Menu
Do not know what this is
Never
Occasionally
Often
Very Often
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
10
109
38
8
5
170
2
172
1.4
0.8
1.0
Intermediate
Students
3
72
14
0
0
89
2
91
1.1
0.4
1.0
Advanced
Students
3
41
14
0
0
58
3
61
1.2
0.5
1.0
All Students
16
222
66
8
5
317
7
324
1.3
0.7
1.0
Results from the two areas of the student survey that pertained to what students do
when they log into the DDP indicated students perceived they had completed
159
approximately two key performances during the spring, 2005 semester. The top four
choices for how often students use various features of the DDP are summarized in Table
27. The results indicated a similarity between the groups. However, in all cases the
choice with the highest mean was Occasionally. None of the features listed on the survey
had a mean score indicating an Often or Very Often response. These results could have
been influenced by student perceptions that they are not using the DDP very frequently
therefore they are not using any of the features very frequently.
All three student groups responded the feature they use the most was Uploading a
Self Assessment. The second and third most often used features were similar for each
group, Add key performance to My Work and Check feedback for a key performance. In
the case of advanced students these two choices had identical means. One of the top four
choices for intermediate and advanced students was Review past key performances. This
was not a top four choice for beginning students. This could relate to the fact that
typically, beginning students do not have as many completed key performances as the
Table 27
Summary of Students’ Most-Often Used Features of the DDP
Beginning
Students
Intermediate
Students
Upload a Self Assessment
M=2.21
Advanced
Students
Upload a Self Assessment
M=2.21
All Students
1
Upload a Self Assessment
M=2.21
Upload a Self Assessment
M=2.21
2
Add key performance to My
Work
M=1.99
Check feedback for a key
performance
M=2.34
Add key performance to My
Work
M=1.81
Check feedback for a key
performance
M=2.02
3
Check feedback for a key
performance
M=1.92
Add key performance to My
Work
M=2.08
Check feedback for a key
performance
M=1.81
Add key performance to
My Work
M=1.98
4
View video of work
M= 1.91
Review past key
performances
M= 2.06
Review past key
performances
M= 1.49
Review past key
performances
M= 1.73
other student groups and therefore do not review their past work. It is interesting to note
that beginning students’ fourth choice was View video of work. Perhaps the fact that
160
beginning students are oriented to the DDP in a session in which they view video and self
assess on their first speech (beginning of semester one) contributed to this perception.
The three least-used features of the DDP were similar for all student groups: Use
the Reference area, Use the Help Menu, and Use the My Resource area. The results are
displayed in Table 28. In the case of intermediate students, View a video of work was the
second least used feature. The feature that advanced students used second least was
Attach a key performance to a matrix. Perhaps the fact that attaching a key performance
to a matrix was a feature introduced in version 2.0 of the DDP contributed to their
perception because advanced students might not have received training on this feature.
Table 28
Summary of Students’ Least-Often Used Features of the DDP
1
2
3
Beginning
Students
Intermediate
Students
Advanced
Students
All Students
Use the Reference area
M=1.26
Use the My Resource area
M=127
Use the Help Menu
M=1.12
View video of work
M= 1.91
Use the Reference area
M=1.26
Use the Help Menu
M=1.26
Use the Help Menu
M=1.35
Use the Reference area
M=1.26
Use the Reference area
M=1.07
Attach a key performance to
a matrix
M=1.15
Use the My Resource area
M=1.27
Use the My Resource area
M=1.30
Faculty Survey Results
The faculty survey contained two areas that connected to the question of what
faculty do when they log onto the DDP. The first question pertained to the faculty
members’ perception of the number of active key performances they had on the DDP.
The second area included a series of nine questions on how often faculty used various
features of the DDP.
Faculty can create their own key performances (they would be listed as the
creator) for student use, or they can have their students complete key performances
created by others. For example, if faculty are team-teaching or teaching a section of a
161
multi-section course, they might have their students use a key performance created by
another faculty member. There are also generic accounts on the DDP. These generic
accounts allow faculty to log on as faculty teams or ability departments in order to create
key performances that can be used by a variety of students.
Faculty were asked approximately how many active key performances they had
on the DDP. The choices for this question were: 0, 1, 2, 3, 4 or more. Figure 28 displays
the survey results, with 90 faculty responding to this question. The mean number of
active key performances was 2.1, with a standard deviation of 1.4 and a median of 2.0. Of
the faculty responding, 24.4% answered they had two active key performances on the
DDP during the spring, 2005 semester. Approximately the same number of faculty
(22.2%) responded that they had four or more active key performances on the DDP.
These data probably do not depict all the active key performances because faculty might
not consider key performances they have created using the generic username as their
own.
Faculty Survey Results: How many key performances do
you have on the DDP?
Percent of Faculty
30%
24.4%
25%
20%
22.2%
21.1%
16.7%
15.6%
15%
10%
5%
0%
0
1
2
3
4 or more
Number of Active Key Performances
Figure 28. Faculty survey results: How many key performances do you have on the
DDP?
162
In addition to the question on their perception of the number of active key performances
they had on the DDP, faculty were asked a series of nine questions concerning how often
they used a variety of DDP features. The first question faculty were asked is how often
they create a key performance. Figure 29 displays the results of the data. A total of 88
faculty responded to this question. The mean was 1.9, with a standard deviation of 0.6.
The median was 2.0, with the majority responding Occasionally. The data indicated that
2.3% of faculty did not know the meaning of creating a key performance, while 75.0%
used this feature Occasionally.
Faculty Survey Results: How often do you create a key
performance?
80%
75.0%
Percent of Faculty
70%
60%
50%
40%
30%
17.0%
20%
10%
4.5%
2.3%
1.1%
0%
Do not know
what this is
Never
Occasionally
Often
Very Often
Survey Choices
Figure 29. Faculty survey results: How often do you create a key performance?
The second question that pertained to what faculty do when they log onto the
DDP was how often they upload student feedback. Figure 30 displays the results of the
data, with 89 faculty answering this question. The mean was 2.6, with a standard
deviation of 0.9 and a median of 3.0 (Often). Over 20% of faculty responded they
uploaded student feedback Very Often, while 31.5% responded Often. Only one
respondent (1.1%) did not know the meaning of this feature.
163
Faculty Survey Results: How often do you upload student
feedback?
80%
Percent of Faculty
70%
60%
50%
39.3%
40%
31.5%
30%
20.2%
20%
10%
7.9%
1.1%
0%
Do not know
what this is
Never
Occasionally
Often
Very Often
Survey Choices
Figure 30. Faculty survey results: How often do you upload student feedback?
The third question that pertained to what faculty do when they log onto the DDP
was how often they read student work. Students do not have to upload their work to the
DDP unless required by their instructor. Figure 31 displays the results of the data, with
82 faculty responding to this question. The mean was 2.5, with a standard deviation of
0.9 and a median of 3.0 (Often). There was one respondent (1.1%) who did not know the
meaning of this feature and 9.8% of faculty responded they view student work Very
Often. It is interesting that 40.2% of faculty responded Occasionally to this question,
despite the fact that students do not have to upload their work.
164
Faculty Survey Results: How often do you read student
work?
80%
Percent of Faculty
70%
60%
50%
40.2%
40%
26.8%
30%
22.0%
20%
9.8%
10%
1.2%
0%
Do not know
what this is
Never
Occasionally
Often
Very Often
Survey Choices
Figure 31. Faculty survey results: How often do you read student work?
The fourth question that pertained to what faculty do when they log onto the DDP asked
how often they read students’ self assessment. Figure 32 displays the results of the data.
There were 85 faculty who responded to this question. The mean was 2.5 with a standard
deviation of 0.9. The median score was 3.0 (Often). For this question, there was one
respondent (1.2%) who did not know the meaning of this feature. Over 50% of faculty
Faculty Survey Results: How often do you read students'
self assessments?
80%
Percent of Faculty
70%
60%
50%
36.5%
40%
35.3%
30%
20%
10%
15.3%
11.8%
1.2%
0%
Do not know
what this is
Never
Occasionally
Often
Very Often
Survey Choices
Figure 32. Faculty survey results: How often do you read students’ self assessments?
165
indicated they read student self assessments Often or Very Often. This feature, along with
uploading feedback had the highest mean with 3.0 (Often).
The fifth question that pertained to what faculty do when they log onto the DDP asked
how often they use the My Resource area. The My Resource area is a place where
faculty can upload files and store materials electronically without directly connecting the
files to a key performance. Figure 33 displays the results of the data. There were 85
faculty responding to this question. The mean was 1.3, with a standard deviation of 0.8.
The median was 1.0 (Never). For this question there was 14.1% of faculty who did not
know the meaning of this feature. Only one respondent (1.2%) stated they use the My
Resource area Very Often. It is interesting to note that over 50% of faculty responded
they Never use the My Resource area, even though it was designed to allow faculty to
upload any type of electronic files, similar to an Internet hard drive. Perhaps this is due to
lack of training, or the general overall perception that faculty do not use the DDP often
enough.
Faculty Survey Results: How often do you use the My
Resources area?
80%
Percent of Faculty
70%
60%
51.8%
50%
40%
28.2%
30%
20%
14.1%
10%
4.7%
1.2%
0%
Do not know
what this is
Never
Occasionally
Often
Very Often
Survey Choices
Figure 33. Faculty survey results: How often do you use the My Resources area?
166
The sixth question that pertained to what faculty do when they log onto the DDP
asked how often they use the Reference area. The Reference area contains institutional
documents that can be of use to faculty. For example, the area contains a list of all major
and support (minor) advanced outcomes and required courses. Figure 34 displays the
results of the data. There were 86 faculty that responded to this question. None of the
faculty responded that they used the Reference area Very Often. The mean was 1.5, with a
standard deviation of 0.8 and a median of 2.0 (Occasionally). Of the faculty responding,
10.5% did not know the meaning of this feature.
Faculty Survey Results: How often do you use the
Reference area?
80%
Percent of Faculty
70%
60%
50%
38.4%
40%
43.0%
30%
20%
10.5%
8.1%
10%
0.0%
0%
Do not know
what this is
Never
Occasionally
Often
Very Often
Survey Choices
Figure 34. Faculty survey results: How often do you use the Reference area?
The seventh question that pertained to what faculty do when they log onto the
DDP asked how often they check a student’s past work. Figure 35 displays the results of
the data. There were 87 faculty who responded to this question. The mean was 1.9, with a
standard deviation of 0.8 and a median of 2.0 (Occasionally). Almost half of the faculty
(43.0%) responded that they Occasionally use this feature.
167
Faculty Suvey Results: How often do you check a
student's past work?
Percent of Faculty
80%
70%
60%
48.3%
50%
40%
29.9%
30%
17.2%
20%
10%
2.3%
2.3%
0%
Do not know
what this is
Never
Occasionally
Often
Very Often
Survey Choices
Figure 35. Faculty survey results: How often do you check a student’s past work?
The eighth question that pertained to what faculty do when they log onto the DDP
asked how often they use the DDP for narratives. For each Alverno graduate, faculty
create a narrative transcript that describes the student’s quality of work and their
demonstration of abilities in her major and support (minor) programs. Figure 36 displays
the results of the data. There were 87 faculty who responded to this question. The mean
for this question was 1.9, with a standard deviation of 1.1 and a median of 2.0
(Occasionally). Of the faculty responding, 43.7% responded they Never use the DDP for
narratives, and 2.3% do not know the meaning of this feature.
168
Faculty Survey Results: How often do you use the DDP
for narratives?
Percent of Faculty
80%
70%
60%
50%
43.7%
40%
27.6%
30%
20%
10%
13.8%
12.6%
Often
Very Often
2.3%
0%
Do not know
what this is
Never
Occasionally
Survey Choices
Figure 36. Faculty survey results: How often do you use the DDP for narratives?
The last question that pertained to what faculty do when they log onto the DDP
asked how often they use the Help Menu. There were 86 faculty who responded to this
question. Figure 37 displays the results of this question. The mean was 1.4, with a
standard deviation of 0.7and a median of 1.0 (Never). Of the faculty responding, 8.1%
did not know the meaning of this feature and no faculty responded they used the Help
Menu Very Often.
Faculty Survey Results: How often do you use the Help
Menu?
Percent of Faculty
80%
70%
60%
45.3%
50%
41.9%
40%
30%
20%
10%
8.1%
4.7%
0.0%
0%
Do not know
what this is
Never
Occasionally
Often
Very Often
Survey Choices
Figure 37. Faculty survey results: How often do you use the Help Menu?
169
Results from the two areas of the faculty survey concerning what faculty do when
they log onto the DDP indicated that faculty perceived they had two active key
performances on the DDP during spring, 2005. The top three choices for how often
faculty use various features of the DDP are summarized in Table 29. It should be noted
that the mean scores for the two most-often used features were less than 3 (choice of
Often).
Table 29
Summary of Faculty’s Most-Used and Least-Used Features of the DDP
1
Faculty Most-Used Features of the DDP
Upload student feedback M= 2.62
Faculty Least-Used Features of the DDP
Use the My Resource Area M= 1.27
2
Read student self assessment M= 2.52
Use the Help Menu M= 1.43
3
Read student work M=2.22
Use the Reference Area M= 1.49
Faculty data on the three least-often used features of the DDP were similar to
students (Use the My Resource Area, Use the Help Menu, and Use the Reference Area).
However, the mean for faculty was approximately 1 (choice of Never).
Students and faculty were asked their perceptions on how often they used various
features of the DDP. Student and faculty surveys listed somewhat different features, but
there were three features that were on both surveys: Using the Reference area, Using the
My Resource area, and Using the Help Menu. These three features were scored by both
students and faculty as their least-used features.
Interview Data Analysis
There were three interview questions that pertained to what students and faculty
do when they log onto the DDP. These questions were slightly different for students than
170
for faculty. However, both groups were asked to describe what they do when they log
onto the DDP.
Student Interview Results
Students were asked two additional questions that pertained to what they do when
they log onto the DDP. They were asked what stood out in their experiences with the
DDP and if they used the DDP outside of course requirements.
All eight students made comments concerning their infrequent use of the DDP.
For example: “I only have one instructor who has us putting things on there on a regular
basis;” “I have had a few things that were required;” and “There really wasn’t much to
upload, maybe a couple of things here and there.” All students described they had
uploaded self assessments and completed what was required by their instructors.
Students seemed more responsive when asked what stood out in their experiences
with the DDP. Examples of student responses included: (a) “I think the most important
thing I’ve seen is where you put in your self assessment, and the instructors put in theirs,
and you see what they said, and you have it on record;” (b) “I can click on them [abilities
on the Ability Matrix] and find out a little bit more about what they mean, and get a real
snapshot of where I’m at;” (c) “we would get an assignment and we’d have to have it
uploaded in a week and then we could have our feedback by our next class;” and
(d)“…[I] like doing speeches and…uploading and things.” There was one negative
comment on students’ DDP experiences: “It seems faculty don’t know how to use it so
you’ll do something and put the work into putting [it] on the DDP and you never get
feedback.”
171
Six of the eight students described using the DDP outside-of-course requirements.
Examples of how they used the DDP included: (a) “When we first starting doing it [the
DDP], I could get onto it at home, and I would look at it and would think of different
things I could do with it;” (b) “I went back to look at the course requirements for my
major and … was double checking what validations were required for the year that I
entered [on the DDP];” (c) “I like looking at my feedback from time to time. At least I
know where I am and what I need to work on;” and (d) “Just curiosity, just to play with
it. I went in to show my husband things.”
The interview data supported the data from the student surveys as to what they do
when they log onto the DDP. Students frequently mentioned they uploaded self
assessments and checked feedback. A difference between the student survey data and the
interview data was that the interview data gave a much richer picture of what students
were thinking about, what they did, and did not do when they log onto the DDP.
Faculty Interview Results
Besides a general question on what they do when they log onto the DDP, faculty
were asked how they use the DDP with their students, and what stood out for them in
their experiences with the DDP.
Most faculty commented that they use the DDP at the end of their courses for
course reflections and course feedback, as well as for narratives. Faculty described
working within their departments to decide on DDP use and that they used it more in
advanced level courses. One faculty member stated they did not like using the DDP and
only used it as an optional piece with their students.
172
A number of faculty described using the DDP for narratives and the usefulness of
course reflections and feedback as some of the ways they used the DDP. Faculty seemed
to be more responsive when they described their memorable DDP experiences. For
example, one faculty member described how she changed a course assignment:
I started using it because the students do an interview on the DDP. I had them
reflect on questions after they finished the interview. We’ve changed that because
of the DDP. I have each student write feedback to the person they interviewed,
and they put it on the DDP. Then the student responds to what she learned from
both the interview and the feedback she got from the interviewer. The prompt in
the DDP, Peer Feedback, prompted me to include this. What it created was an
opportunity for me and for the students … where it [peer feedback] means
something and it’s popular. I couldn’t believe the development!
Another faculty described how using the DDP impacted students with respect to
self assessment:
…one of the things I dearly love about the DDP is it forces students to be
very serious when they realize this is part of a long-term record. I don’t
use the word permanent, but I use it long-term. I give class time to this
because I think it is working—to really seriously reflect on what is self
assessment.
One faculty member described how the use of the DDP has impacted writing
student feedback: “…I was always very conscience of the fact that if someone else is
going to read this, I have to be able to write it in a way that it is contextual and that kind
173
of feedback for me takes more time. I think I became excited in what you could see in
DDP in student performances.”
There were two faculty who described negative aspects of their DDP experiences.
One faculty member described problems he had with the DDP “kicking him out” after he
uploaded feedback, but then added that this was no longer an issue. The other faculty
member described significant problems with the DDP: “Yes, well you will remember the
problems in setting up that performance. I spent a fair amount of time before I was able
to get in touch with [DDP Assistant], trying to troubleshoot it and there were
undocumented bugs in the instructions and there were problems with it not saving…I like
technology to work for me, or it’s not useful. The technology didn’t work for me.” The
faculty member mentioned later in the interview that he experienced this some time ago
and, since then, has not really used the DDP.
Faculty interview responses supported their survey data. Most faculty mentioned
they uploaded feedback, read student self assessments, and read student work. The
interviews provided a number of rich, in-depth ideas on how faculty are using the DDP
with their students and the benefits they see from using the DDP. The interviews also
provided data on what the faculty perceived as issues and problems with the DDP.
Sub-question 3:
What features of the DDP are perceived by students and faculty as useful or not useful?
Data were gathered to describe this question from two of the three approaches. No
data were gathered from the DDP relational database. Student and faculty surveys
contained a series of nine questions on their perceived usefulness of various features of
174
the DDP. Faculty and student interviews usually contained a question on what features of
the DDP they found the most (or least) useful.
Survey Data Analysis
Students and faculty were asked a series of nine questions on their perceived
usefulness of various features of the DDP. These questions mirrored the questions
pertaining to how often they used various features of the DDP.
Student Survey Results
Students were asked a series of nine questions on their perceived usefulness of
various features of the DDP. These features included accessing the DDP from offcampus, assessing their work and self assessments, accessing their feedback, reviewing
past key performances, using the My Resource area, using the Reference area, attaching a
key performance, viewing video of their work, and using the Help Menu. The choices on
the survey were: Do not know what this is (0), Not Useful (1), Occasionally Useful (2),
Often Useful (3), Very Useful (4).
The first question on student perceptions of the usefulness of various DDP
features concerned accessing the DDP from off-campus. Figure 38 displays the results of
this question. In general students perceived accessing the DDP from off-campus was
Very Useful (4), with the exception of beginning students who chose Occasionally Useful
(2). Of the student responding, 38.0% did not know the meaning of this feature.
175
Student Perception of Usefulness of Accessing the DDP
from Off-Campus
Percent of Students
60%
50%
40%
30%
20%
10%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 38. Student perception of usefulness of accessing the DDP from off-campus
Table 30 displays the results from all student groups and the corresponding
measures of central tendencies. The mean for all students was 2.7, with a standard
deviation of 1.2 and a median of 3.0. Beginning students had the lowest mean (2.4) and
their median was 2.0 (Occasionally Useful). Intermediate and advanced students had a
median of 3.0 (Often Useful).
Table 30
Student Survey Statistics on Usefulness of Accessing the DDP from Off-Campus
Do not know what this is
Not Useful
Occasionally Useful
Often Useful
Very Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
11
27
56
26
47
167
5
172
2.4
1.2
2.0
Intermediate
Students
0
8
27
13
41
89
2
91
3.0
1.1
3.0
Advanced
Students
1
6
19
7
25
58
3
61
2.8
1.2
3.0
All Students
12
41
102
46
113
314
10
324
2.7
1.2
3.0
176
The second question on student perception of usefulness of DDP features
concerned accessing their work and self assessments. Figure 39 displays the results of the
data. The most frequent answer for all student groups was Occasionally Useful (2). Only
1.6% of students did not know the meaning of accessing their work and self assessments,
while all intermediate students seemed to understand the meaning of this feature.
Intermediate students also seemed to view this feature as more useful than the other
student groups, with 62.5% of intermediate students responded they perceived accessing
work and self assessments as Often Useful or Very Useful.
Student Perception of Usefulness of Accessing Work and
Self Assessments
Percent of Students
60%
50%
40%
30%
20%
10%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 39. Student perception of the usefulness of accessing work and self assessments
Table 31 displays the data for all student groups and their corresponding measures
of central tendencies. The mean was 2.7, with a standard deviation of 1.0 and a median of
3.0 (Often).
177
Table 31
Student Survey Statistics on Usefulness of Accessing Work and Self Assessments
Do not know what this is
Not Useful
Occasionally Useful
Often Useful
Very Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
4
16
70
38
42
170
2
172
2.6
1.0
2.0
Intermediate
Students
0
2
31
24
31
88
3
91
3.0
0.9
3.0
Advanced
Students
All
Students
1
7
23
13
14
58
3
61
2.6
1.1
2.0
5
25
124
75
87
316
8
324
2.7
1.0
3.0
The third question on student perceptions of the usefulness of DDP features
concerned accessing their feedback. Figure 40 displays the results of the data. The most
frequent student response was Occasionally Useful (2). All intermediate students seemed
to know the meaning of this feature, while 2.5% of all students did not know the meaning
accessing feedback. Over 50% of students viewed this feature as Often Useful or Very
Useful. Intermediate students (60.9%) perceived accessing feedback as Very Useful.
Student Perception of Usefulness of Accessing Feedback
Percent of Students
60%
50%
40%
30%
20%
10%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 40. Student perception of the usefulness of accessing feedback
178
Table 32 displays the results for all student groups with the corresponding
measures of central tendencies. The mean was 2.7, with a standard deviation of 1.0 and a
median of 3.0 (Often Useful).
Table 32
Student Survey Statistics on Usefulness of Accessing Feedback
Do not know what this is
Not Useful
Occasionally Useful
Often Useful
Very Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
6
14
70
36
44
170
2
172
2.6
1.1
2.0
Intermediate
Students
0
2
32
24
29
87
4
91
2.9
0.9
3.0
Advanced
Students
2
6
23
14
13
58
3
61
2.5
1.1
2.0
All
Students
8
22
125
74
86
315
9
324
2.7
1.0
3.0
The fourth question on student perceptions of the usefulness of DDP features
concerned reviewing past key performances. Figure 41 displays the results of the data.
The most frequent answer for all student groups was Occasionally Useful. Over 50% of
intermediate students responded Often Useful (3) or Very Useful (4), while over 50% of
beginning and advanced students responded that reviewing a key performance was
Occasionally Useful or Often Useful. Over 25% of advanced students responded that
reviewing past key performances was Not Useful.
179
Student Perception of Usefulness of Reviewing Past Key
Performances
Percent of Students
60%
50%
40%
30%
20%
10%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 41: Student perception of the usefulness of reviewing past key performances
Table 33 displays the results of all student group and the corresponding measures
of central tendencies on student perception of the usefulness of reviewing past key
performances. The mean was 2.4, with a standard deviation of 1.1. The median was 2.0
(Occasionally Useful).
Table 33
Student Survey Statistics on Usefulness of Reviewing Past Key Performances
Do not know what this is
Not Useful
Occasionally Useful
Often Useful
Very Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
10
26
70
27
35
168
4
172
2.3
1.1
2.0
Intermediate
Students
1
8
31
21
24
85
6
91
2.7
1.0
3.0
Advanced
Students
3
15
21
9
9
57
4
61
2.1
1.1
2.0
All
Students
14
49
122
57
68
310
14
324
2.4
1.1
2.0
The fifth question on student perceptions of the usefulness of DDP features
concerned using the My Resources area. Figure 42 displays the results of the data. The
180
most frequent response for all student groups was Not Useful, and less than 10% of all
students thought this feature was Very Useful. Over 50% of students responded that they
either did not know the meaning of this feature, or thought the My Resources area was
Not Useful. Of interest in these results was that 23.1% of beginning students did not
know the meaning of the My Resources area. Only 19.9% of students responded that
using the My Resource area was Often Useful or Very Useful.
Student Perception of Usefulness of My Resources
Percent of Students
60%
50%
40%
30%
20%
10%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 42. Student perception of the usefulness of My Resources
Table 34 displays the results of all student groups and the corresponding measures
of central tendencies for student perception of the usefulness of the My Resource area.
The mean for this question was 1.6, with a standard deviation of 1.2 and a median of 1.0
(Not Useful).
181
Table 34
Student Survey Statistics on Usefulness of My Resources Area
Do not know what this is
Not Useful
Occasionally Useful
Often Useful
Very Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
39
47
45
23
15
169
3
172
1.6
1.2
1.0
Intermediate
Students
10
29
28
9
8
84
7
91
1.7
1.1
2.0
Advanced
Students
9
29
13
5
2
58
3
61
1.3
1.0
1.0
All Students
58
105
86
37
25
311
13
324
1.6
1.2
1.0
The sixth question on student perceptions of the usefulness of DDP features
concerned using the Reference area. Figure 43 displays the results of the data. This
question had a higher than expected percent of students (18.9%) that did not know the
meaning of the Reference area. Over 50% of advanced students responded using the
Reference area was Not Useful, while only 26.2% of beginning students responded this
feature was Not Useful. Over 50% of students responded they did not know the meaning
of the Reference area feature or they found this feature Not Useful.
182
Student Perception of Usefulness of the Reference Area
Percent of Students
60%
50%
40%
30%
20%
10%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 43. Student perception of the usefulness of the Reference area
Table 35 displays the results of all student groups and the corresponding measures
of central tendencies for student perception of the usefulness of the Reference area. The
mean for this question was 1.5, with a standard deviation of 1.2 and a median of 1.0 (Not
Useful).
Table 35
Student Survey Statistics on Usefulness of the Reference Area
Do not know what this is
Not Useful
Occasionally Useful
Often Useful
Very Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
39
44
53
17
15
168
4
172
1.6
1.2
1.0
Intermediate
Students
9
36
18
10
9
82
9
91
1.7
1.2
1.0
Advanced
Students
10
29
11
4
3
57
4
61
1.3
1.0
1.0
All Students
58
109
82
31
27
307
17
324
1.5
1.2
1.0
The seventh question on student perceptions of the usefulness of DDP features
concerned attaching a key performance to a matrix. Figure 44 displays the results of the
183
data. Close to half of advanced students (47.3%) responded that they found attaching a
key performance to a matrix Not Useful. Over 44% of all students responded they did not
know the meaning of this feature or they found it Not Useful. Intermediate students seem
to have a greater understanding of this feature (9.4% did not know the meaning of this
feature).
Student Perception of Usefulnes of Attaching a Key
Performance to a Matrix
Percent of Students
60%
50%
40%
30%
20%
10%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 44. Student perception of the usefulness of attaching a key performance to a
matrix
Table 36 displays the results for all student groups and the corresponding
measures of central tendencies for student perception of the usefulness of attaching a key
performance to a matrix. The mean for this question was 1.8, with a standard deviation of
1.2 and a median of 2.0 (Occasionally Useful).
184
Table 36
Student Survey Statistics on Usefulness of Attaching a Key Performance to a Matrix
Do not know what this is
Not Useful
Occasionally Useful
Often Useful
Very Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
37
34
50
26
22
169
3
172
1.8
1.3
2.0
Intermediate
Students
8
22
31
13
11
85
6
91
2.0
1.2
2.0
Advanced
Students
10
26
11
5
3
55
6
61
1.4
1.1
1.0
All Students
55
82
92
44
36
309
15
324
1.8
1.2
2.0
The eighth question on student perceptions of the usefulness of DDP features
concerned viewing a video of your work. Figure 45 displays the results of the data.
Intermediate students did not seem to find this feature useful, as 42.9% responded that
viewing a video of the work was Not Useful. Advanced students responded that viewing
a video of their work on the DDP was Not Useful (39.3%) or Occasionally Useful
(39.3%). Beginning students found this feature the most useful, with 40.0% responding
they found viewing a video of their work Often Useful or Very Useful.
Student Perception of Usefulness of Viewing a Video of
Their Work
Percent of Students
60%
50%
40%
30%
20%
10%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 45. Student perception of the usefulness of viewing a video of their work
185
Table 37 displays the results of all student groups and the corresponding measures
of central tendencies for student perception of the usefulness of viewing a video of their
work on the DDP. The mean for this question was 2.0, with a standard deviation of 1.2
and a median of 2.0 (Occasionally Useful).
Table 37
Student Survey Statistics on Usefulness of Viewing a Video of Their Work
Do not know what this is
Not Useful
Occasionally Useful
Often Useful
Very Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
13
28
61
29
39
170
2
172
2.3
1.2
2.0
Intermediate
Students
7
36
23
7
11
84
7
91
1.8
1.2
1.0
Advanced
Students
5
22
22
3
4
56
5
61
1.6
1.0
2.0
All Students
25
86
106
39
54
310
14
324
2.0
1.2
2.0
The last question on student perceptions of the usefulness of DDP features
concerned using the Help Menu. Figure 46 displays the results of the data. Over 50% of
intermediate and advanced students responded they found the Help Menu Not Useful.
Beginning students seemed to find the Help Menu useful, with 29.1% responding this
feature was Often Useful or Very Useful. It is interesting to note that 10.1% of students do
not know the meaning of this feature.
186
Student Perception of Usefulness of the Help Menu
Percent of Students
60%
50%
40%
30%
20%
10%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 46. Student perception of the usefulness of the Help Menu
Table 38 displays the results of all student groups and the corresponding measures
of central tendencies of student perception on the usefulness of the Help Menu. The mean
for this question was 1.8, with a standard deviation of 1.2 and a median of 2.0
(Occasionally Useful).
Table 38
Student Survey Statistics on Usefulness of the Help Menu
Do not know what this is
Not Useful
Occasionally Useful
Often Useful
Very Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
19
48
52
15
34
168
4
172
2.0
1.3
2.0
Intermediate
Students
6
44
19
8
6
83
8
91
1.6
1.0
1.0
Advanced
Students
6
30
11
3
7
57
4
61
1.7
1.2
1.0
All Students
31
122
82
26
47
308
16
324
1.8
1.2
2.0
187
Students were asked to rate their perception of the usefulness of nine DDP
features using a scale of: Do not know what this is (0), Not Useful (1), Occasionally
Useful (2), Often Useful (3), and Very Useful (4). Table 39 lists student perceptions of the
most-useful DDP features. Although the rank order of the features differ between student
groups, all groups listed the same features as the most useful. These were: Accessing
work and self assessments, Accessing the DDP from off-campus, and Accessing feedback.
Intermediate students rated accessing the DDP from off-campus as Often Useful (3).
Beginning students’ perceptions of the most-useful DDP features of the DDP included a
tie between Accessing work and self assessments and Accessing feedback.
Table 39
Summary of Student Perception of the Most-Useful Features of the DDP
Beginning
Intermediate
Advanced
All Students
1
Accessing Work and Self
Assessments
M=2.58
Accessing DDP From OffCampus
M=2.98
Accessing DDP From OffCampus
M=2.84
Accessing Work and Self
Assessments
M=2.68
2
Accessing Feedback
M=2.58
Accessing Work and Self
Assessments
M=2.95
Accessing Work and Self
Assessments
M=2.55
Accessing DDP From OffCampus
M=2.66
3
Accessing DDP From OffCampus
M=2.43
Accessing Feedback
M=2.92
Accessing Feedback
M=2.52
Accessing Feedback
M=2.66
Table 40 lists students’ perception of the least-useful features of the DDP. All
student groups listed Using the Reference Area, and Using the My Resources Area as
their three useful features. Intermediate students were the only group that listed Using the
Help Menu in their least-useful features, ranking it first. Student perceptions of the least
often used features of the DDP (Sub-question 2) also listed Using the Reference Area and
Using the My Resource Area.
188
Table 40
Summary of Student Perception of the Least Useful Features of the DDP
Beginning
Students
Intermediate
Students
Advanced
Students
All Students
1
Using the Reference Area
M=1.55
Using the Help Menu
M=1.57
Using the Reference Area
M=1.32
Using the Reference Area
M=1.54
2
Using the My Resources
Area
M=1.57
Using the Reference Area
M=1.68
Using the My Resources
Area
M=1.34
Using the My Resources
Area
M=1.57
3
Attaching a Key
Performance to a Matrix
M=1.78
Using the My Resources
Area
M=1.71
Attaching a Key
Performance to a Matrix
M=1.36
Attaching a Key
Performance to a Matrix
M=1.75
Faculty Survey Results
Faculty were asked a series of nine questions on their perceived usefulness of
various features of the DDP. These features included accessing the DDP from offcampus, providing feedback to students, viewing student work, viewing student self
assessments, using the My Resource area, using the Reference area, checking a student’s
past work, using the DDP for narratives, and using the Help Menu. The choices on the
survey were: Do not know what this is (0), Not Useful (1), Occasionally Useful (2), Often
Useful (3), Very Useful (4).
The first question on faculty perceptions of the usefulness of DDP features
concerned accessing the DDP from off-campus. Figure 47 displays the results of this
question. Of the 87 faculty responding to this question, 6.9% did not know the meaning
of this feature and 44.8% responded accessing the DDP from off-campus was Often
Useful (3) or Very Useful (4). The mean for this question was 2.4, with a standard
deviation of 1.2 and the median was 2.0 (Occasionally Useful).
189
Faculty Perception of Usefulness of Accessing the DDP
from Off-Campus
Percent of Faculty
60%
50%
40%
31.0%
30%
20%
10%
21.8%
23.0%
Often Useful
Very Useful
17.2%
6.9%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Survey Choices
Figure 47. Faculty perception of the usefulness of accessing the DDP from off-campus
The second question on faculty perceptions of the usefulness of DDP features
concerned providing feedback to students. Figure 48 displays the results of this question.
Of the 86 faculty responding to this question, only 1.2% (one respondent) did not know
the meaning of this feature. Occasionally Useful and Very Useful were each selected by
33.7% of faculty as their perception of the usefulness of providing feedback to students.
Only 7.0% of faculty responded that this feature was Not Useful. This is interesting
because in order to complete a key performance, faculty must upload (provide) feedback
to the student. The mean for this question was 2.8, with a standard deviation of 1.0 and
the median was 3.0 (Often Useful).
190
Faculty Perception of Usefulness of Providing Feedback
to Students
Percent of Faculty
60%
50%
40%
33.7%
30%
33.7%
24.4%
20%
10%
7.0%
1.2%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Figure 48. Faculty perception of the usefulness of providing feedback to students
The third question on faculty perceptions of the usefulness of DDP features
concerned viewing student work. Figure 49 displays the results of this question. Of the 83
faculty responding to this question, only 1.2% (one respondent) did not know the
meaning of this feature. Occasionally Useful was selected by 38.6% of faculty to describe
their perception of the usefulness of viewing student work. It is interesting to note that
42.2% of faculty thought this feature was Often Useful or Very Useful despite the fact that
students are not required to upload their work unless directed by their instructor. The
mean for this question was 2.4, with a standard deviation of 1.0. The median was 2.0
(Occasionally Useful).
191
Faculty Perception of Usefulness of Viewing Student
Work
Percent of Faculty
60%
50%
38.6%
40%
28.9%
30%
18.1%
20%
10%
13.3%
1.2%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Figure 49. Faculty perception of the usefulness of viewing student work
The fourth question on faculty perceptions of the usefulness of DDP features
concerned viewing student self assessments. Figure 50 displays the results of this
question. A total of 85 faculty responded to this question. Only 1.2% (one respondent)
did not know the meaning of this feature. Of the faculty responding, 53.0% thought this
feature was Often Useful or Very Useful. The mean for this question was 2.6, with a
standard deviation of 1.0 and a median of 3.0 (Often Useful).
Faculty Perception of Usefulness of Viewing Student Self
Assessments
Percent of Faculty
60%
50%
40%
32.9%
31.8%
30%
21.2%
20%
10%
12.9%
1.2%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Figure 50. Faculty perception of the usefulness of viewing student self assessments
192
The fifth question on faculty perceptions of the usefulness of DDP features
concerned using the My Resource area. Figure 51 displays the results of this question. Of
the 82 faculty responding to this question, 18.3% did not know the meaning of this
feature, and 47.6% thought this feature was Not Useful. Only 1.2% (1 respondent)
thought this feature was Very Useful. The mean for this question was 1.3, with a standard
deviation of 0.9 and a median of 1.0 (Not Useful).
Faculty Perception of Usefulness of the My Resources
Area
Percent of Faculty
60%
47.6%
50%
40%
30%
20%
24.4%
18.3%
8.5%
10%
1.2%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Figure 51. Faculty perception of the usefulness of the My Resources area
The sixth question on faculty perceptions of the usefulness of DDP features
concerned using the Reference area. Figure 52 displays the results of this question. A
total of 79 faculty responded to this question and 10.1% did not know the meaning of this
feature. Of the faculty responding, 39.2% thought the Reference area was Occasionally
Useful, while 17.7% thought this feature was Often Useful or Very Useful. The mean for
this question was 1.7, with a standard deviation of 1.0 and a median of 2.0 (Occasionally
Useful).
193
Faculty Perception of Usefulness of the Reference Area
Percent of Faculty
60%
50%
39.2%
40%
32.9%
30%
20%
13.9%
10.1%
10%
3.8%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Figure 52. Faculty perception of the usefulness of the Reference area
The seventh question on faculty perceptions of the usefulness of DDP features
concerned checking student’s past work. Figure 53 displays the results of this question.
Of the 83 faculty responding to this question, only 2.4% (2 respondents) did not know the
meaning of this feature, while 51.8% thought checking a student’s past work was
Occasionally Useful. Approximately 20% of faculty thought checking a student’s past
work was Not Useful. The mean for this question was 2.1, with a standard deviation of
0.9. The median was 2.0 (Occasionally Useful).
Faculty Perception of Usefulness of Checking a Student's
Past Work
Percent of Faculty
60%
51.8%
50%
40%
30%
20.5%
18.1%
20%
10%
7.2%
2.4%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Figure 53: Faculty perception of the usefulness of checking a student’s past work
194
The eighth question on faculty perceptions of the usefulness of DDP features
concerned using the DDP for narratives. Figure 54 displays the results of this question.
Of the 81 faculty responding to this question 4.9% did not know the meaning of this
feature. A total of 29.6% of faculty thought this feature was Not Useful and 38.3%
thought using the DDP for narratives was Often Useful or Very Useful. The mean for this
question was 2.2, with a standard deviation of 1.2 and a median of 2.0 (Occasionally
Useful).
Faculty Perception of Usefulness of the DDP for
Narratives
Percent of Faculty
60%
50%
40%
29.6%
30%
27.2%
17.3%
20%
10%
21.0%
4.9%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Figure 54. Faculty perception of the usefulness of the DDP for narratives
The last question on faculty perceptions of the usefulness of DDP features
concerned using the Help Menu. Figure 55 displays the results of this question. Of the 80
faculty responding to this question 10.0% did not know the meaning of this feature, while
43.8% of faculty thought the Help Menu was Not Useful. Only 2.5% of faculty thought
the Help Menu was Very Useful and 38.3% of faculty thought the Help Menu was
Occasionally Useful. The mean for this question was 1.5, with a standard deviation of
0.8 and a median of 1.0 (Never Useful).
195
Faculty Perception of Usefulness of the Help Menu
Percent of Faculty
60%
50%
43.8%
38.8%
40%
30%
20%
10.0%
10%
5.0%
2.5%
0%
Do not know
what this is
Not Useful
Occasionally
Useful
Often Useful
Very Useful
Survey Choices
Figure 55. Faculty perception of the usefulness of the Help Menu
Results from the faculty survey on their perceptions of the most useful DDP
features are summarized in Table 41. It should be noted that the mean scores for the most
useful features were approximately 3 (choice of Often). Faculty perceptions of the least
useful DDP features included: the My Resource Area, the Help Menu, and the Reference
area. Faculty perceptions of the least-useful features were identical to their perceptions of
the least-often used features of the DDP (sub-question 2).
Table 41
Summary of Faculty Perception for Most-Useful and Least-Useful Features of the DDP
Faculty Most-Useful Features
Faculty Least-Useful Features
1
Providing feedback to Students M= 2.83
Using the My Resource Area M= 127
2
Viewing student self assessment M= 2.59
Using the Help Menu M= 1.46
3
Accessing the DDP from Off-campus M=2.37
Using the Reference Area M= 1.68
Students and faculty were asked their perceptions of the usefulness of DDP
features. While some of the features differed between student and faculty surveys, there
were some similarities. Student and faculty surveys listed three of the same DDP
196
features: the Reference area, the My Resource area, and the Help Menu. Both students
and faculty rated these three features as the least-useful features of the DDP.
Interview Data Analysis
During the interviews, students and faculty were asked what features they
perceive as useful or not useful. This was a general question and most students and
faculty responded by describing what they do most often when they log onto the DDP.
Student Interview Results
Students were asked a general question about their perception of useful and not
useful DDP features. In addition, students were asked a question concerning their
perceptions of the purpose of the DDP.
Students frequently mentioned they used the DDP to upload self assessments,
work, and to read feedback. However, there was an overall pattern in their perception
that they used the DDP infrequently and only when required. For example: “I go there
[DDP], I do the download because that’s what I am told and that’s it. Periodically, I will
look in there just to see what is still in there. I haven’t had a great need to refer back;”
and “Actually, I have only gone on the DDP when asked to.” Two students made
comments that indicated a limited knowledge of the DDP: “I’m not overly familiar with
the different tabs;” and “I am not real familiar with how to get to some of the stuff on
there.” Several students described using the Reference area for a variety of things, such
as finding out major requirements, using student forms and criteria sheets, and definitions
for the eight abilities, but as a whole the students interviewed were not familiar with the
Reference or My Resources areas.
197
Students’ responses demonstrated they have knowledge of the main purposes of
the DDP. Most students described the DDP as a place where they could store information
that they (and faculty) could access at any time. For example: (a) “I think their purposes
were so that they and the student could access information about their courses any time.
And the student could also keep track of what she’s done and look back on [it];” (b) “I
think that it was to provide very concise and condensed form of keeping track of
everything;” and (c) “a form of storing our information, and that will go on from year to
year until you complete your course and you will see your strengths and weaknesses and
how much you’ve improved.”
One student commented that she thought she originally knew the purpose of the
DDP, but now was not sure: “At first I thought it was for me to be able to track how far
I’ve come, track my validations especially in that grid, be able to access that past
coursework, etc., but now, quite frankly, I’m at a loss because I’m never asked to use it.”
Although students did not list specific features they found useful or not useful, the
interview results were similar to their survey results. Students perceived their use of the
DDP as infrequent. Students seemed to understand the purpose of the DDP, but were not
using the DDP enough to achieve this purpose.
Faculty Interview Results
Although faculty were asked what features of the DDP they found useful or not
useful, their comments concerned what they liked about the DDP and its potential. For
example: (a) “what I … really like about us now using the DDP is that summary
feedback… we should have a paragraph or two for each student for each outcome level
course that we teach;” (b) “…[I like] the DDP being used as a resource for individual
198
instructors to see how their students are developing… we’ve identified its potential uses
in curriculum development and also in program evaluation;” and (c) “…what would be
extremely helpful is that, somehow each one of us, in every single course that we taught,
if we just wrote three sentences describing the quality of the work of every student in the
course. Obviously it would help in writing the narrative statements.”
Faculty described using the DDP to read student self assessments, to upload
feedback, and to create narrative statements. There were two negative comments about
the DDP that focused on time and work issues. For example: “it [DDP] makes writing
narratives more extensive. You have more useful information when people put stuff on
the DDP, and I will block out whole statements and then transfer them into the narrative
statement that I’m writing, and refine it. It is a long process;” and “I timed it. A minute
and forty seconds per upload per student just to go through the process of getting from
my file, up through the network, on to the DDP and back and make sure its there. When I
have a hundred of those to do, that’s too much time.”
Interview responses reinforced the results of the faculty surveys. Most faculty
mentioned that they uploaded feedback, read student self assessments, and read student
work. However, the interviews provided a number of ideas on potential use of the DDP,
and information on what faculty perceived as issues and problems with the DDP.
Sub-question 4:
Overall, what are students and faculty perceptions of the usefulness of the DDP?
Data were gathered to describe this question from two of the three approaches. No
data were gathered from DDP relational database for this question. Student and faculty
199
surveys posed a question on the overall usefulness of the DDP. Student and faculty
interviews also contained a question on the overall usefulness of the DDP.
Survey Data Analysis
Students and faculty were asked their perception of the overall usefulness of the
DDP. This question was rated on a continuous Likert Scale of 1 to 5, with 1 as Not
Useful, 3 as Useful, and 5 as Extremely Useful. In addition, this question contained an
open-ended response area titled Please Explain.
Student Survey Results
Students were asked to rate their perception of the overall usefulness of the DDP
on a Likert Scale and then explain their answer in an open-ended question. Due to the
format of the Likert Scale, students could mark anywhere on a line. Scores were rounded
to the nearest point or half point to standardize the data. Figure 56 summarizes the Likert
Scale data. There were 318 student responses with 40.6% of students responding the
DDP overall was Useful. Of the students that responded to this question, 10.1% thought
the DDP overall was Not Useful. Advanced students had the highest percent of responses
that indicated the DDP was Not Useful (21.7%). Only 5.9% of beginning students
thought the DDP overall was Not Useful. A total of 68.6% of students thought the DDP
overall was Useful to Extremely Useful. Beginning students had the highest percent of
students who responded the DDP was Extremely Useful (11.8%), while only 1.7% of
advanced students thought the DDP was Extremely Useful.
200
Student Survey Perception of Overall Usefulness of the DDP
Percent of Students
50%
40%
30%
20%
10%
0%
1- No t
Useful
1.5
2
2.5
3 - Useful
3.5
4
4.5
5 - Extremely
Useful
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 56. Student perception of the overall usefulness of the DDP
Table 42 summarizes the data for each student group and the corresponding
measures of central tendencies for student perception of the overall usefulness of the
DDP. The mean for this question was 3.0, with a standard deviation of 1.1 and a median
of 3.0 (Useful).
Table 42
Student Survey Statistics on Overall Usefulness of the DDP
1 Not Useful
1.5
2
2.5
3 Useful
4
5 Extremely Useful
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
10
0
37
4
69
29
20
169
3
172
3.1
1.1
3.0
Intermediate
9
0
12
1
33
25
9
89
2
91
3.1
1.1
3.0
Advanced
12
4
11
0
27
5
1
60
1
61
2.4
1.0
3.0
All Students
31
4
60
5
129
59
30
318
6
324
3.0
1.1
3.0
201
In addition to the Likert Scale on overall usefulness of the DDP, students were
asked to explain their answers. The open-ended responses were analyzed using SPSS
Text Analysis for Survey software. The original data extraction identified 190 different
terms that were pared down to 26. All responses were placed into seven categories with
some responses falling into more than one category. The total count of categorized
responses was 241. The categories identified were: Find Useful, Negative Comments,
Frequency of Use, Use for Review, Need to Learn More, Other, and Blank responses.
Table 43 contains the results of the analysis.
Table 43 contains the thematic categories, number of responses, and sample
responses from each category. There was a total of 324 surveys analyzed; 132 or 40.7%
had blank responses for this question. The category with the largest number of responses
was Find Useful with 95 responses (39.4%). Responses were placed in this category if
they were analyzed to be positive responses, describing what the students found useful
concerning the DDP. Any responses of a negative nature were placed into the Negative
Comments category which had a total of 55 responses (22.8%). Frequency of Use was a
category created due to the number of responses that referred to not using the DDP
enough, using the DDP infrequently, or the DDP would be useful if it was used more
often. There was a total of 53 Frequency of Use responses (22.0%). Responses referring
to using the DDP to review past performances were placed in a separate category, Used
for Review; there were 26 responses in this category (10.8%). While reviewing past
performances is a type of response that could fall into Find Useful category, a separate
category (Used for Review) was created to keep track of this theme for the institution.
There were a number of responses that referred to the need to learn more about the DDP
202
Table 43
Thematic Conceptual Matrix for Student Survey Responses to Overall Usefulness of the
DDP
Categories
Find Useful
N
95
Negative
Comments
55
Frequency
of Use
53
Use for
Review
26
Need to
Learn More
10
Other
2
Blank
132
Example Comments
ƒ Able to see everything -- can access it off-campus which is helpful.
ƒ Don't have to fill out self assessments by hand. Self assess easier because
you view yourself
ƒ I believe it is a good tool to assess my progress as a student.
ƒ I don't use the functions a lot now, but they will be useful when I have to
use them.
ƒ It's a great place to access past performances and to view teacher
feedback quickly!
ƒ I never use it unless I have to for a class.
ƒ I really don't find useful, I don't even know why we use it.
ƒ I think it is nice to have but not necessary.
ƒ It seems like it is a lot of extra work and I've never been aware of what
the purpose is.
ƒ It seems like the DDP is always used at the very end of the semester
because instructors "have to". I hardly ever need to use it in the
beginning or middle of a semester.
ƒ It's a pain! Trying to understand how to attach stuff - and you can't
change things… It’s a pain!
ƒ Not all professors use it.
ƒ Would be more useful if instructors utilized it more often.
ƒ Although I do not access the DDP often, when I have accessed it I found
it to be very helpful.
ƒ I am not required to use it very often.
ƒ I don't use if much, but I see its merit.
ƒ I have hardly used the DDP and have very little work on my matrix.
ƒ Allows me to review what I've done in class that I don't see while in
class.
ƒ I believe it is a good tool to assess my progress as a student.
ƒ I get to view my work, my feedback from instructors, and I'll always have
it because its on the computer and not on paper where I can lose it or
have the chance of it getting damaged.
ƒ I use it as a guideline for improved future work.
ƒ I use the DDP, but only because this is where my final self assessments
are. But it is helpful to look back.
ƒ I don't know enough about it, but it sure seems like there are many uses
for it. I only use it when I was told to.
ƒ I don't know much about it. This is my first semester and my first time
using it.
ƒ Useful but wish I knew how to access all of the tools available on the
DDP.
ƒ Clear 1
ƒ If you don't know how to use it
203
to determine its usefulness. These responses (4.1%) were placed in the Need to Learn
More category. The Other category contained two responses that did not to fit into any
of the other categories.
Data from both the Likert Scale and open-ended responses indicated that overall,
students found the DDP useful. While there were a number of negative comments
(28.0%), 95 responses (39.4%) fell into the Find Useful category. Responses from the
Frequency of Use category (22.0%) indicated students wanted to use the DDP more
often.
Faculty Survey Results
Faculty were asked to rate their perception of the overall usefulness of the DDP
on a Likert Scale of 1 to 5, with 1 as Not Useful, 3 as Useful and 5 as Extremely Useful.
In addition, this question contained an open-ended response area titled Please Explain.
Due to the format of the Likert Scale, faculty could mark any where on a line. Scores
were rounded to the nearest half point to standardize the data.
Figure 57 summarizes the responses to the Likert Scale question of faculty
perception of the overall usefulness of the DDP. A total of 90 faculty responded to this
question, and 33.3% perceived the DDP overall as Useful. It is interesting to note that
50.0% of faculty responded with an answer than was greater than useful (choice greater
than 3), while 16.6% perceived the DDP as being less than useful (choice less than 3).
The mean was 3.5, with a standard deviation of 1.1 and a median of 3.3.
204
Faculty Survey Perception of Overall Usefulness of the DDP
50%
Percent of Faculty
40%
33.3%
30%
27.8%
20.0%
20%
10.0%
10%
4.4%
1.1%
2.2%
1.1%
0.0%
0%
1 - Not
Useful
1.5
2
2.5
3 - Useful
3.5
4
4.5
5Extremely
Useful
Survey Choices
Figure 57. Faculty perception of the overall usefulness of the DDP
In addition to the Likert Scale on the overall usefulness of the DDP, faculty were
asked to explain their answers. These open-ended responses were analyzed using SPSS
Text Analysis for Survey software. The original data extraction identified 146 different
terms. These terms were pared down to 26 and placed into seven categories, with some
responses falling into more than one category. There were a total of 55 responses for all
categories.
Table 44 contains the thematic categories, number of responses, and sample
responses from each category. The seven categories identified were: Find Useful,
Negative Comments, Need to Learn More, Frequency of Use, Suggestions, Other, and
Blank responses. There was a total of 93 surveys analyzed with 50 or 53.8% blank
responses. The category with the highest number of responses was Find Useful (45.5%).
Responses were placed in this category if they were analyzed to be positive responses,
205
Table 44
Thematic Conceptual Matrix for Faculty Survey Responses to Overall Usefulness of the
DDP
Categories
Find Useful
N
25
Negative
Comments
19
Need to Learn
More
4
Frequency of
Use
3
Suggestions
3
Other
Blank
1
50
Example Comments
ƒ Useful for narratives. Can cut/paste quotes from other instructors
ƒ I think if used appropriately it's an excellent way to show development
ƒ I'm a fan -- it is a good resource for giving feedback and documenting student
progress
ƒ I may not use it but I believe it is extremely useful to students and colleagues
at the advanced levels
ƒ It helps both teachers and students have a cumulative picture of student
learning
ƒ Very helpful for writing narratives and honors nominations has pushed me to
give more complete, clear feedback has pushed students to do better self
assessment
ƒ Don't see great use factor beyond 2-3 years after graduation.
ƒ It is not suitable for comments on specific parts of a work or for doing
feedback in a variety of settings.
ƒ I simply haven't had the opportunity to create a need for it yet.
ƒ As reported, end of semester time and energy resources are an issue -- I have
58 files to upload right now -- need a department wide review of use.
ƒ Better understanding/agreement re. the philosophy. In particular, I do not find
video/audio useful (because its linearity) and raw performances do not help
me in DDP use. I understand the student may make use of it, but that is not a
substitute for analysis (feedback/assessment).
ƒ I think we have spent a lot of money on a technological tool that has marginal
value that we now need to justify. I may be wrong. I could easily be
convinced that I am incorrect. However, the only value I see to the DDP is
that students can look back on previous performances. While I think this is
neat, I don't see how that is worth millions of dollars and thousands of hours
of investment.
ƒ It would be more useful if I fully understand how it can be used.
ƒ Still learning -- not taking advantage of its full capabilities and struggle w/
time and choice of when to use it.
ƒ The multiple uses --from photos of students I'm not familiar with to being
able to track student growth as well as begin to get more "sense" of program
outcome achievement over time.
ƒ I would rank this higher, but I haven't fully discovered all of the possibilities.
ƒ Would like to see more systematically used.
ƒ Useful as long as it is used. I hope faculty reference this key performance
when writing graduation narratives.
ƒ It's most useful when there's lots there.
ƒ The more flexibility it can get the better.
ƒ I am generally positive about it but think a co-coordinated department effort
with a structure and plan would improve its usefulness to us. However, I don't
think that that is where our dept's priorities lie, even though we are making
some effort in that direction.
ƒ This workshop was stimulating.
206
describing what the faculty found useful concerning the DDP. Any responses of a
negative nature (34.5%) were placed under Negative Comments. Two categories had a
low number of responses (Frequency of Use – 5.5% and Need to Learn More – 9.1%).
Responses that contained suggestion to assist in making the DDP more useful were
placed in the category Suggestions (5.5%). Responses that did not fit into any category
were placed in the category Other (1.8%).
Data from the Likert Scale and open-ended responses indicated that overall,
faculty found the DDP useful. Over 50% of faculty responded with a choice greater than
3 (Useful). While there were a number of negative responses (19), 25 responses indicated
that faculty found the DDP useful.
Survey responses indicated that students and faculty perceived the DDP as useful,
with both sets of data having a median of 3. Responses to the open-ended response
question Please Explain seemed to be more positive for students, with a high number of
responses indicating that students wanted to use the DDP more often (22.8%). Faculty
responses to the open-ended questions indicated that while faculty perceived the DDP as
useful; they see it as a more useful for students, than for themselves.
Interview Data Analysis
During the interviews, students and faculty were asked their perception of the
overall usefulness of the DDP. An additional question was asked concerning any
differences they perceived in using the DDP for self assessment and feedback.
Student Interview Results
Some students described the DDP as useful. However, a number of students
qualified their statements by how it could be useful if they were using it more frequently.
207
For example: (a) “I don’t see a major use for me in my art therapy; I don’t have a whole
lot of things to put in there;” (b) “I haven’t had any reason to [use it], so I don’t see it as
useful;” “If it was fully functional I think it would be great…if it was encouraged to be
used in each and every class I think it would be a great tool;” and (c) “the DDP would be
such an awesome tool if it was used more frequently.”
Students were mixed in their responses on differences in self assessments and
feedback when they use the DDP. Four students responded there were differences when
they used the DDP. Their comments included: (a) “I was able to look at it when I had
time to look at it. If I didn’t have time when it was first in there I could come back later to
look at it, so that was good. And I can go back and look at it again easily without going
down in the basement and digging through the papers;” (b) “I was able to go back into
the DDP and pull-up my assessment, my self-assessment, and her assessment, and look
and see ‘These were your not so strong points, and this is what you needed to change’ …
it was right there on the computer;” and (c) “One thing that would be slightly different,
we did have to upload video for one of our field experiences, and then our peers watched
the video and gave us some feedback. So that was really unique…I think it also gives
people a chance to be more careful about what they’re saying because you’re not just
writing it on a piece of paper and you can’t change it.” One student spoke of the lack of
human interaction when using the DDP. Three students did not perceive any difference in
self assessment and feedback when they used the DDP.
Overall, student interview data reinforced the data from the surveys. Students
perceived the DDP as useful, but thought they were using it infrequently. Several of the
students commented that using the DDP for self assessments and feedback added the
208
dimension of being able to go back and review their work. However, one student missed
the “human interaction” when she used the DDP.
Perhaps students’ perception of the overall usefulness of the DDP can be summed
up in one student’s response: “It’s useful because it gives you the opportunity to see
feedback from the instructors. It’s clear communication between you and the
instructors.”
Faculty Interview Results
The majority of faculty described the DDP as being overall useful. They provided
comments concerning its usefulness including: “I really like it. I really like the public
quality and the way that it forces me to think about how this is going to help a student
and other instructors who read feedback” and “I think that for students to have specific
benchmarks and to reflect back on them no matter what their major or their career path,
it’s wonderful to learn how we evaluate oneself.”
A few faculty described the potential of the DDP to become more useful if it were
used more frequently. For example: “I’ll say it’s becoming very useful…when there’s
enough stuff in there, and we’re using it in a more effective way and a way that’s lined up
better with our philosophy, then it’s worthwhile; worth the extra effort” and “[could] be
used to create a comprehensive comment by the instructor, so when the student would
graduate we would have…some very detailed evaluations that we could use for the
narrative statement.”
Not all faculty interviewed perceived the DDP as useful. One faculty remarked:
“I don’t see from my perspective that it’s a lot of value…I don’t really see it in its present
209
form as a developmental tool…It’s no easier for me than going to the file cabinet in the
secretary’s office and looking at the paper record.”
Faculty’s views on differences in self assessment and feedback when they use the
DDP varied. Several faculty stated they had not used the DDP enough to make a
judgment. Three faculty described differences in self assessment and feedback when they
used the DDP. Example comments included: (a) “I think I do it kind of purposefully
when I use the DDP as a way to communicate student’s performance to other faculty. I
don’t know if I’m successful, but that is my intent;” (b) “No question about it, in 110
especially. This is the only class where I began class not using the DDP, and developed it
specifically for that purpose;” and (c) “What I’ve found—I’m not sure if I could prove
this or not—but I really think that using the DDP for a comprehensive, end of semester
self evaluation really helps the student to take stock of her learning.”
Faculty interviews reinforced their survey results. Faculty found the DDP overall
useful, but think, in most cases, the system needs to be used more frequently. Several
faculty mentioned time and work issues in the interviews, similar to those listed in the
faculty surveys. The interviews also provided numerous faculty suggestions on making
the DDP more useful.
Sub-question 5:
What do students and faculty think of the ease of use of the DDP?
Data were gathered to describe this question from two of the three approaches. No
data were gathered from DDP relational database for this question. Student and faculty
surveys posed a question on the overall ease of use of the DDP. Student and faculty
interviews usually contained a question relating to ease of use.
210
Survey Data Analysis
Students and faculty were asked their perception of the ease of use of the DDP.
This question was rated on a continuous Likert Scale of 1 to 5, with 1 as Not Easy, 3 as
Easy and 5 as Extremely Easy. In addition, this question contained an open-ended
response area titled Please Explain.
Student Survey Results
Students were asked to rate their perception of the ease of use of the DDP on a
Likert Scale and then explain their answer in an open-ended question. Due to the format
of the Likert Scale, students could mark any where on a line. Scores were rounded to the
nearest point or half point to standardize the data. Figure 58 summarizes the Likert Scale
data. Of the students responding to this question (317), 39.1% responded that overall the
DDP was Easy to use, while 6.0% thought the DDP was Not Easy to use. Advanced
students had the highest percent of students that responded the DDP was Not Easy (9.8%)
to use. Only 3.4% of intermediate students thought the DDP was Not Easy to use. A total
of 74.1% of students thought the DDP was Easy to Extremely Easy to use. Intermediate
students had the highest percent of students responding the DDP was Extremely Easy to
use (24.7%), while only 13.1% of advanced students thought the DDP was Extremely
Easy to use.
211
Student Survey Perception of Overall Ease of Use of DDP
50%
Percent of Students
40%
30%
20%
10%
0%
1 - Not
Easy
1.5
2
2.5
3 - Easy
3.5
4
4.5
Survey Choices
Beginning
Intermediate
Advanced
5Extremely
Easy
All Students
Figure 58. Student perception of the overall ease of use of DDP
Table 45 displays the results for all student groups and the corresponding measure
of central tendencies for student perception of the overall ease of use of the DDP. The
Table 45
Student Survey Statistics on Overall Ease of Use of the DDP
1 Not Easy
2
2.5
3 Easy
3.5
4
4.5
5 Extremely easy
Total Responses
Missing
Responses
Total
Respondents
Mean
SD
Median
Beginning
Students
10
44
2
71
1
15
0
24
167
Intermediate
Students
3
8
1
27
0
27
1
22
89
Advanced
Students
6
7
1
26
0
13
0
8
61
All Students
19
59
4
124
1
55
1
54
317
5
2
0
7
172
3.0
1.1
3.0
91
3.7
1.1
4.0
61
3.2
1.1
3.0
324
3.2
1.1
3.0
212
mean for this question was 3.2, with a standard deviation of 1.1 and a median of 3.0
(Easy).
In addition to the Likert Scale on overall ease of use of the DDP, students were
asked to explain their answers. The open-ended responses were analyzed using SPSS
Text Analysis For Survey software. Table 46 displays the results of the analysis.
Table 46
Thematic Conceptual Matrix for Student Survey Responses to Overall Ease of Use of the
DDP
Categories
Easy to
Use
N
97
Negative
Comments
42
Need
Directions
or
Instructions
21
Frequency
of Use
17
Blank
162
Example Comments
ƒ Accessible and self explanatory
ƒ Easy to use and navigate to appropriate area
ƒ I am relatively computer literate so if I don't see something immediately I
assume I've missed it and keep looking rather than giving up.
ƒ I don't have any problems when using it.
ƒ I don't know computers but I know how to log in. They have easy
instructions to follow.
ƒ It's easy once you know what you are doing, but I am still learning how to use
it.
ƒ It's not really clear how to get to certain areas, but once you're in them, it's
pretty straightforward.
ƒ Sometimes things aren't where they should be and there isn't someone there
to explain.
ƒ When I did understand it, the procedure for how to do things changed.
ƒ With instructions, I can use the DDP but I'm not to good with computers.
ƒ Seems a bit complicated to go through the whole process of uploading and
entering info that I don't really use.
ƒ It’s kind of fussy.
ƒ Before using it someone explained what it was a how to use it in a very
understandable way.
ƒ I keep a copy of the instructions.
ƒ It is hard to do without instructions.
ƒ Sometimes I have to refer to instructions on how to upload files to the DDP.
ƒ I have to pull out my technology folder for how to access and passwords etc.
ƒ When I am given a sheet with steps on what to do, I find it easier to use.
ƒ At first it was a little challenging, but after I did it a few times it became
easier.
ƒ Because it is used so rarely, oftentimes one has to refresh themselves with
how to operate again.
ƒ It is difficult to remember if we don't use it very often only at end of the
semester.
ƒ With more use I would become more proficient.
ƒ There are some things I do not know how to do because I have not been
asked to do them.
ƒ Once it’s explained it's easy but for me I have not been on it a lot so I tend to
forget
213
The original data extraction identified 94 different terms. These terms were pared
down to 62 and placed into five categories, with some responses falling into more than
one category. The total number of responses in all categories was 177. Table 46 contains
the thematic categories, frequency of responses, and sample responses from each
category. There was a total of 324 surveys analyzed, with 162 blank responses (50.0%).
The category with the highest number of responses was Easy to Use (54.8%). Responses
were placed in this category if they described the DDP as easy to use. Any responses of a
negative nature were placed into the Negative Comments category (23.7%). Need for
Directions or Instructions was a category created due to a number of responses that
described the need to use directions or have instructions given by faculty (11.9%).
Frequency of Use was a category created due to the number of responses that referred to
not using the DDP enough, using the DDP infrequently, or the DDP would be easier to
use if it was used more often (9.6%).
Data from the Likert Scale and open-ended responses indicated that students
perceived the DDP as easy to use, with a total of 57.1% of students responding with a
choice greater than 3 (Easy). While there were a number of negative comments (42), 97
responses referred to the DDP as being easy to use. There were also 17 responses that
indicated students wanted to use the DDP more frequently.
Faculty Survey Results
Faculty were asked to rate their perception of the overall ease of use of the DDP
on a Likert Scale of 1 to 5, with 1 as Not Easy, 3 as Easy and 5 as Extremely Easy. In
addition, this question contained an open-ended response area titled Please Explain. Due
to the format of the Likert Scale, faculty could mark anywhere on a line. Scores were
214
rounded to the nearest point or half point to standardize the data. Figure 59 summarizes
the results of the Likert Scale question of faculty perception of the overall ease of use of
the DDP. A total of 91 faculty responded to this question, and 31.9% of faculty perceived
the DDP as Easy (3) to use. It is interesting to note that 30.0% of faculty responded with
an answer that was greater than easy (choice greater than 3), while 35.2% perceived the
DDP as being not as easy to use (choice less than 3). The mean was 3.0, with a standard
deviation of 1.2 and the median was 3 (Easy).
Faculty Survey Perception of Overall Ease of Use of DDP
Percent of Faculty
50%
40%
31.9%
30%
10%
20.9%
18.7%
20%
12.1%
9.9%
5.5%
1.1%
0.0%
0.0%
0%
1 - Not
easy
1.5
2
2.5
3 - Easy
3.5
4
4.5
5Extremely
easy
Survey Choices
Figure 59. Faculty perception of the overall ease of use of DDP
In addition to the Likert Scale on overall ease of use of the DDP, faculty were
asked to explain their answers. These open-ended responses were analyzed using SPSS
Text Analysis for Survey software. The original data extraction identified 32 different
terms. These terms were pared down to 19 and placed into six categories, with some
responses falling into more than one category. The categories include: Ease of Use,
215
Negative Comments, Need Training or Directions, Other, Frequency of Use, and Blank
responses.
The Table 47 contains a summary of the results including thematic categories,
number of responses, and samples responses from each category. There were a total of 93
surveys analyzed, with 60 surveys containing blank responses (64.5%). Responses were
placed into five categories, with some responses falling into multiple categories. There
were a total of 40 responses categorized.
Table 47
Thematic Conceptual Matrix for Faculty Survey Responses to Overall Ease of Use of the
DDP
Categories
N
Ease of Use
14
Negative
Comments
14
Example Comments
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
Need
Training or
Directions
6
Other
4
Frequency
of Use
Blank
ƒ
ƒ
ƒ
ƒ
2
60
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
I've had lots of practice over the past 4-5 years.
Once I learned it, it was easy.
Very user friendly
I needed to use it to upload assessment feedback, followed the directions
provided and viola!
Always help is available
Relatively easy, but so far I have only uploaded text files.
I forget how to do some things I've learned before 2. I need to separate out
all the individual files (per student) vs. keeping a class file for feedback.
I really hate how it boots me out repeatedly and the refresh button used 5X
for every back button.
Never used it.
Numbers of students are the problem
We can't easily revise key performances assessments without taking down,
cloning, reinstalling this is not easy and mistake prone.
It takes me many days at the end of the semester to upload feedback.
Because I use it at the end of the semester I always need a learning refresher
to get into the grove again.
Continually need further training
I can provide basic primary feedback but anything beyond that, i.e. setting
up a performance in the DDP, I need help with.
I'm getting better, but I still need to call on Sheila's expertise.
I just experienced the difference between DDP through the Internet vs.
Outlook. When I enter thru Outlook, there's no way to "go back" once you
open feedback.
I need to use reference and resource areas more productively.
See 25
I have too little experience.
What I've done once I can repeat.
216
Two categories tied for highest number of responses (35%), Easy to Use and
Negative Comments. Responses were placed in the Easy to Use category if they were
determined to be responses that described the ease of use of the DDP. Any responses of a
negative nature were placed into the Negative Comments category. Responses were
placed in the Need Training/Directions category if the response referred to the faculty
needing more training or directions on using the DDP (15.0%). Responses referring to
using the DDP more often were placed in the Frequency of Use category (5.0%).
Responses that did not fit into any category were placed in the Other category (10.0%).
Survey responses indicated students and faculty perceived the DDP as easy to use.
Both student and faculty responses had a median of 3 (Easy). The open-ended responses
for the question Please Explain, seemed to be more positive for students, with 54.8% of
responses indicating the DDP was easy to use. Open-ended faculty responses indicated
that while they perceived the DDP as easy to use (35.0%), there were an equal number of
negative comments concerning the DDP.
Interview Data Analysis
Students and faculty were asked for their perceptions of the overall ease of use of
the DDP.
Student Interview Results
Seven of the students interviewed described the DDP as easy to use. Comments
included: (a) “It is fairly easy to use, no problems;” (b) “Not really [any problems]. I
think it is very user friendly. I mean I think it is easy to know how to get into it;” and (c)
“I think it’s gotten better. I know when I first started we couldn’t remove it [uploaded
217
files].” One student did not think the DDP was easy to use: “It scares me…because I
don’t know how [to use it].”
Student interview data followed the same pattern as the survey results. Students
found the DDP easy to use and experienced few problems. However, the interviews
provided the opportunity to probe student responses in more depth and ask additional
questions.
Faculty Interview Results
Faculty perceptions on the ease of use of the DDP varied. Three faculty found the
DDP easy to use, responding: “…navigating the DDP is easy for me;” and “I’ve never,
ever found anything confusing about it.” One faculty remarked the DDP has gotten
easier to use: “I think it’s a little easier. As I said, I think for me the trick is how intuitive
it is.” Three faculty described problems they have had with the DDP. These problems
included receiving the refresh message and/or timing out on the system, problems setting
up a key performance, and the inability to cut and paste comments directly onto the DDP
without creating a separate document.
Faculty interview data reinforced the findings of their survey. Faculty generally
agreed that the DDP is easy to use, but they had concerns about issues and problems they
have experienced.
Sub-question 6:
What are student and faculty perceptions concerning their frequency of use of the DDP?
Data were gathered to describe this question from two of the three approaches. No
data were gathered from database mining for this question. Student and faculty surveys
218
posed a question on the frequency of use of the DDP. Student and faculty interview
questions also contained a question on the frequency of use of the DDP.
Survey Data Analysis
Students and faculty were asked their perception of their frequency of use of the
DDP. This question was rated on a Likert Scale of 1 to 5. In addition, this question
contained an open-ended response area titled Please Explain.
Student Survey Results
Students were asked to rate their frequency of use of the DDP on a continuous
Likert Scale of 1 to 5 with 1 as Not Enough, 3 as Enough, and 5 as Too Much. Due to the
format of the Likert Scale, students could mark any where on a line. Scores were
rounded to the nearest point or half point to standardize the data. Figure 60 summarizes
the Likert Scale data on student perceptions of the frequency of use of the DDP. Of the
students responding to this question (315), 51.4% answered they were not using the DDP
enough (response less than 3), with 26.0% of students responding Not Enough (response
of 1). Only 2.2% of students perceived they were using the DDP Too Much (response of
5). A response of greater than 3 was given by 7.9% of the students. Beginning students
had the highest percent of responses that indicated they used the DDP Enough (44.6%).
219
Student Perception of Frequency of Use of the DDP
50%
Percent of Students
40%
30%
20%
10%
0%
1 - Not
Enough
1.5
2
2.5
3 - Enough
3.5
4
4.5
5 - T oo
Much
Survey Choices
Beginning
Intermediate
Advanced
All Students
Figure 60. Student perception of the frequency of use of the DDP
Table 48 summarizes the results for all student groups and the corresponding
measures of central tendencies on their perception of the frequency of use of the DDP.
The mean for all students was 2.3, with a standard deviation of 1.0; the median was 2.0.
Table 48
Student Survey Statistics on Frequency of Use of the DDP
1 Not Enough
2
2.5
3 Enough
3.5
4
5 Too Much
Total Responses
Missing Responses
Total Respondents
Mean
SD
Median
Beginning
Students
45
33
1
74
1
8
4
166
6
172
2.7
1.0
3.0
Intermediate
Students
20
30
0
33
0
4
1
88
3
91
2.3
0.9
2.0
Advanced
Students
17
15
1
21
0
5
2
61
0
61
2.3
1.1
2.0
All Students
82
77
3
128
1
17
7
315
9
324
2.3
1.0
2.0
220
In addition to the Likert Scale on frequency of use of the DDP, students were
asked to explain their answers. Table 48 displays the results of the SPSS Text Analysis
for Survey software. The original data extraction identified 55 different terms. These
terms were pared down to 39 and placed into seven categories, with some responses
falling into more than one category. There was a total of 165 categorized responses.
Table 49 contains the thematic categories, number of responses, and sample
responses for each category. There were a total of 324 surveys analyzed, with 166
surveys containing blank responses (51.2%). The category with the highest number of
responses was Frequency of Use (52.1%). Responses were placed in this category if they
described wanting to use the DDP more, not using the DDP enough, or using the DDP
more to enhance its usefulness. The category Use Right Amount contained responses
(18.2%) that indicated the students thought they were using the DDP enough. Any
responses of a negative nature (15.1%) were placed into the Negative Comments
category. In some cases students referred to the DDP as being useful, or described what
they used it for. These responses (6.7%) were placed in the Useful/Used for category.
There were a number of responses (2.4%) in which students stated they were unsure or
did not know. These responses were placed in the Unsure or Don’t Know category. The
Other category contained nine responses (5.4%) that did not fit into any other category.
221
Table 49
Thematic Conceptual Matrix for Student Survey Responses to Frequency of Use of the
DDP
Categories
Frequency
of use
N
86
Used Right
Amount
30
Negative
Comments
25
Useful/Used
for
11
Other
9
Unsure or
Don’t
Know
Blank
4
Example Comments
ƒ Not required for every class
ƒ Especially related to my major, I would like to see these things on the DDP.
ƒ Haven't been asked by teachers to use DDP. Have only used 2 twice for two
semesters.
ƒ I have only used the DDP for assessments in certain classes or for WEC
assessments.
ƒ I have only used the DDP twice this entire semester. The more practice the
better.
ƒ It seems that for its purpose we don't use it enough. We should use it more.
ƒ Each class recommends you use DDP.
ƒ Enough for me, not enough for my education.
ƒ I have been asked to use the DDP enough this semester than before and it help
get me used to it.
ƒ Just right! Too little won't reach you -- too much makes you sick of it.
ƒ My instructors constantly remind when I need to upload and when I want to
check feedback.
ƒ Some teachers want you to upload. Some don't that's just how it is.
ƒ I don't think the DDP is useful as it is for upper division students.
ƒ I find the DDP to be annoying. If it was not required I would not use it. I find
it to be an extra step in the self assessment process. I would be happy to just
write-out a final self assessment on word and hand it in; my instructors would
keep it anyway in my file I would also have a copy saved for myself if needed.
ƒ I only use it when I asked to. If I was not asked I would not us it.
ƒ I'm not proficient, therefore I'm uncomfortable using this site.
ƒ Too much only because it has been not a part of my school until recently - it
has come to me as extra work I was unaware of.
ƒ We really were only told about it not how to use it.
ƒ I find the DDP useful, esp. for viewing feedback; however, few of my course
require I upload work to the DDP.
ƒ On a regular basis I use it to check my progress, load my assessments, or go
back and look at previous works.
ƒ I like having my work on the DDP. Then I don't have to worry about keeping
tract of it.
ƒ I use DDP mostly off site It is a great tool for communications.
ƒ Communications and NSS teachers asked us to use it.
ƒ Incorporate the Educator and the DDP.
ƒ See above
ƒ Don't really have an opinion on this one.
ƒ I am not exactly sure how often I should be using DDP.
ƒ I am not fully sure why the DDP is used other then for review of my work.
166
Data from the Likert Scale and open-ended responses indicated students perceived
they do not use the DDP enough. A total of 57.1% of students responded with a choice
less than 3. While there were a number of negative comments in the open-ended response
222
area (15.1%), 52.1% of categorized responses referred to students wanting to use the
DDP more often.
Faculty Survey Results
Faculty were asked to rate their perception of their frequency of DDP use with
their students on a continuous Likert Scale with 1 as Never, 3 as Often and 5 as
Frequently. In addition, this question contained an open-ended response area titled Please
Explain. Due to the format of the Likert Scale, faculty could mark any where on a line.
Scores were rounded to the nearest point or half point to standardize the data.
Figure 61 summarizes the responses to the Likert Scale of faculty perception of
the frequency of their DDP use. A total of 86 faculty responded to this question and
24.4% of faculty perceived their DDP use with students as Often (3). It is interesting to
note that 60.5% of faculty responded with an answer less than 3 (Often), while 15.1%%
perceived their use of DDP with students as Frequently (choice greater than 3). The mean
for this question was 2.5, with a standard deviation of 1.1; the median was 2.0 between
Never (1) and Often (3).
223
Faculty Perception of Frequency of Use of the DDP
50%
38.4%
Percent of Faculty
40%
30%
24.4%
20%
16.3%
9.3%
10%
5.8%
3.5%
2.3%
0.0%
0.0%
0%
1 - Never
1.5
2
2.5
3 - Often
3.5
4
4.5
5Frequently
Survey Choices
Figure 61. Faculty perception of the frequency of use of the DDP
In addition to the Likert Scale on how often faculty used the DDP with their
students, faculty were asked to explain their answers. These open-ended responses were
analyzed using SPSS Text Analysis for Survey software. The original data extraction
identified 26 different terms. These terms were pared down to 12 and placed into six
categories. No responses fell into more than one category.
Table 50 contains a summary of the results of the SPSS Text Analysis with the
thematic categories, number of responses, and sample responses from each category. The
response categories for this question were somewhat more difficult to assign, due to the
nature of the responses. Some responses were specific to a number of times faculty used
the DDP. Three categories were created based on responses that contained numbers.
Responses that referred to two or more DDP uses (key performances) were placed in
Meets Institutional Goals category. Responses indicating infrequent or occasionally use
were placed in the Use Occasionally category. Responses that indicated a high amount
224
of DDP use were placed in the Use Frequently category. The remainder of responses
were placed in the categories Do not use, Other, and Blank. There were a total of 93
surveys analyzed with 57 surveys that contained blank responses (61.3%). There were a
total of 37 categorized responses, with some responses falling into more than one
category.
The category with the highest number of responses was Use Occasionally, with
18 responses (48.6%). The category Do Not Use had nine responses (24.3%). The
category Meets Institutional Goal contained six responses (16.2%), while the category
Use Frequently contained two responses (5.4%). The Other category contained one
response (2.7%).
Table 50
Thematic Conceptual Matrix for Faculty Survey Responses to Frequency of Use of the
DDP
Categories
N
Example Comments
Use
Occasionally
18
Do Not Use
9
Meets
Institutional
Goal
7
Use
Frequently
2
Other
Blank
1
57
ƒ 1 key performance, per course
ƒ For externals only at this point. May later include final feedback that I work
process anyway.
ƒ I am on the curve of adoption toward "often". I've made a commitment to
myself to use it every semester.
ƒ I am doing a bit more each semester. I have designed a key performance
every other semester.
ƒ I try to do one set of DDP feedback in a course.
ƒ Only when required or reminded, sadly
ƒ Haven't known enough yet -- too much to learn in first years of teaching.
ƒ I find I want students to rely more on intense face-to-face feedback or paper
and in person with the students work before rare, not rely on computer
mediated experiences as substitutes.
ƒ I used it but stopped. It took too much time. At 2 minutes/student to upload
in a class of 30 this is 1 hour.
ƒ Not required to put any performances on the DDP.
ƒ 2x each semester
ƒ Depends on the course and if t is a key performance twice each semester for
such a course.
ƒ L4 and L6 end of semester formal external assessments
ƒ Require items to be uploaded in all classes and feedback is given via DDP
ƒ All discipline 383 internships all discipline 483-492 internships used at end
of the semester.
ƒ I have a key performance in almost every course in my discipline.
ƒ See comments previous page
225
Student survey responses on how often they used the DDP with their students
indicated they perceived the DDP as not being used enough. The students’ open-ended
responses to Please Explain reinforced this view. Over 25% of student responses fell into
the Frequency of Use category (use the DDP more). Over 60% of faculty chose less than
often to describe their frequency of use of the DDP with their students. Faculty openended responses to this question pertained to how often they were using the DDP, with
Use Occasionally (48.6%) as the top category followed by Do Not Use (24.3%).
Interview Data Analysis
During the interviews, students and faculty were asked their perceptions on how
often they use the DDP.
Student Interview Results
There was a distinct pattern of infrequent use of the DDP in the student interview
responses. For example: (a) “I haven’t had to [use the DDP]. It’s kind of puzzling to
me;” (b) “We haven’t had to [use the DDP];” (c) “I think it tends to be hit-or-miss with
the faculty’s comfort with the DDP;” and (d) “...in my other advanced nursing courses I
have not done a single upload of DDP.”
Other patterns in student responses concerning frequency of use included DDP
being used more in beginning courses (less in majors) and want to use the DDP more.
Examples of student responses included: (a) “When I initially came to Alverno… we did
a lot of DDP work. After than there really wasn’t much to upload;” (b) “…it’s really
infrequently at this point, now that I’ve gotten into the upper level course work;” (c) “I
haven’t had to [use the DDP]. It is kind of puzzling to me;” and (d) “It’s just hit-ormiss… potentially we could use it as our portfolio in Education.”
226
There were a number of student comments describing wanting to use the DDP
more or perceiving that other students were using it more. For example: (a) “I use it, but
not to the extent that I could use it;” (b) “I wish it would be more because I would like to
go in there and see [my work];” (c) “We did find out that underclassmen were using it
more;” and (d) “A lot of students are using it way more than I am.”
Student interview responses were similar to the data gathered from their surveys.
Students perceived they were using the DDP infrequently and wanted to use it more.
Faculty Interview Results
Faculty were asked how often they use the DDP with their classes. Faculty
responses indicated they thought their use of the DDP was infrequent. Five out of six
interviewees made comments concerning infrequency of use, such as: (a) “[I use it] only
in upper level courses;” (b) “I have the habit of using the DDP once in a semester;” and
(c) “I probably would want to use it more.” One faculty member indicated that he had
used the DDP in the past, but no longer used it, stating: “I am not a fan of the system. I
have not done anything else, and don’t intend to if I don’t have to.”
One notable response indicated a faculty member had gone “full circle” in their
thinking about the use of the DDP: “It’s interesting because what I’ve found is that I’m
now using the DDP when I used it in the past; in other words, I’ll start teaching a course
and it’ll prompt me to go back to assignments I’m thinking about and it’ll prompt me that
I did this assignment on the DDP.”
Student and faculty interview comments reinforced their survey results. Students
perceived they use the DDP infrequently and want to use it more. Faculty comments
indicated that while they perceived the DDP as useful, they are using it infrequently.
227
Sub-question 7:
What suggestions do students and faculty have on: (a) improvement of the usefulness
of the DDP, (b) assistance in using the DDP more, (c) general ideas for improvement
of the DDP, and (d) additional comments on the DDP?
Data were collected for this question from two of the three data-gathering
approaches. For this sub-question, no data were gathered from the DDP relational
database. Student and faculty surveys contained several questions that related to
suggestions on how to improve the usefulness of the DDP, how to increase the use of the
DDP, and what general ideas student and faculty had for improving of the DDP.
Interviews with student and faculty contained general questions on how to improve or
increase the use of the DDP.
Survey Data Analysis
The student and faculty surveys contained four open-ended questions to gather
suggestions to increase the use of the DDP and improve the program. These four
questions were used to organize the survey data:
1. What do you think could enhance the usefulness of the DDP?
2. What do you think could help you use the DDP more?
3. What are your suggestions for improving the DDP?
4. Do you have any additional comments on the DDP that you would like to
share?
What do you think could enhance the usefulness of the DDP?
Students and faculty were asked their ideas on how to enhance the usefulness of
the DDP. These open-ended responses were analyzed using SPSS Text Analysis for
Survey Software.
228
Student Survey Results. A total of 324 student surveys were analyzed. The
original data extraction identified 108 terms. These terms were synthesized down to 63
and placed into six categories, with some responses falling into multiple categories. The
six categories were: Frequency of Use, Suggestions, Directions/Training, Negative
Comments, Good the Way It Is, and Blank. Responses without answers, “N/A”, “no
opinion”, or “don’t know” were placed in the Blank category which contained 129
responses (39.8%). There was a total of 214 responses placed in the five other categories.
Responses were placed in the Frequency of Use category if they described
wanting to use the DDP more, not using the DDP enough, or requiring DDP use. There
were 54 responses in the Frequency of Use category (25.2%). Responses were placed in
the Suggestions category if they described a suggestion for improvement of the DDP.
There were 53 responses in the Suggestions category (24.8%). The Directions/Training
category contained 48 responses that referred to needing more training, improving
directions, or giving more instruction on how to use the DDP (22.4%). Any response of a
negative nature was placed in the Negative Comments category (n=30, 14.0%).
Responses referring to the DDP as being “good the way it is” or “don’t change a thing”
were placed in the Good the Way it is category (n=29, 13.6%).
Table 51 displays the results of the student responses on to how to enhance the
usefulness of the DDP. Table 51 is a Thematic Conceptual Matrix that lists the
categories, number of responses, and sample comments for each category.
229
Table 51
Thematic Conceptual Matrix for Student Survey: What could enhance the usefulness of
the DDP?
Categories
Frequency
of use
N
54
Suggestions
53
Directions/
Training
48
Negative
Comments
30
Good the
Way It Is
29
Blank
129
Example Comments
ƒ Actually have the instructors use it.
ƒ All instructors should use it not just a few, then students will access it more.
ƒ Can't think of anything other than having a requirement at least one performance
posted to DDP for each class. Would help by revisiting site more often and
making it more useful
ƒ Having professors use it consistently from class to class.
ƒ Instructors don't use it enough to have it be of any use to measure our
performance or improvement.
ƒ Making me use it in more classes none of my classes used it this semester.
ƒ Put all feedback in one matrix. I have to go to Business and Mgmt section to
view feedback.
ƒ Please encourage instructors to upload feedback in a timely manner.
ƒ That if you mess up and put the wrong thing you can remove it anytime not just
24 hours.
ƒ Use it for every class for reference. Only partial info is there. I would like to
find everything there for all validations. Especially major specific stuff
ƒ User friendly with larger letters
ƒ Same sign on user codes for everything.
ƒ If we were a little more informed about it and someone could show us exactly
how to use it.
ƒ If the DDP is something which will be integrated more into the system, students,
faculty, and staff need to be trained on it.
ƒ Proper instruction all of its main functions at entrance to Alverno.
ƒ The instructions could be simplified more.
ƒ Understanding its purpose.
ƒ A better workshop on how to use it instead of the 20 minutes when you are a
beginning student.
ƒ Don't really use it to access my learning since several of my instructors did not
update it.
ƒ Get rid of DDP!
ƒ I really don't understand it.
ƒ If not all validations and key work is shown on the DDP, then what is the point?
Make it complete and it would be a great tool. The contrast between my
validations on IOL and the DDP is HUGE. The DDP is extremely incomplete. I
would love to constructively weigh in on this!!!
ƒ It was kind of confusing using templates, saving and uploading, using matrices.
ƒ Should have chosen one system Educator /or DDP.
ƒ At this point I think the DDP is at its best I haven't had any problems with it.
ƒ Can't think of anything other than having a requirement at least one performance
posted to DDP for each class. Would help by revisiting site more often and
making it more useful.
ƒ Don't know – it’s great as is!
ƒ I have found it extremely useful and don't see room for improvement.
ƒ I don't think that anything else should be added. I like it how it is right now.
ƒ Don't know.
ƒ I am not sure at this time.
ƒ None that I can think of at this time.
230
Faculty Survey Results. A total of 93 faculty surveys were analyzed using SPSS
Text Analysis for Survey Software. The original extraction identified 67 terms. These
terms were synthesized into 25 terms and placed into seven categories, with some
responses falling into multiple categories. The seven categories were: Suggestions,
Directions/Training, Frequency of Use, Time/Work Issues, Negative Comments, and
Blank. There were 33 surveys with blank responses and these placed in the Blank
category (35.5%). There were a total of 58 categorized responses.
The Suggestions category had the largest number of responses with 24 (41.4%).
Responses were placed in this category if they described a suggestion for improvement of
the DDP. The Instruction/Training category contained 21 responses that refer to needing
more training, improving directions, or giving more instruction on how to use the DDP
(36.2%).The Frequency of Use category contained seven responses that described
wanting to use the DDP more, not using the DDP enough, or making the DDP a
requirement (12.1%). There were five responses that referred to the amount of time or
work it takes to use the DDP. These responses were placed into the Time/Work Issues
category (8.6%). While time and work issues were a type of response that could fall into
the Negative Comments category, a separate category was created to keep track of this
theme for the institution. Any response of a negative nature was placed in the Negative
Comments category (n=1, 1.7%).
Table 52 displays the survey results on faculty perception of what could enhance
the DDP. Table 52 is a Thematic Conceptual Matrix that lists the categories, number of
responses, and sample comments from each category.
231
Table 52
Thematic Conceptual Matrix for Faculty Survey: What could enhance the usefulness of
the DDP?
Categories
N
Example Comments
Suggestions
24
Directions/
Training
21
Frequency of
Use
7
Time/ Work
Issues
5
Negative
Comments
1
ƒ Being able to look at student work their self assessment side-by-side as well as
see the document I am typing feedback into.
ƒ Bring part-time faculty on board.
ƒ Develop department plan to see how, when, where it is used in our discipline.
ƒ If faculty course use key performance feedback to include a balance global
feedback on students performance in the entire course.
ƒ If feedback didn't need to be separate from work. I need to make notes directly
on students' work, but don't want to scan a whole paper into DDP.
ƒ More variety of feedback modes. I like the idea of student scanning papers that
have written feedback.
ƒ Simple means of scanning/photographing pages of handwritten feedback for
upload.
ƒ A clearer sense of the "work" vs. "self assessment" functions. I was under the
impression that only assessment/SA should be in the DDP.
ƒ Continue development of interfaced. More faculty development.
ƒ I need to learn how to scan and upload handwritten feedback and use digital
audio taping.
ƒ Make sure all students WEC, WDC are able to access and manipulate the
technology effectively by the end of their semester.
ƒ Ongoing training and use
ƒ Training of students how to use and why and its benefits -- This shouldn't be the
role of the faculty.
ƒ From what the students say, more of their work on the DDP.
ƒ More faculty using it and using it well.
ƒ More use
ƒ Find less time consuming ways of giving feedback to large classes.
ƒ Integrate it more with other faculty work.
ƒ Make it less of a burden on faculty, especially at the end of the semester when
folks are exhausted.
ƒ The last time I used DDP (Fall) it behaved like a beta test. The program failed
during the creation of a key performance, help documentation was inaccurate or
missing. Sheila was very helpful, but a well design product wouldn't have
needed her sitting down with me. Similarly some students encountered
significant problems uploading. It’s partly about the technology these things
need to be fixed before it's useful.
Blank
33
Student and faculty responses to the open-ended question on suggestions to
enhance the usefulness of the DDP had some similarities. The student and faculty
response categories both included the categories: Suggestions, Directions/Training,
Negative Comments, and Frequency of Use. Both students and faculty had a high number
of responses that referred to specific suggestions for enhancing the usefulness of the DDP
232
and responses that referred to directions and/or training. Examples of suggestions
included (student) “I think if it was a requirement to review our progress using the DDP
as a tool it would make the DDP more meaningful” and (faculty) “Changing systems
departmentally to incorporate its use in a meaningful way.” Examples of responses in the
Directions/Training category included (student) “Learning what it is really used for
(maybe a workshop)” and (faculty) “Discussions like faculty panel use and students'
perception picked up hints like audio and scanning.”
One difference between students and faculty responses was the student category
Good the Way It Is. Faculty survey results did not include this category. An example of
a student response in this category was: “I have found it extremely useful and don't see
room for improvement.”
What do you think could help you use the DDP more?
Students and faculty were asked their ideas on what could help them use the DDP
more. These open-ended responses were analyzed using SPSS Text Analysis for Survey
Software.
Student Survey Results. A total of 324 student surveys were analyzed. The
original data extraction identified 71 terms. These terms were synthesized down to 45
terms and placed into seven categories, with some responses falling into multiple
categories. The seven categories were: Frequency of Use, Directions/Training,
Suggestions, Negative Comments, Others, Useful/Positive Comments, and Blank. There
were 161 responses without answers, “N/A”, “no opinion”, or “don’t know” and these
were placed in the Blank category (49.7%). There were a total of 170 categorized
responses.
233
Responses were placed in the Frequency of Use category if they described
wanting to use the DDP more, not using the DDP enough, or making the DDP a
requirement. There were 70 responses in the Frequency of Use category (41.2%). The
Directions/Training category contained 42 responses that referred to needing more
training, improving directions, or giving more instruction on how to use the DDP
(24.7%). Responses were placed in the Suggestions category if they described a
suggestion for improvement of the DDP. There were 22 responses in the Suggestions
category (12.9%). Any response of a negative nature was placed in the Negative
Comments category (n=18, 10.6%). There were five responses that referred to the DDP as
being good and/or listed examples of the usefulness of the DDP were placed in the
Useful/Positive Comments category (2.9%). The Other category contained 13 responses
that did not seem to fit in any other category (7.6%).
Table 53 displays the results of the student responses concerning suggestions to
increase the use of the DDP. Table 53 is a Thematic Conceptual Matrix that lists the
categories, number of responses, and sample responses from each category.
234
Table 53
Thematic Conceptual Matrix for Student Survey: What do you think would help you use
the DDP more?
Categories
Frequency
of use
N
70
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
Directions/
Training
42
ƒ
ƒ
ƒ
ƒ
ƒ
Suggestions
22
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
Negative
Comments
18
Other
13
Useful/
Positive
Comments
Blank N/A
Not Sure
Don’t Know
5
161
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
ƒ
Example Comments
Have more classes access the DDP.
If more teachers asked me to upload key performances for classes.
Having to do things on. Show what more use it has the uploading work on it
If all of our work was on there/feedback.
Make it mandatory or offer a free class on how to use it properly.
More encouragement from instructors to use the resource section.
The more you have to use it the better it is. Some teachers can use it more
often.
Using it more and having the sheet to explain what the DDP is and what it is
about.
A class in it and how to use it.
Better directions and easier available
Either have a class focus on this or have a more one-on-one help with the
DDP because some may have forgotten to know how to use it in the beginning
of the year.
If somebody would show me exactly how to do it and what each part is for.
Student should have a class on how to use and continue to use it until it
becomes second nature.
Better orientation of site … prettier site. Bolder font more simplistic site.
Emphasize the benefits of using the DDP to view past work and look at your
matrix.
If my teachers valued it and knew how to use it.
Putting projects on the DDP and the DDP only.
Reminder that it is there AND info on how to organize info under
references/work.
That the info is updated after each course w/ all instructor feedback. I've had
classes where I've received no feedback.
I don't like to use it.
If I could access it at home, but that has nothing to do with school.
I really don't need to use the DDP My work is better organized at home.
More directions on one site, instead of DDP, IOL, Educator and web mail (too
much, ahhhh!)
To much stuff to learn.
A laptop at home.
As I become knowledgeable as to what it can provide I'll be more receptive.
I think I need to make spare time in my social life to do it.
Not being afraid to mess anything up on the DDP.
DDP is a good program to follow.
Everything is great as it is I feel that I use it very often.
Since it is accessible from everywhere, I think it works fine.
N/A
No idea
Not sure
Nothing really
235
Faculty Survey Results. A total of 93 faculty surveys were analyzed. The original
data extraction identified 42 terms. These terms were synthesized into 25 terms and
placed in eight categories, with some responses falling into multiple categories. The
eight categories were: Suggestions, Directions/Training, Time/Work Issues, Personal
Growth, Frequency of Use, Other, Negative Comments, and Blank. There were 41
surveys with blank responses, and these were placed in the Blank category (44.1%).
There was a total of 60 categorized responses.
The Suggestions category had the largest number of responses with 16 (26.7%).
Responses were placed in this category if they described a suggestion for improvement of
the DDP. The Directions/Training category contained 15 responses that referred to
needing more training, improving DDP directions, or providing more instruction on how
to use the DDP (25.0%). There were eight responses that referred to the amount of time
or work it takes to use the DDP (13.3%). These responses were placed into the
Time/Work Issues category. While time and work issues were a type of response that
could fall into the Negative Comments category, a separate category was created to keep
track of this theme for the institution. The Personal Growth category contained eight
responses concerned with what the individual faculty needed to do for their own
development and growth in using the DDP (13.3%). The Frequency of Use category
contained six responses that described wanting to use the DDP more or not using the
DDP enough (10.0%). Any response of a negative nature was placed in the Negative
Comments category (n=2, 3.3%). There were five responses that did not fit into any
category and these were placed in the Other category (8.3%).
236
Table 54 is a Thematic Conceptual Matrix that lists the categories, number of
responses, and sample comments from each category from the faculty survey question on
what would help faculty use the DDP more.
Table 54
Thematic Conceptual Matrix for Faculty Survey: What do you think would help you use
the DDP more?
Categories
N
Example Comments
Suggestions
16
Directions/
Training
15
Time/ Work
Issues
8
Personal/
Growth
8
Frequency
of Use
6
Other
5
Negative
Comments
2
ƒ A stronger department context.
ƒ Accountability
ƒ How to "standardize" template and areas for key performances across our
department.
ƒ I could give my feedback to a staff person and that person would do the clerical
job of adding stuff to the DDP.
ƒ If students had laptops and DDP became a text/folder for every class.
ƒ Working with DDP support staff on issues of student errors and omissions in
DDP work.
ƒ More hands-on experience after getting education/information about the options.
ƒ Written directions step by step
ƒ "How to" knowledge.
ƒ Assistance with creative ideas and with video.
ƒ Knowing more about its features.
ƒ Learn to do oral feedback - isn't system limited for that type (and video) re
space?
ƒ Find less time-consuming ways to give feedback to large classes.
ƒ Ideas from today will help me i.e.. Combining assessment with final course
feedback etc. Things to decrease labor intensiveness would help.
ƒ More time and training.
ƒ Time to work with Sheila.
ƒ I have to think of ways to be more efficient.
ƒ I need to get serious about using it. Is there a written tutorial?
ƒ More imagination on my part.
ƒ A better home computer.
ƒ More experience and a better thought-out key performance.
ƒ Use it more.
ƒ I think I use it quite often.
ƒ I have some ideas about improving student reflection for growth, but I am not
sure the DDP is essential to them.
ƒ Same answer as 25
ƒ Disclaimers aside, it is more work and more time to use the DDP. Cut my
student load so that I have the time or provide support to scan/edit/upload
feedback -- otherwise, I see no likelihood of using it. After Fall I've decided not
to use DDP until it works, and I see that the system will accommodate the way
useful feedback is given.
ƒ Visual ease of use
Blank
41
237
Student and faculty responses to the open-ended question on suggestions on how
to increase use of the DDP had some similarities. For example, both data sets contained
the categories: Suggestions, Directions/Training, Negative Comments, and Frequency of
Use. The category with the highest number of responses for students was the Frequency
of Use category. There were over 40% of categorized responses that indicated students
wanted to use the DDP more often, more consistently, and across the curriculum. For
example: “More requirements to use it.”
Faculty surveys had a low response rate for the question of how to increase the
use of the DDP. There were 41 blank responses (44.1%). Faculty provided 16 suggestions
(22.7%) on how to increase the use of the DDP. These responses focused on making the
DDP more efficient and decreasing the amount of time needed to upload files. For
example: “How to ‘standardize’ template and areas for key performances across our
department.” Both students and faculty responses had a focus on increasing training and
improving the directions.
What are your suggestions for improving the DDP?
Students and faculty were asked their suggestions for improving the DDP. These
open-ended responses were analyzed using SPSS Text Analysis for Survey Software.
Student Survey Results. A total of 324 student surveys was analyzed. The original
data extraction identified 111 terms. These terms were synthesized down to 53 terms and
placed into seven categories, with some responses falling into multiple categories. The
seven categories of student responses were Suggestions, Frequency of Use,
Directions/Training, Good the Way It Is, Negative Comments, Other, and Blank. There
were 190 surveys without answers, or contained responses of “N/A”, “no opinion”, or
238
“don’t know”. These responses were placed in the Blank category (58.6%). There was a
total of 147 categorized responses.
Responses were placed in the Suggestions category if they described a suggestion
for improvement of the DDP. There were 43 student responses in the Suggestions
category (29.3%). Thirty-four responses described wanting to use the DDP more, not
using the DDP enough, or having the use of the DDP be required (23.1%). These were
placed in the Frequency of Use category. The Directions/Training category contained 29
responses that referred to needing more training, improving directions, or giving more
instruction on how to use the DDP (19.7%). Seventeen responses were positive toward
the DDP and indicated the DDP was already a useful tool. These responses were placed
in the Good the Way It Is category (11.6%). Any response (14 responses) of a negative
nature was placed in the Negative Comments category (9.5%). There were 13 responses
that did not seem to fit in any category and were placed in the Other category (8.8%).
Table 55 displays the results of the student responses on suggestions to improve
the DDP. Table 55 is a Thematic Conceptual Matrix that lists the categories, number of
responses, and sample responses from each category.
239
Table 55
Thematic Conceptual Matrix for Student Survey: What are your suggestions for
improving the DDP?
Categories
Suggestions
N
43
Frequency
of Use
34
Directions/
Training
29
Good the
Way It Is
17
Negative
Comments
14
Other
10
Blank/ No
Opinion/
Nothing
190
Example Comments
ƒ Make it easier.
ƒ A bit more color.
ƒ Just to make it more spaced an instead of it being so condensed with small
letters open it up a little more.
ƒ Keep it updated w/ instructor feedback.
ƒ Pull down menus
ƒ Windows for submitting assessments or other work to the DDP.
ƒ Make it easy to access the link. Require everyone to use it at least once.
ƒ I think that we should upload all coursework to the DDP for easy access and
review.
ƒ Advertise/Publicize it more encourage staff and faculty to encourage students
to use it.
ƒ Either get rid of it or use it more often.
ƒ Encourage instructors to have students complete things on the DDP.
ƒ Have the students use it more to become familiar w/ it.
ƒ A clearer explanation of its uses.
ƒ A little more help as we use it more often will help.
ƒ Have example or explanation on the ability in each of the levels
ƒ Make a course to teach us.
ƒ Teach students how to use it without being afraid. We need to understand
how it works better.
ƒ To make it more clear and for us to put more self assessments on it so we can
go back and read them.
ƒ Good as is!
ƒ I think it is useful and fine currently.
ƒ Nothing, it’s good as is.
ƒ The DDP is great to use for updating your work its fine.
ƒ Nothing, it is pretty user friendly.
ƒ Again, too many sites. More functions in one centralized area would be much
more helpful.
ƒ Either get rid of it or use it more often.
ƒ It can be difficult to navigate -- last semester there were a lot of problems and
even the teacher couldn't do it. We were referred to too many people to fix it.
ƒ The DDP is a very good tool but I find it to be useless.
ƒ More communication among student and staff.
ƒ Same as 27, 29
ƒ I'll share in person.
ƒ I have not suggestions at the moment.
ƒ I have no idea at this time.
ƒ None, because I like the way how things are not having to use it.
Faculty Survey Results. Ninety-three faculty surveys were analyzed. The original
data extraction identified 45 terms. These terms were synthesized down to 21 and placed
into five categories with some responses falling into multiple categories. The five
240
categories were Suggestions, Directions/Training, Negative Comments, Other, and Blank.
There were 65 surveys with blank responses that were placed in the Blank category
(69.9%). There was a total of 31 categorized responses.
The Suggestions category had the largest number of faculty responses with 19
(61.3%). Responses were placed in this category if they described a suggestion for
improvement of the DDP. The Directions/Training category contained six responses that
referred to needing more training, improving directions, or giving more instruction on
how to use the DDP (19.4%). Any response of a negative nature was placed in the
Negative Comments category (n=3, 9.7%). There were three responses that did not fall
into an existing category; these were placed in the Other category (9.7%). Table 56 is a
Thematic Conceptual Matrix that lists the categories, number of responses, and sample
comments from each category.
Table 56
Thematic Conceptual Matrix for Faculty Survey: What are your suggestions for
improving the DDP?
Categories
N
Example Comments
Suggestions
19
ƒ Be able to look at several windows at once. It is labor intensive to go through
each aspect of student's feedback.
ƒ Create a place where summary course feedback can be found (not hiding it
under the final assessment). It shouldn't be that hard, and would really help for
narrative writing.
ƒ Improve video quality.
ƒ Keep showing us how not to make it an add-on job. Ex. I give written/oral
feedback to speech students right after speech when it's "fresh" and other
students offer constructive comments In this way, DDP can be limiting -- i.e.
can't be live w/o extensive set-ups, technical consideration (digital recorders,
etc.).
ƒ Longer time-out period (1 to 2 hours) easier retrieval of information on the
analysis tab
ƒ Make it possible to upload whole-class data from Excel, Word or Educator.
Make it possible to upload feedback w/o assigning a progress code - students
who get an ‘I’ need the feedback the most.
ƒ More flexibility also as we keep having class sizes of 25-30 we need ways to
support both our own use and teaching/incentives for part timers.
ƒ Set up protocol for students scanning of script/handwritten feedback -- put onus
on students.
Table Continued
241
Table Continued
Categories
N
Example Comments
Directions/
Training
6
Negative
Comments
3
Other
3
ƒ Easy hand-outs describing the purposes for using the DDP and how to use it for
the various purposes.
ƒ Explain its value beyond the "neat to have" factor.
ƒ Learning new ways to enter feedback -- I hand write feedback then need to type
it. Could my handwritten be entered? And if so, is it easy?
ƒ Training by dept. so it meets our departmental needs.
ƒ Be able to look at several windows at once. It is labor intensive to go through
each aspect of student's feedback.
ƒ The DDP is somewhat scary because uploading feels so permanent -- easier
delete functions would help.
ƒ I need to continue further investigation.
ƒ They are my own personal developmental needs.
ƒ See 25
Blank
65
There were similarities between student and faculty responses to the open-ended
question providing suggestions on how to improve the DDP. The categories of responses
for student and faculty both contained the following categories: Suggestions,
Directions/Training, and Negative Comments. The category with the highest number of
responses for students and faculty was the Suggestions category. These suggestions for
improving the DDP varied and included enhancing the layout, adding to user friendliness,
and adding additional features to increase efficiency of the program. For example, a
faculty response was: “Make it possible to upload whole-class data from Excel, Word or
Educator. Make it possible to upload feedback w/o assigning a progress code -- students
who get an I need the feedback the most.”
Over 23% of student survey responses related to frequency of use. As in other
student survey questions, these responses indicated students wanted to use the DDP more
often, more consistently, and across the curriculum. For example: “It is only really useful
if all instructors use it -- otherwise it is only pieces of your education.”
Faculty surveys had a low response rate for this question with 65 of the 93
surveys having blank responses (69.9%). There were 9.6% of faculty responses
242
categorized in the Negative Comments category for this survey question. The responses in
this category seemed to pertain more to personal issues, fear, and need to reduce the work
load of using the DDP. For example: “The DDP is somewhat scary because uploading
feels so permanent -- easier delete functions would help.”
Do you have any additional comments on the DDP that you would like to share?
The last open-ended survey questions on both student and faculty surveys was
designed for participants to add any additional comment they had that were not covered
by the survey questions.
Student Survey Results. Three hundred twenty-four student surveys were
analyzed. The original data extraction identified 37 terms. These terms were synthesized
down to 25 and placed into six categories, with a few responses falling into multiple
categories. The six student categories were Useful/Positive Comments, Negative
Comments, Frequency of Use, Suggestions, Other, and Blank. Responses without
answers, “N/A”, “no opinion”, or “don’t know” were placed in the Blank category. This
student survey question had the highest number of responses in the Blank category
(n=274, 84.6%). There was a total of 58 categorized responses.
Responses that were positive toward the DDP and/or indicated the DDP was a
useful tool, were placed in the Useful/Positive Comments category (n=21, 36.2%). Any
response of a negative nature was placed in the Negative Comments category (n=19,
32.8%). Responses were placed in the Frequency of Use category if they described
wanting to use the DDP more, not using the DDP enough, or using it consistently. There
were nine responses in the Frequency of Use category (15.5%). Six responses described
suggestions for improvement in the DDP and were placed in the Suggestions category
243
(10.3%). There were three responses that did not seem to fit in any category and these
responses were placed in the Other category (5.2%).
Table 57 displays the results of the student responses on any additional comments
concerning the DDP. Table 57 is a Thematic Conceptual Matrix that lists the categories,
number of responses, and sample responses from each category.
Table 57
Thematic Conceptual Matrix for Student Survey: Do you have any additional comments
on the DDP you would like to share?
Categories
Useful/
Positive
Comments
N
21
Negative
Comments
19
Frequency
of Use
9
Suggestions
6
Example Comments
ƒ DDP is a great tool. It is nice to be informed of the work done. I love the idea
to have the video downloaded. It helps to see where I have to work on.
ƒ I enjoy having it, but it does not seem to be a priority academically.
ƒ I enjoy seeing how I improved throughout my courses.
ƒ I think each student should try to make time to go on DDP. Because it will
help you know if you are achieving.
ƒ I think it is very resourceful, I also think they should show you how to use it
instead of playing with it in order to figure it out.
ƒ It is a good tool for keeping track of progress.
ƒ No. Great job on the DDP.
ƒ DDP is confusing at times.
ƒ I don't find it useful. It's a nice concept but if I don't use it why bother
ƒ I haven't really enjoyed using it -- I've never really looked back at my work
and I don't understand the purpose of using it
ƒ The video quality on some computers is poor. Some computers have no
speakers to use.
ƒ Using the DDP is very confusing for me personally because every time I try to
upload or put my work in it just doesn't show up so I think I'm doing
something wrong. Maybe if there was assistance who is an expert in that kind
of work to help out.
ƒ I think it is useful, but NOT for everything.
ƒ I think if the school wants students to use and appreciate this useful
technology they ought to show us how to use it in the first place and then be
consistent w/ using it through at our education.
ƒ I think it's useful but not very many of my instructors have asked us to upload
anything to or from it for class.
ƒ It is a great tool but is under utilized. Would make it more useful if everything
was stored there to create a more thorough portfolio once we are done.
ƒ Either use it more or not at all!
ƒ It's a great idea for our students I wish it appeared more to upper division
work.
ƒ I think it is very resourceful, I also think they should show you how to use it
instead of playing with it in order to figure it out.
ƒ I would like to see the DDP have our external assessments on the DDP.
ƒ Peer feedback is an important as the prof. feedback, but we are limited only to
specific course work
Table Continued
244
Table Continued
Categories
Other
N
3
Blank/ No
Opinion/
Nothing
274
ƒ
ƒ
ƒ
ƒ
ƒ
Example Comments
I think it will be useful in the future.
See above
No thank you
No
Not at this time
Faculty Survey Results. Ninety-three faculty surveys were analyzed. The original
data extraction identified 37 terms. These terms were synthesized into 25 terms and
placed in six categories with a few responses falling into multiple categories. The six
categories were Suggestions, Negative Comments, Useful/Positive Comments, Other, and
Blank. Sixty-five of the 93 surveys did not have responses for this question and these
were placed in the Blank category (70.0%). There was a total of 31 categorized
responses.
The Suggestions category had the largest number of faculty responses with 10
(32.3%). Responses were placed in this category if they described a suggestion to
improve the DDP. Responses in this category could be used to enhance the process and
procedures for using the DDP. Any response of a negative nature was placed in the
Negative Comments category (N= 9, 29.0%). The Useful/Positive Comment category was
created due to six responses that listed reasons why the DDP is a useful tool, or were
positive toward the DDP (19.4%). There were six response that did not fall into an
existing category and were placed in the Other category (19.4%).
Table 58 is a Thematic Conceptual Matrix that lists the faculty response
categories, number of responses, and sample comments from each category. Of note in
the results are the low number of responses in each category.
245
Table 58
Thematic Conceptual Matrix for Faculty Survey: Do you have any additional comments
on the DDP you would like to share?
Categories
N
Example Comments
Suggestions
10
Negative
Comments
9
Useful/
Positive
Comments
6
Other
6
ƒ Use it to address institution wide issues around quantity, quality and timeliness
of instructor feedback. Use it to help the college re-think and organize narrative
transcripts.
ƒ I would encourage faculty to provide balanced (affirming aspects of
performance, pointing out areas required for improvement and giving advice for
improvement) feedback at the end of courses.
ƒ Should be able to post summary feedback that doesn't need a student selfassessment associated with it.
ƒ The more students can do themselves the better. I like the idea of their scanning
papers with written feedback by instructors where they choose to.
ƒ I realize this is a convenience for students, and I acknowledge there could be
some benefits for narratives, but overhead is just too great.
ƒ I'd like to hear from students who don't find DDP particularly useful.
ƒ I'm very interested in it but feel creatively uninspired.
ƒ Is not convenient because writing on student papers is much more efficient and
comfortable for me. Don't need to be on computer, don't need to describe what
I'm referring to, can just write on it. All options I have heard include me doing
more work, uploading, scanning, recording, etc.
ƒ With our very large transfer classes (IN 130 WDC) it is difficult to upload
student work. If it were not for Sheila this would not happen.
ƒ I overall like the technology I enjoy sharing it and explaining/exploring it with
students. I have not figured out how to make it efficient. I give feedback and
summative end of semester evals.
ƒ One thing I use the DDP for is to look up student's picture, advisor, etc. It's a
way to bypass Datatel and get more information.
ƒ I think the DDP is extremely useful, although I do not use it for my courses now
because other faculty are resistant to using it and we haven't really decided what
the key performances are that should be uploaded.
ƒ I am not opposed to the DDP. I've just never warmed up to it.
ƒ I am not very good at giving written feedback, I do a better job with verbal
feedback.
ƒ Sheila is fantastic in helping faculty use the DDP.
Blank
65
Student and faculty responses to the open-end question on suggestions on how to
improve the DDP had some similarities. Both data sets contained the following
categories: Suggestions, Useful/Positive Comments, and Negative Comments. Another
similarity was the high number of blank responses for both students (85.0%) and faculty
(72.0%). Both students and faculty responses contained suggestions for improving the
DDP. For example, a student suggestion was “I enjoy having it, but it does not seem to be
246
a priority academically.” An example of a faculty suggestion was “Should be able to post
summary feedback that doesn't need a student self-assessment associated with it.”
Interview Data Analysis
Students and faculty were asked if they had any suggestions they would like to
make to the DDP design team including suggestions they had on increasing the use of the
DDP.
Student Interview Results
Students had a number of suggestions for the DDP design team. A pattern to the
responses concerned increasing the use of the DDP. For example: (a) “A lot of students
are using it way more than I am… my friends say we don’t really know how to use it....
We haven’t had to;” (b) “I didn’t think it was a design issue as much as encouraging the
faculty to make it a requirement;” (c) “Keep reintroducing the idea… the technology is
available for you and it will become beneficial for you when it comes to mid terms or
finals to provide evidence on how your progress is going;” and (d) “I talked to a few
students in preparation for this interview and I’m getting the same type of feedback.
We’d love to use it, but it never comes up. It’s never asked of us. So I don’t know if
there’s something on the instructors’ end that makes it difficult for them to use.”
Several students described why they think the DDP is useful, such as “I would
just say that it is a good idea that we have the DDP. We don’t have to accumulate papers,
and you can always go back.” and “I would probably say that it’s a great way to make
things accessible to students…I like having the forms on the DDP.”
One student went into some depth on her perceptions on how to increase the use
of the DDP. She said: “I think getting people to not be so afraid of it…there are people
247
who just don’t know how to use it, or they don’t understand what its potential could be…
I think it starts with people understanding why it’s important and really getting them to
buy into it.” Another student commented that the DDP is “…great just the way it is…I
just think its very user friendly and it’s easy to get into.”
Student interview responses supported the data from their surveys. Students gave
some suggestions on enhancing the DDP, especially in regard to using the DDP more
frequently. A response that perhaps summed up students’ suggestions on how to increase
the use of the DDP was “…if there were a common vision…It’s kind of like buying into
the Alverno curriculum; if you don’t, you can’t be a successful student, or teacher here.
They just have to buy into it. The concept of this really could be something great. Until
people buy into it, they’re not going to want to set aside time … to train their students in
class.”
Faculty Interview Results
Faculty interview responses described a number of general suggestions for
improving the DDP. These included “I think it would be useful to get a report of
something of the student’s progress with respect to each of the departmental outcomes
and perhaps also with respect to each of the abilities… The more that it could do
something that you could see the arc of the student’s development with respect to specific
things the more useful it is.” and “From a technical perspective it would be nice if we
could do batch uploads.”
Concerns on work load and time were mentioned, along with the institution
continuing to remind faculty about the DDP, and serve as a “cheerleader” for its use. One
faculty member described his wish for the DDP: “I would wish for more final evaluative
248
feedback on courses. Not just associated with the performance or the project, but
something that synthesizes the student’s performance for the semester.”
Faculty interview responses were similar to data gathered from their surveys.
However, the interview protocol provided the opportunity to go into more detail and ask
additional questions. Perhaps a faculty comment that summed up a number of
interviewees’ thoughts was “I would encourage as many opportunities as possible for
faculty to use the DDP as an opportunity to record overall judgment of students’ work.”
Characteristics of Key Performances
This study was concerned with student and faculty use and perceptions of the
DDP. Key performances are the operational unit of the DDP. In order to understand the
use of the DDP during spring, 2005, an analysis of active key performances was
necessary. There were four sub-questions related to the analysis of active key
performances:
1. How many active key performances are being used by students?
2. What discipline departments have completed key performances?
3. How are completed key performances connected to the abilities?
4. How are completed key performances connected to other matrices?
All of the data to address these questions came from the DDP relational database
entries from spring, 2005.
How many active key performances are being used by students?
The DDP relational database was queried to identify all key performances that
were active (available for student use) during spring, 2005. The query identified 472
249
active key performances. A query was created to identify which of the 472 active key
performances were completed by students. Approximately 40% (184) of the active key
performances were completed by students during spring, 2005.
What discipline departments have completed key performances?
Key performances that were completed in spring, 2005 were sorted into discipline
departments. There were a total of 63 possible undergraduate discipline departments in
the DDP. The data indicated that 37 different discipline departments, or 59.0%, had key
performances completed by students during spring, 2005.
Figure 62 displays the results of the data on these discipline departments. The
Assessment Center (AC), which maintains all outside-of-class assessments required of all
students, had the highest percent of active key performances at 23.6%. The
Communication Ability Department (CM) accounted for 19.6% of completed key
performances. There were four discipline departments that had one completed key
performance. These departments were Marketing Management, Physics, Religious
Studies, and Science.
250
Discipline Departments with Completed Key Performances
24%
2 3 .8 %
22%
19 .7%
20%
Percent Completed KP
18%
16%
14%
12%
10%
8%
6 .4 %
6%
4%
2 .0 %
0 .6 %
2 .9 %
2 .7%
2 .6 %
1.6 %
0 .3 %
3 .9 %
3 .3 % 3 .3 %
2 .9 %
2%
5.0 %
4 .5%
4 .2 %
2 .0 %
1.0 %
0 .4 % 0 .2 %
0 .5%
2 .0 %
1.6 %
0 .8 %
0 .2 %
0 .7%
0 .0 %
0 .1% 0 .0 %
0 .4 %
0 .0 % 0 .0 % 0 .1%
0 .5%
0%
Figure 62. Discipline Departments with completed key performances
250
Discipline Departments
251
Table 59 displays all discipline departments completing key performances during
spring, 2005, the number of completed key performances, and the percentages. Two
discipline departments accounted for 43.2% of completed key performances, Assessment
Center (AC) and Communications (CM).
Table 59
Summary of Discipline Departments and Completed Key Performances
Department
Art
Assessment Center
Arts & Humanities
Algebra
Biology
Broadfield Science
Chemistry
Computer & Information
Literacy
Community, Leadership &
Development
Communication
Communication,
Management & Technology
Computer Science
Education
English
Global Effective Citizenship
History
Independent Learning
Experience
Integrated Learning
N
Percent
12
925
61
165
112
25
102
0.3%
23.6%
1.6%
4.2%
2.9%
0.6%
2.6%
14
0.4%
8
767
Department
N
Percent
77
32
106
1
63
196
26
2.0%
0.8%
2.7%
0.0%
1.6%
5.0%
0.7%
76
1.9%
0.2%
19.6%
Liberal Arts
Management Accounting
Business & Management
Management Marketing
Mathematics
Nursing
Nursing PreProfessional
Professional
Communications
Psychology – Drug &
Alcohol
Physics
3
1
0.1%
0.0%
40
174
127
129
78
18
1.0%
4.4%
3.2%
3.3%
2.0%
0.5%
Philosophy
PreProfessional Seminar
Psychology
Religious Studies
Science
Sociology
17
113
150
1
1
4
0.4%
2.9%
3.8%
0.0%
0.0%
0.1%
6
249
0.2%
6.4%
Social Science
18
0.5%
How are completed key performances connected to the abilities?
A key performance could be connected to any of the eight abilities and four
levels. For students to be able to track their development across the abilities, all abilities
and levels need to be represented in completed key performances. The DDP relational
database was queried to determine the connections between key performances completed
during spring, 2005 and the Abilities matrix. The results of the query found a total of
8,753 connections to the Ability matrix from completed key performances.
252
Table 60 displays a summary of all the Ability matrix connections to completed
key performances. The table indicates that all eight abilities and four levels are
represented in completed key performances for spring, 2005.
The data indicated that 43.6% of completed key performances were connected to
the Communication ability. The Analysis and Problem Solving abilities accounted for
19.7% and 11.8% of key performance connections. The Effective Citizenship ability had
the lowest number of connections to completed key performances, with 4.0%. These
results found that during spring, 2005 all abilities and all levels were represented by
completed key performances.
Table 60
Summary of Ability-Matrix Connections to Completed Key Performances for the Spring,
2005 Semester
Ability
Communication
Level 1
Level 2
Level 3
Level 4
R - 287
W - 269
S - 199
L - 0
C - 257
Q - 408
R - 124
W - 198
S - 758
L - 157
C - 135
Q - 76
ICM - 721
ICM – 167
Q - 7
Q - 50
Totals
(3,813)
1,299
467
957
157
392
541
Analysis
178
337
721
484
1,720
68
107
492
362
1,029
119
134
117
48
418
197
41
15
103
356
59
14
258
15
346
118
74
124
110
426
Problem Solving
Percent
(43.6%)
14.8%
5.3%
10.9%
1.8%
4.5%
6.2%
19.7%
11.8%
Valuing
4.8%
Social Interaction
4.1%
Effective Citizenship
4.0%
Global Perspectives
4.9%
Aesthetic Engagement
Totals
Percent
7.4%
184
2,343
26.8%
200
2,355
26.9%
134
2,589
29.6%
127
1,466
16.8%
645
8,753
100.0%
253
How are completed key performances connected to other matrices?
Besides the Abilities matrix, a key performance could be connected to matrices
representing advanced outcomes of major and support (minor) programs. There are
additional matrices that could be connected to key performances, including Wisconsin
Educational Standards (Department of Public Instruction (DPI) Standards) and a variety
of Content Guidelines Matrices set forth by DPI. A key performance could be connected
to any number of matrices multiple times. For example, a key performance in
Mathematics could be connected to the advanced outcomes for Mathematics majors,
advanced outcomes for Mathematics supports (minors), Wisconsin Education Standards
matrix, and/or the Mathematics Content Guidelines matrix.
The DDP relational database was queried to determine the connections between
key performances completed during spring, 2005 and matrices other than the Ability
matrix. Data indicated key performances completed during spring, 2005 were connected
to 29 different matrices a total of 3,487 times.
Table 61 contains a summary of the key performances completed and their
connections to matrices. The Educational Standards matrix had the highest number of
connections to completed key performances, with 602 or 17.3%. Psychology,
Mathematics Content Guidelines, English, and Computer Science matrices rounded out
the top five matrices connected to key performances completed during spring, 2005.
254
Table 61
Summary of DDP Relational Database Data on Completed Key Performances
connections to Matrices (Other Than Ability Matrix)
Matrix Name
No. of
Connections
Percent
Matrix
No. of
Connections
Percent
Art - Studio
5
0.1%
English
307
8.8%
Art Education
Art Education/Art
Therapy
1
0.0%
112
3.2%
2
0.1%
50
1.4%
Art Therapy
Arts and Humanities,
Integrated
Business & Management
Accounting
Business and
Management
3
0.1%
English - Support
International
Business
Marketing
Management
22
0.6%
54
1.5%
38
1.1%
54
1.5%
352
10.1%
206
5.9%
Mathematics
Mathematics Content
Guidelines
Nursing - Level 5
Junior Year
48
1.4%
Chemistry
122
3.5%
Philosophy
120
3.4%
Chemistry - Support
Communication
Management &
Technology
Community Leadership
& Development
12
0.3%
Philosophy - Support
24
0.7%
90
2.6%
Professional
Communication
43
1.2%
42
1.2%
Psychology
516
14.8%
Computer Science
Computer Science Support
241
6.9%
Psychology - Support
64
1.8%
76
2.2%
81
2.3%
Education
175
5.0%
Social Science
Social Science Support
Education Standards
602
17.3%
Totals
25
3,487
0.7%
100.0%
Summary of Results
The results of this study were summarized using the seven research sub-questions
and the four sub-questions concerning the characteristics of key performances.
1. How often do students and faculty log onto the DDP?
Data from the DDP relational database indicated that 1,893 students logged
onto the DDP a total of 17,303 times during spring, 2005 (M = 9.1, SD = 10.1, Range
1-17, median = 6.0). The median denoted the typical student logged onto to the DDP
approximately six times. Survey responses indicated that students perceived that they
255
logged onto the DDP once per month or approximately four times a semester (M =
1.7, SD = 1.7, median = 1.0).
DDP relational database data found 180 faculty (71.4%) logged onto the DDP
3,961 times (M = 22.0, SD = 27.7, Range 1-157 median = 10.0) during spring, 2005.
The median indicated that faculty logged onto the DDP 10 times during the semester.
Survey responses indicated that faculty perceived they logged onto the DDP twice a
month or approximately eight times a semester (M = 5.1, SD = 6.7, Range 0-35,
median = 2.0).
2. What do students and faculty do when they log onto the DDP?
DDP relational database data indicated that 1,669 students completed a total
of 3,918 key performances (M = 2.4, SD = 1.5, Range 1 – 11, median = 2.0). Students
completed approximately two key performances during spring, 2005. The data listed
116 faculty/assessors uploaded a total of 3,150 files (M = 27.2, SD = 26.8, Range 1120, median = 18.0). Data were also collected on the number of active key
performances created by faculty. There were a total of 475 active key performances
created by 105 different faculty (M = 4.3, SD = 6.8, Range 1-58, median = 3.0).
Survey results indicated student perceptions of the most-often and least-often
used features of the DDP. A summary of each of the nine features by student group
(beginning, intermediate, and advanced) is displayed in Table 62. Means of all nine
256
Table 62
Summary of Student Perceptions of How Often They Use Features of the DDP
Feature
Beginning
Intermediate
Advanced
N=172
M
N=91
M
N=61
M
1. Add a key performance to the My Work area
166
2.0
89
2.1
59
1.8
2. Upload a self assessment
170
2.2
89
2.4
60
2.2
3. Check feedback for a key performance
171
1.9
88
2.3
59
1.8
4. Review past key performances
171
1.6
86
2.1
59
1.5
5. Use the My Resource area
170
1.3
87
1.5
59
1.2
6. Use the Reference area
170
1.3
85
1.3
59
1.1
7. Attach a key performance to a matrix
170
1.5
88
1.4
59
1.2
8. View a video of work
171
1.9
88
1.2
60
1.3
9. Use the Help Menu
170
1.4
89
1.1
58
1.2
Choices: Do not know what this is (0), Never (1), Occasionally (2), Often (3), Very Often (4)
features were 2.2 or less (Choice of 2 = Occasionally). Table 63 displays a summary
of each of the nine features for faculty. Uploading student feedback had the highest
mean, with 2.6 (between 2-Occasionally and 3-Often). Three features had similar
means for both students and faculty: Use the My Resource area, Use the Reference
area, and Use the Help Menu.
Table 63
Summary of Faculty Perceptions of How Often They Use Features of the DDP
Feature
Faculty
N=93
M
1. Create a new key performance
88
1.9
2. Upload student feedback
89
2.6
3. Read student work
82
2.2
4. Read student self assessments
85
2.5
5. Use the My Resource area
85
1.3
6. Use the Reference area
86
1.5
7. Check a students past work
87
1.9
8. Use the DDP for narratives
87
1.9
9. Use the Help Menu
86
1.4
Choices: Do not know what this is (0), Never (1), Occasionally (2),
Often (3), Very Often (4)
A comparison of student and faculty perceptions of most-often and least-often
used DDP features is displayed in Table 64. Student and faculty interview data
257
supported the results of the surveys. Uploading self assessments and reading feedback
were two of the most frequent comments made during the student interviews. Faculty
described uploading student feedback and reading student self assessments and work
as frequent tasks they completed when they logged onto the DDP.
Table 64
Summary of Student and Faculty Survey Results for Most-Often and Least-Often Used
Features of the DDP
Perception of Most-Often-Used DDP
Features
Students
Faculty
Upload a Self
Assessment
M=2.21
Check feedback for a
key performance
M=2.02
Add a key performance
to My Work
M=1.73
Upload student
feedback
M= 2.62
Read student self
assessment
M= 2.52
Read student work
M=2.22
Perception of Least-Often-Used DDP
Features
Students
Faculty
Use the Reference
area
M=1.25
Use the Help Menu
M=1.26
Use the My Resource
Area
M= 1.27
Use the Help Menu
M= 1.43
Use the My
Resource area
M=1.30
Use the Reference Area
M= 1.49
3. What features of the DDP are perceived by students to be useful or not useful?
The data from the surveys identified the features students and faculty
perceived as the most-useful and least-useful. A summary of each of the nine
features by student group (beginning, intermediate, and advanced) is displayed in
Table 65. Student perceptions of the usefulness of these nine features of the DDP
were slightly higher than their perceptions of how often they used these features.
258
Table 65
Summary of Student Perceptions of Useful Features of the DDP
Feature
Beginning
N=172
M
167
2.4
170
2.6
170
2.6
168
2.3
169
1.6
168
1.6
169
1.8
170
2.3
168
2.0
Intermediate
N=91
M
89
3.0
89
3.0
87
2.9
85
2.7
84
1.7
82
1.7
85
2.0
84
1.8
83
1.8
Advanced
N=61
M
58
2.8
58
2.6
58
2.5
57
2.1
58
1.3
57
1.3
55
1.4
56
1.6
57
1.6
1. Accessing the DDP from off-campus
2. Accessing my work and self assessments
3. Accessing my feedback
4. Reviewing past key performances
5. Using the My Resource area
6. Using the Reference area
7. Attaching a key performance to a matrix
8. Viewing a video of work
9. Using the Help Menu
Choices: Do not know what this is (0), Not Useful (1), Occasionally Useful (2), Often Useful (3), Very Useful (4)
Table 66 displays faculty perceptions of the usefulness of nine features of the DDP.
Faculty perceptions of the usefulness of these features were slightly higher than their
perceptions of how often they used these features.
Table 66
Summary of Faculty Perceptions of Useful Features of the DDP
Feature
Faculty
N=93
M
1. Create a new key performance
87
2.4
2. Upload student feedback
86
2.8
3. Read student work
83
2.4
4. Read student self assessments
85
2.6
5. Use the My Resource area
82
1.3
6. Use the Reference area
79
1.7
7. Check a students past work
83
2.1
8. Use the DDP for narratives
81
2.2
9. Use the Help Menu
80
1.5
Choices: Do not know what this is (0), Never (1), Occasionally (2),
Often (3), Very Often (4)
A comparison of student and faculty perceptions of usefulness of DDP features is
displayed in Table 67. Student and faculty seem to have similar perceptions of the
least-useful features of the DDP, with both groups selecting the Reference area and
the My Resources area as two of the least-useful features.
259
Table 67
Summary of Student and Faculty Survey Results for Most-Useful and Least-Useful
Features of the DDP
Perception of Most-Useful DDP Features
Students
Faculty
Accessing Work and
Self Assessment
M=2.68
Accessing the DDP
from Off-Campus
M=2.66
Accessing Feedback
M=2.66
Providing Feedback to
Students
M= 2.83
Viewing Student Self
Assessment
M= 2.59
Accessing the DDP from
Off-Campus M=2.37
Perception of Least-Useful DDP
Features
Students
Faculty
Using the Reference
area
M=1.54
Using the My
Resource area
M=1.57
Attaching a Key
Performance to a
Matrix
M=1.75
Using the My
Resource Area
M= 1.27
Use the Help Menu
M= 1.46
Use the Reference
Area M= 1.68
During interviews students and faculty did not specifically name features of
the DDP they found the most useful. Students frequently commented that they did not
perceive they were using the DDP very frequently. Most faculty mentioned that they
uploaded feedback, read student self assessments, and read student work. However,
the interviews provided a number of ideas on potential use of the DDP, and
information on what faculty perceived as issues and problems with the DDP.
4. What are students’ and faculty members’ overall perceptions of the usefulness of the
DDP?
Students and faculty were asked to rate their perception of the overall
usefulness of the DDP on a Likert Scale of 1 (Not Useful), 3 (Useful), and 5
(Extremely Useful). Student survey results indicated they perceived the DDP as
overall Useful (M = 3.0, SD = 1.1, median = 3.0). Survey results indicated faculty
perceived the DDP as overall Useful (M = 3.5, SD = 1.1, median = 3.3).
The data from the open-ended survey question on overall usefulness of the
DDP supported the Likert Scale response data. The category with the highest number
260
of student and faculty responses was the Find Useful category. Student responses in
this category included “Able to see everything -- can access it off-campus which is
helpful” and “I believe it is a good tool to assess my progress as a student.” Faculty
responses in this category included “It helps both teachers and students have a
cumulative picture of student learning” and “Very helpful for writing narratives and
honors nominations has pushed me to give more complete, clear feedback has pushed
students to do better self assessment.”
Student and faculty interview results supported the data gathered from the
surveys. Comments generally referred to the DDP as useful, although students
responded that they felt if they used the DDP more, it would be more useful. Several
of the faculty also mentioned the DDP should be used more frequently.
5. What do students and faculty think about the ease of use of the DDP?
Students and faculty were asked to rate their perception of the ease of use of
the DDP on a Likert Scale: 1 (Not Easy), 3 (Easy), and 5 (Extremely Easy). Student
survey results indicated they perceived the DDP as easy to use (M = 3.2, SD = 1.1,
median = 3.0). Faculty survey results indicated they also perceived the DDP as easy
to use (M = 3.0, SD = 1.2, median = 3.0).
The data from the open-ended survey question on ease of use of the DDP
supported the Likert Scale response data. The category with the highest number of
responses from student and faculty surveys was Easy to Use. Student survey
responses in this category included: “Easy to use and navigate to appropriate area”
and “Accessible and self explanatory.” An example of a faculty response in the Easy
261
to Use category was: “I needed to use it to upload assessment feedback, followed the
directions provided and viola!”
Student interview data supported student survey results, as they viewed the
DDP as easy to use. Five out of the six faculty surveyed also thought the DDP was
easy to use, although two faculty described specific problems they have encountered
in their use of the DDP.
6. What are students and faculty perceptions concerning their frequency of use of the
DDP?
Students were asked to rate their perception of their frequency of use of the
DDP on a Likert Scale: 1 (Not Enough), 3 (Enough), and 5 (Too Much). Student
survey results indicated students perceived the DDP as being used enough (M = 2.3,
SD = 1.0, median = 2.0). However, 51.0% of students responded with a choice less
than 3 (less than Enough).
Faculty were asked to rate their perception of how often they use the DDP
with their students on a Likert Scale: 1 (Never), 3 (Often), and 5 (Frequently). Faculty
survey results indicated they perceived that they used the DDP with their students
slightly less than Often (M = 2.5, SD = 1.1, median = 2.0). However, 60.5% of
faculty responded with a choice of less than 3 (less than Often).
The data from the open-ended survey question on frequency of use of the
DDP supported the Likert Scale response data. The category with the largest number
of responses from students was Frequency of Use. Responses in this category
indicated students wanted to use the DDP more. Student survey responses included:
“Especially related to my major I would like to see these things on the DDP” and
262
“Haven’t been asked by teachers to use the DDP -- have only used it 2x this
semester.” Faculty survey responses from the Use Occasionally category included: “I
think I could use it more -- I do lots of feedback for my students but I don't put it on
the DDP -- I need to create more assessments as key performances.”
Both student and faculty interview results supported the data gathered from
the surveys. Students referred to not using the DDP enough and faculty responded
that the DDP should be used more frequently.
7. What suggestions do students and faculty have on: (a) how to improve the usefulness
of the DDP, (b) how to assist them in using the DDP more, and (c) what general ideas
would suggest improvement of the DDP?
Students and faculty were asked open-ended survey questions to gather data
on suggestions to increase the use of the DDP and improve the usefulness of the
DDP, followed by general ideas and comments. Student survey responses indicated a
pattern of Frequency of Use type responses. Students indicated they were not using
the DDP enough, wanted to use the DDP more, or could learn how to use the DDP if
they used it more. For example, “I think if the school wants students to use and
appreciate this useful technology they ought to show us how to use it in the first place
and then be consistent with using it through at our education.” Students also indicated
the need for increased training and directions to assist them in using the DDP more.
For example, “Either have a class focus on this or have more one-on-one help with
the DDP because some may have forgotten to know how to use it in the beginning of
the year” and “I think it is very resourceful, I also think they should show you how to
use it instead of playing with it in order to figure it out.” Students gave a variety of
263
suggestions to improve the program’s usefulness, such as: “Reminder that it is there
AND info, on how to organize info under references/work” and “Please encourage
instructors to upload feedback in a timely manner.”
Faculty responses to suggestions to increase the use of the DDP, improve its
usefulness, and provide additional suggestions were more varied. They responded
with numerous suggestions on how to improve the process of using the DDP with
their students. For example, “Being able to look at student work, their self assessment
side-by-side as well as see the document I am typing feedback into” and “Use it to
address institution wide issues around quantity, quality and timeliness of instructor
feedback. Use it to help the college re-think and organize narrative transcripts.”
Student and faculty interviews provided numerous suggestions on how to
increase the use of the DDP or improve the program. Of note were the stories faculty
related on how they are using the DDP with their students. These stories can provide
the institution with valuable models to share with other faculty.
Data on completed key performances were used to compile an overall picture of
key performance characteristics. Characteristics of key performances completed during
spring, 2005 included:
1. Students completed 184 out of 472 different key performances (39.0%). A
total of 3,918 key performances were completed by 1,669 different students.
2. There were 37 discipline departments represented in the 3,918 completed key
performances. The discipline departments with the highest number of
completed key performances were: Assessment Center (AC), 23.6%,
264
Communication (CM), 19.6%, Integrated Learning (IN), 6.4%, and Nursing
(N), 5.0%.
3. Completed key Performances were connected 8,753 times to the Ability
matrix. All eight abilities and levels were represented. The abilities with the
largest number of connections to completed key performances were:
Communication (43.6%), Analysis (19.7%), and Problem Solving (11.8%).
4. Completed key Performances were connected to matrices (other than the
Ability Matrix) 3,487 times, representing 29 different matrices. The matrices
with the largest number of connections to completed key performances were:
Wisconsin Education Standards (17.3%), Psychology (14.8%), Mathematics
Content Guidelines (10.1%), and English (8.8%).
The results of this study can be used to create a picture of student and faculty use
and perceptions of the DDP. In addition, knowledge gained from the DDP database,
student and faculty surveys, and student and faculty interviews can be used by the
institution to improve both the process of using the DDP and the DDP program.
265
CHAPTER FIVE: DISCUSSION
Overview
The purpose of this study was to address the question of the use of Alverno
College’s Diagnostic Digital Portfolio (DDP) by describing and evaluating undergraduate
student and faculty use and perceptions. An Interactive form of program evaluation
(Owen, 1999) was the methodology used in this study, which focused on providing
information on program delivery, documenting improvements/innovations, understanding
more fully how and why a program operates in a given way, and providing suggestions
for improving the program (Owen, 1999, p. 44). The key approaches connected with the
Interactive form that are used in this study are responsive evaluation (takes into account
the perspectives of the stakeholders) and developmental evaluation (working with
providers on a continuous improvement).
The three data gathering methods used in this study (mining of the DDP relational
database, student and faculty surveys, and student and faculty interviews) were designed
to address both of the key approaches in the Interactive form of program evaluation:
responsive and developmental evaluation. Student and faculty use and perceptions of the
DDP were broken down into two main areas: seven sub-questions that concern usage by
students and faculty and four additional sub-questions that focus on identifying
characteristics of key performances. These two sets of sub-questions form the main part
of Chapter Five, which is preceded by a summary overview of the important points. The
summary of results of the research sub-questions is followed first by a comparison of
Alverno’s DDP to Love, McKean, and Gathercoal’s levels of maturation of digital
portfolios and then by a discussion of the relationship of this study to other research,
266
including Alverno’s initial research. Chapter Five ends with a discussion of conclusions,
limitations of this study, and future research possibilities.
Summary of Findings
One of the most significant findings of this study was that undergraduate students
and faculty WERE logging onto and using the DDP. An analysis of the DDP relational
database logs found students logged on six times (median) and faculty logged on 10 times
(median) during the spring, 2005 semester. Student surveys and interviews also indicated
that students wanted to use the DDP more often and more consistently throughout their
educational experience at Alverno. Faculty survey and interview results suggested that
faculty would also like to use the DDP more often with their students. However, faculty
had concerns about their level of knowledge of the DDP (training issues), as well as
concerns about organization, time, and workload issues.
Students completed an average of two key performances during spring, 2005.
Therefore, DDP use is meeting the institutional goal of a minimum of two key
performances completed each semester. In survey responses and interviews, students
described using the DDP primarily when required by faculty, but seemed to know very
little about additional DDP features and would like more training. When faculty were
asked what they have the students do on the DDP, they usually described their DDP use
within specific courses. Faculty also said they would like to use the DDP more often with
students. They expressed the need for more time, more training, and more models of
DDP use and integration.
The majority of students and faculty indicated the DDP IS useful. Surveys results
found that 68.6% of students rated the DDP as useful to extremely useful, while 83.3% of
267
faculty rated the DDP as useful to extremely useful. One theme of the open ended survey
questions was that students wished they were using the DDP more often, and if they used
it more often, it would become even more useful. Students also discussed issues with
timing of the use of the DDP (everything happening at the end of the semester), with not
really understanding the purpose of the DDP, and with not knowing much about the DDP
in general.
Students and faculty perceived the DDP as EASY to use. Student survey results
found that 74.1% of students perceived the DDP as easy to extremely easy to use, while
6% thought it was not easy to use. Faculty survey results found 64.9% of faculty
perceived the DDP as easy to extremely easy to use, while 9.9% perceive the DDP as not
easy to use. Open-ended survey responses and interviews indicated that students
perceived the DDP would be easier to use if they used it more often.
An interesting result of this study concerned the frequency of use of the DDP.
Students perceived that the DDP was not being used enough and should be used more
often. When asked about frequency of use on the student survey, 51.4% of students
responded the DDP was not used enough. Student comments, both from open-ended
survey questions and from interviews supported these findings. Students also stated the
need for faculty to require more consistent use of the DDP. Even students who expressed
negativity toward the system in some of their survey responses referred to the infrequent
use of the DDP as one of the reasons why they did not like it.
Faculty survey results indicated that 60.5% of faculty used the DDP less than
often with their students. Open-ended survey questions and interviews supported these
findings.
268
Summarizing the important results of this study made it clear that the majority of
undergraduate students and faculty perceived the DDP as an easy to use, useful tool that
should be used more often.
Summary of Research Sub-question Results
This research study focused on seven sub-questions concerning student and
faculty use and perceptions of the DDP, along with four sub-questions on the
characteristics of key performances. These sub-questions form the organization for the
discussion of the results of this study. In addition, a comparison of the DDP to Love,
McKean, and Gathercoal’s five levels of maturation for digital portfolios is described
here.
Sub-question 1: How often do students and faculty log onto the DDP?
Students and faculty WERE logging onto the DDP. An analysis of the DDP’s
relational database logs found that the median number of student log-ons for the spring,
2005 semester was six (M = 9.1 SD = 10.2). Institutional data listed a total of 2,006
undergraduate and non-degree students attending Alverno during spring, 2005. An
analysis of the DDP relational database found a total of 1,893 (94.4%) undergraduate
students logged onto the DDP during spring, 2005. Data from the DDP relational
database logs found that the median number of DDP faculty log-ons during the spring,
2005 semester was 10 (M = 22.0 SD = 27.8). Comparing these results to institutional
records of the number of faculty (180) indicated that approximately 71.4% of faculty
logged onto the DDP during spring, 2005.
Survey results found that 32.4% of students perceived they log onto the DDP once
a month, or approximately four times a semester. This is slightly lower than the findings
269
from the DDP database (median = six). Faculty survey results found 57.6% of faculty
perceived that they logged onto the DDP three times or less per month, or approximately
nine times a semester. Again, these results are slightly lower than the findings from the
DDP relational database (median = 10).
Although, during the interviews students and faculty were not specifically asked
the number of times they logged onto the DDP, one overarching interview theme was the
need to use the DDP more often. The findings of this study indicate that students and
faculty WERE logging onto the DDP and they expressed an interest in using the DDP
more often.
Sub-question 2: What do students and faculty do when they log onto the DDP?
Data from the DDP relational database was analyzed to find the average number
of key performances completed by students and the number of active key performances
and files uploaded by faculty during spring, 2005. Survey questions asked for students’
perceptions of the number of key performances they had completed during the semester,
as well how often they used nine features of the DDP. Faculty surveys contained a
question on the number of key performances faculty used during the semester and how
often they used nine features of the DDP. Both student and faculty interviews contained
a question on what type of experiences they have had with the DDP.
An analysis of data from the DDP relational database found that during spring,
2005 students completed an average of two key performances (M = 2.4, SD = 1.5,
median = 2.0). Student survey results were similar to the database findings with 59.4% of
students responding they had completed two or more key performances during the
semester. However, 18.2% of students responded they had completed no key
270
performances, while 22.4% of students perceived they had completed one key
performance. It is interesting to note that survey statistics based on student groups found
that beginning students had a median of two completed key performances, while
intermediate students had a median of one, and advanced students had a median of zero.
This could be due to students’ perceptions that they used the DDP more in beginning
courses or the general perception that they used the DDP infrequently.
An analysis of the DDP relational database found that faculty uploaded a median
of 18.0 files during spring, 2005 (M = 27.2 SD=26.8). During this same period, 347
active key performances were created by 100 different faculty (median of 3.0), which
provided a range of possible key performances for students to complete. Faculty survey
perceptions on the number of active key performances they had on the DDP were slightly
lower than database results, with a median of two.
Student interview data supported the survey results. Both intermediate and
advanced students indicated they used the DDP more frequently in their beginning
courses. For example, an advanced student said, “When I initially came to Alverno [in
the course] IN 130 we did a lot of DDP work. My initial Nursing courses had a lot of
DDP work…after that, there really wasn’t much to upload, maybe a couple of things.”
When interviewed, faculty were asked what kinds of things they have done with
the DDP; they usually responded by describing how they used the DDP with their
students in specific courses. For example:
My favorite activity is a key performance, that’s a self reflection that I arrange for
PSY 110 students. I piloted it probably three years ago, and I have used it two or
three times and it consistently gets better self assessments from my students… the
271
mid-term is to reflect on the theories that we have been learning…I have them
choose a theory and tell me what they understand about that theory in relationship
to a series of questions I give them … part of their mid-term self reflection is to
basically give themselves an overall evaluation on how they did on this mid-term.
I have students reflect on what they have learned to date at mid-term on theory,
and make three goals in those three domains [cognitive, psycho-social, and biosocial]. So that helps them pull in this very personal self reflection as part of the
course content, as well as pulling in theory.
Another faculty described a unique use of the DDP:
I have each student write feedback to the person they interviewed, and they put it
on the DDP. Then the student would respond to what she learned from both the
interview and the feedback she got from the interviewer. So the prompt, Peer
Feedback, prompted me to include that, and what it created was an opportunity
for me, for the students, when they give peer feedback in that way, where it means
something and it’s popular. I couldn’t believe the development.
The second aspect concerning what students and faculty did when they logged
onto the DDP focused on their perception of how often they use certain DDP features.
Survey choices available on how often they used the features were: Do not know what
this is (0), Never (1), Occasionally (2), Often (3), Very Often (4). Although there were
some differences in features of the DDP listed for students and faculty, there were three
features in common: the Reference area, My Resources area, and the Help Menu. Both
students and faculty agreed on these three features as the LEAST-used features. Table 68
272
displays these findings. The results of this study indicate that students and faculty did not
use these features very often.
Table 68
Comparison of Student and Faculty Survey Results of Least-Often Used Features of the
DDP
Students
1. Reference Area (M = 1.26)
2. Help Menu (M = 1.26)
3. My Resources (M = 1.30)
Faculty
My Resources (M = 1.27)
Help Menu (M = 1.43)
Reference Area (M = 1.49)
Choices
0-Do not know what this is,
1-Never, 2-Occasionally,
3-Often, 4-Very Often
Although these three features are considered important by the DDP design team,
it seems apparent that students and faculty do not share this view. While some students
and faculty described using the Reference area in the interviews, only one student spoke
of using the My Resource area. These results could be due to limited training students
received on the DDP or they could be due to perceptions that students and faculty do not
use the DDP very often.
Student and faculty perceptions of the MOST used DDP features followed the
main user processes of the DDP. For example, the main DDP process for students is to
complete key performances by uploading self assessments and reading feedback. The
main DDP process for faculty is to complete key performances by uploading feedback
and assigning an overall status. Faculty usually read the students self assessment and
work (if uploaded) before they post their feedback. Table 69 displays the student and
faculty survey results of the most often used features of the DDP.
273
Table 69
Comparison of Student and Faculty Survey Results of Most-Often Used Features of the
DDP.
Students
1. Uploading Self Assessment
(Mean = 2.21)
2. Checking Feedback
(Mean = 2.02)
3. Add a key performance to My
Work area (Mean = 1.98)
Faculty
Uploading feedback
(Mean = 2.62)
Reading Self Assessments
(Mean = 2.52)
Read Student Work
(Mean = 2.22)
Choices
0-Do not know what this is,
1-Never,
2-Occasionally,
3-Often,
4-Very Often
When students were asked during the interview to describe what kinds of things
they had done on the DDP, they usually described putting assessments or assignments
into the DDP. For example, a beginning student said: “When we were IN 125 we were
told to go on the DDP to do our assessments.”
When faculty were asked what kinds of things they had done with the DDP, they
usually responded in terms of the courses in which they used the DDP. For example:
“We in the Psych department have come to an agreement that in our upper level courses,
we will always have some input into the DDP.” and “We have used the DDP for SC 120
and SC 118 courses and no longer do so.”
The results of this study found that when students and faculty logged onto the
DDP they were completing key performances. It is interesting to note that students and
faculty did not making adequate use of additional features of the DDP such as the
Reference area, My Resource area, and the Help Menu. This could be due to limited
training for both students and faculty on various features of the DDP or due to their
perceptions of their limited use of the DDP.
274
Sub-Question 3: What features of the DDP are perceived by students and faculty as
useful or not useful?
It is interesting to note that the features of the DDP students and faculty
considered to be the least useful features are similar to the least often used features from
the preceding sub-question. Table 70 displays the comparison of student and faculty
perceptions of the least useful features of the DDP. Students perceived the Reference and
My Resource areas as two of the least useful features. The third least useful feature for
students was attaching a key performance to a matrix. Perhaps students perceive that
attaching a key performance to a matrix is not a useful feature due to the lack of training
on why to use this feature and/or how to use it.
Table 70
Comparison of Student and Faculty Survey Results of Least-Useful Features of the DDP
Students
1. Reference Area ( M = 1.54)
2. My Resources (M = 1.57)
3. Attaching a Key
Performance to a Matrix
(M = 1.75)
Faculty
My Resources (M = 1.27)
Help Menu (M = 1.46)
Reference Area (M = 1.68)
Choices
0-Do not know what this is,
1-Not Useful
2-Occasionally Useful
3-Often Useful,
4-Very Useful
Only a few students interviewed described using the Reference or My Resource
areas unless prompted by faculty. They seemed somewhat unsure as to how these areas
could be used. For example, when asked if she had used the Reference or My Resource
areas one student said, “No, not yet… I am not really familiar with how to get to some of
the stuff on there.”
Faculty perception of the least-useful features were identical to their perception of
the least often used features (My Resources, Help Menu, and the Reference area).
During the interviews faculty spoke of viewing Advanced Outcomes of major programs
275
and using the criteria sheets from the Reference area, but were not familiar with the My
Resource area. This would imply a need to address the purpose of these two areas and
their usefulness in training sessions.
Student and faculty perceptions of the most-useful features of the DDP seemed to
mirror the main processes of the DDP. Both students and faculty listed accessing the
DDP from off-campus in their top three most useful features. Easy access to information
any time, anywhere on the Internet is becoming the norm in technology and the data from
this study supports the importance of this easy access. Table 71 displays the comparison
of student and faculty perceptions of the most useful DDP features.
Table 71
Comparison of Student and Faculty Survey Results of Most-Useful Features of the DDP
Students
Accessing Work and Self
Assessment (M = 2.68)
2. Accessing the DDP from
Off-Campus (M = 2.66)
3. Accessing Feedback
(M = 2.66)
1.
Faculty
Providing feedback to
students (M = 2.83)
Viewing Student Self
Assessments (M = 2.59)
Accessing the DDP from
Off-Campus (M = 2.37)
Choices
0-Do not know what this is,
1-Not Useful
2-Occasionally Useful
3-Often Useful,
4-Very Useful
It is interesting to note that the mean scores for the most useful features were
generally between 2.6 and 2.8. This could be connected to both students and faculty
comments on the need to use the DDP more often.
Sub-questions 4: What are student and faculty perceptions of the overall usefulness of the
DDP?
Data from surveys and interviews indicated that students and faculty perceived
the DDP as USEFUL. A total of 68.6% of students surveyed rated the DDP as useful to
extremely useful. Faculty survey responses indicated that 83.3% of faculty rated the DDP
as useful to extremely useful. However, 10.1% of students rated the DDP as not useful,
276
while 4.4% of faculty responded the DDP was not useful. Advanced students had a lower
perception of usefulness of the DDP with 21.7% responding that the DDP was not useful.
Perhaps this result is connected to advanced students’ perception that they do not use the
DDP very much after their beginning courses. Survey results found that 10.1% of
intermediate students and 5.9% of beginning students responded that the DDP was not
useful.
Of note in these findings is the pattern of student responses, both to the openended survey question which asked them to explain their rating of DDP usefulness, and
in their interview comments. A major theme on both the student surveys and interviews
was the infrequent and inconsistent use the DDP. Students described the DDP as useful,
but not as useful as it could be because they are not using it much. For example, during
the interview, an intermediate student said, “The DDP would be such an awesome tool if
it was used more frequently.”
Student interview comments did not indicate negativity towards the DDP itself;
instead, their negative comments focused on the infrequency of use. For example, an
advanced student who responded hesitantly to the question of overall usefulness also said
“…if it was encouraged to be used in each and every class I think it would be a great
tool.” An intermediate student said, “I don’t see a major use for me in my art therapy. I
don’t have a whole lot of things to put in there…I haven’t had any reason to [use it], so I
don’t see it as majorly useful.” This student went on to say “I think if I were using it to
the extent I could be using it, or should be using it, that I would find the whole thing very
useful.”
277
Faculty perception of the overall usefulness of the DDP was higher than students.
Over 83.3% of faculty responded the DDP was useful to extremely useful (20.0%
responded extremely useful). It is interesting to note that faculty perceived the DDP as
useful but, according to student perceptions, the faculty do not seem to use it enough.
Survey data supported the interview findings with a high number of positive
open-ended comments concerning the usefulness of the DDP. The majority of negative
comments were about the DDP process, not the DDP itself. Faculty responses included
time and workload issues, various media on the DDP that they did not think is useful
(video and audio files), and/or that the DDP does not “fit” with the way they gave
feedback. One clearly negative faculty survey response was:
I think we have spent a lot of money on a technological tool that has marginal
value that we now need to justify. I may be wrong. I could easily be convinced
that I am incorrect. However, the only value I see to the DDP is that students can
look back on previous performances. While I think this is neat, I don't see how
that is worth millions of dollars and thousands of hours of investment.
Interview data seemed to be skewed to the DDP being a very useful tool with five
out of six student interviewees indicating they wanted to use the DDP more often.
Faculty interviewees described how they used the DDP and integrated it into their
teaching, the need to learn more about the DDP, and the need to use the DDP more often
with their students. One faculty indicated they did not like the DDP and would not use it
unless required to. However, this interviewee made note of the fact that they could
understand why students would think the DDP is useful to them. One faculty member
(classified in the negative toward the DDP group) said:
278
I’ll say it is becoming very useful. It is a pain in the neck, but it will always be an
effort but it’s work we have to do. When there’s enough stuff in there, and we’re
using it in a more effective way that’s lined up better with our philosophy, then
it’s worthwhile; worth the extra effort.
The results of this study found that students and faculty perceived the DDP as a
useful tool. A notable finding is that students responded they wanted to use the DDP
more often, and the more they use it, the more useful they believed the DDP would
become. Students indicated they would like more consistent use of the DDP in their
courses, especially courses within their major program. Faculty responded they wanted to
use the DDP more often with their students. It would seem that both groups agreed on
the need to use the DDP more often. Due to the processes of the DDP, in order for
students to use it more often, faculty need to have students complete key performances.
The findings of this study also indicated that it is important for the institution to create
models of DDP use that demonstrates integration with faculty teaching and assists faculty
in using the DDP more often.
Sub-question 5: What are student and faculty perceptions of ease of use of the DDP?
Survey results indicate that students and faculty perceived the DDP as EASY to
use. Approximately 74.1% of students surveyed responded the DDP was easy to
extremely easy to use, while only 6.0% of students thought the DDP was not easy to use.
Approximately 65% of faculty thought the DDP was easy to extremely easy to use, with
9.9% of faculty responding the DDP was not easy to use.
It is interesting to note that ease of use of the DDP varied with the student groups.
For example, advanced students had the highest percent of students who thought the DDP
279
was not easy to use (9.8%). Only 6.0% of beginning students responded that the DDP
was not easy to use. However, 33.5% of beginning students responded that the DDP was
less than easy to use. These results could be due to beginning students are just learning
the system and/or the limited training they receive. Entering students are introduced to
the DDP during a one-hour training session on technology use at Alverno College.
Besides an introduction to the DDP, students are introduced to the Academic Computing
Center (computer labs that students use), the Alverno network, and Educator. This
technology training session takes place during a two-day general introduction to Alverno
prior to the start of classes.
Negative comments from the open-ended student survey questions were generic
in nature. For example: “With instructions, I can use the DDP but I’m not good with
computers” and “It seems a bit complicated to go through the whole process of uploading
and entering info that I don’t really use.”
Faculty survey comments indicated some specific issues with the DDP, such as
issues with archiving and cloning, use of the back button, and forgetting how to use the
DDP. Faculty comments also indicated difficulties in using the DDP with large classes,
as well as the length of time it takes to upload feedback to the DDP at the end of the
semester. One faculty wrote “I never use it [DDP].”
Student and faculty interviews supported the survey results. Five out of eight
students interviewed indicated they had no problems using the DDP. One advanced
student responded “I think it has gotten better.” An intermediate student’s response
focused on not understanding some of the additional functions of the DDP, “…the only
thing [I don’t understand] is the Resources. I wish there had been more focus on it.”
280
Four out of six faculty interviewed described some issues with ease of use of the
DDP. Two faculty members referred to the DDP “timing out” too fast or had issues with
using the browser’s Back button. Faculty also mentioned the need to “refresh” the
screen, although they viewed this as an irritation rather than a major issue with the DDP.
One faculty said using the DDP was getting a little easier, but also indicated the need to
keep making the DDP more intuitive. Another faculty interviewed described having had
significant technical issues with the DDP in the past, saying “I am not a fan of the
system…I have not done anything else and I don’t intend to if I don’t have to.”
The results of this study indicated that both student and faculty perceive the DDP
as easy to use. The data provided several suggestions to make the DDP even easier to use.
It is interesting to note that students responded that if they used the DDP more, it would
be easier to use. Faculty survey responses included similar comments concerning
frequency of use. One faculty said, “Because I use it at the end of the semester I always
need a learning refresher to get into the groove again.”
Sub-question 6: What are student and faculty perceptions concerning the frequency of
use of the DDP?
Student perception of the frequency of use of the DDP indicated they believe the
DDP should be used more often. Frequency of use was a main theme in all of the openended student survey questions as well as in the student interviews. The results of the
study indicated faculty perceived they should use the DDP more often with their students.
Approximately 51% of students responded that the DDP was not used enough, while
7.9% of students responded the DDP was being used more than enough. Faculty survey
281
results found 60.5% of faculty responded they use the DDP less than often with their
students.
Over 50% of the responses to the student survey open-ended question (please
explain your rating) commented on not using the DDP enough. Examples include:
“Haven’t been asked by teachers to use the DDP – have only used it twice for two
semesters” and “It seems that for its purpose we don’t use it enough. We should use it
more.”
Faculty responses to the open-ended survey question (please explain your
response) indicated they are learning or trying to use the DDP more often. Faculty
responses included: (a) “I am on the curve of adoption toward often;” (b) “I’ve made a
commitment to myself to use it every semester;” and (c) “I am doing a bit more each
semester. I have designed a key performance every other semester.” One faculty wrote
on their use of the DDP: “I used it but stopped. It took too much time. At 2
minutes/student to upload in a class of 30 this is 1 hour.”
Student and faculty interviews supported the survey results. All eight students
interviewed indicated they wanted to use the DDP more often and more consistently. For
example, an intermediate student said, “I wish it would be [used] more because I would
like to go in there and see [more things].” An advanced student stated: “It tends to be hit
or miss with the faculty comfort with the DDP.”
Five out of six faculty interviewed wanted to see the DDP used more often. One
faculty, in response to the question of what would assist you in your use of the DDP said,
“More stuff up there. In particular, more faculty feedback and student self assessments
that are aimed at departmental outcomes for majors and supports.” Another faculty, in
282
response to the question, what would you would like to tell the DDP Design Team, said,
“I would encourage as many opportunities as possible for faculty to use the DDP as an
opportunity to record overall judgment of student work.”
The results of this study indicated that both students and faculty would like to see
the DDP used more often. Survey and interview comments also pointed out that students
and faculty believe the more the DDP is used, the more useful it would become. These
findings identified the importance for the institution to provide resources, support, and
training that assists faculty in increasing their use of the DDP. The findings also suggest
that the institution also needs to continue to develop models that integrate the DDP use
into faculty teaching and work load.
Sub-question 7: What suggestions do students and faculty have on: how to improve the
usefulness of the DDP, how to assist them in using the DDP more, and what general
ideas would suggest improvement of the DDP?
One clear reoccurring student theme, in the survey questions regarding
suggestions for the DDP, concerned the frequency of use and the need to use the DDP
more often. For example, on a student survey question on how to enhance the usefulness
of the DDP, one student said, “having professors use it consistently from class to class.”
On a survey question concerning suggestions for using the DDP more often, a student
said, “having more classes access the DDP.” One student, in response to a question on
suggestions for improving the DDP, said, “…either get rid of it or use it more often.”
Another theme for student responses concerning suggestions for the DDP was the
need for training. Students frequently responded that they need to understand the purpose
of the DDP. Student suggestions included: “have a class on the DDP” and “A better
283
workshop on how to use it instead of the 20 minutes when you are a beginning student.”
Student survey responses also offered some generic suggestions on how to improve the
DDP including: (a) “use it for every class;” (b) “same sign in code;” (c) “if my teachers
valued it and knew how to use it;” and (d) “have instructors do feedback in a timely
manner.”
Student interview suggestions mirrored the student survey suggestions, especially
with respect to using the DDP more often. In response to a question on what would you
like to tell the DDP Design Team, students said, (a) “Well, I don’t think this is [for] the
design team, as much as encouraging faculty to make it a requirement;” (b) “Keep
reintroducing the idea…I am a senior and this is the first time in a long time that we are
going to start using the DDP;” and (c) “I talked to a few students in preparation for this
interview and I’m getting the same type of feedback. We’d love to use it, but it never
comes up.”
Faculty survey responses to the questions on suggestions for the DDP contained a
pattern of calling for increased and on-going training and development, learning more
about features, and increased departmental training and planning. There were also clear
patterns of time and work load issues in faculty responses. For example: (a) “Keep
showing us how not to make it an add-on job;” (b) “Find less time-consuming ways to
give feedback to large classes;” (c) “More time and training;” and (d) “Integrate it with
other faculty work.”
Faculty suggestions for improving the DDP seemed to focus more on DDP
process. Examples of faculty survey suggestions for improvement included: (a) “Bring
part-timers aboard;” (b) “More variety of feedback modes;” (c) “Simple means of
284
scanning handwritten feedback;” and (d) “Accountability.” However, there were some
specific faculty suggestions on improving the DDP application including: (a) “Longer
time out;” (b) “Be able to use the Back Button;” (c) “Remove the Refresh problem;” and
(d) “Be able to see more windows at the same time.” Some faculty survey responses
pertained to institutional processes. For example, one faculty survey response was, “Use
it to address institution-wide issues around quantity, quality, and timeliness of feedback.
Use it to help the college rethink and organize narrative transcripts.”
Faculty interviews also provided some suggestions for enhancing use and
improving process. Examples include: (a) “just keep being the cheerleaders… I think if
there has to be a team of dedicated people that keep hammering away on it;” (b) “It
would be nice if we could do batch uploads so that I could take my whole class and
upload all of the feedback at once;” (c) “I think it would be useful to get a report of
something of the student’s progress with respect to each of the departmental outcomes;”
and (d) “More final evaluative feedback on courses.”
Survey and interview suggestions given by students indicated that they want to
use the DDP more often and have more training on how to use the DDP, especially with
respect to additional DDP features. Faculty comments indicated the need to have more
entries in the DDP for students and the need for more training. Faculty training
suggestions extended beyond the “how to” of the DDP and included the need for training
on the integration of the DDP into teaching and workload. The results of this study
identified faculty issues concerning using the DDP with large classes and the time it takes
to upload feedback into the DDP.
285
Summary of Results on Characteristics of Key Performances
This study investigated four main questions concerning the characteristics of key
performances: (a) active key performances being used by students, (b) departments that
have completed key performances, (c) key performance connection to abilities and levels,
and (d) key performance connection to other matrices.
How many active key performances are being used by students?
Data from the DDP relational database found that 38.9% of active key
performances were completed by students during the spring, 2005 semester (472 active
key performances, 184 were used). These results reveal that 61.1% of active key
performances on the DDP were not used during spring, 2005. These results could be due
to some courses being offered every other semester, faculty teaching rotations, or lack of
training in how to archive the key performance (remove it from the active list). The fact
of having unused key performances could create an issue for students who are searching
for a particular active key performance to complete and could result in errors in selection
of the correct key performance. It is important for the institution to investigate and track
the use of active key performances and create maintenance plans that will keep the list of
active key performances as up-to-date as possible.
What discipline departments have completed key performances?
A total of 58.7% of departments had completed key performances during spring,
2005. The Assessment Center (AC) had the highest percent of completed key
performances at 23.6%, with the Communication Ability Department second at 19.6%.
These results indicated that two departments accounted for 43.2% of completed key
performances. It is important that additional departments, especially discipline
286
departments with majors, increase their number of completed key performances. This
suggestion is reinforced by student comments that they seem to use the DDP in their
beginning courses (general education), but use the DDP very little in their major courses.
How are completed key performances connected to the abilities?
Key performances completed during spring, 2005, were connected to all eight
abilities and all four levels (refer to Chapter One for explanation of abilities and levels).
These results indicated that students could use the DDP to demonstrate their progress in
development of the eight abilities and levels. The Communication ability accounted for
43.6% of completed key performances connected to abilities. The Analysis ability was
second with 19.7% of completed key performances connected to this ability, followed by
Problem Solving with 11.8%. The other five abilities (Valuing, Social Interaction,
Effective Citizenship, Global Perspectives, and Aesthetic Engagement) together
accounted for approximately 25% of completed key performance connected to the
Abilities Matrix. While all four levels of abilities were represented by completed key
performance, level three had the highest percentage of connections, with 29.6%. Level
four had the smallest percentage of connections to completed key performances with
16.8%.
The results of this study indicated that students could demonstrate all eight
abilities and four levels in completed key performances on the DDP during spring, 2005.
This is a critical point, because the purpose of the DDP is to assist students in analyzing
their development in the eight abilities. While Communication has the highest
percentage of completed key performance connections to the Ability Matrix, it is
important to note that these connections are usually at the beginning levels (levels 1 and
287
2). It will be important to continue to expand the demonstration of abilities and levels on
the DDP to insure students have sufficient numbers of key performances to analyze their
development in all of the abilities and levels.
How are completed key performances connected to other matrices?
The results from this study indicated that key performances were connected to
other matrices in the DDP; however, these connections are somewhat limited. For
example, key performances completed during spring, 2005, were connected to 29
different matrices (other than the Ability Matrix). In spring, 2005, there were 59 different
active matrices in the DDP (other than the Ability Matrix). An analysis of this data
indicated that 49.1% of matrices were connected to completed key performances.
The Wisconsin Educational Standard’s matrix had the highest percentage of
connections to completed key performances with 17.3%. Psychology was next at 14.8%,
followed by the Wisconsin Mathematical Guidelines with 10.1%. Approximately 69% of
all connections were from seven matrices (Wisconsin Educational Standards,
Psychology, Wisconsin Mathematical Guidelines, English (8.8%), Computer Science
(6.9%), Business and Management (5.9%), and Education (5.0%)).
The results of this study indicated a need to expand the connections of key
performances to a larger variety of matrices. These results supported findings from
student surveys and interviews which indicated that the DDP seems to be used more in
beginning general education courses (connections to ability matrix) than in intermediate
and advanced course work (usually connected to advanced outcomes matrices).
288
Comparison of the DDP to Love, McKean, and Gathercoal’s Levels of Maturation for
Digital Portfolios
Love, McKean, and Gathercoal’s research on levels of maturation of digital
portfolios was used to analyze the maturation level of the DDP. This analysis could
provide additional perspectives for looking at the criteria for the various levels of
maturation, especially as they apply to an institution-wide, required digital portfolio
(DDP).
To compare the DDP to Love, McKean, and Gathercoal’s five levels of maturation of
digital portfolios, the criteria set forth by the authors were used (statements regarding
system structure and function). Table 72 lists the five levels of maturation, the authors’
statement regarding system structure and function for each level, and a description of
how Alverno’s DDP meets these statements.
The authors determined their five levels of maturation by analyzing and
categorizing eight physical and theoretical qualities they believe are inherent in the
portfolio/webfolio processes and applications. These eight qualities include:
1. Type of portfolio/webfolio – working or showcase
2. Organization of the portfolio/webfolio
3. Type of student artifact in the portfolio/webfolio
4. Presence and capture of feedback and assessment based on standards
5. Nature of the portfolio/webfolio content – static or dynamic and evolving
6. Heuristic processes involved in developing the portfolio/webfolio
7. Context provided for each item in the portfolio/webfolio
289
Table 72
Comparison of the DDP to Love, McKean, and Gathercoal’s Levels of Maturation for Digital Portfolios
Maturation Level
Statement Regarding System Structure and
Function
Level 1: Scrapbook
Hard-copy, eportfolio, or webfolio
Students have no schema that guides the
organization and artifact selection. A portfolio is
really just a scrapbook of assignments completed in
course or awards received along the way
Student work is guided and arranged by educator,
department, or institution determined curriculum
requirements or standards and institution-wide
“student life” contributions.
Alverno’s DDP is not just a scrapbook. There is a specific schema (key
performances) that guides the organization of the learning artifacts.
The student can contribute to the content structure
within the departmental and program curricular
framework or “student life” institutional showcase
of achievements. The portfolio is a working and a
showcase portfolio.
Alverno’s DDP is both a working and a showcase portfolio. Students can
elect to “pull off” selected key performances to form other portfolios. The
DDP also includes areas (My Resources) and processes (Independently
Learning Experiences) that allow students to make their own entries
connected to curricular or extra-curricular activities.
Level 4: Mentoring
Leading to Mastery
Webfolio
Students can redeem their work multiple times
based on feedback from a variety of interested
parties, educators, mentors, administrators,
parent/caregiver(s), employers, and recruiters.
Level 5: Authentic
Evidence as the
Authoritative Evidence
Webfolio
Work-sample assessment is linked to standards,
program goals, and other descriptors like higherorder thinking taxonomies, and this data is
retrieved for analysis at the individual, class,
program, or institutional level.
Alverno’s DDP is an integrated system of multiple performances that
include assignments, learning resources, student work, and feedback. The
DDP has the ability to include a variety of media including audio and
video. Students must complete a self assessment for each key
performance, providing an emphasis on reflection. There are also selected
time in the student’s curriculum where they are asked to reflect on their
prior learning, assess strengths and challenges, and create learning plans
for the future.
The DDP is an integrated system of assignments, assessments, learning
resources, student work and feedback that is linked (connected to) state
educational standards for pre-service teachers, institutional standards
(eight abilities), program standards (Advanced Outcomes of majors and
supports), and includes multiple performances.
The DDP does not provide the ability for students to control who can view
their portfolio, because all faculty can view all student portfolios.
Students do have the ability to control anyone else’s access to their
portfolio.
Level 2: Curriculum
Vitae
Hard-copy, eportfolio, or webfolio
Level 3: Curriculum
Collaboration
Webfolio
Alverno’s DDP
Student work is arranged by institutional abilities, advanced outcomes of
major and support (minor) areas, Wisconsin Educational Standards, and/or
Wisconsin Content Guidelines
289
290
8. Delivery mode for the portfolio/webfolio (Love, McKean, and Gathercoal,
2004, p. 25)
In addition to these eight qualities, they also considered six value-oriented issues: value
to the student, value to the employer, value to the educator, value to the educational
institution, potential for contributing to digital equity within the educational institution,
and expense involved in developing the portfolio/webfolio (Love, McKean, and
Gathercoal, 2004, p. 26).
Each of the descriptions of the levels of maturation builds on the previous level.
For example, Level 5 assumes the student can redeem their work multiple times (from
Level 4) as well as having work sample assessment linked to standards. Table 73 lists
each of the eight physical/theoretical qualities and the six value-oriented issues with a
summary description for Level 5 (Authentic Evidence as Authoritative Evidence) to
provide additional data on the comparison of the DDP to the five levels of maturation.
The last column in the table describes characteristics of the DDP that relate to each of
these qualities and issues.
Table 73
Comparison of DDP to Level 5 Maturation: Authentic Evidence as the Authoritative
Evidence--Webfolio
Qualities
Issues
Description
Type
Level 5 Description
Integrated system of assignments, learning
resources, student work, formative and
summative feedback linked to national, state,
and program standards; multiple opportunities
to master curricular content
Working or showcase
Characteristics of DDP
The DDP is an integrated system of
key performances, that are linked to
Alverno’s eight abilities and four
levels, Advanced Outcomes of
majors/minors, and/or state
educational standards and content
guidelines that include feedback
Developmental portfolio with the
ability to create a variety of showcase
portfolios
Table Continued
291
Table Continued
Qualities
Issues
Organization
Student
Artifact
Feedback
and
Assessment
Nature of
Content
Heuristic
Process
Context
Delivery
Student
Value
Level 5 Description
Student work arranged by department and
program curriculum initiatives and institutionwide “student life” contributions; also might
include student contributions to content
structure within department of program
curricular framework or “student life”
institutional showcase
Multimedia capabilities
Formative and summative feedback, provided
by teachers, mentors, administrators,
parents/caregivers, employers, or recruiters;
work-sample assessment linked to national,
state, and program standards and retrieved for
analysis at individual, class, program, or
institutional level
Dynamic content; may be revised based on
instructor and/or mentor feedback until the
content is “locked” by instructors
Student-controlled process of reflection and
critical thinking mediated by choices made in
program, educator, and/or student life; student
responses to course and program assignments,
or constructed work samples within a particular
curriculum; student control over what
categories of people (all teachers, students,
recruiters, and so on) can view each work
sample; students maintain working and
showcase portfolios with the same work
samples but limit access of the “showcase
audience” to the best and most relevant works
Provided by institution, program, educators,
and students; includes information about the
institution, faculty, program, specific syllabi
and assignments, additional help, resources,
assessment criteria, and student work sample;
may include product description and work
samples provided by student
Electronic – anywhere, any time
High—enhanced communication involving
multimedia messages among student, teacher,
mentors, significant others, recruiters
employers; great potential for feedback,
reflection, and self-appraisal within a heuristic
process
Characteristics of DDP
DDP is organized by matrices (See
Description above); also includes
learning inventories and can include
entries related to co-curricular
activities
Multimedia capabilities; video
portfolio for each student related to
student development of speaking
Completed key performances must
contain feedback; feedback can be
provided by teachers, mentors,
external assessors; key performances
are organized and linked to matrices;
can be retrieved by student, faculty or
institution for analysis
Learning artifacts connected to key
performances are locked after 24
hours; area of DDP available for
students to upload additional resources
that may be modified at any time
Completed key performances must
contain student self assessments;
students capable of controlling key
performance connection certain
matrices; ability to upload work
samples; students have control over
creating a variety of showcase
portfolios for different audiences.
Key performances created by faculty
or departments contain context in the
form of description and criteria for
judgment; student work samples
maybe uploaded or required by
faculty.
Internet-based anywhere, any time
Data gathered in this study indicated
student value the DDP as a source of
reflection and feedback that is
developmental; students express need
to use the DDP more frequently
Table Continued
292
Table Continued
Qualities
Issues
Employer
Value
Educator
Value
Institutional
Value
Digital
Equity
Expense
Level 5 Description
Characteristics of DDP
High—enhanced communication involving
multimedia messages among student, teacher,
institution, and employers; employer can view
student’s showcase portfolio, with the benefit
of contextual clues from institution, syllabi,
assignments, help, resources, and assessment
criteria
High—enhanced communication involving
multimedia messages among student, teacher,
mentors, significant others, recruiters
employers; educator can repeat instructional
implementation by copying course content
from one semester to the next, each time
enriching the content through additional
resources and new curricular initiatives;
educators also can ascertain which students
meet or exceeded standards linked to specific
assignments, using assessment data to assist
course revision
Moderate—enhanced communication involving
multimedia messages among student, teacher,
mentors, significant others, recruiters,
employers; institution can repeat instructional
implementation by copying course content
from one instructor to the next, each time
enriching the content through additional
resources and new curricular initiatives,
institution also can ascertain which students
met or exceeded standards linked to specific
assignments, using assessment data to assist
program revision.
Highly likely (if requirement for students)
Students can create a variety of
showcase portfolios for employers that
include the context of the key
performances, criteria for judgments,
and feedback on the quality of the
work based on the criteria.
Low
Faculty can clone key performances to
refine and develop; DDP contains
reference section with resources on
ability descriptions and Advanced
Outcomes statements; specific
feedback on key performances can
detail quality of students performance
and are always linked to one or more
matrices
Because the DDP mirrors Alverno’s
Ability-based educational philosophy
the value to the institution is high;
DDP provides “snapshots” of student
performances across time and
throughout the curriculum; can
provide source of data for institutional
research.
Required by all undergraduate
students
High implementation expense,
moderate expense to maintain
depending on programming and
enhancements made
The DDP clearly meets the majority of the qualities and issues listed by the
authors. In the Type quality, the DDP is a developmental (working) portfolio that can
also be used to create separate showcase portfolios by download a selection of key
performances to their computer or CD. Under Nature of Content, learning artifacts that
are connected to key performances are “locked” after 24-hours. In the Context quality,
293
the creation of key performances requires context to be added in the form of a description
and explicit criteria. In addition, feedback from a faculty member or external assessor is
required to complete the key performance. In the Heuristic Process quality, the DDP
does not provide complete student control over who can view their DDP. Alverno
faculty can view a student’s DDP, but security measures prohibit anyone else from
viewing it. However, as stated earlier Alverno students have the ability to download and
create selective portfolios and control who can view these downloaded portfolios.
The results from this study indicate that student and faculty value of the DDP was
high. Students valued the ease of access and the ability to check past work and current
feedback any time, any place. Faculty viewed the DDP as a tool they would like to use
more often; one that is useful to them. One difference between level 5 maturation and the
DDP concerned institutional value. Love, McKean, and Gathercoal’s list the value to the
institution as moderate at level 5. The DDP’s value to the institution is high, due to the
DDP mirroring Alverno’s Ability-Based learning philosophy. The DDP also differs from
level 5 maturation in the category of Expense. The authors list expense as low at level 5
because “students can assign and reassign access to a variety of constituencies; because
students can modify webfolio items, which are instantly undated for all to see; and
because there is no delivery cost to the student.” (Love, McKean, Gathercoal, 2004, p.
34) It seems, that by expense the authors are referring to expense for the owner (student)
rather than the expense to the institution. For example, they list the expense for level 1
Scrapbook and level 2 Curriculum Vitae as high, while levels 3 to 5 expenses are listed at
low. While there is no direct delivery cost to the student when using the DDP, there is a
cost to the institution to maintain and/or enhance the DDP.
294
Relationships of Results to Previous Research
Most research on digital portfolios focuses on explaining the various types and
categories, describing digital portfolio programs, and/or explaining implementation
strategies. This study went beyond describing the DDP and provided data on students
and faculty use of the DDP. There is also limited research data on student and faculty
perceptions of digital portfolios. This study expanded the research and demonstrated that
Alverno students and faculty perceived DDP as an easy to use, useful tool. The study also
pointed out that while faculty perceived the DDP as a useful tool, they did not seem to
use it as often as they would like. In addition, this study provided insights into issues that
could inhibit faculty use of digital portfolios.
This study reinforced Zou’s findings that digital portfolios need to be reflective
learning tools. The findings of this study mirrored Zou’s results concerning the positive
attitudes of the majority of students towards the portfolio process. Although Zou’s study
focused on teacher education and on institution-wide digital portfolios, this study’s
results reinforced one of her premises -- learning portfolios seem to trigger student
interest and motivation.
This study expanded the initial research on the DDP completed by Alverno’s
Educational Research and Evaluation Department (ERE), significantly adding to the data
on DDP use. The results of this study expanded the quantitative data gathered by ERE on
student log-ons (Table 14), as well as providing new data on faculty log-ons. This study
has also added to the qualitative data gathered by ERE from the 2002 student interviews.
Six questions from the ERE student interviews were incorporated into the student
interviews.
295
In a document on Research and Evaluation Activities 2001-2002, Rickards described
student experiences as falling into three categories. Table 74 outlines these three
categories, ERE’s findings, and a description of related data from this study. The data
gathered from this study supported and extended ERE’s findings. For example, ERE
described the Introductory category as being guided by faculty or staff members who
work closely with the student and directs procedures, usually occurring at the beginning
level. Data from this study indicated that tasks such as logging onto the DDP, exploring
sections, preparing/uploading self assessments, and reading feedback are completed by
students with limited faculty or staff direction. Students perceived the DDP was
relatively easy to use for these basic tasks. However, the results of this study found that
students wanted additional training on the variety of other features available on the DDP.
One pattern of student responses in this study was the need to use the DDP more
frequently and consistently throughout their curriculum. The results of this study
indicated that students did not really perceive a difference between the Introductory and
Supported Use categories. Studdents used the DDP independent of course time to
complete key performances assigned by their instructor. Rather than describing the need
for direct faculty or staff guidance, students described the need for additional training and
an increase in frequency of use of the DDP.
296
Table 74
Comparison of ERE’s Student Experience Categories
Categories ERE’s 2002 Findings
Results of Study
Introductory
Supported Use
Student
constructing
and creating
own use
Tasks: logging on, exploring sections,
preparing and uploading self assessments,
and reading feedback
Guidance: guided by faculty or staff
member who works closely with students.
Occurrence: entry to college and in the
first few semesters, can occur at later
points if faculty are introducing
specialized applications
Tasks: Linked to particular activities with
a course, directed by faculty, but occurs
independent of course time.
Guidance: Not guided directly by faculty
or staff.
Occurrence: Some department uses
(English Dept. reading list), but primarily
in connection with outside-of-class
assessments (AC 301 Mid Program
Review and GEC 300).
Tasks: Determined by students, including
storage strategies and developed by
students own patterns and applications.
Guidance: No guidance by faculty or staff,
strategies and methods developed by the
student.
Occurrence: Determined by student
Students did not readily differentiate
introductory use from supported use.
They did not mention logging onto the
DDP as a specific task, but frequently
referred to uploading self assessments,
work and reading feedback.
Students seemed to readily understand the
basic procedures of uploading self
assessments and work without close
guidance and describe the DDP as easy to
use.
Students described using the DDP in
beginning and some advanced course
work, but described limited intermediate
use, except for outside-of-class
assessments (AC 301 and AC 260 Mid
Program Portfolio Self Assessment).
Students expressed the need to use the
DDP more frequently and consistently to
build their portfolio as well as to increase
their knowledge of the DDP.
A number of students described creating
their own uses for the DDP including
reviewing past performances, checking
past feedback as they prepare for a new
learning activity or assessment. A few
students described accessing the
Reference area to locate criteria sheets,
descriptions of abilities, and/or Advanced
Outcomes. Students expressed a need to
learn more about the variety of features of
the DDP, as well as to use the DDP more
frequently and consistently.
Data from ERE’s 2002 interviews described several suggestions for improving the
DDP. Most of the responses dealt with simplifying functions, especially uploading,
better support form faculty (including use the DDP more often), and providing
opportunities for revisions (ability to remove and revise documents). Two of these
suggestions were addressed in the design of version 2.0 of the DDP: simplifying
297
uploading and providing the opportunity to revise/correct documents (for 24 hours).
Students continue to describe wanting to use the DDP more often. The results of the
study in this dissertation included a number of student and faculty suggestions on
enhancing the use of the DDP and improving the DDP program, as well as specific
suggestions for student and faculty training and development.
A significant note in this research is that this study provided a different
perspective on how to look at digital portfolios. Barrett’s research describes “portfolios
used for assessment of learning” (purpose of the portfolio prescribed by the institution)
and “portfolios that support assessment for learning” (purpose of the portfolio agreed
upon with the learner) (Barrett, 2005, p. 18). This concept is mirrored by other authors
(Lorenzo and Ittelson (2005) and Wilkerson (2003)), who refer to portfolios, other than
showcase portfolios, as “products” that are evaluated based on some type of criteria or
rubric. This research study offers a different perspective, the concept of portfolios as
learning. This perspective focuses on digital portfolios as a tool for students to analyze
their own patterns of learning and their learning process; a tool that is integrated into the
curriculum, rather than a separate “assignment” to be completed or a “product” that needs
to be evaluated. With the inclusion of feedback on portfolio entries, portfolios as learning
become snapshots of student learning performances throughout their entire curriculum.
The portfolio as learning can be used by students and faculty to set educational goals,
analyze the learning progress. This portfolio and assists in providing specific points for
students to reflect on, and evaluate their learning progress.
298
Conclusions
This study provided empirical evidence that students and faculty WERE using the
DDP and perceived the DDP as an EASY to use, USEFUL tool that can meet the goal of
providing a developmental record of student learning and self assessment in order to
analyze learning progress. Students met the institutional goal of completing two key
performances during the spring, 2005 semester. Student and faculty perception of the
most useful features of the DDP mirrored the processes they used to complete key
performance. Students and faculty did not perceive the additional features of the DDP as
useful (My Resources, Reference area, Help Menu, and students attaching a key
performance to a matrix). This study highlighted the importance for the institution to
work on developing training programs that go beyond the basic use of the DDP and
encourage and train students and faculty on how to use the DDP to its full potential.
Key performances completed during spring, 2005, provided opportunities for
students to demonstrate their development of the eight abilities and four levels of
Alverno’s ability-based curriculum. However, active key performances connected to
Valuing, Social Interaction, Effective Citizenship, Global Perspectives and Aesthetic
Engagement abilities could be increased so as to provide additional opportunities for
students to demonstrate these abilities.
With respect to key performance characteristics, a majority of active key
performances are not being used. This issue should be explored and addressed to ensure
that active key performances are actually being used. In addition, more of Alverno’s
discipline departments need to create and use key performances. This study also found
that key performances connected to Advanced Outcomes in majors should be increased.
299
In addition, the results of this study suggested increasing the use of the DDP by creating
more key performances that are used consistently throughout the curriculum. This would
not only increase the use of the DDP but also provide students with a richer picture of
their learning progress.
This researcher believes an important outcome of this study is that it provided a
different perspective for thinking about digital portfolios. Rather than viewing a digital
portfolio as a “product” that needs to be assessed or evaluated, digital portfolios can be
viewed from the perspective of portfolios as learning. The process of adding student self
assessments and faculty feedback to the portfolio, as well as analyzing learning
development, becomes the focus of the portfolio, rather than how well the portfolio meets
certain criteria or rubrics. This does not demean the need for students to create separate
portfolios that can be assessed or evaluated in a variety of different discipline programs
(for example teacher education). Using student portfolios as learning could provide the
basis for creating additional portfolios that then could be used to address the variety of
purposes of student portfolios (showcase, learning, assessment, and program evaluation).
Implications for Practice
Results of this study reveal that students wanted to use the DDP more often and
faculty believed they should use the DDP more often with their students. Faculty
interviews provided several examples of discipline department plans for using the DDP.
These examples could be communicated to other departments and departments could be
encouraged to create and implement their own plans to insure consistent use of the DDP
for all students.
300
A comprehensive training and development plan for the DDP needs to be created
to insure that students and faculty understand the purpose and features of the DDP. A
central focus to this training plan should be to ensure that students and faculty understand
the basic purpose of the DDP and how they can use it to provide a developmental picture
of student learning progress. For students this means the development of a
comprehensive training plan that spans the entire curriculum. This plan would go beyond
“how to” complete a key performance to provide student training on analyzing their
patterns of learning, incorporates using the Reference and My Resource area, and
includes rationales for students form their own connections to various DDP matrices.
Faculty training needs to be expanded to provide models of DDP use that
integrate teaching and learning. Faculty interviews provided some excellent models of
this integration, which could be shared with all faculty. Additional models need to be
developed and provided as part of a comprehensive training and development plan.
This study reinforces the need to continue to collect data on the use of the DDP,
as well as to continue to explore student and faculty perceptions concerning the DDP.
Continuous research also needs to be done to monitor consistent use of the DDP
throughout students’ educational experience at Alverno College.
Limitations of Study
This study was limited to undergraduate student and faculty use and perceptions
of Alverno College’s Diagnostic Digital Portfolio. Data from the DDP relational database
and student and faculty surveys were gathered during only the spring, 2005, semester.
Interview data was gathered during the following semester, fall, 2005. The fact that the
data gathered from the DDP relational database and the surveys were from only one
301
semester could impact the results of the study. Interviews completed during the
following semester, originally schedule to distance the participants from their surveys,
could also have impacted the results.
Data from the DDP relational database could have included students who were
not undergraduate students, due to the absence of data in some student program fields
(166 records contained blank program fields). However, due to the number of student
records analyzed (1,893 student records) this is a limited problem.
There could be limitations concerning students who participated in the survey.
Students could have been absent when the survey was given. The number of advanced
students (61) who participated in the survey was smaller compared to beginning (172)
and intermediate students (91). There were also limitations concerning faculty who
participated in the survey. The faculty survey was completed during the May all-college
institute. Typically, only full-time faculty attend the institute and the results of this study
reflect the perceptions of full-time faculty. Additional research should be done
concerning part-time faculty use and perceptions of the DDP.
Participation in the interviews was self selecting. Students or faculty may have
had a bias, either for or against the DDP that influenced their decision to be interviewed.
Only two advanced students agreed to be interviewed. These two limitations could skew
the data concerning student perceptions.
The results of the study may not be generalizable to other digital portfolio
programs. However, despite these limitations, the process used in the program evaluation
and the subsequent results may be helpful to other schools. For example, the results of
this study could provide a model for program evaluation of other digital portfolio
302
programs; aspects of the data gathering techniques could be applicable for other digital
portfolio programs; and though the study is focused on Alverno’s Diagnostic Digital
Portfolio, the results of this study added to the body of research on digital portfolios.
Future Research Possibilities
This study underscored the importance of continuing research to track the use of
the DDP, including student and faculty log-ons, completed key performances, and
characteristics of key performance. In addition, research should include continuing,
consistent gathering of data on student and faculty perceptions of the DDP.
The results of this study indicated the need to explore how the DDP can
contribute to institutional research on Alverno’s educational practices and philosophy.
For example, can the DDP be used to document student development of self assessment
throughout the curriculum?
This study provides the foundation for additional research on the impacts of the
DDP on student learning. Now that the institution has empirical data on student and
faculty use and perceptions of the DDP, additional research should be done that focuses
on investigating the impacts of the DDP on teaching, learning, and assessment.
303
Bibliography
Academic Affairs. (2005). Alverno faculty report. Internal Document. Alverno College.
Academic Services. (2005). Enrollment and FTE report. Internal Document. Alverno
College.
Ahn, J. (2004). Electronic portfolios: Blending technology, accountability & assessment
[Electronic version]. T.H.E. Journal Online. April 2004.
Allen, Z., & Rickards, W. H. (2000). The social interaction assessment: A DDP-based
analysis of feedback and self assessment. Unpublished manuscript. (Available
from Alverno College).
Alverno College (2000). Alverno College Ability-Based Learning program. Milwaukee,
WI: Alverno College Institute.
Alverno College. (2001). Final report to Atlantic Philanthropic Service Co.. Unpublished
manuscript. (Available from Alverno College).
Alverno College. (2002). Ability based learning program - The English major
[Brochure]. Milwaukee, WI: Alverno College Institute.
Alverno College Faculty, (1994). Student Assessment-as-Learning at Alverno College.
Milwaukee, WI: Alverno College Institute.
Alverno College Faculty. (2000). G. Loacker (Ed.), Self assessment at Alverno College.
Milwaukee, WI: Alverno College Institute.
Alverno College's Diagnostic Digital Portfolio (2004). Retrieved January 2004, from
http://ddp.alverno.edu
American Association for Higher Education. Retrieved December 27, 2004, from
http://www.aahe.org/ElectronicPortfolios/index.htm#NCEL
304
Avraamidou, L., & Zembal-Saul, C. (2002, Fall 2002). Making the case for the use of
web-based portfolios in support of learning to teach. Journal of Interactive Online
Learning, 1 (2). Retrieved 12/29/04, from www.ncolr.org/jiol/archives/2002/2/01
Barrett, H. (1994). Teacher-supported portfolio assessment. The Computing Teacher,
3/1994. Retrieved December 29, 2004, from
http://electronicportfolios.com/portfolios/compteach0394.html
Barrett, H. (2001). Electronic portfolios - A chapter in Educational Technology; An
Encyclopedia to be published by ABC-CLIO. Retrieved December 29, 2004, from
http://www.electronicportfolios.com/portfolios/encyclopediaentry.htm
Barrett, H. (2005). White paper: Researching electronic portfolios and learner
engagement. Retrieved October 28, 2005 from
http://www.taskstream.com/reflect/whitepaper.pdf
Batson, T. (2002, December 1). The electronic portfolio boom: What's it all about?
Syllabus, 12/1/2002. Retrieved December 29, 2004, from http://www.campustechnology.com/article.asp?id=6984
Billings, D., & Kowalski, K. (2005). Learning portfolios. The Journal of Continuing
Education in Nursing 36(4), 149-151.
Cambridge, B., & Yancy, K. (2001). Electronic portfolios emerging practices in student,
faculty, and institutional learning. Washington, DC: American Association for
Higher Education.
Carney, J. (2004). Setting an agenda for electronic portfolio research: A framework for
evaluating portfolio literature. Retrieved November 2, 2005, from
http://it.wce.wwu.edu/carney/Presentations/AERA04/AERAresearchlit.pdf
305
Chapell, D., S. & Schermerhorn, Jr. J. R. (1999). Using electronic student portfolios in
management education: A stakeholder perspective. Journal of Management
Education 23(6) 651-662.
Chen, H., & Mazow, C. (2002, October 28, 2002). Electronic learning Portfolios and
student affairs. Retrieved 12/29/04, from
www.naspa.org/netresults/PrinterFriendly.cfm?825
Creswell, J. (1998). Qualitative inquiry and research design. California: SAGE
Publications, Inc.
DDP Design Team. (2003). [Notes from meeting, May 13, 2003]. Unpublished raw data.
(Available from Alverno College).
Diez, M. (1994, October). The portfolio: Sonnet, mirror and map. Paper presented at the
Conference on Linking Liberal Arts and Teacher Education: Encouraging
Reflection through Portfolios, San Diego, CA.
Educational Research and Evaluation Department, Alverno College (2003, February 23).
DDP studies: Summary from ERE meeting. Presented at ERE Meeting.
Educational Research and Evaluation Department. (2002). DDP interviews 2002 data.
Unpublished manuscript. (Available from Alverno College).
Educational Research and Evaluation Department. (2004). The context for learning
inquiries in DDP and similar portfolio environments: Seven propositions for an
unfinished tool. Unpublished manuscript. (Available from Alverno College)
Ehley, L. (2004, April). Student use of the Diagnostic Digital Portfolio. Paper submitted
as part of doctoral work.
306
ePort Consortium (2003, November 3, 2003). Electronic portfolio white paper (Version
1.0). Retrieved 12/28/04, from
http://www.eportconsortium.org/Uploads/whitepaperV1_0.pdf
Galloway, J. (2001). Electronic portfolios (EP): A "how to" guide. Retrieved 12/28/04,
from jerrygalloway.com/pro/ep.ppr.pdf
Gathercoal, P., Bryde, B. B., Mahler, J., Love, D. O., & McKean, G. (2002, April).
Preservice teacher standards and the MAGNETIC CONNECTIONS electronic
portfolio. Paper presented at meeting of the American Educational Research
Association, New Orleans, Louisiana .
Georgi, D., & Crowe, J. (1998). Digital portfolios: A confluence of portfolio assessment
and technology. Teacher Education Quarterly. Retrieved November 7, 2005 from
http://www.csubak.edu/~dgeorgi/projects/digital.thm
Gibson, D., & Barrett, H. (2003). Directions in electronic portfolio development.
Contemporary Issues in Technology and Teacher Education, 2 (4), 559-576.
Retrieved 12/28/04, from
http://www.citejournal.org/vol2/iss4/general/article3.cfm
Hamp-Lyons, L., & Condon, W. (1998). Assessing the portfolio: Principles for practice,
theory & research. Cresskill, NJ: Hampton Press.
Hartmann, C. (2004). Using teacher portfolios to enrich the methods course experiences
of prospective mathematics teachers. School Science and Mathematics, 104 (8),
392-408.
Herbert, E. A. (1998). Lessons learned about student portfolios. Phi Delta Kappan, 79(8),
583-591.
307
Herman, J.L., & Winters, L. (1994). Portfolio research: A slim collection. Educational
Leadership, 52 (2), 48-55.
Hill, D. M. (2002, February). Electronic portfolios: Teacher Candidate Development and
Assessment. Paper presented at American Association of Colleges for Teacher
Education Conference, New York, NY.
Ittelson, J. C. (2001). Building an E-dentity for each student. EDUCAUSE Quarterly, 4,
43-45.
Jarfari, A. (2004). The "sticky" ePortfolio system: Tackling challenges and identifying
attributes. EDUCAUSE Review. Retrieved July 20, 2004, from
http://www.educause.edu/pub/er/erm04/erm0442.asp
Jones, J. (1994). New directions for adult and continuing education. In R. Hiemstra & R.
G. Brockett (Eds.), Overcoming resistance to self-direction in adult learning. San
Francisco, CA: Jossey-Bass Publishers. Retrieved 1/9/06, from http://wwwdistance.syr.edu/ndacesdch3.html
Lankers, M. (1998, April). Portfolios: A new wave in assessment. THE Journal Online,
April 1998, . Retrieved 12/28/04, from
www.thejournal.com/magazine/vault/articleprintversion.cfm?aid=3380
Loacker, G., Cromwell, L,. & O’Brien, K (1985). Assessment in American higher
education: Issues and contexts. In C. Adelman (Ed.), Assessment in higher
education: To serve the learner (pp. 47 – 62). Washington DC: Office of
Educational Research and Improvement, U. S. Department of Education.
Loacker, G., & Rogers, G. (2005). Assessment at Alverno College: Student, program,
institutional. Milwaukee, WI: Alverno College Institute.
308
Lorenzo, G., & Ittelson, J. (2005). D. Oblinger (Ed.), An overview of E-portfolios.
Retrieved October, 2005, from
http://www.educause.edu/ir/library/pdf/ELI3002.pdf
Love, D., McKean, G., & Gathercoal, P. (2004). Portfolios to Webfolios and beyond:
Levels of maturation. EDUCAUSE Quarterly, 25 (2) . Retrieved December 27,
2004, from http://www.educause.edu/ir/library/pdf/eqm0224.pdf
Lyons, N. (Ed.). (1998). With portfolio in hand: Validating the new teacher
Professionalism. New York: Teachers College Press.
Mathews, J. (2004). Teachers struggle for depth despite tests. washingtonpost.com, July
6, 2004. Retrieved July 20, 2004, from http://www.washingtonpost.com/wpdyn/articles/A30980-2004Jul6.html
McDermott, K. (2003). E-portfolios: More than just resumes. Dartmouth News. Retrieved
12/28/04, from http://www.dartmouth.edu/~news/releases/2003/april/041803.html
Miles, M., Huberman, A. (1994).Qualitative data analysis. California: Sage Publications,
Inc.
Mullen, L., Bauer, W. I., & Newbold, W. W. (2001). Developing a university-wide
electronic portfolio system for teacher education. Retrieved 12/29/04, from
english.ttu.edu/kairos/6.2/assessment/mullenbauernewbold/main.htm
Murphy, P. (2003, February 2003). E-portfolios: Collections of student work move from
paper to pixels. Teaching, Learning and Technology Center University of
California. Retrieved 12/28/04, from www.uctltc.org/news/2003/02/feature
309
Newcomer, K. E., Hatry, H. P., Wholey J.S. (2004). Meeting the need for practical
evaluation approaches: An introduction. In J. Wholey, H. Hatry, K. Newcomer
(Eds.), Handbook of practical program evaluation (pp. xxxiii – xliv). San
Francisco: Jossey-Bass.
Owen, J., & Rogers, P. (1999). Program evaluation forms and approaches. London:
SAGE Publications Ltd.
Paulson, F. L., Paulson, P. R., & Meyer, C. A. (1991). What makes a portfolio a
portfolio?, Educational Leadership, 48 (5), 60 - 63.
Paulson, F. L., Paulson, P.R. (1994). A guide for judging portfolios. Measurement and
Experimental Research Program (Reports Evaluative/Feasibility No. 142).
Portland, OR.
Product roundup: An electronic portfolio sampler. (2003). Syllabus, 38 - 39.
Rickards, W. H. (2002, January). The DDP: Research and evaluation activities fall 2001.
Paper Presented at the meeting of the Educational Research and Evaluation
Committee, Alverno College.
Rickards, W. H. (2002). The DDP: Research and evaluation activities, 2001 - 2002.
Unpublished manuscript. (Available from Alverno College).
Rickards, W. H. (2002). Observations of student's using the DDP for a final course self
assessment in AH 150. Unpublished manuscript. (Available from Alverno
College).
Rogers, G., & Reisetter Hart, J. (2001). Observations of student's using the DDP for a
final course self assessment in CM 110. Unpublished manuscript. (Available from
Alverno College).
310
Santos, J. (1999, April). Cronbach’s Alpha: A tool for accessing the reliability of scales.
Journal of Extension, 37(2). Retrieved February 7, 2006, from
http://www.joe.org/joe/1999april/tt3.html
Siemens, G. (2004, December 16, 2004). ePortfolios. Retrieved 12/29/04, from
www.elearnspace.org/Articles/eportfolios.htm
Simon, M., & Forgette-Giroux, R. (2002). Impact of a content selection framework on
portfolio assessment at the classroom level. Assessment in Education, 7(1).
Smith, M. F. (1989). Evaluability assessment a practical approach. Nowell: MA Kuwer.
Stiggins, R. J. (2001). The unfilled promise of classroom assessment. Educational
Measurement, Issues and Practice, 20(3) 5-16.
Strudler, N., & Wetzel, K. (2005). The diffusion of electronic portfolios in teacher
education: Issues of initiation and implementation. Journal of Research on
Technology in Education. 37(4),411-434.
Suskie, L. (1996). Questionnaire survey research. Tallahassee, FL: Association for
Institutional Research.
Tosh, D., & Werdmuller, B. (2004, 0/15/04). ePortfolios and weblogs: one vision for
ePortfolio development. Retrieved 1/5/05, from
http://www.eradc.org/papers/ePortfolio_Weblog.pdf
Treuer, P., & Jenson, J. (2003). Electronic portfolios need standards to thrive.
EDUCAUSE Quarterly, 26 (2), 34 – 42.
Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (Eds.). (2004). Handbook of practical
program evaluation. San Francisco, CA: Jossey-Bass.
311
Wiedmer, L. T. (1998). Digital portfolios. Phi Delta Kappan, 79(8), 586- 589.
Wilkerson, J.R., & Lang, W.S. (2003, December 3). Portfolios, the pied piper of teacher
certification assessments: Legal and psychometric issues. Education Policy
Analysis Archives, 11 (45). Retrieved 10/29/05 from
http://epaa.asu.edu/epaa/v11n45
Yancy, K. B. (1992). Portfolios in writing classroom: An introduction. Urbana, IL:
National Council of Teachers of English.
Young, J. (2002). Creating online portfolios can help students see ‘big picture,’ colleges
say. The Chronicle of Higher Education, February 21, 2002. Retrieved on
12/28/04 from http://chronicle.com/chronicle/v48/4824guide.htm
Zeichner, K., & Wray, S. (2001). The teaching portfolio in US teacher education
programs: what we know and what we need to know. Teaching and Teacher
Education, 17, 613-621.
Zou, M. (2002). Organizing instructional practice around the assessment portfolio: The
gains and losses. Missouri: Southeastern Missouri State University. (ERIC
Document Reproduction Services No. ED 469 469).