File - Evan S. Baum

advertisement
Portfolio #2 – Integrated Research Statement – Evan Baum
Portfolio #2 Integrated Statement
This research statement for my second portfolio submission has five sections. The first
two sections (academics and professional experiences) will summarize and synthesize what I
have done and learned since my first portfolio review (May 2012). The third section, building
upon the first two sections, will articulate a proposed dissertation topic. Section four will outline
three possible methodologies for pursuing my proposed dissertation topic. The final section will
present an initial review of the literature on the topic.
Since Portfolio 1 – Academic Experiences
Including those classes that were in progress at the time of my first portfolio review, I
have completed 20 credits (five traditional courses, one independent study, and one internship)
since May 2012 (see program of study and internship evaluation). Four final papers from these
credits have been added to this website, the most recent of which served as a pilot study for my
proposed dissertation topic described below.
Beginning with the Ways of Knowing course, my recent academic experiences have
reaffirmed my interest in pursuing research questions that are inherently more qualitative in
nature (the paper posted from EDRS 810 being an isolated exception). My interests continue to
revolve around researching processes, contexts, and understandings within the higher education
environment, all of which are best explored through qualitative methodologies. However, my
upcoming Mixed Methods course to fulfill the advanced research requirement will give me
another perspective that will inform my thinking heading into the final portfolio.
My independent study and internship experiences (both for the Advanced Organizational
Behavior in Higher Education course) confirmed my passion for identifying as an organizational
studies thinker. While organizational studies might define my scholarly interest at a macro level,
2
I have been consistently interested in the topics of organizational adaptation, effectiveness,
change, and innovation in higher education as more narrowly defined areas of curiosity under the
broader umbrella of organizational studies.
As examples, the study I proposed for EDRS 810 would have explored the link between
interdisciplinary programs (as an organizational innovation strategy) and critical thinking
outcomes among first year undergraduate students. In CTCH 792, the grounded theory group
research project that was the focus of the class explored the transition experience of recent
military veterans, a project that I framed as a study looking at how colleges and universities need
to adapt as organizations to meet the needs of a growing population of students. Most recently, in
EDRS 812, I did a qualitative project looking at how four mid-level student affairs professionals
at one large, public research university understood their responsibility for demonstrating student
learning outcomes. This study was driven by my interest in exploring how change and
innovation occurring in higher education and student affairs related to the accountability
movement are influencing the work of individual practitioners.
Looking back at the motivations behind and results of these projects and courses, I now
have a greater sense of clarity and confidence in being able to articulate a more concise and
coherent research identity (or so I think).
Since Portfolio 1 – Professional Experiences
My professional responsibilities have also evolved since my first portfolio submission in
May 2012. Of greatest relevance is the addition of strategic planning responsibilities to my
position within University Life. A major aspect of this strategic planning process is identifying
how University Life can systematically demonstrate its impact for achieving (internally-defined)
3
student learning outcomes. The larger process and this intended outcome are both clear examples
of organizational adaptation, effectiveness, innovation, and change, and considered in light of my
academic experiences since my first portfolio review, have helped me focus in on my proposed
dissertation topic discussed in the next section of this paper.
Although not explicitly related to my proposed dissertation topic, it is worth mentioning
other professional experiences I have had over the last 12-15 months. I recently joined the
Association for the Study of Higher Education (ASHE) with the intention of attending the
conference in St. Louis in November 2013. The HEP program has agreed to pay my registration
fee for the special pre-conference seminar on higher education policy for currently enrolled
doctoral students. Beyond the desire to be more closely connected to the scholarly conversation
going on in the field, my interest in attending this seminar stems from my interest in
understanding the impact of the broader policy movement for accountability in higher education
from an organizational effectiveness and change perspective. In addition to ASHE, I attended the
Campus Labs World Tour in Philadelphia in June 2013 to learn more about the technology
George Mason is using for executing its data collection and assessment strategy within
University Life, and the Student Affairs Technology UnConference in Blacksburg in July 2013,
exploring a wider range of technology topics within student affairs. Lastly, I have submitted two
program proposals for NASPA 2014 in Baltimore. The first, in partnership with a fellow HEP
PhD student (Dave Farris), focuses on developing a coordinated plan for recovering from
campus tragedies, and is an extension of my day-to-day work in University Life. The second,
with a blogger from InsideHigherEd (Eric Stoller), is intended to be a reprised version of the
session I presented at NASPA 2008 in Seattle on the organizational consequences of searching
for best practices.
4
Proposed Dissertation Topic
My proposed dissertation topic comes from an integration of the academic and
professional experiences I have had since my first portfolio review in May 2012 and is an
extension of the pilot study I conducted in EDRS 812 during the summer of 2013. Specifically,
the topic I want to explore is how student affairs professionals make sense of their responsibility
for demonstrating student learning outcomes. The following section presents three alternative
methodologies that could be use to conduct such a study. The last section is a preliminary
literature review on the topic.
Possible Methodologies
While an array of quantitative, qualitative, or mixed research methodologies exist to
explore how student affairs professionals make sense of their responsibility for demonstrating
student learning outcomes, the topic inherently lends itself to a qualitative approach, which is
best suited for considering understandings, processes, and contexts. Below are three potential
qualitative methodology options I would consider for executing a study on this topic for my
dissertation. However, before going into these options, let me make a few overarching comments
that would span across all three approaches.
First, my intention would be to target individuals at public institutions regardless of the
exact methodology I choose. My rationale is that the responsibility of student affairs practitioners
for demonstrating student learning outcomes is inevitably greater at a public institution than at a
private or for-profit institution (although I acknowledge that arguments could be made to the
contrary). I believe that the influence of calls for greater public accountability for the use of
state-provided funding, while potentially impacting the experience of student affairs
5
professionals at all institutions, has the most direct applicability within public colleges and
universities and is a relevant factor in my consideration of possible individuals and sites.
Second, I would intend to target student affairs professionals at institutions serving
10,000 or more undergraduate students (an arbitrary definition of mid-size or above). The
rationale for this limitation on my sampling is similar to the first. The responsibility of student
affairs practitioners for demonstrating student learning outcomes is arguably going to be most
vivid at institutions serving the largest numbers of students.
With these two broad contextual considerations in mind, I envision that my dissertation
could take one of three approaches to explore this topic: 1) a grounded theory study; 2) a
narrative inquiry, or; 3) a hybrid case study/ethnography. My thinking on each of these three
options and the tradeoffs I foresee with each approach are below.
Option 1 – Grounded Theory
The aim of a grounded theory study is the creation of a theory that explains a process. By
exploring a process and experience from the perspective of those who live it, “grounded theory
methodology employs a systematic and structured set of procedures to build an inductively
derived theory grounded in the actual data and informed by the area under study” (Jones, Torres,
& Arminio, 2006, p. 42). The pilot project I did in EDRS 812 is an example of how this study
could be developed as a grounded theory. In that project, I interviewed four, mid-level student
affairs professionals within the same large public research university about their experiences in
demonstrating student learning outcomes. For the sake of confidentiality, I selected participants
for the pilot study that were in different functional areas of student affairs (career services,
residence life, leadership programs, and student involvement), but that would not necessarily
6
need to be case for my dissertation. I selected mid-level professionals within the organization
because I wanted to understand the topic for those who had recently transitioned into a new level
of responsibility. Mid-level professionals are likely those who will shoulder the majority of
responsibilities for demonstrating student learning outcomes within a student affairs division at a
large public university (with entry-level staff having greater responsibility over direct program
and service delivery and senior-level staff having greater responsibility over supervision and
management). Moreover, as my initial literature review below will show, the preponderance of
studies in student affairs on this topic focus on senior or entry-level staff, illustrating a
substantial gap in the field.
As a grounded theory, I would look to do 15-25 one-on-one interviews with mid-level
student affairs professionals, using a semi-structured interview protocol focused on exploring the
experiences of each participant in fulfilling their responsibilities for demonstrating student
learning. I would exclude anyone that is the person primarily responsible for assessment in his or
her office from being a possible participant. My intention would be to explore the topic of
demonstrating student learning outcomes with those for whom it is one piece of their role, but
not a primary one. My justification for this exclusion is that mid-level professionals for whom
assessment is a primary job responsibility would offer little to interestingly advance an
understanding of the topic. The individual participants could come from multiple institutions and
multiple functional areas, selected through a combination of purposeful, snowball, and
convenience sampling. Alternatively, I could choose to focus on mid-level student affairs
professionals in one functional area (all from within career services, for example), but still across
multiple institutions using the same sampling methodology. While I would only do one interview
7
per participant, I would plan to follow up with all participants following my theoretical coding as
a form of member checking.
There are two primary challenges that I can identify with adopting this approach.
Recruiting participants, despite the limited commitment on their part, will be a challenge,
especially if I pursue maximum variation across 15-25 different institutions. At the same time,
the more institutions represented by my interview participants the more unique institutionally
specific factors may influence my participants experiences, which could present challenges for
data analysis and may leave me with only superficial (and uninteresting) findings.
Option 2 – Narrative Inquiry
A second option, narrative inquiry, in some way has the reverse tradeoffs of the grounded
theory approach described above. With a narrative approach, I would focus on the same topic,
still exploring the experiences of individual student affairs professionals (likely using the sample
sampling criteria), but would go in much greater depth with a smaller number of participants.
Taking a narrative inquiry approach, I might have only 4 or 5 participants, but would perform
multiple interviews with them (perhaps one per month over four months), supplementing my
interviews with field observations, and asking my participants to keep a journal to further reflect
on their experiences outside of our interview conversations (an online journal with the request to
post something, perhaps in response to a particular question or prompt, every 1-2 weeks for four
months is something I envision).
In this scenario, I would almost certainly need to have all participants come from
different institutions so that participants would not influence one another during the study and to
ensure confidentiality, which would add another layer of logistical complexity, especially if field
8
observations are going to be a part of my data collection process. Additionally, the sustained
commitment for my research participants might require me to offer them some form of
compensation to justify their time with the project (although I could consider applying for grant
funding from ACPA and/or NASPA for such an expense). Furthermore, narrative inquiry is not a
methodology I have used to this point, and as such would require some additional up front
learning on my part in order to best understand its epistemological and practical foundations.
Option 3 – Hybrid Case Study/Ethnography
A third alternative for exploring the topic of how student affairs professionals make sense
of their responsibility for demonstrating student learning outcomes would be to do a hybrid case
study/ethnography, looking at one (or more) student affairs divisions (still at large public
institutions) to explore how those divisions have changed to fulfill this emerging priority. I
consider this a hybrid approach because the division (or divisions) would be bounded cases, but
the approach within each case would be ethnographic, examining a phenomenon as manifested
through a culture and its individuals.
The most ideal setting for such a study would be to immerse myself (through a
combination of field observations, interviews, and other data collection methods) within a
division that is about to begin change efforts towards being able to demonstrate student learning
outcomes and observe the change process from its inception. However, this is likely an
unrealistic expectation, and consequently, I would probably find myself studying a division (or
divisions) of student affairs somewhere in the middle of its change process.
Pragmatically, this approach would seem to be the most time intensive (as well as the
most open-ended, given that the data collection would be inexorably intertwined to a change
9
process that could go on indefinitely and over which I would have no real control). It would also
require me to have established a high level of trust with a number of key stakeholders across the
division (or divisions) in order to gain access, which would be difficult to do at all but my
current institution. However, this path would allow me to draw upon my existing knowledge of
organizational culture and organizational change, which could balance out some of the other
practical considerations (with the other two options above, I imagine needing to do a good
amount of exploration in the literature on topics I have only loosely explored.)
Initial Literature Review
The role and responsibility of student affairs professionals in the delivery of experiences
that foster learning and development of those enrolled at colleges and universities dates back to
the most seminal documents from the field. Published in 1937, and revised in 1949, The Student
Personnel Point of View (American Council on Education) articulates the contribution of extra
and co-curricular experiences that supplement student learning that occurs in the classroom.
Over the last 75 years, these foundational documents have been updated and reflected upon
(ACPA, 2012; ACPA, 1996; NASPA, 1997; NASPA, 1987), influencing the statements made in
Learning Reconsidered (ACPA & NASPA, 2004) and Learning Reconsidered 2 (ACPA &
NASPA, 2010). Throughout the revisions and updates, the responsibility of student affairs
professionals for fostering student learning and development outside of the classroom has
consistently remained a core principle of the field.
However, the responsibility for assessing co-curricular experiences offered by student
affairs professionals and demonstrating that student learning outcomes are being achieved is a
professional competency that has emerged over time. Pope and Reynolds (1997) articulated
10
assessment and evaluation as one of seven core competencies of student affairs professionals in a
widely influential publication that was a springboard for more recent documents describing
global professional competencies and professional standards in the field (ACPA & NASPA,
2010; CAS 2010), assessment skills and knowledge standards (ACPA 2006), and the role of
student affairs in accreditation (ACPA, 2013). Bresciani (2011c) argued that student affairs
professionals are naturally curious about the effectiveness of their efforts and want to inquire
about the outcomes of their work because of their passion for creating high-quality, holistic
student learning experiences. Schuh and Gansemer-Topf (2010) described this evolution of
student affairs assessment and evaluation as, “at least conceptually, has moved away from
evaluating students’ use of and participation in services and programs to measuring how
programs and experiences contribute to students’ learning” (p. 6).
Given the espoused importance of student learning in the field of student affairs, one
might assume that the supporting literature on the topic is robust. This assumption, regrettably,
would be at worst completely false, and at best selectively incomplete. Studies exploring the
experiences of student affairs professionals in fulfilling their responsibilities for demonstrating
student learning outcomes have been limited in both scope and number. Specifically, a review of
the literature illustrates three distinct areas of publication. First, studies exist that can be
classified as “how to” studies – those that explore the practice of implementing assessment
efforts to demonstrate student learning with the field of student affairs. Second, a number of
publications have been produced that can be grouped together as exploring global skill and
knowledge competency of student affairs professionals, primarily through the perspectives of
senior student affairs officers and graduate preparation program faculty. Lastly, a third clustering
of studies have been done looking at the experiences and competencies of entry-level
11
professionals in student affairs. After exploring the literature in each of these three clusters in the
sections below, gaps will be evident that support the exploration of the questions being posed by
this study.
Cluster 1 – “How to…” Demonstrate Student Learning Outcomes Studies in Student Affairs
The writings of Upcraft and Schuh (1996) articulate the need for the first clustering of
studies, “Unfortunately, among many staff in student affairs, assessment is an unknown quantity
at best, or at worst, it is misguided and misused. It has been our experience that while everyone
in student affairs would agree that assessment is important, too often it is considered a low
priority and never conducted in any systematic, comprehensive way. And even if it is done, it is
often done poorly; as a result, it simply gathers dust on someone’s shelf, with little or no impact”
(p. 4). In a 2004 study, Doyle found that among 216 senior student affairs officers at small
colleges (500-3,000 students), “that assessment was one of the least well-practiced actions of
student affairs divisions” (p. 389), arguing that “student affairs divisions are much better at
building good relationships with students than managing their administrative responsibilities” (p.
388).
Consequently, researchers and practitioners have published a wealth of documents over
the last 15 years to show how student learning outcomes assessment (and general program
evaluation) in student affairs can be bolstered. Early studies sought to confirm that learning did
in fact happen outside of the classroom (Kuh, 1995), and that assessment was a platform for
student affairs professionals to collaborate with academic affairs (Banta & Kuh, 1998) despite
the inherent obstacles that accompany efforts to study student learning outcomes (Terenzini,
1989). Kuh (1995) did an exploratory qualitative study with graduating seniors to identify the
12
association between out-of-the-classroom experiences and student learning and development,
finding that students viewed these experiences as real-world laboratories. Banta and Kuk (1998)
argued that assessment efforts like these are “one of the few institutional activities in which
faculty and student affairs professionals can participate as equal partners” (p. 42), further
establishing the importance of learning outcomes assessments in student affairs.
More recent publications have sought to advance not simply that student affairs plays a
role in fostering student learning outcomes, but how best to do so, especially in light of
heightened calls for accountability in higher education (Bresciani, 2011a; Bresciani, 2011b;
Collins, 2012; Manderino & Meents-DeCaigny, 2012; Rothenburg, 2011). One recent study
examined three student affairs divisions at large research institutions identified as having “highquality” assessment practices (Green, Jones, & Aloi, 2008). This study concluded that support
for learning assessment activities, particularly decentralized assessment within the various
functional areas of student affairs, coordinated by a director or a committee and charged to do so
by the senior student affairs officer, were key to realizing success. Unfortunately, most of the
data collected to reach these conclusions came from the senior student affairs officer and the lead
assessment staff member in student affairs at each of the three institutions. Another recent study
looked at how student affairs divisions can build a culture of evidence at community colleges
(Oburn, 2005), stating “by demonstrating that student affairs divisions offer quality programs
that contribute significantly to student access, learning, and success” (p. 32) student affairs
professionals can help support institutional effectiveness efforts. Seagraves and Dean (2010)
looked at accreditation efforts at three small colleges, finding that leadership from the senior
student affairs officer, an attitude of using assessment to improve programs and services, and a
collegial supportive atmosphere are also keys to developing a culture of assessment among
13
student affairs divisions. Lastly, Slager and Oaks (2013) describe the strategy of using
assessment coaches at one large research university to help “staff overcome barriers of
assessment such as lack of resources, lack of knowledge related to conducting a rigorous
assessment, or negative attitudes towards assessment” (p. 29).
Cluster 2 – Assessment and Evaluation as a Global Student Affairs Competency
Where the first cluster of literature on assessment in student affairs focuses on strategies
for doing assessment (either program evaluation or learning outcomes), a second cluster more
broadly considers assessment and evaluation as a skill set and knowledge base within the student
affairs profession. In a meta-analysis of 30 years of research on successful student affairs
administration, Lovell and Kosten (2000) found that 57% of studies on the subject included
research, evaluation, and assessment as a necessary skill for student affairs professionals. They
also write, “However, the level of sophistication required to demonstrate this effectiveness is
increasing. Assessing knowledge is becoming a common staple for today’s student affairs
administrators” (p. 567). An update to this study stated, “research, assessment, and evaluation
were found to be the most frequently mentioned items in the literature” (Herdlein, Reifler, &
Mrowka, 2013) as desired skills for student affairs professionals, with “55% of the articles
identifying research and assessment as an important knowledge area while 68% identified
research/assessment/evaluation as an important skill in the field” (p. 263). In yet another metaanalysis on professionalism in student affairs, Carpenter and Stimpson (2007) found, “It may be
that, since scholarship and research are frequently not familiar tasks, they are not considered to
be as enjoyable or even as necessary, as, say, advising a student organization president or
planning a program, or any of the thousands of other tasks confronting busy student affairs
14
workers” (p. 272). Thus, while these skills may be valuable and necessary, those in the field may
perceive them as undesirable or unenjoyable.
In a 2004 study, Herdlein examined the perception of 50 senior student affairs officers
about graduate preparation programs. The results of his study found that only 16% of senior
student affairs officers found graduates of student affairs administration programs to be
proficient or above average with assessment and research abilities. More recently, Hoffman and
Bresciani (2010) reviewed assessment-related skills, specifically student learning and
development outcomes, in 1,759 student affairs job postings from 2008, concluding “slightly
more than one in four (27.1%) of the positions posted required applicants to either demonstrate
competency in assessing student learning or to complete learning assessment duties as a part of
the job” (p. 508) They found that there were no differences in skills required from public to
private institutions or across institutions of different sizes, with multicultural services, new
student programs, and student activities having assessment skills most often included in job
requirements. They also stated, “requirements for assessment skills and duties were less
prevalent among entry-level positions that required less education and experience and more
prevalent among mid-level and senior-level jobs that carried greater requirements for the
education and experience of job applicants” (Hoffman & Bresciani, 2010, p. 507).
Taken together, these studies illuminate a number of interesting questions to be explored
by this proposed study. First, despite the importance of assessment, research, and evaluation of
student learning outcomes as a competency for student affairs professionals as evidenced by
multiple meta-analyses of the literature and surveys of senior leaders in the field, is this
responsibility truly perceived as undesirable or enjoyable? Second, while assessment-related
skills are increasingly necessary for professionals, what is behind the perception that student
15
affairs graduate programs fail to develop competence of professionals in this area? Lastly, if this
skill set is more likely to be required of positions at the mid and senior-level, how do those
making a transition into positions at that level come to develop this knowledge base and skill set
if they are not responsible for doing so as a part of entry-level positions? This last question is
particularly relevant for framing my intended sampling criteria described above.
Cluster 3 – Experiences and Competencies of Entry-Level Student Affairs Professionals in the
areas of Assessment and Evaluation
The final cluster of the literature on student affairs professionals and assessment
experiences and competencies focuses in on the last question from the previous cluster of studies.
Specifically, a number of recent studies explore competencies and preparation of entry-level
professionals in the field of student affairs, many of which touch on assessment, evaluation, and
research as a skill or knowledge base. At the same time, interestingly, a recent book on becoming
socialized as an entry-level professional in student affairs fails to mention assessment or
evaluation as a component of the socialization process (Tull, Hirt, & Saunders, 2009).
Examining the perceptions of 104 senior and mid-level student affairs professionals about
competencies needed for entry-level professionals in the field, one study found that program
evaluation was ranked 25th out of 32 desired competencies (Burkard, Cole, Ott, & Stoflet, 2005).
A more recent study of senior student affairs officers and graduate preparation program faculty
found that a large gap exists between the desired level and current of entry-level professionals
competencies with assessment, concluding, “graduate preparation programs should also consider
placing greater emphasis on outcomes-based assessment within research and program evaluation
course sequences (Dickerson, et al., 2011, p. 476). Yet another study exploring the perceptions of
16
senior student affairs officers, midlevel managers, and program faculty found that faculty
members viewed administrative practices, organizational management, and change competencies
as less relevant for entry-level professionals than senior and midlevel managers (Kuk, Cobb, &
Forrest, 2007).
Thus, while outcomes-based assessment, research, and evaluation skills may exist among
a long list of desired competencies of entry-level student affairs professionals, graduate programs
and graduate faculty seem to emphasize the development of this skill set less than practitioners.
Young and Janosik (2007) reported that graduates of CAS-compliant masters programs reporting
greater confidence in their abilities as new professionals, but their lowest level of confidence was
in the area of research foundations. They concluded, “at least based on the responses in this study,
the curricula in master’s level preparation programs may not provide enough preparation in
assessment and research to help graduates play a meaningful role in this arena” (Young &
Janosik, 2007, p. 361). Another study surveyed over 1,200 new professionals to identify the
skills developed in their graduate programs and the extent to which these skills were used in their
first position (Waple, 2006). Out of 28 competencies learned through graduate programs, student
outcomes assessment was ranked 19th, program evaluation 20th, and assessment of student affairs
programs 21st (these competencies were ranked 24th, 17th, and 23rd respectively for perceived use
in entry-level positions) (Waple, 2006). A different survey of entry-level professionals and their
supervisors about the competencies that were developed through their graduate programs
concluded that recent entry-level staff members rated their abilities to understand quantitative
and qualitative research higher than their supervisors’ perceptions of their abilities to do so
(Cuyjet, Longwell-Grice, & Molina, 2009). Similarly, in a qualitative study of 90 new
professionals transitioning into their first job in student affairs, assessment and evaluation was
17
consistently described as one of the skills that new professionals found themselves to be missing
(Renn & Jessup-Anger, 2008).
Synthesis/Conclusion
Examined collectively, several themes emerge from the literature review above that
reinforce the proposed study I would like to explore for my dissertation, the rationale behind
some of my preliminary sampling criteria, and noticeable gaps in the literature. Most
fundamentally, this literature review reaffirms that student affairs professionals have a
responsibility for demonstrating student learning outcomes. Moreover, this responsibility is
increasing, as the knowledge base and skills required for performing learning outcomes-based
assessment, research, and evaluation responsibilities gain greater importance within the field. At
the same time, there is an obvious disconnect regarding the perceived importance of this
knowledge base and skill set between graduate preparation program faculty and practitioners,
particularly for entry-level professionals. While not an explicit focus of this study, this finding
from the literature review does beg the question, if student affairs practitioners view outcomesbased assessment, research, and evaluation knowledge and skills as increasingly important, but
graduate program faculty do not, where is this knowledge base to be acquired and how are these
important skills to be developed?
What is relevant to this study from this finding is that entry-level professionals view these
skills as being comparative less important and relevant in their entry-level positions and that they
are ill-equipped to execute assessment, research, and evaluation responsibilities should they need
to do so. If, as the literature review suggests, these responsibilities are more likely to show up in
mid and senior level positions in the field, and entry-level professionals do not have the
knowledge base or skill set to complete them successfully, how do mid and senior level
18
professionals effectively transition into higher level positions with learning outcomes-based
assessment, research, and evaluation responsibilities? This question goes to the heart of my study,
my intended sampling criteria, and the need to explore how mid-level student affairs
professionals make sense of their responsibility for demonstrating student learning outcomes.
In general, there is a noticeable gap in the literature both around the experiences of midlevel professionals and in exploring the experiences of individual practitioners doing student
learning outcomes assessment work. Research on the skills and competencies of student affairs
professionals seem to focus on the lived experiences of entry-level professionals and the
perceptions of senior and mid-level professionals, with little exploration of the lived experiences
of those at higher levels of responsibility. Additionally, writing that considers how outcomesbased learning assessment in student affairs happens (does or should) almost always omits the
role of the individual practitioner. In taking a macro level approach to studying outcomesassessment in student affairs, primarily by considering organizational structures and overarching
cultural considerations that allow divisions of student affairs to effectively execute assessment
efforts, scholars misses out on the experiences of individual professionals. In effect, this gap
privileges by omission traditional assumptions about organizational effectiveness and change,
such as senior leadership needs to show support for outcomes-based assessment or successful
practices cannot happen without positions or committees to oversee them. While these
conclusions may be accurate, they miss a larger piece of the context, namely, the experiences of
student affairs professionals with responsibilities for demonstrating student learning outcomes
who are not the senior officer or the assessment director for the division. Consequently, I believe
there is much that a study like the one I am proposing could offer to the literature and to
practitioners.
19
References
American College Personnel Association. (2013). Accreditation and the role of the student
affairs educator. Washington, DC: Authors.
American College Personnel Association. (2012). Reflections on the 75th anniversary of the
Student Personnel Point of View. Washington, DC: Authors.
American College Personnel Association. (2006). ASK standards: Assessment, skills, and
knowledge content standards for student affairs practitioners and scholars. Washington,
DC: Author.
American College Personnel Association. (1996). The student learning imperative: Implications
for student affairs. Washington, DC: Author.
American College Personnel Association & National Association of Student Personnel
Administrators. (2010). Professional competency areas for student affairs practitioners.
Washington, DC: Authors.
American College Personnel Association & National Association of Student Personnel
Administrators. (2010). Learning reconsidered 2: A practical guide to implementing a
campus-wide focus on the student experience. Washington, DC: Authors.
American College Personnel Association & National Association of Student Personnel
Administrators. (2004). Learning reconsidered: A campus-wide focus on the student
experience. Washington, DC: Authors.
American Council on Education. (1949). The student personnel point of view. Washington, DC:
Author.
American Council on Education. (1937). The student personnel point of view. Washington, DC:
Author.
20
Banta, T.W., & Kuh, G.D. (1998). A missing link in assessment: Collaboration between
academic and student affairs professionals. Change, 30(2), 40-46.
Bresciani, M.J. (2011a). Assessment of student learning from a student affairs perspective: Part 1.
NASPA NetResults.
Bresciani, M.J. (2011b). Assessment of student learning from a student affairs perspective: Part 2.
NASPA NetResults.
Bresciani, M.J. (2011c). Making assessment meaningful: What new student affairs professionals
and those new to assessment need to know. Champaign, IL: National Institute for
Learning Outcomes Assessment.
Burkard, A., Cole, D.C., Ott, M., & Stoflet, T. (2005). Entry-level competencies of new student
affairs professionals: A delphi study. NASPA Journal, 42(3), 283-309.
Carpenter, S., & Stimpson, M.T. (2007). Professionalism, scholarly practice, and professional
development in student affairs. NASPA Journal, 44(2), 265-284.
Collins, K.M., & Roberts, D.M. (2012). Learning is not a sprint: Assessing and documenting
student leader learning in co-curricular involvement. Washington, DC: National
Association of Student Personnel Administrators.
Council for the Advancement of Standards in Higher Education (2010). CAS professional
standards for higher education. (7th ed.). Washington, DC: Author.
Cuyjet, M.J., Longwell-Grice, R., & Molina, E. (2009). Perceptions of new student affairs
professionals and their supervisors regarding the application of competencies learned in
preparation programs. Journal of College Student Development, 50(1), 104-119.
Dickerson, A.M., Hoffman, J.L., Anan, B.P., Brown, K.F., Vong, L.K., Bresciani, M.J., Monzon,
R., & Oyler, J. (2011). A comparison of senior student affairs officer and student affairs
21
preparatory program faculty expectations of entry-level professionals’ competencies.
Journal of Student Affairs Research and Practice, 48(4), 463-479.
Doyle, J. (2004). Student affairs division’s integration of student learning principles. NASPA
Journal, 41(2), 375-394.
Herdlein, R., Reifler, L., & Mrowka, K. (2013). An integrative literature review of student affairs
competencies: A meta-analysis. Journal of Student Affairs Research and Practice, 50(3),
250-269.
Herdlein, R.J. (2004). Survey of chief student affairs officers regarding relevance of graduate
preparation programs. NASPA Journal, 42(1), 51-71.
Hoffman, J.L., & Bresciani, M.J. (2010). Assessment work: Examining the prevalence and
nature of assessment competencies and skills in student affairs job postings. Journal of
Student Affairs Scholarship and Research, 47(4), 495-512.
Green, A.S., Jones, E., & Aloi, S. (2008). An exploration of high-quality student affairs learning
outcomes assessment practices. NASPA Journal, 45(1), 133-157.
Lovell, C.D., & Kosten, L.A. (2000). Skills, knowledge, and personal traits necessary for success
as a student affairs administrator: A meta-analysis of thirty years of research. NASPA
Journal, 37(4), 553-572.
Kuh, G.D. (1995). The other curriculum: Out-of-class experiences with student learning and
personal development. Journal of Higher Education, 66(2), 123-155.
Kuk, L., Cobb, B., & Forrest, C. (2007). Preceptions of competencies of entry-level practitioners
in student affairs. NASPA Journal, 44(4), 664-691.
22
Jones, S.R., Torres, V., & Arminio, J. (2006). Negotiating the complexities of qualitative
research in higher education: Fundamental elements and issues. New York, NY:
Routledge.
Manderino, M., & Meents-DeCaigny, E. (2012). Measuring learning through student affairs:
Moving beyond activity-level outcomes. NASPA NetResults.
National Association of Student Personnel Administrators. (1997). Principles of good practice
for student affairs. Washington, DC: Author.
National Association of Student Personnel Administrators. (1987). A perspective on student
affairs: A statement issued on the 50th anniversary of the Student Personnel Point of View.
Washington, DC: Author.
Oburn, M. (2005). Building a culture of evidence in student affairs. New Directions for
Community Colleges, 131, 19-32.
Pope, R.L., & Reynolds, A.L. (1997). Student affairs core competencies: Integrating
multicultural awareness, knowledge, and skills. Journal of College Student Development,
38(3), 266-277.
Renn, K.A., & Jessup-Anger, E.R. (2008). Preparing new professionals: Lessons for graduate
preparation programs from the national study of new professionals in student affairs.
Journal of College Student Development, 49(4), 319-335.
Rothenberg, L. (2011). Aligning co-curricular initiatives with learning outcomes: Key
challenges facing student affairs leaders. Washington, DC: The Advisory Board
Company.
Schuh, J.H., & Gansemer-Topf, A.M. (2010). The role of student affairs in student learning
assessment. Champaign, IL: National Institute for Learning Outcomes Assessment.
23
Seagraves, B., & Dean, L.A. (2010). Conditions supporting a culture of assessment in student
affairs divisions at small colleges and universities. Journal of Student Affairs Research
and Practice, 47(3), 307-324.
Slager, E.M., & Oaks, D.J. (2013). A coaching model for student affairs assessment. About
Campus, 18(3), 25-29.
Terenzini, P.T. (1989). Assessment with open eyes: Pitfalls in studying student outcomes.
Journal of Higher Education, 60(6), 644-664.
Tull, A., Hirt, J.B., & Saunders, S. (2009). Becoming socialized in student affairs administration:
A guide for new professionals and their supervisors. Sterling, VA: Stylus Publishing.
Upcraft, M. L. & Schuh, J. H. (1996). Assessment in student affairs: A guide for practitioners.
San Francisco, CA: Jossey-Bass.
Waple, J.N. (2006). An assessment of skills and competencies necessary for entry-level student
affairs work. NASPA Journal, 43(1), 1-18.
Young, D.G., & Janosik, S.M. (2007). Using CAS standards to measure learning outcomes of
student affairs preparation programs. NASPA Journal, 44(2), 341-366.
Download