Sie sind auf Seite 1von 8

Running Header: ASSESSEMENT: A REFLECTION

Assessment: A Reflection
Eric Hilldorfer
Western Michigan University

Assessment: A Reflection

ASSESSMENT: A REFLECTION

This class was probably the most daunting for me because I had very little experience
with assessment. My experience up until this point with assessment was making sure students
complete evaluation sheets after they left advising sessions. This sheet was developed after an
assessment of our work area had been reviewed and how we could receive appropriate and
timely feedback from our students. After this course this year, I learned that this is just one form
of assessment and there are many other ways to go about creating assessments.
For the assessment project for this course I was nervous about being able to effectively
work in the area of assessment since I had limited experience. There was pressure to not only
succeed, but succeed beyond expectations and impress the department that we had been paired
with for the project. During the process of the project I was able to develop a stronger
understanding of assessment.
Purpose of the Project
For this project my partner, Anthony Ringuette, and I were given the opportunity to work
with University Recreation and the Associate Director of Business Operations, Chris Voss. We
were to examine the Consortium: Campus Recreation Impact Survey 2012-2013. This is a
national survey developed by the National Association of Student Personnel Administrators
(NASPA) that is used every three years as a way to form institutions to compare the satisfaction
and overall performance of university recreation. The goal for our project was to look at the
survey, compile the data in the survey and create a set of documents that displays the information
contained in the survey, so University Recreation will be able to assess their needs and areas for
improvement for their department. Our results will also be used as a way to market University
Recreation on Western Michigan Universitys campus.
The Survey

ASSESSMENT: A REFLECTION

The Consortium: Campus Recreation Impact Survey 2012-2013 was developed to


determine the communitys satisfaction with university recreation and is a measurement tool to
compare and evaluate how well and institution is performing based on their results. It is
described as a web-based data collection survey. One of the benefits of this type of survey is that
it allows for both qualitative and quantitative data collection (Schuh & Upcraft, 2001). This
survey contains both sets of data and this is over 140 different questions that addresses the major
aspects of university recreation from intramurals, exercise space, staff knowledge, and many
other areas of university recreation. The survey was tailored to specific issues to Western
Michigan Universtiy sp. and they had the option of adding additional questions if they needed
feedback. For example, there is a question pertaining to students knowledge and awareness of
the Tobacco Free Campus Initiative and if the students think it is an affect program. This is a
campus wide initiative but it also aligns with University Recreations mission of promoting
healthy, active life through recreational programs and services (Mission Statement, n.d.). The
survey was sent out to students, faculty, staff, and alumni at the end of the 2013 Spring semester
from March 25, 2013-April 19, 2013 (Survey). The real work for our project started with the
data collection.
Data Collection
The data collection for me was the most intimidating part because I always think it is
some really deep scientific process for interpreting the data, but that is not always the case.
There is are multiple ways to gather data from questionnaires, surveys, focus groups, interviews,
and many other forms (Schuh & Upcraft, 2001). With this project containing a survey in one
electronic data-base made it easy to compile and gather the information. There are very few time
constraints on web-based surveys because they can be distributed at any time and closed when

ASSESSMENT: A REFLECTION

the assessor deems it time (Schuh & Upcraft, 2001). Since the survey had been sent out
approximately six months before the data was there for us. The baseline tool where the data was
stored also was nice because we were able to compile the data based on specific demographics,
so the information could be specific and meaningful.
Qualitative vs. Quantitative
The majority of the survey contained questions that asked the respondents to rate their
satisfaction using a Likert Scale (i.e. Agree, Somewhat Agree, Neither Agree or Somewhat
Disagree, Disagree). This data would be described as quantitative because there is a value placed
on the data and it can be easily charted or grouped to determine the satisfaction (Schuh &
Upcraft, 2001). The only downside when there are five answers describing satisfaction
someones Somewhat Agree could be another persons Agree or Somewhat Agree.
Therefore, the data is accurate to a point, but can be misleading. The information that helped
develop some strong information and where the survey contained the most honest feedback is the
section with open ended questions.
This was where the qualitative data was contained and was the most interesting part of
the survey. This section allowed us to develop the skill of coding qualitative data. It allowed us
to go through the process of taking raw data, giving the data a preliminary code, and then
narrowing down codes to be combined because they might have overlapping information and can
be grouped together (Saldana, 2008). This last step led to themes we developed that were the
most common in University Recreation. This was the most enjoyable part because it was our
way of coding the data and our touch on the project. Our efforts to code the data were also
collaborative in nature because we wanted to see if we were interpreting the data the same or f
maybe we missed some of the information. According to Saldana (2008), writers of joint

ASSESSMENT: A REFLECTION

research projects should make coding a collaborative effort because multiple minds bring
multiple ways to analyze and interpret data. There were only two of us but we were able to see
the importance of collaborative coding. For example, the we were able the most common theme
we found students reporting was that the Student Recreation Center allowed for them to foster
meaningful relationships while using the facilities. My partner was coding this as friendships,
while I saw this as community development. After discussion and looking at the data we were
able to see the different ways we went about coding and were able to combine or thoughts into a
solid theme. This gave us better understanding of how to effectively code qualitative data.
Reflection of the Work
This assignment allowed me to connect with another department, evaluate data, and
develop information that was presented to the heads of a department and how their services
could be improved or maintained. It felt like actual contribution to the University and our
supervisor seemed really pleased with the work and we were able to assist in making them take
notice of certain issues that need to be addressed. This project also allowed for me to work on
my Assessment, Evaluation, and Research competencies (ACPA & NASPA, 2012) to become a
more effective student affairs practitioner.
Impact of the Work
As a young student affairs professional, this project really allowed me to behave an active
participant in decision making with a department in their assessment process. Through the
research that waswhen conducted, the data we collected and analyzed my partner and I were able
to compile two sets of information University Recreation was able to use almost immediately.
The first set was similar to the national results of the NASPA Consortium Survey and we were
able to highlight the benefits of student recreation on Western Michigans campus. University

ASSESSMENT: A REFLECTION

Recreation plans to use this as a way to market and attract students to their facilities. The second
set of data was focused on the areas of success and improvement in University Recreation. In
this data set there was a section devoted to promotional materials for University Recreation and
the overall performance was relatively low. Upon seeing this they almost immediately tasked
one of their employees with improving their materials. This suggestion of improvement was
taken seriously and acted upon. As a young student affairs professional, that was a small
accomplishment and reaffirmed the choice for working in this field.
Professional Development
Even though this project was based out of the class room it allowed for more experiential
learning. Similar to Learning Reconsidered, it did not place specific focus on learning in the
class room, but interacting with departments, students, and other areas that did not involve being
in an academic setting (2004). Institutions are losing financial support every year and it is
affecting programming and how to reach students. This project allowed for us to develop our
skills, but assists department where it might take money away from certain programs (NASPA &
ACPA, 2004). It is about being able to use resources effectively and I was able to improve upon
my professional competencies in the process.
Out of all the ACPA/NASPA (2012) competencies, next to Law and Policy, I felt that
Assessment, Evaluation, and Research was the one that I had the least amount of experience with
since entering the HESA program. After this course and project I have the confidence to say that
I can differentiate among assessment, program review, evaluation, planning, and research and I
am able to align program and learning outcomes with organization goals and values (ACPA &
NASPA, 2012). These have giving me the knowledge to be part of the basic level of
competency, but this is my first real experience really focusing on improving this competency

ASSESSMENT: A REFLECTION

area. I am fine with that for now, but my plan for the next two years is be more involved in
assessment within my department to allow me to reach the intermediate level. The department I
work with has a few assessment projects in place and I will see how my role can be included in
the process. Also, within the next few months I will hopefully have a full time job. In this
position I will look for opportunities to be part of assessment and planning committees. This is
also useful in other areas of my professional work to familiarize myself with the campus culture
and faculty and staff. After this project I have learned that Assessment, Evaluation, and Research
is more than just a competency area, but it is integral process for sustaining any campus.
Conclusion
This project allowed me to explore and interact with other departments on campus. It
helped me focus on a specific professional competency area and it was an enjoyable experience.
It challenged me at times, but it really allowed me to incorporate the learning from the classroom
almost immediately into the work I was conducting. This project allowed for growth in my field
and has now equipped me with another valuable skill for my student affairs utility belt.

References
American College Personnel Association (ACPA) & National Association of Student Personnel

ASSESSMENT: A REFLECTION
Administrators (NASPA). (2012). ACPA/NASPA professional competency areas: For
Student affairs practitioners. Washington, DC: American College Personnel Association
Mission Statement (n.d.). Retrieved from http://wmich.edu/rec/about/mission.
National Association of Student Personnel Administrators (NASPA) & American College
Personnel Association (ACPA) (2004). Learning reconsidered: A campus-wide focus on
the student experience. ppg. 1-43.
Saldana, J. (2008). An introduction to codes and coding. In The Coding Manual or Qualitative
Researchers. ppg. 1-31. Thousand Oaks, CA: Sage Publications Inc.
Schuh, J. H. and Upcraft, M. L. (2001). Assessment practice in student affairs: An applications
manual. San Francisco: Jossey-Bass, Wiley.

Das könnte Ihnen auch gefallen