Beruflich Dokumente
Kultur Dokumente
Stated
Sauk Valley Community College's goals for student
learning outcomes are clearly stated for each
educational program and make effective
assessment possible.
3A.1: Faculty-Driven System
Assessment of academic achievement at Sauk is a faculty-driven system for which all of the
student learning outcomes, including course outcomes, area and program outcomes, and general
education competencies have been developed and are regularly reviewed by faculty. In addition,
with oversight and approval by administration, the design and maintenance of the system is
entirely the responsibility of two groups of faculty:
Faculty Area Facilitators: For assessment purposes, the college faculty is divided into
eight areas of related courses. Five of these contain the General Education Core
Competency Areas (Social Sciences, Physical Sciences, Humanities/Fine Arts,
Communication, and Mathematics) and the remaining are Career Program groupings
(Health Careers, Technology, and Business). Each of these areas is led by an Area
Facilitator, a compensated faculty member who is responsible for calling meetings,
moderating discussions, and providing leadership in assessment efforts. In 2008, the role
of the Area Facilitators was expanded to include operational planning and program
review for their respective areas. In addition, each Area Facilitator is a member of either
the Organizational Planning and Improvement Committee (OPIC) or the Assessment
Committee, providing faculty input on those important oversight committees.
Faculty Core Team: The primary driver of the academic assessment system is the Core
Team, a subcommittee of the Assessment Committee (
Appendix). The Core Team
consists of four faculty Area Facilitators, two or more additional faculty members, and a
representative from Instructional Technology. As of 2009, the Director of Academic
Development has been added in order to coordinate a developmental education
assessment component of the system. The Vice President of Academic Services is an ex
officio member of the Core Team, serving primarily as liaison to the Presidents Cabinet.
The Core Team's primary function is to oversee the assessment system, which is
articulated in a formal Assessment Plan ( Appendix). In addition to providing
leadership, planning, and system evaluation, the Core Team coordinates assessmentrelated discussions, activities, and projects.
% of Program/Disciplines*
2004-05
2005-06
2006-07
2007-08
2008-09
* Gap Analysis noted that some of the decline stems from difficulties using the digital folder.
Source: 2009-10 Assessment Annual Report
The same analysis found, however, that where a group of faculty was working together
on a project, whether a multi-faculty discipline or an area, the data was more rigorously
collected, discussed, and acted upon. In fact, in FY10, area projects had been conducted
in 100% of the transfer areas. As a result, in 2010, the system was revised so that instead
of assessing each discipline, the focus of assessment efforts shifted to area-level projects.
91%
67%
43%
29%
17%
Goal 2: General Education - Students will develop habits of mind consistent with our
six chosen general education competencies.
The General Education Competencies (Ethics, Mathematics and Quantitative Reasoning,
Problem Solving, Communications, Technology, and Research) were developed by the
faculty for the 2003 Assessment Plan. These outcomes speak directly to the responsibility
of the college, as well as to its obligation to assist students in developing the habits of
mind that society values in the formation of citizens in a democracy ( 4B.1). The
competencies are referred to in assessment documentation as a golden thread woven
through the strands of coursework. As a result of this conception, the data for assessing
the competencies is drawn from college-level classrooms across the curriculum and
generally not from a specific course where direct instruction is provided. So, for example,
assessment events for research are carried out in classes which have research project
requirements, such as literature, nursing, or biology, rather than from Composition II
(ENG 103) where research writing is taught. This approach arises from the facultys
desire to assess how students are acquiring and carrying out the competencies across
education experiences in order to confirm retention of the competency and to inform
direct instruction.
The Gen Ed Competency Cycle:
o Year 1: Problem Solving and Communication
o Year 2: Quantitative Reasoning and Technology
o Year 3: Ethics and Research
Instructors are asked to choose two competencies they value and annually to collect and
report data for these from appropriate college-level courses. The data is aggregated over a
three-year period and discussed during the year that the competency comes up on the
cycle. The faculty uses the data as a catalyst for discussions, both cross-curricular and
within their areas, conforming to the timeline that coordinates these discussions into the
planning and budget cycles. By design, the results of this analysis and discussion make
their way either into Operational Plans or back to the full faculty for action. Several
examples demonstrate that over time, this process is consistently providing opportunities
for improvement:
o
During the first round of gen ed assessment, review of mathematics data led to
discussions between the nursing and mathematics faculty. The resulting actions included
a change in the way nursing instructors embed math in their courses and an overhaul of
the outline of MAT 106, the required math course taken by math students.
As a result of the review of research data in 2007, the Communications area placed on its
Operational Plan the task of creating a guide to research for use by the entire faculty.
In 2010, the discussion of ethics data resulted in a consensus among the faculty that the
college should develop a plagiarism/cheating statement which is consistent and should
be included on every syllabus. The commitment to accomplish this task appears on
FY11 Operational Plans, and a taskforce convened in spring 2011 to begin the process.
Cyclical assessment of the competencies also enabled the development of institutionwide projects. Selected and administered by the Core Team, the institution-wide
assessments show promise to become a valuable instrument of institutional improvement.
Three such projects have been undertaken:
FY10 The Instructional Technology Staff and Core Team developed an online
assessment of student technology skills. Faculty volunteered to administer a test of
technology skill during the first week of the fall 2009 semester. The goal was to assess
whether incoming students could do a series of basic tasks related to file management and
word processing. The Developmental Taskforce repeated the assessment the next
semester to confirm a finding that the developmental population was especially at risk.
FY11 - The LRC Staff, Instructional Technology Staff, and Communications faculty
developed a research assessment that could be given using clicker technology or online.
A pre-test was given to all students who took the library tour during the fall semester. A
post-test was given as part of the spring semester final exam in all sections of the research
writing course, Composition II (ENG 103).
Goal 3: Career - Students will demonstrate skills necessary to obtain and advance in
employment in their chosen field.
As with transfer degrees, the A.A.S. degree faculty created specific outcomes for each
individual program in the design of the 2003 assessment system. They did not create
area-level assessments because no compelling need appeared at that time for aggregating
data across programs. Nursing, as a multi-faculty program with a highly developed
assessment process that predates the development of the college-wide system, was able to
benefit from the program-level discussions; however, the technology programs and some
of the business programs, each with a single-person faculty, struggled with program
assessment in the same way that single-person transfer disciplines did. To provide a
remedy, the career degrees have added appropriate cross-curricular aggregation in order
to benefit from the discussions and institution-level influence that area-level assessment
has demonstrated are valuable. The program faculty met at the fall 2010 in-service to
create a combined objective sheet that establishes outcomes based on the needs of
employers. As the self-study is in process, the system has not yet collected data, but the
end product will provide common outcomes will be collected and aggregated. For
example, all of the career programs share the outcome that students will exhibit
Area level: In the 2003 conception of assessment, the Area Level was designed to create
outcomes for the General Education Core Curriculum areas required for a degree. These
outcomes make statements about the skills and habits of mind that any holder of an
Associates degree is expected to have mastered in acquiring the breadth of thought that
higher education values. Programs were also grouped by Area, but only to manage the
system under the Area Facilitators. In the 2010 revision of the Assessment Plan, the Area
Level outcomes for the GECC have been reviewed and have generally subsumed
discipline-level outcomes. In addition, the Program Areas have created common
outcomes based on employment skills and habits of mind (described above). This change
to the system has not eliminated the program and discipline outcomes, but rather
broadened them to improve engagement and efficacy. Certain disciplines where a specific
sequence of knowledge is required, such as education and music, may need to continue to
assess discipline-level outcomes, as will some distinct programs, like nursing and
criminal justice.
Institution level: As described above, the Assessment Plan establishes the four overarching goals which all of the other outcomes serve to measure. In addition, the General
Education Competencies are maintained and assessed by the faculty at an institution level
to provide an internal, formative assessment and engender institutional improvement. As
an external, summative assessment of selected competencies, the CAAP test, to be
administered every three years, provides data.
exist, that indirect and direct assessment each have their place, and that some assessments are
formative and others are summative. A sampling of practices displays the array:
Classroom measures for college-level classes and for general education competencies are based
on student performance of outcomes. Each submission of data must include a description of the
assignment on which the results are based. At the course level, the course outlines provide clear
guidance as to whether an instructor may choose the assessment or comply with a course-specific
tool. At the program and area level, the faculty groups determine where the data for a project will
come from. For example, College Algebra (MAT 121) uses a common final exam from which
data is collected, aggregated and analyzed. The Communications Area projects generally direct
faculty to select from any appropriate writing assignment or speech and assess it against a
common rubric.
Besides the regular collection of classroom data, internal data about student learning is gleaned
from the general education projects conducted by the Core Team each year. These projects seek to
answer questions about student learning at an institutional level.
External data is valued as confirmation of student learning. Programs apply appropriate licensure
feedback, and transfer areas are able to make use of grade reports from some of the universities to
which students most commonly transfer. These data are reported on Operational Plans or during
program review, where they may be applied to decision-making about budgetary and curricular
change. Area faculty have selected from an array of data to focus on the most highly valued
sources. So, for example, Business values transfer grades earned at 4-year universities; Nursing's
NCLEX scores are reported on the Operational Plans; and Technology area has requested
additional questions on the regular employment survey to meet its needs.
Formative assessment begins with initial placement testing designed to ensure that students begin
the learning process at an appropriate level and continues through various course-specific tools,
which are described on each syllabus as they relate to outcomes established in course outlines.
Most of the area and program assessment efforts of the faculty are formative in nature.
Summative data comes from a variety of sources. Administration of the CAAP test to a sampling
of prospective graduates every three years or so provides information that allows the college to
compare to peer institutions, as well as state and national benchmarks. Health careers receive
detailed licensing examination results that they can use for program improvement. Program data
aggregated from internships and capstone courses provides summative data for career
programs.
Although not regulated by the Assessment Plan, the degree to which data-influenced decisionmaking is embedded in Sauk's culture is revealed in the degree to which the various support units
of the college depend on data to assess the effectiveness of their programs. For example, the
Operational Plan Templates require that action items report results that are sought as a way to
benchmark project success ( Appendix). When the project is complete, another column reports
results obtained. This combination sets up a similar sequence of reporting data, discussing
results, and taking appropriate actions that characterizes the academic assessment process.
By using a wide variety of data from varying sources, the faculty is able to benefit from the
multi-dimensional view of student learning to improve instruction, curriculum, and the
institution.
Board: Although the Board of Trustees is kept apprised of and plays an important role in
strategic planning and in evaluating the associated data, the Assessment Plan does not
indicate any process by which assessment results are directed to the attention of the
Board. Certain external data (such as reports of university GPAs) are reported as news
items.
Students: The importance of communicating the process of the assessment system and
its value to student learning is described in the Assessment Plan. Indirectly, students are
exposed to the system through the outcome-based design of course syllabi and by
participation in classroom assessments. Directly, however, students are introduced to the
assessment process and the system through two communication methods:
o
Syllabus statement: Every syllabus must include the following statement informing
students of the colleges assessment program and their own involvement:
Sauk Valley Community College is an institution dedicated to continuous instructional
improvement. As part of our assessment efforts, it is necessary for us to collect and
analyze course-level data. Data drawn from students work for the purposes of
institutional assessment will be collected and posted in aggregate, and will not identify
individual students. Your continued support in our on-going effort to provide quality
instructional services at Sauk is appreciated.
The statement is included as part of both the online and print syllabus templates.
A survey of online syllabi showed all full-time faculty were in compliance with
the requirement to include the statement (except for one first-year instructor who
also omitted other required statements). A sampling of ten adjunct syllabi showed
that all ten included the required statement.
had been allowed to become outdated, so it is being revised for fall 2011. In addition, an
assessment system review checklist now prompts the Team to review the pamphlet so that
it can be kept current.
Public: The FY10 Recommendations for Change section of the Assessment Annual
Report calls for improvements in public reporting. The notion that the community
stakeholders are interested in or expect student learning results has received little or no
attention in either the design of the 2003 system or in the 2010 revised plan. The Core
Team became aware of the design flaw in spring 2010 and has recommended creation of
a webpage for annual assessment results. At the time the self-study is concluding, no such
action has been taken. The college has regularly submitted news releases to the local
press about certain external data (for example, university GPA comparisons and CAAP
results), but has yet to do any systematic reporting of assessment projects or results.
At the area/program level, each area faculty group discusses prior year data in the fall
of the following year, moderated by an Area Facilitator. At that event, the participating
faculty have the opportunity to make appropriate changes to program or area outcomes,
rubrics, or assessment tools.
At the institutional level, the faculty discusses data and conducts projects on two general
education competencies a year. In a series of discussion events, the faculty has the
opportunity to recommend changes to the competencies themselves. In fall 2008, for
example, a recommendation came from area-level discussions to eliminate ethical
reasoning as a competency. The issue was brought to a meeting of the faculty; both sides
of the argument were presented and discussed. The resulting vote confirmed the
competency. The same process resulted in the faculty revising the research competency
objectives in a way that clarified its goals for student outcomes.
At the system level, the Assessment Plan and the charge of the Assessment Committee
call for the Faculty Core Team to conduct an annual evaluation each spring of Sauk's
assessment system ( Appendix). The Core Team, based on evaluation of system data,
creates an annual report, including recommendations for change (if any) and a plan of
action for the next academic year. The report is subsequently considered at the spring
meeting of the full Assessment Committee, which consists of the Core Team, the
Academic Vice President, the Dean of Institutional Research and Planning, and all of the
Academic Deans. The annual evaluation ensures systematic oversight of the assessment
system and has resulted in several major alterations in the system:
The 2005 and 2006 reports were directed to the Organizational Planning and
Improvement Committee (OPIC). By 2006, a shift in the OPIC charge and revisions to
the new Operational Plan Template led to a recommendation that this submission to OPIC
was no longer necessary; and subsequent reports have been acted upon by the Assessment
Committee.
In 2008, the Core Team, believing that the major work of design was complete, subsumed
a separate General Education subcommittee and began more direct oversight of the
competencies. It indicated that it would refocus its creative energies from design to the
improvement of teaching and learning through application of the assessment data,
primarily by recommending and facilitating professional development related to the Gen
Ed competencies.
In spring 2009, the Core Team called for a Gap Analysis of the system. This report was
considered at a special meeting of the Assessment Committee in the fall and resulted in a
major revision of the 2005 system, which is being implemented as the self-study occurs.
At the administrative level, all of the academic administrators are members of the
Assessment Committee and have an important role in discussing, revising, and approving
the annual reports of the Core Team. In addition, as part of the 2009 Gap Analysis, key
administrators were invited to assess the system against the HLC Matrix of
Implementation, the results of which are reported in Figure 3ii below. A periodic
repetition of this evaluation has been added to the Core Teams annual review checklist to
strengthen the administrative role in reviewing the assessment process, particularly those
aspects, like Board support, that are beyond the purview of the faculty.Figure 3ii:
Administrative Evaluation of Assessment System