...

Report on Assessment St. Cloud State University Spring 2006

by user

on
Category: Documents
11

views

Report

Comments

Transcript

Report on Assessment St. Cloud State University Spring 2006
Report on Assessment
St. Cloud State University
Spring 2006
Submitted to: Michael Spitzer, Provost
Prepared by: Neal Voelz - University Assessment Director
Elaine Ackerman – COE
Holly Evers – University Assessment Office
Chris Inkster – LRTS
Sandra Johnson – COSE
Judy Litterst – GEC Liaison
Joseph Melcher – COSS
Suellen Rundquist – COFAH
Bradley Sleeper – HCOB
Mary Soroko – HCOB
Mitch Rubinstein and Mark Nook - Administration
University Assessment Committee and Director Accomplishments for the 2005-06 Academic Year
•
•
•
•
•
•
•
•
•
•
As of spring 2006 approximately 93% of undergraduate academic programs (not
including minors and BES) had identified student learning outcomes (or were in the
process of doing so). In addition, approximately 91%, 88% and 78% of the
undergraduate programs had assessment plans, used direct assessment tools (often
in conjunction with indirect methods) and were using data for program improvement,
respectively (see matrices at the end of this report).
Approximately 77% of the graduate academic programs have developed student
learning outcomes (or were in the process of doing so) by spring 2006. Approximately
67%, 65% and 40% of the graduate programs had assessment plans, used direct
assessment tools and were using data for program improvement (or were in the
process of doing so), respectively (see matrices at the end of this report).
The University Assessment Committee met weekly during most of the 2005-06
academic year. In addition, each college assessment committee met regularly and
was chaired by a college-wide assessment coordinator.
The University Assessment Director (Neal Voelz) served on the following committees:
Assessment (Chair), HLC Accreditation Assessment (Chair), General Education (exofficio), HLC Accreditation Steering, Strategic Planning.
Suggestions for significant changes to the university assessment web site were
gathered and we are in the process of updating the site. Each college also has either
constructed or is planning a separate assessment web site, specific to the college.
The University Assessment Office has been established and consists of two rooms in
Headley Hall. One half-time assessment office administrator was hired (Holly Evers).
Progress was made toward establishing an optional standard campus syllabus, which
includes a section on assessment (particularly student learning outcomes). Continuing
Studies presently has an online syllabus available.
Discussions on general education assessment were continued in conjunction with
changes being proposed for the overall program (the program now has a mission,
general learning goals are being established, and translation of some of these goals
into assessable student learning outcomes has started). In particular we discussed the
need for both locally and nationally (specifically, the Collegiate Learning Assessment –
CLA) produced assessment tools. Towards this end we held a conference call with
CLA staff to discuss various issues and concerns with using this national test.
Progress was made in establishing a standard assessment database for use at the
college/unit level. Half of the assessment committee also attended an online seminar
in May 2006 demonstrating a comprehensive assessment management system
(WEAVE online).
A campus-wide assessment luncheon was conducted, bringing together over 60
faculty and staff to discuss the current status and future needs of assessment at
SCSU. The overarching themes from the discussion were: we need a common
language/vocabulary of assessment; assessment needs greater visibility across
campus; the administration needs to define assessment expectations outside of HLC;
we need a timeline that supersedes any changes in faculty/administration/staff; we
need to affect cultural shift. In addition, a survey of the luncheon participants yielded
the following needs: greater student awareness of assessment; making decisions
based on assessment data; including assessment in faculty job descriptions; and
having adequate resources to conduct assessment.
•
•
•
Members of the University Assessment Committee were involved with the following
Faculty Forum Day presentations/workshops in April 2006:
1. A beginning discussion of assessing non academic and service units and linking
this assessment with student learning.
2. A general introduction to assessment – Assessment 101.
3. A workshop on the more specific topic of developing general education student
learning outcomes.
4. An advanced level assessment workshop entitled “Teaching, Learning,
Assessment - Closing the Loop.”
Over half of the University Assessment Committee attended the NCA/HLC
accreditation conference in spring 2006.
The University Assessment office sent four people directly involved in General
Education Assessment (Judy Litterst, Michelle K. Hammes, Margaret Villanueva and
Sandy Johnson) to the AAC&U Network for Academic Renewal Conference - “General
Education and Outcomes That Matter in a Changing World” held 9-11 March 2006.
The Assessment office also paid for Mary Soroko, Joe Melcher, Elaine Ackerman,
Patty Aceves, Suellen Rundquist and Brad Sleeper to attend the Collaboration
Regional Assessment Workshop in May 2006.
General Education Core and Racial Issues Assessment
•
The Department of Communication Studies:
1. completed analysis of their direct measure of student learning in the public
speaking unit of CMST 192 (data collected spring 2005). That analysis showed that
they are achieving the desired learning outcomes for the unit via all delivery methods
for that course. They completed analysis for the indirect measure of the
interpersonal unit for that course. It is unlikely that the analysis for the indirect
measures of student learning for the small group and public speaking units will be
completed by May 2006. Given the degree of difficulty in doing this analysis, they will
be using direct measures for all three units of CMST 192 going forward.
2. agreed to a set of student learning outcomes for CMST 192 as a department, and
the assessment committee will submit a plan for using direct measures for assessing
those outcomes to the department this spring. They will collect data for assessment
purposes in CMST 192 during 2006-2007.
•
Dale Buske, Sandra Johnson and Shawn Triplett (Mathematics), and David Robinson
(Statistics) received a 2005 assessment grant and completed the following:
1. Developed test questions to measure the core learning outcomes as they apply
to courses that fulfill the SCSU General Education core three requirement.
2. Administered these questions to students enrolled in MATH 193, STAT 193,
MATH 112, MATH 115, MATH 211 or MATH 221.
3. Analyzed student responses.
4. Are using data from analysis to recommend questions for future use and design
additional direct measures.
Their assessment was based on two of the five Core Learning Outcomes:
1. Students will identify and analyze problems in various contexts and
will design solutions.
2. Students will learn to learn by employing various methods to obtain,
classify, analyze, synthesize, and apply knowledge.
Knowing that the content of these courses varies widely, they decided to define a type
of question which could be asked using the content of each individual course. The
description of the type of questions developed is: Translate a problem described
verbally or by tables, diagrams or graphs into symbolic language, solve the problem
and interpret the result in the original context. The questions were administered in pre
and post tests. For each question, a scoring rubric was developed. Student
responses were analyzed and compared.
Pre and post tests were administered to 724 math students and 215 statistics students.
In every class, the average score improved from pre to post test. The committee
concluded that the method was sound, but that questions needed to meet clearly
stated criteria. The questions developed in the project will be improved and embedded
in tests providing a direct measure of student learning in subsequent years. These
measures will serve as models for developing further assessments in both
departments, and may be used by other departments as well. The final result will be a
set of questions, classified according to the five Core Learning Outcomes, with
appropriate scoring rubrics and accompanying data.
•
Michelle Kukoleca Hammes (Political Science/Core 5 Director), Carolyn Hartz
(Philosophy/Core 4 Director) and David H. Robinson (Math/Statistics) received a 2005
assessment grant to assess learning outcomes for the Core 3, 4 and 5 areas. To date,
the General Education Program has had no systematic, continuing assessment. Their
project seeks to rectify this by creating a comprehensive, sustainable, collaborative
method of assessing most of the Core areas with a single instrument. In fall 2005 they
administered the instrument (essay) twice, once at the beginning of the semester and
once at the end. It was also administered as a pretest in the spring of 2006 and will be
administered as a post test at the end of the spring semester (These results were not
available as of early May 2006). Each time it was given to two sections of Core 5, and
counted for some of the students’ grades in these sections. Embedding the
assessment in a particular course and having the incentive of graded work enhances
the completion rate as the assessment is conducted in future semesters. Compliance
in the initial set was almost 100 percent.
•
In addition to the above assessment incorporating PHIL 194, there has been
assessment of this course for many years (pre- and post-test). Assessments of PHIL
194, Critical Reasoning, use pre- and post-tests to assess learning; also analyzes the
effect of class size on test scores, the effect of including the assessment as part of
student grade (to increase value of the test to students), and each individual question
in the tests. The department has been assessing the Critical Reasoning course since
1998 using this method.
•
The Racial Issues Colloquium (RIC), a group comprised of faculty teaching racial
issues courses obtained an assessment mini-grant in order to support an on-going,
longitudinal, cross-departmental assessment of Racial Issues courses. During the
summer of 2003, the Colloquium held their annual summer seminar in which they
developed a system of pre- and post-surveys that instructors in collaborating
departments would give to their students as a way to assess students' achievement of
knowledge and attitudinal learning outcomes. The cooperating departments include:
Ethnic Studies, Community Studies, History, Sociology, and Human Relations (in
COE). They used the grant to hire a graduate assistant to carry out the rather
complicated process (given the number of courses and instructors) of gathering all of
the survey results, coding the data to ensure student and instructor privacy, and having
statistical analyses done. One of their goals for the summer 2006 seminar will be to
review these results and to reflect upon how they could be used to inform beneficial
changes.
BES Assessment
The BES program has not developed an assessment plan. However, a meeting was held on 21 April
2006 to discuss the BES assessment plan. The following people were present at this meeting: Patty
Aceves, Kay Sebastian, John Hoover (SPED), Neal Voelz (BIOL/ Univ. Assessment Director), Sandy
Johnson (MATH, COSE Assessment Director), John Burgeson.
This group suggested the following:
Action Items:
 CCS must define its market and whom it serves.
a. Collect data from ISRS to determine demographics of BES students over past 5 years.
Are there trends? What types of students are we serving? Who do we want/need to be
serving?
b. Suggested that we interview several students or hold a focus group to talk about who
these students are, what the BES means to them, and how it will serve their future
purposes.
c. Gather data from current sources: NSSE, Alum surveys, Noel Levitz, ACT, etc.
 The BES orientation: The student should be creating a Program Rationale with measurable
learning outcomes. We could create an orientation course that all students would work through
to create the Program Rationale and learning outcomes.
 BES Advising: 300+ students are too many for one advisor. There should be someone
overseeing the program and we should have faculty advisors in the student’s major or area of
interest (if self-select) who would act as their program advisor in creating the Program
Rationale and learning outcomes. These advisors would need to be trained or knowledgeable
in developing measurable outcomes. Program changes would need to be approved by the
faculty advisor who would ensure that the course changes either did not adversely affect the
student’s program rationale, or would assist the student in updating their goals appropriately.
 BES Capstone Course: The BES self select major needs to be assessable. Students should be
able to submit evidence that they have met their self-designed learning outcomes, possibly in
the form of a portfolio or e-portfolio.
 Assessing the portfolio would again require either the original faculty advisor or possibly
graduate students with knowledge in higher ed, assessment, advising, etc (ED students).
 BES Advisory Board: There should be an advisory board to provide input and feedback on the
program. This board could consist of faculty, alumni, business leaders who hire our graduates,
etc.
Written Assessment Plan:
 Outline Mission, Program Outcomes, Student Learning Outcomes
 For each outcome, define strengths, in progress, and needs.
 Write narrative, including evidence for each outcome.
College Assessment Accomplishments and Plans for the 2006-07 Academic Year
From Mary Soroko and Brad Sleeper, Herberger College of Business
2005-06 Activities and Accomplishments:
1.
2.
3.
4.
5.
6.
Completed the HCOB web site design for assessment for both students and faculty.
Expanded faculty use of eLumen.
Completed a faculty handbook about the college assessment program.
Documented program learning goals for most college programs including the MBA.
Developed a competency based testing model for core business requirements
Conducted an assessment retreat to develop a 3 year plan for moving assessment forward in
the HCOB.
7. Agreed to use assessment in RPT decision-making and to revise course evaluations to ask
student perceptions of their achievement of course learning goals.
Next Steps:
1.
2.
3.
4.
5.
6.
Allocate additional resources to support assessment at the departmental level
Create program learning goals for international business and entrepreneurship
Identify where and how program goals will be assessed and begin to gather data
Revise RPT procedures to include assessment
Implement competency based testing in core business courses and revised course evaluations
Ensure that learning goals are explicitly stated on all course syllabi
Recommendations:
1. Allocate $5,000 to each department to fund assessment activities in each of the next two
years.
2. Pay the assessment coordinator of each department a stipend to coordinate the activities of
the department and serve as the department champion for assessment. Ask that each
coordinator do extensive reading during the summer (2006) on assessment.
3. Find ways to elevate the importance of assessment within the college by:
• Modifying RPT procedures
• Posting assessment reports on the college web site
• Having the assessment director do a formal report at the fall college retreat
4. Enhance the draft assessment booklet to provide more specific, concrete and discipline
specific directions (worksheets) on how to do it.
Our goal for the coming year is to make assessment part of the fabric of the college; a way of doing
business. We recognize that our efforts have largely been to persuade those who already have an
interest in assessment to participate—but this has only gotten about 10 to 15% of our faculty involved
in our program.
Consequently, for the coming year, we have asked the dean for a $25,000 budget over and above the
current commitment to assessment to support assessment activities at the departmental level. This
recommendation was the result of a retreat we held with our assessment coordinators earlier in the
month. Each department is at a different place—but generally, the assessment coordinators need to
be paid something for their efforts to create consensus within their units; to purchase resources that
will help faculty do assessment, etc. Brad and I will also be doing a presentation in the fall on our
assessment program.
College of Education Assessment (Elaine Ackerman)
College of Education Assessment Activities
2005-2006
Activity
Undergraduate Program Completer
Follow-Up Study
Cooperating Teacher Follow-Up
Study
Educational Leadership Program
Completer Survey
Principal/Employer Survey
Student Teaching Formative and
Summative Performance-Based
Assessments
Description
Survey distributed by mail to
teacher education program
completers designed to assess how
well St. Cloud State teacher
education programs prepared
teacher candidates to enter the field
of teaching based on INTASC
(Interstate New Teacher
Assessment and Support
Consortium) and NCATE National
Council for Accreditation of
Teacher Education) teaching
standards.
Survey distributed by mail to
cooperating teachers (i.e., teachers
in school districts that host student
teachers) to assess how well St.
Cloud State teacher education
programs prepared teacher
candidates to enter the field of
teaching based on INTASC and
NCATE teaching standards.
Survey distributed by mail to
Educational Leadership program
completers to assess how well St.
Cloud State’s Educational
Leadership program candidates to
enter the field of school
administration based NBPTS and
NCATE school personnel
standards.
Survey distributed by mail to
Principals or employers who have
employed St. Cloud State teacher
education program graduates to
assess how well the teacher
education program prepared the
candidates to enter the field of
education based on INTASC and
NCATE standards. (pilot study)
Date
Ongoing – Program completers
graduating in the fall receive survey
early in the spring semester;
program completers graduating in
the spring receive survey early the
following fall semester.
Formative and summative
assessments of student teaching
performance completed by both the
university supervisor and
Ongoing – Collected at the end of
fall and spring semesters.
Ongoing – Cooperating teachers
who hosted student teachers during
the fall semester receive a survey
early in the spring semester;
Cooperating teachers who hosted
student teachers in the spring
receive a survey early the following
fall semester.
Ongoing – Program completers
graduating in the fall receive survey
early in the spring semester;
program completers graduating in
the spring receive survey early the
following fall semester.
This survey instrument will be reevaluated this summer for validity
and reliability. The plan is to send
the new instrument out in the
Spring next year.
P-12 and 5-12 Program Evaluation
Form
ED 300 Field Experience
Evaluation
ED 441 Student Self-Assessment
Form
ED 441 Final Assessments
Completion of Program
Assessment Matrix
Development of program transition
points matrix
Attendance at NCATE
Conferences
Completion of PEDS (Professional
cooperating teacher during and at
the completion of student teaching
assignments. Assessments are based
on INTASC and MN Board of
Teaching standards.
Distributed by university
supervisors to students who are
currently student teaching. The
form instructs students specify the
courses they found most valuable in
preparing them to meet each of the
INTASC standards. Students are
also asked to describe their
understanding of the College of
Education’s Conceptual
Framework.
Completed by cooperating teachers
to assess student teachers’ quality
of experience (e.g., the variety of
tasks a student teacher participated
in while in the classroom).
Completed by teacher candidates.
Designed to encourage selfreflection in the areas addressed by
INTASC standards.
Completed by the cooperating
teacher at the end of the student’s
ED 441 field experience to assess
student competencies in relation to
the INTASC standards
The Program Assessment Matrix
lists CoE programs, type of degree,
program mission statement,
whether the program has an
assessment plan, assessable
learning outcomes, assessment
tools, data, and how the data is used
for program improvement.
The transition points matrix lists all
of the requirements and
assessments within each CoE
program tracked or assessed at each
program transition point (i.e.,
admission, completion of courses,
pre-clinical experience, exit from
clinical experience, program
completion, and follow-up)
NCATE Steering Committee
attended two NCATE conferences
this year
Report submitted to AACTE &
Ongoing – Distributed at the end
fall and spring semesters
Ongoing – Collected at the end of
fall and spring semesters
Ongoing – Completed by students
during the ED 441 field experience.
Collected fall and spring semesters.
Ongoing – Collected fall and spring
semesters.
Submitted to University
Assessment Director April 2006
Drafts of program transition points
matrix due on May 1, 2006
October, 2005 & March 2006
May 2006
Education Data System report
Completion of Minnesota
Measures of Teacher Quality
Report(MTQM)
College of Education data system
development
NCATE containing general and
program-specific enrollment and
completion data, faculty
information, distance education
information, and fiscal data. Based
on 2003-2004 data.
Report submitted to the Minnesota
Association of Colleges for Teacher
Education listing enrollment and
completion in specific degree
programs, admission requirements,
hours of clinical experience,
partnerships with school districts,
faculty experience, diversity, use of
assessments, and technology.
Based on 2003-2004 data.
The College of Education is has
completed the planning phase of the
development of a data system
April 2006
Ongoing – development phase will
begin in progress January 2006
Teacher Education Council work
on Board of Teaching (BOT)
Matrices
TEC representatives from program
areas have worked in collaboration
with the Director of Assessment
and Accreditation to create reports
for BOT
On-going—final reports for BOT
are to be submitted Spring 07
Title II report; Tracking of Praxis I
and Praxis II passing rates
Teacher education student passing
rates on the Praxis exams are
tracked and submitted to the
Department of Education
The steering committee has met
weekly during the 05-06 academic
year to establish the foundation for
the NCATE work
In an effort to provide greater
support to our students in the Praxis
testing, the center was re-opened
with established center hours, a
director, workshops, tutoring.
April 2006
NCATE Steering Committee
Re-opening of the Praxis Center
2005-2006
October 2005
Planned activities:
(1) Collection of transition points—May 2006
(2) Program representatives will work during the summer to complete documentation for BOT –
Summer 2006
(3) Revise Employer/principal survey
(4) 3- and 5-year follow-up surveys of undergraduate and Ed Leadership program completers –
Fall 2006
(5) Concentrated efforts during 06-07 to prepare for BOT and NCATE accreditation visits.
Assessment Report from the College of Fine Arts and Humanities (Suellen Rundquist)
ART
•
•
•
•
•
•
•
•
•
•
(Julie Baugnet)
The department met for an assessment retreat in January.
Implementation of new grids.
Implementation of a blanket assessment tool for all classes.
Assessments of all classes will start on a rotating basis.
Faculty have gathered visuals and assessment data.
We have worked with the Asst. Dean on sequential steps as advised.
Praxis is in place for Art Education.
Writing requirements for upper division are in place.
Portfolio reviews have been ongoing.
Student Scholarships reviews have taken place.
Student Outcomes
BA in Art History
Acquaintance with the
art history of nonWestern cultures.
300/331
430
431
432
433
434
435
436
437
General knowledge of
the monuments of all
major visual art periods
in the west
X
X
X
X
X
X
X
X
X
X
General knowledge of
the principal artists of
all major visual art
periods in the West
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
438
439
X
X
Theory within the
context of applicable
art works
General historical
knowledge within the
context of applicable
art works
X
X
X
X
X
Analytical and critical
essay writing
X
X
X
X
X
X
X
X
X
X
X
X
X
Research tools and
praxis
Assessment
Instruments in
Art History
Acquaintance with the
art history of nonWestern cultures.
300/331
General knowledge of
the monuments of all
major visual art
periods in the west
General knowledge of
the principal artists of
all major visual art
periods in the West
Blue
430
X
X
X
X
X
431
432
433
434
435
X
436
437
Blue
books
essays
X
438
Journals
Blue
books
Blue
books
Exams
Exams
books
Blue
Blue
books
Blue
books
Blue
books
Blue
books
Blue
books
Blue
Essays
exams
Exams
books
Blue
Blue
books
books
Blue
books
Exams
Exams
Blue
books
books
Exams
439
Student Outcomes
Ceramics
Basic design principles
related to ceramics
Ability to develop
solutions to design
problems
370
371
372
470
471
472
X
X
X
X
X
X
X
X
X
X
X
X
Knowledge and skills in
the use of a pottery
wheel
X
X
X
Knowledge and skills in
the use of handbuilding
techniques
X
X
X
X
X
X
Knowledge and skills in
the use of clays and
glazes
X
X
X
X
X
X
Knowledge and skills in
the use of kilns
Knowledge and skills in
the industrial
applications of a
ceramics technique
related to mold-making
X
X
X
X
X
X
X
X
X
X
Knowledge of basic
business practices
Knowledge of the place
of ceramics within the
history of art, designs
and culture.
X
X
X
X
X
X
X
Student Assessment
Ceramics
Basic design principles
related to ceramics
Ability to develop
solutions to design
problems
Knowledge and skills
in the use of a pottery
wheel
Knowledge and skills
in the use of
handbuilding
techniques
Knowledge and skills
in the use of clays and
glazes
Knowledge and skills
in the use of kilns
Knowledge and skills
in the industrial
applications of a
ceramics technique
related to mold-making
Knowledge of basic
business practices
370
Using
assessment
documentation
sheet (blanket
assessment for
all outcomes
and courses).
371
372
470
471
472
Knowledge of the
place of ceramics
within the history of
art, designs and
culture.
Student Outcomes
Drawing
Understanding of basic
design principles,
concepts, media, and
formats. The ability to
place organization of
design elements and
the effective use of
drawing media at the
service of producing a
specific aesthetic intent
and a conceptual
position. The
development of
solutions to aesthetic
and design problems
should continue
throughout the degree
program.
Understanding of the
possibilities and
limitations of the
drawing medium.
311
312
313
315
411
412
413
415
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Knowledge and skills in
the use of basic tools
and techniques
sufficient to work from
concept to finished
product. This includes
mastery of the
traditional technical and
conceptual approaches
to drawing.
Functional knowledge
of the history of
drawing.
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
ASSESSMENT
MATRIX COURSES:
Program Name:
DRAWING
Understanding of basic
design principles,
concepts, media, and
formats. The ability to
place organization of
design elements and
the effective use of
drawing media at the
service of producing a
specific aesthetic intent
and a conceptual
position. The
development of
solutions to aesthetic
and design problems
should continue
throughout the degree
program.
311
411
412
413
415
Assessment
Documentation
Sheet (blanket
assessment
for all
outcomes and
courses)
312
313
315
Understanding of the
possibilities and
limitations of the
drawing medium.
Knowledge and skills in
the use of basic tools
and techniques
sufficient to work from
concept to finished
product. This includes
mastery of the
traditional technical
and conceptual
approaches to drawing.
Functional knowledge
of the history of
drawing.
Assessment GRID
Art Education
Program
Personal Qualities
(Disposition)
The potential to inspire
others and to excite the
imagination of
students, engendering
a respect and desire
for art and visual
experiences.
101-105
Foundations
201202
Theory
330331
Art
History
390
395
490
Art
Methods
Art
Methods
Art
Methods
x
x
x
305370
Studio
The ability and desire
to seek out, evaluate,
and apply new ideas
and developments in
both art and education.
x
x
x
The ability to maintain
positive relationships
with individuals and
various social and
ethnic groups, and
empathize with
students and
colleagues of differing
backgrounds.
The ability to articulate
and communicate the
goals of an art program
to pupils, colleagues,
administrators, and
parents in an effective
and professionally
responsible manner.
x
x
x
x
x
x
ASSESSMENT: Art
390 – Teaching
portfolio (collection
of educational
experiences,
volunteering, resume,
personal art, etc.)
Art Competencies
Studio art – the
prospective art teacher
must be familiar with
the basic expressive,
technical, procedural
and organizational
skills, and conceptual
insights which can be
developed through
studio art and design
experiences.
Instruction should
include traditional
processes as well as
newer technological
developments in
environmental and
functional design fields.
x
x
x
x
ASSESSMENT:
Portfolio Review
Prospective art
teachers must be able
to make students
emphatically aware of
the all-important
process of artistic
creation from
conceptualized image
to finished art work.
ASSESSMENT: See
Studio Course
Assessments
x
x
x
Art History – The
prospective teacher
must have an
understanding of
1. major styles and
periods of art history,
analytical methods,
and theories of
criticism
2. the development
of past and
contemporary art forms
3. contending
philosophies of art
4. fundamental and
integral relationships of
all these to the making
of art
ASSESSMENT:
Grades from Art
Survey Courses and
the scores on the
Praxis II Art Content
Test
X
X
X
Advanced Work - The
student has the
opportunity for
advanced work in at
least one or more
studio/and or art
application areas.
x
ASSESSMENT:
Student Show,
Scholarships,
Exhibitions, etc.
Technical Processes –
The prospective art
teacher should have
functional knowledge in
such areas as the
physics of light,
chemistry of pigments,
the chemical and
thermal aspects of
shaping materials, and
the basic technologies
involved in
printmaking,
photography,
filmmaking, and video.
x
x
x
Assessment:
Portfolio Review and
Studio
Student Outcomes
101-105
201202
330331
390
395
490
305370
Art Education
Program
K-12 Licensure
Personal Qualities
(Disposition)
The potential to inspire
others and to excite the
imagination of
students, engendering
a respect and desire
for art and visual
experiences.
The ability and desire
to seek out, evaluate,
and apply new ideas
and developments in
both art and education.
The ability to maintain
positive relationships
with individuals and
various social and
ethnic groups, and
empathize with
students and
colleagues of differing
backgrounds.
The ability to articulate
and communicate the
goals of an art program
to pupils, colleagues,
administrators, and
parents in an effective
and professionally
responsible manner.
Foundations
Theory
Art
History
Art
Methods
Art
Methods
Art
Methods
x
x
x
x
x
x
x
x
x
x
x
x
Studio
Art Competencies
Studio art – the
prospective art teacher
must be familiar with
the basic expressive,
technical, procedural
and organizational
skills, and conceptual
insights which can be
developed through
studio art and design
experiences.
Instruction should
include traditional
processes as well as
newer technological
developments in
environmental and
functional design fields.
Prospective art
teachers must be able
to make students
emphatically aware of
the all-important
process of artistic
creation from
conceptualized image
to finished art work.
Art History – The
prospective teacher
must have an
understanding of
1. major styles and
periods of art history,
analytical methods,
and theories of
criticism
x
x
x
x
x
X
X
x
x
Advanced Work - The
student has the
opportunity for
advanced work in at
least one or more
studio/and or art
application areas.
Technical Processes –
The prospective art
teacher should have
functional knowledge in
such areas as the
physics of light,
chemistry of pigments,
the chemical and
thermal aspects of
shaping materials, and
the basic technologies
involved in
printmaking,
photography,
filmmaking, and video.
x
x
x
x
Art education methods courses should be taught by faculty who have had successful experience teaching art in elementary and secondary schools and
who maintain close contact with such schools
Dr. Kathryn Gainey, Associate Professor of Art Education, has had over 25 years of
teaching K-12 art education.
Institutions should encourage observation and discussion of teaching prior to beginning formal study
in teacher education, whether at the freshman or at the more advanced level
The student organization, The Future Art Educators, connects students immediately with teaching
and volunteering opportunities in classroom. All methods courses in the art department provide
opportunities for observation and discussion on teaching of the arts. The college of education
provides field experiences and student teaching experiences for art education majors.
Supervised practice teaching opportunities should be provided in actual school situations. These
activities, as well as continuing laboratory experience, must be supervised by qualified art teacher for
certification for kindergarten through high school ideally should have a period of internship at both
elementary and secondary levels and should be given substantial responsibility for the full range of
teaching and classroom management in these experiences
Field experiences and student teaching is completed in both the elementary and high school
setting. Students are placed with certified visual art teachers who have had at least three years of
successful teaching experience.
Institutions should encourage ongoing professional studio involvement for art teachers
The Art Student Union and the Future Art Educators encourage students to continue to make and
exhibit their art. Local and state venues for exhibition are provided through these organizations
and other arts organizations.
Institutions should establish specific evaluation procedures to assess student progress and
achievement. The program of evaluation should include an initial assessment of student potential for
admission to the program, periodic assessment to determine progress throughout the program, and
further assessment after graduation. It is recommended that a college supervisor be enabled to make
at least two monthly visits during the internship to conduct individual conferences with the student
teacher and confer with cooperating school personnel
Transition Points and Assessments:
Transition Point 1
Admission to the Art Education Program - First Assessment
 Candidates must complete a Portfolio Review and Interview (after
completing art Foundations)
 Candidates must complete the Major Application Form for the Art Major.
The signature from the Art Education advisor is required.
 A grade point average of 2.5 is a prerequisite. Transfer credits must be
identified and approved by the Art Department Chairperson on the
Transfer Course Form.
 Course substitutions or changes must be identified on the Art Major/Minor
Change Form.
 The PPST test must be completed before beginning the professional
education sequence of courses or the art education courses.
Transition Point 2
Knowledge, Skills, and Dispositions - Midpoint Assessment
 The midpoint assessment is defined as the period of time beginning with the
completion of the Major Application Form and acceptance into the art education
program. During this time, students complete the professional education
sequence prior to student teaching.
 Students must maintain the 2.5 grade point average.
 Art methods courses Art 390 (Visual Arts in the Secondary School) and Art 490
(Ethnic and Folk Art for Educators) are courses that the Art Education Major is
required to complete during this transition. Assessment rubrics have been
developed for assessment during this transition based on knowledge, skills, and
dispositions. (A similar form is also used within the College of Education for
assessment
 During this timeframe students will be required to successfully complete
professional portfolios showcasing evidence in fulfilling the standards required
by the Minnesota Board of Teaching.1
Transition Point 3
Student Teaching & Licensure Assessment
 Culminating Assessment – Student Teaching
It is during the student teaching that the College of Education Supervising
Teacher, the Cooperating Teacher, the student teacher, and the art education
advisor make the final assessments. The Art Education advisor will make at least
one contact with the SCSU supervisor during this period of time to communicate
student progress and to provide interventions if needed.
 The Praxis II art content test can be taken after all art courses are completed. The
Praxis II Pedagogy test should be taken midway through the student teaching
experience. These tests are required for Minnesota Teaching Licensure.
 By the end of this transition point the student should have completed all of the
necessary paperwork for graduation, completed the Praxis II, and filed for
Minnesota Teaching Licensure.
Transition Point 4
Post-Graduate Follow-up and Program Assessment
 After graduation, each graduate in Art Education will be contacted to fill out a
survey. The purpose of this survey is to assess the course work for teaching
preparation at the University.2
 Email correspondence
 Departmental surveys
 Information obtained through the College of Education and Career Services
1
Students who do not successfully complete coursework or portfolio reviews will be referred to the
Professional Concerns Committee
2
See survey
ASSESSMENT
Graphic Design
Program
Solving communication
problems
Problem identification
and research, information
gathering, analysis
Evaluation of outcomes
The ability to describe
and respond to the
audiences and contexts
which communication
solutions must address
including recognition of
the physical, cognitive,
cultural, and social
human factors that shape
design decisions
The ability to create and
develop visual form in
response to
communication problems
An understanding of
principles of visual
organization/composition,
information hierarchy
220
221
Notebook, research for
identity system
320
321
322
420
421
422
Graphic Design
History Report
Research
report on
project
Projects: Personal
and revisions for
portfolio
Presentation reports on
vocations, history,
issues
Web site
Logo Poster
Game
Brochure
Portfolio One on
One of Aliga MN +
other exchanges
with designers
All projects: logo,
newsletter, website,
poster for non profit org
Navigation of
website text and
image use for
project
All projects
Personal identity
system; stationary,
logo, card
Assessment
documentation sheet
(for all courses).
444
Understanding of
symbolic representation,
typography, aesthetics,
and the construction of
meaningful images
Logo design, use of type
for projects
All projects
Personal identity
system; stationary,
logo, card for
portfolio
An understanding of tool
and technology, including
roles in the creation,
reproduction, and
distribution of visual
messages
Relevant tools and
technologies include but
are not limited to,
drawing, offset printing,
photography, and timebased and interactive
media (film, video,
computer multimedia)
An understanding of
design history, theory,
and criticism from a
variety of perspectives
including those of art
history, linguistics,
communication and
information theory,
technology, and the
social and cultural use of
design objects
All applications;
Photoshop, Illustrator,
Dreamweaver, Flash,
InDesign
Flash,
Dreamweaver +
others
All
Applications;
Photoshop,
Illustrator,
InDesign
All Applications;
Photoshop,
Illustrator,
InDesign +
Presentation reports on
vocations, history, issue
+ course text (thinking
with Type)
Project subject;
design history
Readings on
theory of
information
system
design
Readings from
class text (how to
be a graphic
designer) and
other sources
An understanding of
basic business practices,
including the ability to
organize design projects
and to work productively
as a member of teams
Student Outcomes
Graphic Design
Program
Presentation reports on
vocations, history,
issues
Collaboration on
research
Speakergame
designer,
graphic
designers
Presentation and
exchanges with
visiting designers.
Discussions on
design practice.
220
221
320
321
322
420
421
422
444
Solving communication
problems
Problem identification
and research,
information gathering,
analysis
X
X
X
X
X
X
X
X
X
X
X
X
Evaluation of outcomes
The ability to describe
and respond to the
audiences and contexts
which communication
solutions must address
including recognition of
the physical, cognitive,
cultural, and social
human factors that shape
design decisions
The ability to create and
develop visual form in
response to
communication problems
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
An understanding of
principles of visual
organization/composition,
information hierarchy
Understanding of
symbolic representation,
typography, aesthetics,
and the construction of
meaningful images
An understanding of tool
and technology, including
roles in the creation,
reproduction, and
distribution of visual
messages
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Relevant tools and
technologies include but
are not limited to,
drawing, offset printing,
photography, and timebased and interactive
media (film, video,
computer multimedia)
X
X
An understanding of
design history, theory,
and criticism from a
variety of perspectives
including those of art
history, linguistics,
communication and
information theory,
technology, and the
social and cultural use of
design objects
X
X
An understanding of
basic business practices,
including the ability to
organize design projects
and to work productively
as a member of teams
X
X
X
X
X
X
X
X
COMMUNICATION STUDIES (Wendy Bjorklund)
1. We completed analysis of our direct measure of student learning in the Public
Speaking unit of CMST 192 (data collected Spring, 2005). That analysis
showed that we are achieving our desired learning outcomes for the unit via
all delivery methods for that course. We completed analysis for the indirect
measure of the interpersonal unit for that course, and you will receive that
report by the end of this semester. It is unlikely that the analysis for the
indirect measures of student learning for the small group and public speaking
units will be completed by then. Given the degree of difficulty in doing this
analysis, we will be utilizing direct measures for all three units of CMST 192
going forward.
2. We have agreed to a set of student learning outcomes for CMST 192 as a department,
and the assessment committee will submit a plan for utilizing direct measures
for assessing those outcomes to the department this spring. We will collect
data for assessment purposes in CMST 192 during 2006-2007.
3. We have revised the student learning outcome(s) for our UDWR course and it has
been approved by the department. We have submitted a rubric for assessing
writing in those classes to the department and they should vote on it during
our next departmental meeting (Wednesday, April 19). If approved, we will
be assessing the writing from our UDWR courses in 2006-2007.
4. We have submitted a first draft of student learning outcomes for our 36 credit major
to CAP (our curriculum committee), which asked us to make some revisions.
We have made revisions and CAP will be meeting to discuss them within the
next two weeks. Once they have met with CAP's approval, they will go before
the department for its approval before the end of the school year.
COMMUNICATION SCIENCES AND DISORDERS - CSD (formerly CDIS) (Monica Devers)
Assessment Tools
Throughout our program, we are involved in the ongoing assessment of our students to improve their learning. Some
examples of formative assessments in our program include:
• Midterm exams
• Quizzes
• Oral presentations
• Written assignments/papers
• Demonstration of ability to use clinical software or equipment
Examples of summative assessments in our program
• Comprehensive examination data
• Internship supervisor feedback on the WPACC
• Undergraduate survey data
• Graduate survey data
• Employer data – collected 1 year post-graduation
• Alumni surveys
In 2005-06, we have been involved in multiple efforts and I am happy to report that our faculty is completely supportive
of our assessment plans to document the progress we are making and identifying areas in need of further
improvement.
2005-06 Assessment Projects
1. Recruitment and retention
We have continued our efforts to improve student recruitment and retention. This year, there have been multiple
activities on and off-campus.
• Better Hearing and Speech Month, May 2005
• Pairing of first year and second year students in CDIS 130 with graduate students and faculty. Multiple oncampus activities completed related to speech, language and hearing screenings, presentations and
awareness.
• Health care careers fair, St. Cloud Hospital, 9/05
• One day spent at the hospital informing the community and high school students abut our professions.
• CDIS Career Awareness Day, 10/05
• Invitations sent to all first years student in CSD 130 and to all high school counselors in the region. One day
devoted to meeting students interested in our major, providing lunch, meeting with families, touring our
facilities.
• Gear-Up Days, SCSU, 10/05
• On-campus day-long program to inform first year student about our major.
• Presentation to High School students in St. Cloud, Spring semester 2006
• Continued assessment of first year students in CSD 130 Introduction to Speech Pathology and Audiology
Recruitment and Retention Data
I have listed below the enrollment in our major classes that are typically taken in the junior year. You will note that
these class sizes have increased, in large part due to our recruitment efforts in CSD 130, our introductory course.
# enrolled in CSD 220 Phonetics
# enrolled in CSD 324 Speech Science
# enrolled in CSD 460 Lang Dev.
2004-05
40
19
13
2005-06
57
44
39
2. Assessment of upper division writing
In spring 2005, Dr. Devers and Dr. Rangamani applied for and were awarded an Assessment Grant. The purpose of
this grant was to implement systematic formative and summative assessment of the writing skills in our undergraduate
program. We will be tracking the writing skill development of these students and will be following that development
when some of them enter our graduate program.
3. Expansion of clinical opportunities
This year, based on our assessment of the clinical opportunities available for our students, we have developed a
Fluency group on campus. It is run by Cindy Lofton.
The mission of the Fluency group is to provide support, opportunity and socialization for people who stutter in a safe
and open environment where they can discuss feelings and attitudes regarding dysfluencies with peers and
professionals, develop awareness, and build confidence through appropriate techniques and strategies.
Clinical Assessment
The majority of our clinical educators use the Wisconsin Procedure for Appraisal of Clinical Competence (W-PACC) to
evaluate the performance of student clinicians in our clinic. These evaluations are typically done at mid-semester, and
33
again at the end of the semester. Some clinical educators also have students do a self-evaluation using the same
form. Student feedback on the use of this form has been positive.
Client Satisfaction/ Caregiver Satisfaction with service at the Speech-Language-Hearing Clinic
Every semester, we assess the clients or the parents/caregivers of our clients, on the services they receive in our
clinic. These ratings are based on a 5 point scale.
Scheduling
Clinician
Promptness
Client Benefits
Support Staff
Professionalism
of Clinician
Organization of
Clinician
Knowledge of
Clinician
Knowledge of
Supervisor
Explanation of
Services
Health/Safety
Precautions
Building
Accessibility
Parking
Clinic
Appearance
Therapy room
Length of
Therapy
Sessions
Client's Program
Overall
Satisfaction
Recommendation
to Others
SPRING 2005
FALL 2005
SPRING 2005
FALL 2005
Client
Satisfaction
4.60
Client Satisfaction
4.58
Parent/caregiver
Satisfaction
4.86
Parent/caregiver
Satisfaction
4.7
4.60
4.60
4.60
5.00
4.83
4.67
5.00
4.85
4.86
5
4.8
5
4.60
4.83
5.00
5
4.60
4.92
4.93
4.9
4.40
4.83
4.86
4.9
4.60
4.83
5.00
5
4.60
4.83
4.79
5
4.56
4.83
4.83
5
4.44
4.56
4.75
4.25
4.86
4.62
5
4.3
4.60
4.33
4.75
4.83
5.00
4.79
4.9
5
4.60
4.60
4.92
4.83
4.93
4.62
5
5
4.60
4.92
5.00
4.9
4.60
5.00
4.86
5
Praxis Examination Results
The Praxis Examination in Speech-Language Pathology and Audiology is a summative assessment of student learning
in the Communication Disorders field. Successful completion of the test is necessary in partial fulfillment of the
requirements for the national certificate of clinical competence in speech pathology and audiology.
Speech-Language
Pathology
ETS Testing
Cycle
Current Year
2004-2005
ETS
Data
Number of students taking exam
Number (and %) passed *
34
Program Data
3
3 (100%)
Graduation/Program Completion Rates
Here are our graduate program completion rates—within the expected time frame identified by us for 2004-05.
Academic Year
Academic
Year
Current
Year
Prior Year
2 Years Prior
Current
AverageYear
for three years*
Prior Year
2 Years Prior
Average for three years*
Program Completion Rate (%)
Audiology
SLP
94
Employment Rate in Profession (%)
92
SLP
100
100%
95.3%
100%
100%
100%
Employment/Job Placement Rates
Alumni Surveys
Each year, we send a survey to recent graduates of our program, asking them to evaluate the education they received
and other aspects of the department. In addition to rating the education they received, we ask our alumni to provide
feedback on the department, SCSU, and the quality of their graduate education.
This year, the highest ratings were given for the statements “off campus internship placement provided valuable
learning experiences” and “CDIS Master’s degree was a good value” with average ratings of 4.8 on a 5 point scale. A
difference between this year and the previous year was noted for “received an adequate variety of on-campus clinical
experience.” Last year, there was an average rating of 2.75 and a 4.0 was given in 2003-2004.
As in previous years, the lowest ratings were given to “the CDIS department had adequate physical space/facilities”
Overall the average ratings for this year were 0.37 higher than the previous year.
Supervisor/employer ratings of our alumni
Supervisors/employers evaluated the CSD graduates one year post-graduation. The highest ratings were given to
“adheres to professional standards and state/local regulations in implementing a program,” “effective in screening
hearing and assisting with management of hearing impaired clients,” “conducts in-service for staff as
needed/requested,” “conducts professional responsibilities in an ethical manner,” “personally uses appropriate and
correct speech and language,” “meets professional responsibilities and deadlines in a timely manner,” and “interacts
with professional colleagues and administrators in a positive manner” with average ratings of 5.0.
There was a large difference noted this year and the previous year for “effective in screening hearing and assisting
with management of hearing impaired clients.” Last year, this statement received an average rating of 1.67 and this
year, employers provided an average of 5.0. A difference was also noted for “conducts in-service for staff as
needed/requested.” An average rating of 1 was obtained last year and a 5.0 was noted this year.
The lowest rating was given to “integrates speech-language-hearing services with inter-disciplinary collaborations” with
a rating of 4.4. Overall, the average ratings for this year were 0.80 higher than the previous year.
ENGLISH (Raymond Philippot)
The Department of English has undertaken several assessment-related steps in preparation for the Higher
Learning Commission’s visit in 2007. Below are brief snapshots, by program area, of the assessment plans
and actions thus far:
BA (Literature):
35
Learning outcomes have been filled in
Matrices for the outcomes have been written
Learner outcomes have been identified within courses
Learner outcomes have been identified by assignments within courses
New UDWR data and statement are available
(Matrices are attached)
BA (Rhetoric and Applied Writing):
In 2004, the department began discussions regarding the possibility of reshaping this program;
however, for various reasons the process stalled, but the composition and rhetoric faculty has rejuvenated
talks. To date, programmatic learner outcomes have not been formalized, largely because it’s a widely held
belief that the program will be reformed, and it does not make sense to establish outcomes for an “in flux”
course of study. Faculty will have learner outcomes in place by fall of 2006.
BA (Linguistics):
Learning outcomes are in place, though changes will be made for fall of 2006
The program (BS in teaching) will undergo Minnesota Board of Teaching review in 2006
The program (BS in teaching) will undergo NCATE review in 2007
BA (Creative Writing):
Learning outcomes have been formalized
Matrices for the outcomes have been filled in
Learner outcomes have been identified within courses
Learner outcomes have been identified by assignments within courses
(Matrices are attached)
BS (Communication Arts and Literature):
Learning outcomes have been formalized
Matrices for the outcomes have been filled in
Learner outcomes have been identified within courses
Learner outcomes have been identified by assignments within courses
The program will undergo MN Board of Teaching review in 2006
The program will undergo NCATE review in 2007
(Matrices are attached)
MA (General/TESL/English Education):
Graduate faculty revised mission statement
Assessment plan developed in summer/fall of 2005
Rubric established to examine student writing (self-assessment)
(See attachment)
Intensive English Center:
Assessment plan is in place
Each level is assessed annually
Analysis of level assessments result in curriculum review when appropriate
College ESL:
Assessment plan is in place
36
Each level is assessed annually
Analysis of assessments used to re-evaluate program
The Write Place:
Assessment plan is in place
Goals for the Center have been developed
Mission statements has been rewritten
Learner outcomes have been identified within classes
Learner outcomes have been identified by assignments within classes
Data sources have been identified
Data are being collected (ongoing)
(Matrices and goals statements are attached)
FOREIGN LANGUAGES AND LITERATURES (Michael Hasbrouck)
The following information is for the French, German and Spanish Programs, BA and BS.
Mission: The Department of Foreign Languages and Literature works to provide students with outstanding
language preparation for future language studies, as well as careers in business, teaching, and other
enterprises where international awareness and language proficiency are important. We serve a significant
role in promoting cultural diversity at SCSU and advancing external relationships with campuses and
institutions around the world. We sponsor and promote outreach and support to schools in the community
and the state.
Assessable Learning Outcomes: We have assessable learning outcomes. We base our learning outcomes
on the language proficiency scale created by the American Council for Teachers of Foreign Languages
(ACTFL). The desired assessable learning outcomes of the French BA are as follows:
•
Speaking at the Intermediate-High level on the ACTFL scale.
•
Writing at the Intermediate-High level on the ACTFL scale.
•
Listening at the Intermediate-High level on the ACTFL scale.
•
Reading at the Intermediate-High level on the ACTFL scale.
Assessment Plan: We will assess the learning outcomes through the Senior Project and using assessment
tools in the classroom in the 400 level courses. Assessment of learning outcomes for the BA is more
informal than in the BS where the State mandates the use of a standardized assessment tool such as the
Praxis II exam. Even though all the professors in our department are familiar with the ACTFL Proficiency
scale and nearly all of us have participated in seminars and workshops on assessing linguistic proficiency
using the scale, we currently have no ACTFL certified tester. Becoming ACTFL certified is a laborious and
costly venture. Given sufficient financial and workload-reduction by SCSU and/or MnSCU administration
many of us would be happy to pursue ACTFL certification. We currently use the data from our assessment
informally to adjust our program as needed.
Assessment Tools: We have a number of assessment tools at our disposal that will assess the desired
learning outcomes.
37
•
•
•
•
•
•
•
•
•
•
•
Written exam (Ex)
Oral Presentation (OPr)
Oral Interview (OI)
Classroom Participation (CP)
Spontaneous Writing Evaluation (SW)
Written Essay (WE)
Research Paper (RP)
Listening Comprehension Evaluation (LC)
Dictation (D)
Reading Comprehension Evaluation (RC)
Senior Project (SP)
Results: Our current concrete assessment data comes in the form of the Senior Project Portfolio, which
demonstrates written and reading proficiency. We currently have no concrete assessment data on oral or
listening proficiency. We base our assessments on the use of oral and listening assessment tools in the
classroom at the 400 level.
BS Programs in French, German and Spanish
Assessment Plan: We will assess the learning outcomes through the Senior Project and using assessment
tools in the classroom in the 400 level courses. Assessment of learning outcomes for the BS is more
formalized with the State mandating the use of a standardized assessment tool such as the Praxis II exam.
Even though all the professors in our department are familiar with the ACTFL Proficiency scale and nearly
all of us have participated in seminars and workshops on assessing linguistic proficiency using the scale, we
currently have no ACTFL certified tester. Becoming ACTFL certified is a laborious and costly venture.
Given sufficient financial and workload-reduction by SCSU and/or MnSCU administration many of us
would be happy to pursue ACTFL certification. We currently use the data from our assessment informally to
adjust our program as needed.
Assessment Tools: We have a number of assessment tools at our disposal that will assess the desired
learning outcomes. The BS uses the Praxis II exam as its definitive assessment tool.
• Written exam (Ex)
• Oral Presentation (OPr)
• Oral Interview (OI)
• Classroom Participation (CP)
• Spontaneous Writing Evaluation (SW)
• Written Essay (WE)
• Research Paper (RP)
• Listening Comprehension Evaluation (LC)
• Dictation (D)
• Reading Comprehension Evaluation (RC)
• Senior Project (SP)
• Praxis II Exam
Results: The Department is in the process of gathering data on our student's success with the Praxis II exam.
38
How assessment data are used for program improvement: We use assessment for program improvement
informally. As language teachers we all know time abroad in a French-speaking country is far and away the
most important factor in our students' achieving the desired learning outcomes of a BS. Attempts at program
improvement will have insignificant success unless increasing students' time abroad is the centerpiece of
those efforts.
MASS COMMUNICATIONS (Roya Majid)
This year we developed a new entry/exit test which measures gains in theoretical knowledge by our
graduates as compared to when they entered the program. The test was administered to 130 incoming
intended majors in January 2006. It will be administered to graduating students in late April 2006. We are
interested in seeing the difference in the mean scores.
MUSIC (Tim Wollenzien)
We began the year by presenting an Assessment Audit to all faculty of the Music Department at our annual
retreat prior to the beginning of Fall Semester. We had developed the audit instrument based on student
learning outcomes, which we had identified based on competencies from one of our accrediting bodies,
National Association of Schools of Music (NASM). We had nearly 100% participation and cooperation
from department faculty! We were moving towards evaluating trends and areas of focus from the data
collected through the Assessment Audit, and determining the next steps that we needed to take as we
assessed student learning outcomes in courses across our curriculum.
In December, the focus of the Assessment Committee shifted to preparing for the NCATE/BOT accreditation
process, which significantly impacts our department. We began an examination of transition points for
students throughout our programs, and realized that those needed attention and revision. As we began
working on the transition points, we discovered that we also needed to look at how we assess student
learning at the transition points. This led us to re-visit the process we had begun in the fall, and to look at the
creation of rubrics across the curriculum. We presented a successful workshop on rubric creation to our
faculty at a meeting in December, and another one in early March. Our committee members and many other
faculty members have become comfortable with writing and using rubrics, and students have also been
actively involved in the creation of some. Informal feedback indicates that students will welcome rubrics at
transition points and within courses, since expectations will often be much clearer.
We have completed Transition Point Flow Charts for our undergraduate majors (B.A., B.M., and B.S.), and
are awaiting full faculty approval. We are nearly finished creating rubrics for jury examinations (which all
performers take at the end of each semester), Junior and Senior Recitals, and the Piano Proficiency (required
of all majors).
The members of the assessment committee feel that we have had a successful year and that we have made a
great deal of progress in the assessment process. We feel that we are fairly well positioned to address the
requirements we will be expected to document for the accreditation processes in the coming years. We
anticipate showing rubrics that are being used to assess student learning in the competencies placed in
courses that our music education majors are required to take. Since the majority of our majors are music
education (B.S.), most of our courses are affected. We feel that this makes the process of department-wide
assessment a reasonable goal.
PHILOSOPHY (Carolyn G. Hartz)
We have results from three of our instruments this year, the comparative writing analysis, alumni
surveys and the senior comprehensive exam.
39
I. Program learning objectives
Our program learning objectives are divided into three categories:
A. Coherent comprehension of content (history of philosophy, ethics,
metaphysics, epistemology, logic,)
B. Thinking skills that philosophy is particularly suited to developing (analytical
abilities, critical thinking, abstract/conceptual thinking)
C. Attitudes (valuing self-understanding and the examination of one’s life, a
reflective habit of mind, love of learning, intellectual autonomy)
A: Coherent comprehension of content
1. Students will explain views of the major philosophers of the main historical
periods: Ancient/Medieval, Modern, and Contemporary, and/or describe
relationships (such as distinctions, similarities, indebtedness, and other
connections) among them.
2. Students will explain representative major issues and theories in ethics.
3. Students will explain representative basic metaphysical issues and theories.
4. Students will explain representative major epistemological issues and theories.
5. Students will explain representative fundamental concepts of logic.
B: Thinking skills that philosophy is particularly suited to developing
1. Students will analyze concepts, arguments, issues, theories, and/or views.
2. Students will critically evaluate concepts, arguments, issues, theories, and/or
views.
3. Students will use abstract concepts.
C: Attitudes
1. Students will value self-understanding and the examination of one’s life.
2. Students will gain a reflective habit of mind.
3. Students will increase their love of learning.
4. Students will exhibit intellectual autonomy.
Levels of performance in terms of which student attainment of each of these goals will be assessed:
High: The student has attained a level of competence appropriate for pursuing graduate
work in philosophy.
Middle: The student has attained a level of competence appropriate for pursuing other
graduate or professional education for which philosophical training is of value.
Low: The student has attained a level of competence at most minimally greater than that of
a control group of non-philosophy students.
N/A: The student's work either did not engage the objective in question or provided
insufficient basis for judging its level of competence.
II. Comparative Writing Analysis
The writing analysis compares students' written work from our senior seminar (which also fulfils the
upper division writing requirement) with that of a control group from a 200-level History of Philosophy
class. The former course is normally taken by students at the end of their program, and the latter by students
at the beginning of their program. I therefore call those in the latter group "pre-program" students, and those
in the former group "post-program" students. This comparison allows us to gauge the students' improvement
with respect to the skills in group B (1-3) and objective C 2, 3 and 4. The pre-program group contained 13
40
students, and the post-program group contained 4 students. In the following table, for each learning
objective the “pre” column gives the percentage of the pre-program group that scored each level, the “post”
column gives the percentage of the post-program group, and the “change” column gives the change in the
percentage points. The results were as follows:
H:
M:
L:
N/A:
Pre
B1
Post
Change
Pre
B2
Post
Change
Pre
B3
Post
Change
23.1
46.2
30.1
0
75.0
25.0
0
0
+51.9
-21.2
-30.1
0
23.1
38.2
23.1
15.2
0
50.0
25.0
25.0
-23.1
+11.8
+1.9
+9.8
23.1
76.9
0
0
25.0
75.0
0
0
+1.9
-1.9
0
0
Post
33.3
50.0
8.3
8.3
Change
+10.2
-3.8
-9.4
+3.2
Learning Objectives B averages
H:
M:
L:
N/A:
H:
M:
L:
N/A:
Pre
C2
Post
Change
30.1
61.5
7.7
0
75.0
25.0
0
0
+44.9
-36.5
-7.7
0
Pre
23.1
53.8
17.7
5.1
Pre
C3
Post
Pre
C4
Post
Change
Change
30.1
61.5
7.7
0
50.0
50.0
0
0
+19.9
-11.5
-7.7
0
38.2
46.2
15.2
0
0
50.0
50.0
0
-38.2
+3.8
+34.8
0
Learning Objectives C averages
H:
M:
L:
N/A:
Pre
32.8
56.4
10.2
0
Post
41.6
41.6
16.6
0
Change
+8.8
-14.8
+6.4
0
Learning Objectives
overall averages
H:
M:
L:
N/A:
Pre
28.0
55.1
14.0
2.6
Post
37.5
45.8
12.5
4.2
Change
+9.5
-9.3
-1.5
-1.6
Comments
Unfortunately, these data are not very significant as the post-program group contained only four
students. That said, there was a large improvement in their analytical abilities (B1) and in their
reflectiveness (C2). Very little change was apparent in their ability to use abstract concepts (B3). Their
ability to critically evaluate (B2) went down somewhat and their “love of learning” went up somewhat, but
41
again with only four students in the post-program group it is hard to say what this means. Our revised C4
objective, intellectual autonomy (which replaced “intellectual integrity”), had a pretty bad showing, although
this can be a function of writing assignments as well.
As always, the C category of objectives, comprising attitudes rather than skills, calls for some
caveats. First, these are much harder to gauge than skills, and hence the assessments are rather uncertain.
Second, attitudes are less apt to change over the course of the program; students are more apt to some to the
program with these attitudes. Indeed, their having some of these attitudes may be partly responsible for their
choosing a philosophy program to begin with. We know from exit interviews that many students perceive
themselves as having come to the program with these attitudes already in place and hence see the program
itself as not having cultivated them.
It would be helpful if next year we have a larger pool of both pre- and post-program writing to assess.
III. Alumni Surveys
Here are the data and commentary (in bold) from the surveys of students who graduated from 19942004. We sent out 76; we received 16. Unfortunately, this is down substantially from the last time we
surveyed our graduates (34). Numbers next to categories indicate frequency of response.
1. What gender are you?
Male
9
Female
7
We seem to have made progress here—last time 82% were male.
2. Age? 21-25: 6
3. Race?
26-30: 4
31-35: 4
Caucasian/White 12
American Indian/White 1
36-40: 1
over 40: 1
African American/Black 2
“Normal” 1
4. Were you a Philosophy major or Philosophy minor? a) major 9
b) minor 7
5. What other areas, if any, did you major and/or minor in?
Major(s): Sociology 3; English/English Lit./ Creative Writing 3 (1 each);
Psychology 1; Anthropology 1; Criminal Justice 1; Political Science 1
Minor(s): Religious Studies 6;
Psychology 1; Women’s Studies 1;
History 1
Religious Studies seems to be a strong interest among our students. It’s no surprise that several
majored in English, but it’s remarkable that all the other majors/minors cited are in the Social
Sciences. Should we be strengthening our connections with programs in the social sciences, or perhaps
strengthening our connections with other programs in our own college? (Or are social science
students simply more inclined to respond to surveys?)
However, these results are not out of line with past results. This time, 47% of the majors/minors cited
were in Humanities areas, while 53% were in the Social Sciences. Last time the survey was done, only
30% of them came from the Humanities, while 66% came from the Social Sciences. (The remainder
were from Science and Technology).
6. What year did you graduate?
’92 ’93 ’94 ’95 ’96 ’97
1
2
3
1
’98 ’99
1
’00
42
’01
1
’02
2
’03
2
‘04
3
7. What was your GPA?
3.51-4.00
9
3.01-3.50
7
2.51-3.00
2.01-2.50
0-2.00
We are evidently attracting good students, although again the better ones may be more inclined to
respond to a survey. (Also, the University average GPA for graduates in the last ten years is 3.14.)
Last time the survey was done, half were above 3.5, 31% were in the second tier, and 19% were in the
third.
8. What is the highest academic degree you have earned, and in what field?
a) undergraduate degree (BA or BS?) BA 8
b) MA 6
field: Philosophy
2
Sociology
Counseling
English
1
1
2
institution:
Northern Illinois U.
Colorado State U.
MN State U-Mankato
U. Wisconsin-Oshkosh
SCSU
UC-Riverside
c) MS 1
field: Criminal Justice
institution:
SCSU
d) Ph.D. 1
field: English
institution:
University of Nebraska
e) other (please specify): 0
9. Are you employed? If so, how would you describe your primary job or position?
Two said no, three said Graduate Assistant, and one each said (10 total) the following:
Instructor of Philosophy
Account analyst
College professor of English
Financial services
and Women’s Studies
Director of loss prevention
Instructor of Composition and ESL
Restaurant manager
Youth pastor
Barista in coffee shop
Martial arts teacher
10. If you are not employed, are you in graduate school? If so, what are you studying (what program are
you in) and at what institution?
One each said the following:
Ph.D.
(no degree specified)
Ph. D.
Ph. D.
(no degree specified)
Psy. D.
M. Div
Philosophy
Philosophy
Sociology
Sociology
Sociology
Psychology
Theology
University of Florida
Northern Illinois University
South Dakota State University
York University (Toronto)
Bowling Green State University
Chicago School of Professional Psychology
Bethel Seminary
A healthy number of respondents have gone on to further study (not necessarily in Philosophy). Four
of the seven Masters in #8 above are in doctoral programs. Three of our graduates in the last ten
43
years teach in academic settings (see #9) and several more are evidently headed that way. Those who
are outside academia, as usual, are all over the map employment-wise.
In the last survey, 15% were in the legal profession, and the same number taught in higher education,
and the rest reported a wide variety of types of employment (much like this time). The undergraduate
degree was the highest earned for about half the respondents on both surveys. This time around, 44%
of the respondents reported being in graduate school, compared with 9% on the previous survey. This
may be due to the fact that the previous surveyed graduates from as far back as 30 years prior to the
survey, while the current one only goes back 10 years. Still, 63% of the current respondents are either
in graduate school or have completed it and are employed academically (nearly a third of these are in
Philosophy, or almost a fifth of all respondents), while the last time 24% were in graduate school or
employed academically.
11. If you started but did not complete a graduate program, what field was it in and at what institution?
Why did you leave?
One each said the following:
Left Criminal Justice MA at SCSU because semester conversion would lengthen time
for completion.
Offered MA/Ph.D. package in English; finished MA but left because field is
“supersaturated” with Ph.D.s and because of health problems.
12. If you are in graduate school now or have attended in the past, did you take any time off between
graduating from SCSU and starting graduate work? If so, how long, and what did you do in the meantime?
No
5
Worked in loss prevention
1
One year; traveled around the world
1
One year; nannying and mentoring youth
1
One year; worked and looked at grad programs
1
VISTA volunteer and instructor of Ethnic Studies 1
Evenly split between going on directly and taking time off.
13. Which of the following best describes how much significance your study of Philosophy has had in your
life after college?
a) a lot 9
b) some 5
c) not much 2
d) none at all
e) unsure
14. In making the judgment you made in question #13, how much importance or weight did you attach to
the following factors? Please rank the top three, 1 to 3, in descending order with 1 meaning the most
importance. (The columns list how may rated each as (1), (2) and (3). The last column lists how many rated
each in their top 3.)
(1)
2
7
6
2
(2)
1
3
4
4
3
2
2
(3)
3
5
2
2
1
2
a) your employment and/or income
b) the content and quality of what you learned
c) your personal growth or satisfaction
d) living a good life
e) the quality of your thinking or reasoning skills
f) using philosophical skills currently
g) participating in active citizenship
44
6
8
13
6
9
5
4
1
1
h) other (explain):
2
(prep for
(relationships)
grad school)
Personal growth and thinking skills are the winners here; active citizenship and using skills currently
were less important to the respondents.
15. What effect has your study of Philosophy had on your employment and/or income?
a) good effect 9
b) bad effect 1
c) little or no effect 6
16. What effect has your study of Philosophy had on your personal growth or satisfaction?
a) good effect 15
b) bad effect
c) little or no effect 1
17. To what extent has your study of Philosophy helped you to live a good life?
a) a lot 10
b) some 6
c) not much
d) none at all
e) unsure
18. To what extent do you currently use philosophical skills?
a) a lot 9
b) some 6
c) not much 1
d) none at all
e) unsure
19. To what extent has your study of Philosophy fostered your participation in active citizenship?
a) a lot 6
b) some 6
c) not much 3
d) none at all
e) unsure 1
20. On the whole, how would you rate the quality of the Philosophy Department’s teaching?
a) excellent 6
b) above average 10
c) average
d) below average
e) poor
21. On the whole, how would you rate the quality of the Department’s courses?
a) excellent 4 b) above average 9 c) average 3 d) below average
88% said their study of Philosophy had at least some significance in their lives
after graduation (97% last time).
94% said it had no negative effect on their employment and/or income after
graduation, and 56% reported a positive effect (last time 50% said it
had a positive effect and 50% said it had no effect).
94% said it had a good effect on their personal growth and satisfaction (97%).
100% said it helped them lead a good life to at least some extent (97%).
94% said they still use their philosophical skills after graduation to at least
some extent, even though most are not going on in Philosophy (this
question was not asked last time).
75% said it fostered their participation in active citizenship (this question was
not asked last time). Several respondents noted in #42 that they use
their philosophical skills in the context of political issues, and in #43
many respondents noted extensive community involvement.
100% rated the Philosophy Department’s teaching as excellent or above
average (76%).
45
e) poor
88% rated the Department’s courses as excellent or above average (76%).
22. Do you think you received sufficient education in Philosophy from this program--e.g., sufficient depth
and breadth? Please explain and/or suggest improvements.
Yes 8
No 1
There should be more: Special topics 2
Hannah Arendt
Continental 2
Plato & Aristotle
History
1
Debates between colleagues
Non-western 1
Depth, less breadth
Nietzsche
1
There seemed to be general satisfaction, and little consensus on suggestions.
1
1
1
1
23. What aspects of the program did you find most influential or important to you? Which were the least?
(We’ve deliberately left this question vague and open-ended.)
Most
Least
Good or passionate professors
6
No prep for grad school
1
Small classes
2
Inactive philosophy club
1
Discussion format
2
Need more women students
1
Reading
2
Students need more thorough
Classes requiring daily reflections 1
understanding
1
Guest speakers
1
Logic
1
Developing writing skills
1
Broad courses (e.g., history)
1
Feeling respected and valued
1
Women philosophers as role models 1
Particular teachers
1
Wittgenstein class
1
Augustine class
1
Ethics
1
Methods
1
Symbolic logic
1
Studying Marxism
1
Epistemology
1
This question may need to be reworded—many respondents listed what they thought was good or bad
rather than what was important or unimportant. Again, there was little consensus, except that the
quality of the professors seems to be important to many of the respondents.
24. Do you favor a capstone experience/course for graduating seniors in Philosophy? If so, in what form?
E.g., would you have benefited from an internship, writing a thesis, or a course that addresses the field of
Philosophy (examining job opportunities, strategies for getting admitted to and surviving graduate school,
the culture of professional Philosophy, other ways of using Philosophy, etc.)?
Culture of professional philosophy or job opportunities
Thesis
Internship
Optional thesis
Writing sample for grad school application (not thesis)
Meeting on applying to grad school & what it’s like
Use book The Philosopher’s Tool Kit and emphasize
46
6
5
2
1
1
1
that philosophy is practical
System of comprehensive exams
Capstone course
Discussion-focused seminar
Course on “what to do now’
No extra project beyond regular coursework
1
1
1
1
1
1
The new course on Philosophy After Graduation seems to fulfill a perceived need. Other than that
there was little consensus, except for a sizeable group who thought there should be a thesis. Several of
the other responses above would probably be addressed in the new course.
25. Do you think that, while you were in the program, you were part of an intellectual community of
philosophers? Were you friendly or connected with other majors and minors, and with the faculty? If so, do
you think this atmosphere was valuable? If you do not think there was this sense of community, do you
think this was a problem? Do you have suggestions about how to foster a sense of community in the
department?
Yes 10
No 4
Suggestions:
regular meetings of philosophy club
“Aristotle’s café”
more mentoring for women
take philosophy to community
have debates
book club
Somewhat 1
2
2
2
1
1
1
movie nights
go to bars, etc. & discuss
student/faculty lounge1
more engaging/encouraging
advising
foster activities based on
common interests
1
1
1
1
Respondents in general appreciated the atmosphere of intellectual community. Smaller class sizes and
seeing the same students in many classes helps this. Several of the suggestions above have merit—
philosophy club is up and running again, we will have a student lounge in our new digs in Centennial,
perhaps out-of-class book discussions can be encouraged. The philosophy club might think about a
Socrates’ café-style setup that might also take philosophy to the community. Two students thought
there should be more mentoring for women students—we might look into how we could do this.
26. Why did you choose to major/minor in Philosophy?
Interesting
Particular Instructor
Liked it
Particular course
Good foundation
Maintain life-long interest
3
3
3
2
1
1
Good at it
Fit in with major
No Rel. St. major-Phil. was 2nd choice
The way my mind thinks
Quest for truth
“Sweet Jesus, who knows?”
1
1
1
1
1
1
How effective do you believe the Program was at helping you cultivate the following skills and attitudes?
Please circle the appropriate number. Levels of Effectiveness:
1: very effective 2: somewhat effective 3: somewhat ineffective 4: very ineffective
1
47
2
3
4
27. Analytical abilities
28. Critical thinking
29. Abstract/conceptual thinking
30. Written expression
31. Text interpretation
32. Enjoyment of being challenged/love of learning
33. A reflective habit of mind
34. Intellectual integrity
35. Valuing self-understanding/examining one’s life
11
12
9
5
4
11
6
6
5
4
7
10
11
3
10
9
10
1
1
2
6
1
36. Above we asked how effective the Program was at helping you cultivate these skills/attitudes. Please
comment on how important each, any, or all of these skills/attitudes are to you.
Rated important or very important:
Analytical abilities
Critical thinking
Abstract/conceptual thinking
Written expression
Text interpretation
Enjoyment of being challenged/love of learning
Reflective habit of mind
Valuing self-understanding/examining one’s life
All are important
Other comments;
4
6
2
4
1
3
1
5
2
More feedback on writing would have been good
More studying of current journals and arguments
Some of these were developed more in the English major
Respondents expressed general satisfaction with the program’s effectiveness at inculcating thinking
skills (#27-29 above, the B set of our program learning objectives), and more rated critical thinking
(#28) as important than any other.
During exit interviews students often comment that the program did not do much to foster the
attitudes listed in #32-35 above, but that this is because they had these attitudes when they came in—
indeed, that having these attitudes was part of their motivation to enter the program. The data above
(#32-35) indicate that they do perceive the program to have fostered these traits, albeit to a slightly
lesser extent than the B objectives. Perhaps time away from the program gives them a different
perspective.
Written expression is rated somewhat lower in terms of the success of the program at fostering it (as
was text interpretation, but only one respondent rated that as important). Exit interviews have
indicated a general dissatisfaction with writing in the program and we should address this.
37. About how many books do you read in a typical 3-month span? Can you name a few you’ve read lately?
0: 0
2-3: 5
4-5: 0
5-9: 5
Examples;
Locke’s Essay Concerning Human Understanding
Selected Works of Spinoza
Conditionals
Scholastic Philosophy
10-15: 4
He’s Just Not That Into You
Misconceptions
Who Stole Feminism
The War Against Boys
48
Epistemic Justification
Knowledge and Lotteries
Four Dimensionalism
Time, Tense, and Causation
Routledge Guide to the Critique of Pure Reason
Mere Christianity
The Case for Christ
New Testament and People of God
Jesus Remembered
Reinventing Church: McWorld vs. Mustard Seed
Pentateuch: Paradise-Promised Land
Art of Biblical Narrative
Third Millennium Church
Theology of Revelations
Jesus Seminar
Cynic, Sage, or Son of God?
Having Faith
Reading the Bible Again for the First Time
Aeneid
Foucault’s Pendulum
Parables and Paradoxes
Exile and the Kingdom
Denial of Death
Science Fiction/ Fantasy
Childbirth and Pregnancy books
Intellectual Hubris
From Beirut to Jerusalem
The Fabric of the Cosmos
Remnants of Auschwitz
Toltec Prophecies
An Invitation to Reflexive Sociology
Soul Retrieval
Teaching on Love
The Power of Now
Stillness Speaks
The Crystal Shard
Running with Scissors
House of Bondage
The Making of a Modern South Africa
Survivor
A Confederacy of Dunces
Who’s Your Caddy?
Underboss
One Magical Sunday
Nexus
Existential America
Star Wars and Philosophy
38. About how many newspapers do you subscribe to or read typically? What are they?
0: 7
1: 4
2: 1
3: 1
4: 1
Star-Tribune
3
Press Enterprise
1
NY Times
2
Wausau Daily Herald 1
St. Cloud Times
1
Toronto Star
1
Wall Street Journal 1
Seattle Times/Tribune1
LA Times
1
39. About how many magazines do you subscribe to or read typically? Can you name a few you’ve read
lately?
0: 3
1-2: 3
3-5: 5
12-13: 1
20: 1
Scientific American
Small Arms Review
Sports Illustrated
National Geographic
Security Magazine
Golf
Time (2)
Loss Prevention Magazine Runner’s World
Newsweek
Government Security News Outdoor Life
US News and World Report Contemporary Sociology
Field and Stream
Atlantic Monthly (2)
Martial Arts magazines
Minnesota Moments
Harpers
Sojourners
Popular Photography
Utne Reader
The Economist
American Baby
Mother Jones
Adbusters (2)
E: Environmental Magazine
Money
The Door Magazine
Bitch: Feminist Response to
Wired
Z Magazine
Pop Culture
49
Vegetarian Times
Cooking magazines
40. How often to you regularly read sources online? What are they?
0: 2
1: 1
2: 1
3: 1
6: 1
not very often: 2
sometimes: 1
weekly: 2
often: 2
daily: 3
Yahoo! Headlines for Reuters, AP,
and LA Times (2)
MLA online journals
msn.com (2)
The Smirking Chimp
msnbc.com
Chronicle of Higher Education
CNN
Narco News
NY Times
scifi.com
Chicago Tribune
Commondreams.org
BBC World News
Psychology databases
LeMonde
National Council of Teachers of English
Fox News
Trade discussion boards
Reuters
Movie reviews
Salon.com
Various websites
New Republic
Everybody reads; some read a lot. The respondents read a lot of philosophical and
academic/intellectual material, appear to be well-informed and engaged intellectually and civically.
41. How are you intellectually active in your life? For example, do you belong to any book clubs? Do you
take courses classes (e.g., community education, professional development, traditional college classes, etc.)
or attend lectures? Do you do any teaching or tutoring? What other learning opportunities do you take
advantage of?
Teach
7
Trade books and movies with friends 1
In grad school
7
Attend workshops
1
Reading
4
Museum trips
1
Mentor/tutor students
2
Attend events on campus
1
Discussions with friends
2
Writing
1
Book club
2
Learn new languages
1
Took more philosophy classes
Professional conferences/training
after graduation
1
sessions
1
Online classes
1
Studying for LSAT
1
Completed technology certifications 1
Toastmasters
1
Attend lectures
1
None
1
42. Can you think of non-academic situations in which you’ve used your philosophical skills since you’ve
graduated? E.g., how often do you get into philosophical discussions with others or discussions in which
your philosophical training is useful? Please describe some typical cases.
How often: daily 1
Discussions with family/friends
Assessing/discussing political issues
Discussions at parties/bars
Making moral decisions
Online discussions
8
4
3
2
2
50
Discussing religion
In classroom (as teacher)
Use rhetorical skills for training employees
to conduct interviews/interrogations
Use rhetorical/argument skills in making
case for company funding
Launching forum based website on ethics,
religion, and metaphysics
No
1
1
1
1
1
1
Again, questions #41 and #42 indicate a high degree of intellectual engagement.
43. How are you active in your community? For example, how often do you vote? What volunteer work do
you do? Are you involved in local government? PTA or other organizations?
Vote
Amnesty international
ACLU
Campus Democrats
Young Republicans (vice chair)
Political meetings
On-campus activist organizations
NARAL Pro-Choice America
Equality California
Lobbying for legislative changes
Write to congress people
Volunteer at community art center
Education reform group
On board of local Humane Society
Donate to animal rights charities
12
2
1
1
1
1
1
1
1
1
1
1
1
1
1
Board of local Crimestoppers
1
Member of Chamber of Commerce
1
Teach hunter safety for DNR
1
Community outreach to prep. students 1
Volunteer with teens
1
Pen Pal program with local
elementary schools
1
Volunteer as English instructor for
immigrants
1
Literacy volunteer
1
Work at preschool part-time, take children
to library and educational field trips
1
Volunteer through school
1
Volunteer for kinship as a mentor
1
Not very
1
Comment: When I was in the program I never realized how important civic duties are to the department.
These kinds of things were never discussed.
Voting is, unsurprisingly, the most common evidence of civic engagement. The respondents were also
active in various political organizations and volunteered quite a bit, primarily in education contexts.
44. Is there anything else you’d like to comment on to help us in thinking about ways to improve our
program? For example, program structure, required courses and prerequisites, amount of writing, etc.
Start graduate program
Try to be the top Minnesota school for philosophy
More writing
More specific help with philosophical writing
Writing samples for grad school applications
—400 level classes should do this
More mentoring
More community opportunities
Capstone course on philosophy as a field of inquiry
3
1
1
1
1
1
1
1
51
Capstone course explaining the academic/job world
Service learning
Study groups
Be more rigorous in grading and expectations
More diversity
Better student/adviser relationship
Phil. 194 turns off students from taking
more philosophy
Have students present papers
Have a course specifically on Plato/Aristotle
More Foucault
More Ayn Rand
More Eric Hoffer
More Wittgenstein
1
1
1
1
1
1
1
1
1
1
1
1
1
Starting a graduate program and more mentoring in various ways and writing were the top
suggestions. We’re investigating the first; we should consider the other two more deeply, especially
how we handle writing in the program.
IV. Conclusion
On the whole our students make progress through our program, especially with regard to the skills
objectives. Our students apparently come to our program with the attitudes already in place to a larger
degree, but some improvement is evident over the course of the program.
The results of past exit interviews corroborates student suggestions of more emphasis on and training
in philosophical writing. Our comparative writing analysis is not used to assess writing (as we have no
explicitly writing-related learning objectives). However, at some point we will need to assess our Upper
Division Writing Requirement course(s), and the writing assignments there already form our post-group
sample for the Comparative Writing Analysis in our program assessment (see II above). This would be an
ideal opportunity to gauge student writing ability. In any case, we should probably begin now to focus on
writing issues in our courses.
Senior Comprehensive Examination
Percentage of students scoring H, M, L, and N/A for each learning objective and commentary.
Learning Objectives A1-5
Part I assesses the A group of objectives (coherent comprehension of content).
Students will explain:
1. views of the major philosophers of the main historical
periods and/or describe relationships among them.
H
M
L
N/A
13.0
26.1
56.6
4.3
2. representative major issues and theories in ethics.
8.7
43.5
43.5
4.3
3. representative basic metaphysical issues and theories.
13.0
26.1
56.6
4.3
4 representative major epistemological issues and theories.
21.7
30.4
43.5
4.3
5 representative fundamental concepts of logic.
13.0
13.0
69.6
4.3
Scores on the A group of objectives are pretty dismal. On 3 of the 5 objectives more than half of the answers rated Low. We seem to be
strongest in epistemology and weakest in logic. Here are three proposals to address these results:
1) We revise the program and/or course content to make it more likely that
students will learn/retain this material.
52
2) We revise this part of the test. These are difficult questions to devise; they
should not be so specific that students could easily miss/forget this material
in the course of the program.
3) We use instruments in the courses that address this material to assess these
objectives rather than waiting until the students take the comprehensive
test. Advantages: a) we presumably do this anyway, and b) students are
less apt to forget material they have just studied. Disadvantages: a) we
would need to coordinate what we do in our classes more (i.e., we could not
just rely on students’ grades for a course, since a course grade would
presumably be based on more than just these objectives; assessment would
have to be on the basis of specific test, essay or paper questions, and while
these might vary from course to course or from instructor to instructor,
there would have to be a way of bringing these data together to measure
how well each of the objectives is being met); b) we may want to assess how
much of this learning is retained by students at the end of the program
rather than just how well they understood when they learned it.
Learning Objectives B1-3
Parts II and III assess the B group of objectives (thinking skills that philosophy is particularly suited to developing).
H
M
L
N/A
Students will:
1. analyze concepts, arguments, issues, Part II:
8.7
39.1
43.5
8.7
theories, and/or views.
Part III:
21.7
56.5
8.7
13.0
2. critically evaluate concepts, arguments,
issues, theories, and/or views.
Part II:
Part III:
4.3
13.0
39.1
56.5
47.8
17.4
8.7
13.0
3. use abstract concepts.
Part II:
Part III:
17.4
43.5
52.2
43.5
21.7
0.0
8.7
13.0
Scores on the B group of objectives are better than for the A group, but not stellar. Students seem to be more adept at using abstract concepts
than at analyzing or critically evaluating. But a large percentage of our students perform the thinking skills in the B group only marginally better
than non-philosophy students. Since we try to inculcate all these skills in (nearly?) all our courses, perhaps we should work on sharpening our
focus on them. Making philosophy students aware that these are program goals might be a good start and a way to begin focusing their attention
on them. A more conscious and deliberate approach to these skills in writing and class discussions might help too.
Learning Objectives C2 and C4
Parts II and III assess objectives C2 and C4 (attitudes).
Students will
2. gain a reflective habit of mind.
Part II:
Part III:
4. intellectual autonomy.
H
M
L
N/A
17.4
30.4
52.2
47.8
21.7
8.7
8.7
13.0
52.2
30.4
4.3
13.0
Not surprisingly, students exhibit these intellectual attitudes more strongly than they exhibit understanding (the A objectives) or thinking skills
(the B objectives). Students often comment during exit interviews that having these attitudes to begin with is one reason they went into
philosophy.
General Comment on the Results: One obvious shortcoming of using an exam like this to gauge what the students have gotten out of the
program is that they have no incentive to show us what they know or can do, since there is no penalty for doing poorly. For the most part
students have shown remarkable good will in taking the exam at all, given that it is not required. No doubt they do not deliberately do poorly to
sabotage results; still, it is equally likely that they would do better with more incentive. Incorporating the exam into a required course at the end
of their program (the Senior Seminar or Philosophy After Graduation) might make the results more meaningful, whether we retain Part I or not
(see comment on the A objectives above). If we took this route, we would need to be more careful about making sure only those nearing
graduation take the exam (or the course, perhaps); also, we would miss any minors who do not take these courses. This requirement would also
need to be built in to the course and should not depend on the whim of whomever the instructor is.
Appendix: Raw Data
Part I assesses the A group of objectives (coherent comprehension of content).
53
Students will explain:
1. views of the major philosophers of the main historical
periods and/or describe relationships among them.
H
M
L
N/A
3
6
13
1
2. representative major issues and theories in ethics.
2
10
10
1
3. representative basic metaphysical issues and theories.
3
6
13
1
4 representative major epistemological issues and theories.
5
7
10
1
5 representative fundamental concepts of logic.
3
3
16
1
Parts II and III assess the B group of objectives (thinking skills that philosophy is particularly suited to developing). Data is given as number of
Pt. II essays scoring at a level/number of Pt. III essays scoring at that level.
Students will:
1. analyze concepts, arguments, issues, theories,
and/or views.
2/5
9/13
10/2
2/3
2. critically evaluate concepts, arguments, issues,
theories, and/or views.
1/3
9/13
11/4
2/3
3. use abstract concepts.
4/10
12/10
5/0
2/3
Parts II and III assess objectives C2 and C4 (attitudes). Data for C2 is given as number of Pt. II essays scoring at a level/number of Pt. III essays
scoring at that level.
Students will
2. gain a reflective habit of mind.
4/7
12/11
5/2
2/3
4. intellectual autonomy.
12
7
1
3
THEATRE, FILM STUDIES AND DANCE (Kate Sinnett)
Mission statements for department and individual programs transmitted to the College of Fine Arts and
Humanities (COFAH) in fall semester. Each program has now established learner outcomes and related
them to specific required coursework in the majors and minor. This information was transmitted to COFAH
in January 2006. Film studies has indicated specific assignments in classwork which assess the learner
outcomes. Theatre and dance are in the process of completing this step and anticipate completing it by the
end of the semester. All programs have been made aware that learner outcomes need to be included on
future syllabi. Mission statements will be added to the website by the end of the summer or before. (We are
awaiting on software to enable one of our staff members to input this information.)
54
Theatre - Learner Outcomes
Script
Analy
235
Students will employ discipline-appropriate vocabulary for
critiquing performances.
X
Students will demonstrate a basic knowledge and
understanding of major trends and landmark plays in the
history of dramatic literature.
X
Students will utilize discipline-appropriate research
methods for understanding the historical and cultural
context of a play viewed as a primary source document.
X
Intro
to Production
236
Acting
148 or
248
Voice
250
Directing I
349
Th Hist
481 &
482
X
X
X
X
Practica 4 Cr. Design elective:
From 271342 or 345 or
279, 371-379
346
X
X
Students will demonstrate discipline-appropriate research
methods for understanding events in theatre history by
writing a research paper in an upper division theatre
history class.
Students will gain practical, hands-on experience in
directing actors in scenework.
X
X
X
X
Students will demonstrate a variety of processes for
critically analyzing a piece of literature for the purposes of
directing.
Students will demonstrate a variety of processes for
critically analyzing a piece of literature for the purposes of
acting.
Students will gain practical hands-on experience in
demonstrating a variety of vocal techniques suitable for
the theatrical stage.
Students will demonstrate a basic knowledge of the
principles behind vocal techniques, including vocal
quality, projection, diction and characterization
techniques suitable for the theatrical stage.
Students will gain practical, hands-on experience in the
process of creating a character and acting on stage.
X
Students will demonstrate a variety of processes for
critically analyzing a piece of literature for the purposes of
theatrical design.
Students will employ the appropriate techniques of visual
rendering to communicate that interpretation.
X
X
X
X
X
X
X
X
Students will demonstrate familiarity and proficiency with
the current methods and technologies of constructing/
implementing theatrical design.
Students will actively participate in the program's mainstage season in at least two of the following areas:
performance, constructions, technical performance, and
production support.
Makeup/
Costume
Hist 240
or 244
X
X
X
X
X
X
X
X
X
X
X
X
Film Studies – Learner Outcomes
FS 175
FS 260
FS 264
FS 270
FS 294
FS 394
FS 370
FS 395
FS 451
FS 452
FS 453
FS 464
FS 496
Students will be able
to correctly use film
terminology (the tools
of objective
description).
Exams:
Weeks 5,
10 and 16
Term
Paper
Final Paper
Students will be able
to recognize major
historical landmarks
in the development of
cinema.
Quizzes:
Weeks 2 &
3 Exams:
Weeks 5,
10 and 16
Assignments: Midterm,
Week 2,
Final
Week 8,
Week 13
Students will be able
to analyze the role of
major theoretical
paradigms and
aesthetic concepts in
cinema.
Weekly
essays
Students will be able
to write critically
about the cinema
from a multicultural
and international
perspective.
Students will be able
to create proposals,
scripts, storyboards
and films.
Weekly
essays, class
participation,
portfolio
assignment
Weekly
essays, class
participation,
portfolio
assignment
Exams:
Week 4,
Week 9,
Week 14
Exams:
Week 4,
Week 9,
Week 14
Exams:
Week 4,
Week 9,
Week 14
Midterm,
Final Exam
Midterm
Weekly
essays, class
participation,
portfolio
assignment
Weekly
essays, class
participation,
portfolio
assignment
Exams:
Week 4,
Week 9,
Week 14
Exams:
Week 4,
Week 9,
Week 14
Exams:
Week 4,
Week 9,
Week 14
Term Paper
Class
Presentation
Assignment
Weekly
essays, class
participation,
portfolio
assignment
Weekly
essays, class
participation,
portfolio
assignment
Exams:
Week 4,
Week 9,
Week 14
Exams:
Week 4,
Week 9,
Week 14
Exams:
Week 4,
Week 9,
Week 14
Research
Paper
Research
Paper
Research
Paper
Experimental
and Narrative/
Documentary
Films
Midterm
Project 2
Project 3
Students will fulfill
their upper division
writing requirement.
Dance Minor – Learner Outcomes
Students will demonstrate an
understanding of the foundations of
modern and ballet technique.
Students will execute intermediate level
ballet and modern dance technique.
DAN
111
X
DAN
112
X
MUS
100
Generate example of solo and group
choreography.
DAN
211
DAN
212
X
X
DAN
319
X
56
PESS
249
TH
236
DAN
435
DAN
471
Students will analyze the similarities and
differences among the contemporary
dance forms and reform movements in the
history of dance.
Students demonstrate an understanding of
the dynamics and commitment of dance
company membership.
Analyze and critique live dance
performance using discipline-specific
terminology.
Students will demonstrate an
understanding of anatomical language and
the ability to name major muscles and
bones.
Students will be able to differentiate
between different forms, styles and meters
of music.
Students will demonstrate a fundamental
understanding of theatrical design.
X
X
X
X
X
X
X
X
X
X
X
57
Humanities
From Carolyn Hartz (Philosophy):
The tentative learning objectives for the Humanities Program from the original proposal
are:
•
•
•
•
•
•
•
•
Integrate content and methodologies gained from a variety of disciplines to form
opinions about the world
Solve problems by listening, reading, discussing and integrating content
creatively
Respect disciplinary rigor
Value knowledge to serve a vision of truth
Respect diversity of cultures, races, and world views
Reflect on your own learning by using self-criticism to analyze your attitudes
and values
Conceptualize how to get things done in the world
Build on your General Education program and further integrate content into this
interdisciplinary degree
These will be revisited as we develop an assessment plan.
College of Science and Engineering Assessment Summary (Sandy Johnson)
COSE Assessment Summary
2005-2006
The COSE Assessment Committee’s goals for 2005-2006 were to
• Showcase departments’ student learning outcomes.
• Develop a common assessment report template with emphasis on changes that have been
made based on assessment.
Fall Semester
• The committee addressed the question, “How do we fit with the SCSU Assessment System
Alignment Overview?”
• Each department submitted drafts of their department mission and goals.
• A subcommittee reviewed the drafts and made suggestions for alignment with college and
university mission and goals and for assessment purposes.
• The departments responded positively to all suggestions.
• The committee decided that all COSE departments should have their missions posted and
visible.
• Committee gave input into developing the SCSU General Education Mission statement.
Spring Semester
• The committee approved the design for mission posters for all departments.
• The committee decided that department mission and goals should be easily accessible on the
web.
• An assessment links was added to the COSE home page and all department goals and student
learning outcomes were posted.
• The committee approved using the syllabus template from the University Assessment
Committee including student learning outcomes for each course.
• The committee completed the COSE assessment matrix with emphasis on changes that have
been made based on assessment.
Goals for 2006-07:
1. Correct the COSE matrix
2. Include student learning outcomes on all syllabi
3. Gather and interpret data to assess at least one student learning outcome in each department (move into
“closing the loop” phase)
4. Assess B.E.S. majors
5. Assess graduate programs
6. Monitor General Education reform
7. Address assessment of distance education courses
59
Assessment Accomplishments for the College of Social Sciences (Joe Melcher):
College of Social Sciences 2005-06 Assessment Report
Prepared by Joe Melcher, College Assessment Coordinator
During this year COSS departments made accelerated progress in developing systematic assessment efforts.
This most important step in this direction involved disbanding the original COSS Assessment Committee,
which had been appointed by the previous Associate Dean, and which had only five members. In its place I
formed a comprehensive committee with a representative from each of COSS’s 12 departments (January,
2006). Each member was also designated as their Department Assessment Coordinator, responsible for
encouraging the development of assessment plans for the department and/or programs. This structure is
much more effective for communications, for sharing progress and information, and getting feedback from
departments.
Since then, COSS Assessment Committee members have taken the lead in developing or updating
Departmental and/or Program Mission statements (as appropriate), and encouraging program faculty to
develop program and course learning outcomes.
Mission statements’ status (undergraduate)
By departments: All 12 departments and one of the two Centers (International Relations) now have
Mission statements. The only missing Mission statement is from the Social Science and Social Studies
program. This is because the program is in the process of converting to a Center for Social Studies
Education, and the new Mission is being developed as part of this process.
By programs: Besides its 12 departments, COSS has 12 Programs, of which 7 have submitted
Mission statements. Of the five programs which have not, one (Heritage Preservation—in Community
Studies) is in limbo due to faculty departures), and the rest are minor-only programs in the departments of
Criminal Justice Studies, Geography, and History.
Assessment plans’ status (undergraduate)
Ten programs have previously developed, or are now developing, assessment plans of varying type
and complexity. They include Community Development, Gerontology, Criminal Justice Studies,
Economics, Ethnic Studies, Land Surveying/Map Science, History, Psychology, Social Science and Social
Studies, and Social Work. Of these, the latter two have the most highly developed assessment plans because
of accreditation requirements. Social Science and Social Studies is accredited through NCATE (because of
its affiliation with the College of Education) and Social Work has been accredited through the national
Council on Social Work Education.
Ethnic Studies has made the greatest progress of the departments who were starting from scratch.
They took advantage of an assessment mini-grant to develop a partially standardized pre-and posttest based
assessment of Ethnic Studies courses taught in the different departments represented in the Racial Issues
Colloquium.
Mission statement status (graduate programs)
By departments: Of 16 graduate programs, only 4 have published Missions (Gerontology, and
Gerontology Certificate, Industrial-Organizational Psychology, and Social Work). Of the rest, 6 are in
progress (Applied Economics, Public and Nonprofit Institutions, Geography, Geographic Information
Science, Tourism Planning and Development, Geographic Information Science Certificate). The remaining
6 have not established formal Mission statements (Masters in Criminal Justice Studies, Public Safety
60
Executive Leadership Graduate Program, History, Public History Track, MS in History Teaching Emphasis,
and Political Science).
Assessment plans’ status (graduate programs)
Four graduate programs have established assessment plans and have been collecting data
(Gerontology/Gerontology Certificate, Industrial-Organizational Psychology, and Social Work. Six programs
are in the process of developing assessment plans, and the status of the others is uncertain.
List of specific activities undertaken in AY 2005-2006
o Formed an expanded COSS Curriculum Committee, consisting of a representative from all
departments (effective January 2006).
o COSS Assessment Coordinator (Melcher) attended the 2006 Higher Learning Commission annual
meeting in Chicago (April 2006), focusing on assessment issues and accreditation.
o COSS Assessment Coordinator (Melcher) attended the MAY 15 – Regional Assessment Workshop:
Analyzing, communicating and using assessment data to improve student learning. (Bloomington,
MN).
o Collected Mission statements from every department and program (several of which were revised,
updated, or even in brand-new)
o Received Dean’s Cogdill’s approval for funds to place each department’s Mission on a poster in each
department office.
o Solicited assessment plans from every department and program in COSS.
o Collected information about the status of all departmental and program assessment plans and
activities.
o Answered questions about assessment and provided feedback to instructors and department
assessment coordinators about assessment plans.
o Continued to maintain and update the COSS Assessment web page
(http://web.stcloudstate.edu/jmmelcher/Assessment_COSS/COSS_Assessment_Page.html)
o Served on the HLC Accreditation Committee Subcommittee on Assessment.
o Served on the University Assessment Committee.
o Provided feedback regarding the assessment component of an NSF Course, Curriculum, and
Laboratory Improvement (CCLI) grant proposal for the Geography department.
o As College Assessment Coordinator, wrote a letter a in support of the assessment component of the
application, based upon my review.
o Participated in presenting an assessment presentation at Faculty Forum Day (Assessment 101).
NEEDS & FUTURE DIRECTIONS
During AY 2006-2007, I will work with the COSS Assessment Committee to gather together all of the
existing assessment plans and to encourage departments and programs that have not submitted them to do so
(providing technical assistance where necessary). All departments and programs are aware that they should
have program and course assessments ready to go for Fall, 2006.They should have summary reports,
including recommendations for course and programmatic change, early in Spring term, 2007.
Based on guidance from the University Assessment Committee, it will be necessary to work with the COSS
Assessment Committee to develop a College-wide policy on how to handle assessment-related data and
reports. My bias is to keep it as much in faculty hands as possible, consistent with verification that the work
61
is being done, and even more importantly, that changes to courses and programs are based on assessment
data.
I plan to develop a database with links to electronic copies of all assessment plans and reports submitted by
departments and programs. Its purpose will be to keep track of submissions and updates in order to ensure
continuous attention to assessment.
Working with the COSS Assessment Committee, I will plan a College-wide meeting during Spring term,
2007, at which we and interested faculty will present examples of assessment and how we have used them to
drive improvements to courses and programs.
I will continue to explore the possibility of making some kind of an assessment system available to faculty in
the College (e.g., TaskStream, eLumen).
Early in Fall term, 2006, I will have the each Department's Mission statement mounted and placed
prominently in each department's office. The purpose will be to make the Mission more salient to students,
faculty, and staff.
I will work to ensure that all graduate and undergraduate departments and/or programs submit a list of
student learning outcomes upon which to base course assessment.
Finally, I will continue to work with departments and programs to develop program assessments that focus
on post-graduation placements and alumni and employer feedback about students' preparation for
employment..
Assessment Update for Learning Resources and Technology Services (Chris Inkster)
Assessment Plan
Learning Resources & Technology Services
2005-2006
Assessment Personnel:
Coordinator........................................................................................... Chris Inkster
Work group leaders:
Access Services........................................................................Robin Ewing
Collection Management........................................................... Julie Blake
Computer & Technology User Services.................................. Randy Kolb
InforMedia Services.................................................................Rich Josephson
Instructional Technology & Infrastructure Services................Randy Evans
Information Technology Services............................................Phil Thorson
Reference Services................................................................... Susan Motin
LR&TS Assessment Committee........................................................... Randy Kolb
Sandra Williams
62
Chris Inkster
Assessment Charge from Dean:
LR&TS Dean Kristi Tornquist’s charge to the LR&TS assessment project is “to assess the contributions our service
activities make to student learning in a broadly defined manner.”
Assessment Projects:
Several assessment projects from 2004-05 will be revised and repeated. To support the campus-wide emphasis on the
North Central accreditation process, the emphasis of LR&TS assessment projects will continue to be on students’
awareness of and satisfaction with LR&TS services.
Miller Center Exit Survey
Purpose:
To determine student awareness of and satisfaction with a variety of LR&TS services and resources.
Audience:
Approximately 800 students exiting from the Miller Center lobby area.
Date:
Mid February 2006
Format:
One sheet (two-sided) survey with yes/no, scaled responses, and several open-ended questions. Two
versions of the survey will be used to include more questions.
Changes:
Some questions from the 2004-05 survey will be revised to clarify responses and to make
triangulation of data from other assessment instruments more consistent. The format of the survey
will be revised to increase reliability of the data.
Method:
LR&TS faculty and staff will ask students to participate in a 10-minute survey.
Analysis:
Completed surveys will be entered into SPSS by the Statistical Consulting Center and then analyzed.
Possibilities: Conduct similar surveys off-site, perhaps in Atwood, to get information from a wider variety of
students.
SCSU Telephone Survey
Purpose:
To determine student awareness of and satisfaction with a variety of LR&TS services and resources.
Audience:
Approximately 1,000 students will be called by the SCSU Survey team.
Date:
April 2006
Format:
Ten questions with yes/no, scaled responses, and two open-ended questions.
Changes:
Wording on several questions from the 2004-05 survey will be changed to clarify responses.
Method:
SCSU Survey Team will conduct the survey based on our questions.
63
Analysis:
SCSU Survey Team will provide statistical analysis of responses, including cross-tabulation with
demographic information for selected questions.
Focus Groups
Purpose:
To determine student awareness of and satisfaction with a variety of LR&TS services and resources.
Audience:
Several small focus groups: mixed / undergraduate / graduate / international students.
Date:
Mid spring 2006
Format:
Two-three LR&TS staff and faculty will facilitate the focus groups.
Changes:
By increasing the number and possibly the size of the focus groups, more diverse responses to
questions about LR&TS services will be collected. Questions will be determined in response to other
assessment information.
Method:
Students will respond to scripted questions designed to confirm or disconfirm information, gaps, and
perceived issues discovered in other instruments.
Analysis:
Conversations will be recorded with participants’ permission and notes will be taken. Natural
language coding will be used to analyze responses and suggestions.
Miller Center “Secret Shopper” – Tentative
Purpose:
To determine student satisfaction with customer service provided by employees at service counters.
Audience:
LR&TS service counters (Reference, Circulation, Periodicals, 2nd East Lab, HelpDesk, ResNet, open
labs across campus)
Date:
Spring semester following student worker training sessions on customer service
Format:
Some details of this unobtrusive evaluation will remain secret! Secret shoppers will not be
confrontational or “problem” patrons.
Method:
Adaptation of business model for secret shoppers will include recruitment and selection of student
shoppers, training of shoppers to ask pre-determined questions, development of response rubric, and
plan for timeline.
Analysis:
“Secret Shopper” team will analyze results and follow up with area supervisors.
LR&TS Work Group Assessments
LR&TS work groups have selected assessment projects for 2005-06. Some of these projects, marked with * below, are
planned to respond to data or gaps in data from the 2004-05 assessment report.
Access – Robin Ewing
* Student Study Room Users Survey
Serials Survey
CMLE membership annual survey
64
Collection Management – Julie Blake
Collection Assessment for Program / College Accreditation Reports
Communication – Dana Drazenovich
* Comment Box or Comment Webpage
CTUS – Randy Kolb
SCSU Survey – Tech Fee questions
* Student Workshop Evaluation Analysis
* Miller Center Secret Shopper
IMS – Rich Josephson
* Space Utilization by Students in the Miller Center – Summary
* Faculty Rover Feedback Forms Summary
* Workshop Evaluation Summary
ITIS – Randy Evans
SCSU Website audit by STAMATS in preparation for Website redesign
* E-Classroom User Satisfaction Survey
* HelpDesk User Follow-Up Survey Summary
Media Site Live (with Jeanne Anderson, CIM; grant submitted by Doris Bolliger)
ITS – Phil Thorson
* Survey of Top Technology Services
Reference – Susan Motin
* Library Instruction Evaluations – fall and spring semesters
* Reference Desk Survey -- fall and spring semesters
SCSU Data Sources
Data from a variety of SCSU sources relating to LR&TS services will be noted.
ACT
Selected questions
Charting Our Future
SCSU HLC / NCA focus group, November 17, 2005
http://www.stcloudstate.edu/nca/reports/111705.asp
Especially comments from Groups 8 and 9.
D2L Faculty Users at SCSU (Mert Thompson and Steve Malikowski)
LR&TS Triangulation Study
This study triangulates data from 2004-05 LR&TS assessment projects: building exit survey, telephone survey,
and focus group.
MN Online E-Student Services Audit (Round 2)
Data from survey provided by Patty Aceves, Center for Continuing Studies.
65
NCA Self-Study
Library and technology resources and services will be a part of several of the NCA criteria reports.
NSSE (National Survey of Student Engagement)
Selected questions.
Summer School Survey (summer 2005)
Selected questions.
66
The following matrices update general program information and add information on program assessment plans, tools, if data
are being collected and how those data are being used.
Herberger College of Business
Assessment Progress Report - Spring 2006
REVISED April 10, 2006
ACCT
BCIS
FIRE
MGMT
Mktg/Blaw
MBA
Program Goals and Curriculum Matrices
Goals
IP
C
C
IP
C
C
Matrix
IP
C
C
IP
C
C
Core
C
C
C
IP
C
NS
Other
IP
NS
NS
NS
NS
NS
IP
NS
NS
NS
IP
NS
C
IP
IP
IP
C
NS
IP
IP
IP
IP
C
NS
eLumen Participation
Carol
Dick,
Rich,
Mark?
Helen,
Janikan,
Others
Subba
Tom
NS
RPT Process
NS
NS
NS
NS
NS
NS
Faculty Expertise and Motivation
IP
IP
IP
IP
IP
IP
Learning Goals on Course Syllabi
Student Course Evaluation Forms
Core Exam
Consensus on Implementation
Graduated Responses
Tied to Program Goals
C = Completed IP = In Progress NS = Not
Started
67
College of Education Program Assessment Matrix
Department
Programs
Degree
Mission
Statement
Assessment
Plan
Assessment Tools
(Measures)*
Results
(Data)
Early Childhood
Major
B.S.
Yes
http://www.stclou
dstate.edu/CFS/
Yes
Minnesota BOT
Core + EC
Yes
Yes *
Yes **
Yes or
No
Yes
Early Childhood
Special Education
License
5th Year
Yes, see above.
Yes
Minnesota BOT
Core + ECSE
Yes
Yes *
Yes **
Yes
Early Childhood
Special Education
Master
M.A.
Yes, see above
Yes
Minnesota BOT
Core + ECSE
Yes
Yes *
Yes **
Yes
Parent Education
License
5th Year
Yes, see above
Yes
Minnesota BOT
Core + Pared.
Yes
Yes *
Yes **
Yes
Family Studies
Masters
M.A.
Yes, see above
Yes
Minnesota BOT
Core + Pared.
Yes
Yes *
Yes **
Yes
Yes or No
CHILD AND
FAMILY STUDIES
Assessable Learning
Outcomes
(Objectives, goals, etc.)
Yes or No
Yes or No
Direct
Indirect
*Direct measures include: TWS (Teacher Work Sampling), content analyses of student papers/projects.
**Indirect measures include: Telephone surveys, analyses of student forms and graduation records, focus groups and questionnaires.
68
How assessment data
are used for program
improvement
Select textbooks
adapt/change course
content. Adapt/change
course requirements
add/delete practical
experience.
Select textbooks
adapt/change course
content. Adapt/change
course requirements
add/delete practica
experience.
Select textbooks
adapt/change course
content. Adapt/change
course requirements
add/delete practica
experience.
Select textbooks
adapt/change course
content. Adapt/change
course requirements
add/delete practica
experience.
Select textbooks
adapt/change course
content. Adapt/change
course requirements
add/delete practica
experience.
Date of next self
study
Fall 2006
Fall 2006
Fall 2006
Fall 2006
Fall 2006
Department
Programs
Mission
Statement
Degree
Assessable Learning
Outcomes
(Objectives, goals, etc.)
COMMUNITY
PSYCHOLOGY
EDUCATIONAL
LEADERSHIP
Assessment Tools
(Measures)*
Results
(Data)
College Student
Development
M.S.
Yes
Yes
Yes
X
X
Yes or
No
Yes
School Counseling
M.S.
Yes
Yes
Yes
X
X
Yes
Rehabilitation
Counseling
M.S.
Yes
Yes
Yes or No
COUNSELOR
EDUCATION AND
EDUCATIONAL
PSYCHOLOGY
Assessment
Plan
Yes or No
Yes or No
Direct
Indirect
How assessment data
are used for program
improvement
Date of next self
study
To further develop
program
2006-2007
Shared with advisory
committee and
department and program
faculty
Shared with advisory
committee and
department and program
faculty
The following apply to
all five Community
Psychology programs:
2006-2007
2010-2011
Chemical
Dependency
B.S.
Yes
Yes
No
X
Yes
Community
Psychology
B.S.
Yes
Yes
yes
X
Yes
Community
Counseling
M.S.
Yes
Yes
Yes
X
Yes
Marriage & Family
Therapy
M.S.
Yes
Yes
Yes
X
Yes
Behavior Analysis
M.S.
Yes
Yes
Yes
X
X
Yes
Education Leadership
M.S.
Yes
Yes
Yes
X
X
Yes
Faculty analyze data
and make
recommendations.
NCATE
2006-2007
Education Leadership
Ed.S.
Yes
Yes
Yes
X
X
Yes
Faculty analyze data
and make
recommendation.
NCATE
2006-2007
All licensure (K-12,
Supt., SPED, Comm.
Ed.)
No
Yes
Yes
Yes
X
X
Yes
Comprehensive internal
and external reports are
analyzed.
Spring 2005
69
*review of curriculum
*review of individual
courses and overall
curriculum
*improve courses
*offer more electives
*improve advising
2007 – intend to
pursue CACREP
accreditation
ABA accreditation
until 2009
Department
Programs
Mission
Statement
Degree
Assessable Learning
Outcomes
(Objectives, goals, etc.)
HUMAN
RELATIONS AND
MULTICULTURAL
EDUCATION
Assessment Tools
(Measures)*
Results
(Data)
Health Ed
B.S.
No
Yes
Yes
X
X
Yes or
No
Yes
Physical Education
B.S.
Yes
Not to Students.
Yes to faculty.
Yes
X
X
Yes
Yes or No
HEALTH, PHYSICAL
EDUCATION,
RECREATION, AND
SPORT SCIENCE
Assessment
Plan
Yes or No
Yes
70
Yes or No
Direct
Indirect
How assessment data
are used for program
improvement
Date of next self
study
Define goals and
objectives
2008
Designed to find
program weakness &
illustrate program
needs.
2008
Department
Programs
Mission
Statement
Degree
Assessable Learning
Outcomes
(Objectives, goals, etc.)
Yes or No
INFORMATION
MEDIA
IM major
(may include preSLMS students)
B. S.
Yes
http://www.stclou
dstate.edu/cim/
IM minor
(may include preSLMS students)
Can be used
with BA.,
B.S. or
B.E.S.
Yes
Instructional
Technology
Certificate
(undergraduate)
SLMS Licensure
Certificate
Yes
Licensure
only
Yes
Information Media Information
Technologies
M.S.
Yes
Information Media Educational Media
M.S.
Yes
Information Media Instructional
Design/Training
M.S.
Yes
Instructional
Technology
Certificate (graduate)
Certificate
Design for ELearning
Certificate (graduate)
Certificate
Yes or No
Yes
http://www.stcloudstate.ed
u/
cim/undergraduate/major/
default.asp
Assessment
Plan
Yes or No
Assessment Tools
(Measures)*
Direct
Indirect
Results
(Data)
Yes or
No
How assessment data
are used for program
improvement
Date of next self
study
Yes
X
X
Data is examined for
areas to improve,
courses or programs are
revised.
NCATE 2007
Yes
X
X
Data is examined for
areas to improve,
courses or programs are
revised.
NCATE 2007
Yes
http://www.stcloudstate.ed
u/cim/undergraduate/certifi
cate.asp
Yes
Yes
X
X
Yes
X
X
Yes
X
X
Yes
Yes
X
X
Yes
Yes
X
X
Yes
Yes
Yes
http://www.stcloudstate.ed
u/cim/graduate/track1/defa
ult.asp
Yes
http://www.stcloudstate.ed
u/cim/graduate/track2/defa
ult.asp
Yes
http://www.stcloudstate.ed
u/cim/graduate/track3/defa
ult.asp
Yes
Yes
X
X
Yes
Yes
Yes
X
X
Data is examined for
areas to improve,
courses or programs are
revised.
Data is examined for
areas to improve,
courses or programs are
revised.
Data is examined for
areas to improve,
courses or programs are
revised.
Data is examined for
areas to improve,
courses or programs are
revised.
Data is examined for
areas to improve,
courses or programs are
revised.
Data is examined for
areas to improve,
courses or programs are
revised.
Data is examined for
areas to improve,
courses or programs are
revised.
71
NCATE 2007
Department
Programs
Degree
Mission
Statement
Yes or No
SPECIAL
EDUCATION
TEACHER
DEVELOPMENT
Assessable Learning
Outcomes
(Objectives, goals, etc.)
Assessment Tools
(Measures)*
Results
(Data)
Yes
X
X
Yes or
No
Yes
Yes,
X
X
Yes
X
Yes, in
revision
Yes
Minnesota BOT, Core +
EBD
Yes or No
Learning Disabilities
Undergraduate
B.S.
Yes
http://www.stclou
dstate.edu/coe/aca
demic/departments
/sped/undergrad/vi
sion.asp
EmotionalBehavioral Disorders
Undergraduate
B.S.
Yes, See Above
Yes
Minnesota BOT (Board of
Teaching)
Core + LD
http://education.state.mn.u
s/html/intro_board_teach.h
tm
Yes
Developmental
Disabilities
B.S.
Yes, see above
Minnesota BOT
Core + EBD
Yes
Yes
http://www.stclou
dstate.edu/coe/aca
demic/departments
/sped/grad/vision.a
sp
Yes, see above
Assessment
Plan
Minnesota BOT, Core +
DD
Yes
Minnesota BOT, Core +
LD
Yes or No
Direct
Indirect
How assessment data
are used for program
improvement
Date of next self
study
Discussed at
departmental meetings
Fall 2006
Yes
Discuss at meetings,
propose program
revisions
Fall 2006
X
Yes
See above
Fall 2006
X
X
Yes
See above
Fall 2006
Yes, in
revision
X
X
Yes
See above
Fall 2006
Learning Disabilities
Graduate
Graduate
Certificate
Emotional/
Behavioral Disorders
Graduate
Certificate
Developmental
Disabilities
Graduate
Certificate
Yes, see above
Yes, Minnesota BOT Core
+ DD
Yes, in
revision
X
X
Yes
See above
Fall 2006
Physical/Health
Disabilities
Graduate
certificate
Yes, See above
Yes, Minnesota BOT Core
+ PH/D
Yes, in
revision
X
X
Yes
See above
Fall 2006
Elementary
Education with a
specialty
B.S. Ed
No—under
revision
On individual syllabi
Yes
X
No
As an ongoing focus in
program discussions
Fall 2006
Curriculum and
Instruction
M.S.
Not specific to this
program;
On syllabi
In progress
No
Discussion within
Graduate Committee
Fall 2006
72
College of Fine Arts and Humanities Assessment Matrix
Department
Mission
Assessable
Statement Learning
Outcomes
(objectives,
goals)
Assessment
Plan (Which
learning
outcomes
will you be
assessing and
when?)
Assessment
Tools
(Measures)
Results
(Do you
have
results?)
How Assessment
data are used for
program
improvement
Date of next
accreditation
or self-study
2007-2008
(NASAD
accreditation)
Indirect|Direct
Art
BFA-Fine Art
Yes
Yes
Yes
Portfolios are
reviewed and
evaluated on a
numbered
scale.
Yes
BA-Fine Art
Yes
Yes
Yes
Yes
Yes
Learning outcomes
are developed
using NASAD
standards.
We have
developed 2 grids
for direct
assessment. We
are using an
assessment
documentation
sheet to assess
ongoing student
work. This
information is
used to find
strengths/weakness
and improve
courses as needed.
As above
BFA-Graphic
Design
BA- Art History
Yes
Yes
Yes
Yes
Yes
As above
Same
Yes
Yes
Yes
Yes
Yes
As above
Same
73
Same
BS-Art Education
Yes
Yes
Department
Mission
Assessable
Statement Learning
Outcomes
CSD (formerly
CDIS)
BS
Yes
MS
CommStudies
CMST
192 -- CORE
Yes the
foundation
classes.
Yes
Assessment
Plan
Yes surveys.
Exit
assessment
surveys.
Assessment
Tools
Yes
Yes
Both
Yes
Yes
Yes
Yes
Both
Yes
Yes
Yes (refined
and
submitted to
CAP)
Yes – Fall
2006
Direct
Spring
2007
B.S.
Yes
(Communication
Arts and Literature)
B.A.
Yes
Yes
Yes
Direct
In progress
In progress
No
B.S.
Interdisciplinary
Yes
No
No
No
Minor
Yes
No
No
No
74
Results
Results are
summarized and
used to enhance
curriculum.
How data are
used for
improvement
Same and
NCATE
2006
 Recruitment
and retention
 Improve upper
division
writing
 Improve
graduate
curriculum
 Increase
clinical
opportunities
2009 (CAA)
Points out areas
for additional
emphasis when
teaching Public
Speaking
Self-study
2010-11
Date of next
accreditation
or self study
2009 (CAA)
BoT 2006,
NCATE 2007
Interdisciplinary
intercultural minor
Yes
No
No
No
English
BA (general)
Yes
Yes
Yes
Yes (survey)
Yes
BA (Linguistics)
Yes
Yes
Yes, indirect
No
BS
Yes
(Communication
Arts and Literature)
Yes
Yes, Part of
BA and BS
Yes
Yes, direct
Yes.
Many
sources
MA (general)
Yes
Yes –
revamped
Fall 2005
Yes
Yes, direct
Yes
MA (TESL)
Yes
Yes
Yes
Yes, direct
No
MS (English Ed)
Yes
Yes
Yes
Yes, direct
Yes
Minor
Yes
Yes
Yes, part of
BA
Yes, part of
BA
Yes
Core – Engl 191
Yes
Yes
Yes
Yes, direct
Yes
Intensive English
Center
Yes
Yes, for each
level
Yes
Yes, direct
Yes
75
To help inform
curriculum changes
In development for
fall 2006
To help inform
curriculum changes
and remediate
individual students
Analysis of student
writing in a variety
of courses to
determine quality;
changes made if
needed to the
curriculum
Reshape
curriculum
Analysis of student
work used to
change curriculum
Enrollment analysis
to verify program
vitality
Self study in 200203 resulted in
program and
objectives changes
Curriculum
changes are made
each year based on
data analysis
Self study
2013
As above
2006 BoT
2007 NCATE
Self study
2013
Self study
2013
Self study
2013
Self study
2013
Self study
2013
Self study
2013
College ESL
Yes
Yes, for each
level
Yes
Yes, direct
Yes.
Will
have
results at
end of
this
semester
Yes
Data analysis used Self study
to determine weak
2013
areas; program then
re-evaluated
The Write Place
Yes
Yes
Yes
Yes, direct and
indirect
Offer workshops,
expand program
(satellite), offer
tours, solicit
funding
Self-study
2013
Foreign Language
Yes
Yes
Yes
Yes, both
Yes
To adjust the
program as needed
Yes
Yes, both
Yes
As above
Yes
Yes
Yes, both
Yes
As above
Next selfstudy, external
review 2010
Self-study,
external
review 2010
BoT 2006,
NCATE 2007
French BA,
German BA,
Spanish BA
French BS,
German BS,
Spanish BS
Minor in each
MassComm
BS
Yes
Yes
Yes
Yes
Yes
Yes
As above
Yes
As above
As above
Yes
Yes
Yes
Yes, both
Yes
ACEJMC –
2010-2011
MS
Yes
Yes
No
Comprehensive Yes
exam
More emphasis on
internships; new
entry/exit test
No
Music
BA
Yes
Yes
Yes
Yes, direct
Yes
BM
Yes
Yes
Yes
Yes, direct
Yes
76
As above
Re-evaluate student NASM - 2010
learning outcomes
across the
curriculum
As above
NASM 2010
BS
Yes
Yes
Yes
Yes, direct
Yes
As above
NASM 2010,
BoT 2006 and
NCATE 2007
NASM 2010
MM
Yes
Not yet
No
Yes, direct
No.
Philosophy
BA
Yes
Yes
Yes
Interdisciplinary
BA
Yes
Yes
Yes
Yes. Direct Senior
comprehensive
exam;
comparative
writing
analysis; exit
interviews;
alumni
surveys.
Indirect – Exit
interviews and
alumni surveys
Yes. As above.
Yes
Added required
capstone course;
improved writing
instruction
External
Review 2006
Yes
As above.
As above.
BA Minor
Interdisciplinary
minor
Minor for
Mathematics
majors
BES major
BES minor
Humanities BA
TFSD
Yes
Yes
Yes
Yes
Yes
Yes
Yes. As above.
Yes. As above
Yes
Yes,
As above.
As above
As above.
As above
Yes
Yes
Yes
Yes. As above
Yes
As above
As above
Yes
Yes
No (soon)
Yes
Yes
Not yet
Yes
Yes
Not yet
Yes. As above
Yes. As above
Working on it
Yes
Yes
No
As above
As above
No
As above
As above
Not sure
Theatre BA
Yes
Yes
yes
Yes, direct
Yes,
from
Used to assess
program goals,
External
review 2006-
77
Theatre minor
Film Studies BA
Yes
Yes
Yes
Yes
Yes
Yes
Yes, direct
Yes, direct
Film Studies minor Yes
Dance minor (only) Yes
Yes
Yes
Yes
Yes
Yes, direct
Yes, direct
78
2001present
As above
Yes
curriculum needs
and class content
As above
Used to assess
program goals,
curriculum needs
and class content
As above As above
Yes
Used to assess
program goals,
curriculum needs
and class content
07
As above
2012-13
As above
BS – NCATE,
2007; BOT
2006
College of Science and Engineering Assessment Matrix
May, 2006
Report
Submitted
to COSE
Assessment
Coordinator
Mission
Statement
Assessable
Learning
Outcomes
(objectives,
goals, etc.)
Assessment
Plan
Assessment Tools
(Measures)
Direct
Yes
Yes
Yes
Aviation
Aviation Major
Bachelor of
Applied Science
Biological Sciences
Biomedical
Sciences
General Biology
Ecology and
Field Biology
Cell Biology
Biotechnology
Biology Teaching
Medical Technology
Yes
Chemistry
BA
Biochemistry
Professional
Chemistry ACS
Approved
Chemistry Teaching
Yes
Results
Indirect
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
79
How assessment data are
used for program
improvement
Gathered information in
many areas to improve
instruction and required a
capstone course for all
students in aviation
Based on outcomes from
the ETS major field exam,
expanded the core to
include Biol. 262 – Genetics
for all majors. Assessment
tool linked with student
learning outcomes
developed and
administered in core
courses both as a pre- and
post-test. New graduate
"bridge" courses developed
and staffed by teams of
faculty as a result of review
of the graduate program.
Generating a portfolio for
each graduating major,
including a written thesis, a
DVD of a one hour oral
presentation and a copy of
the PowerPoint
presentation delivered.
Yes
Yes
Computer Science
Computer
Science – CSAB
Accredited
Applied
Computer
Science
Earth and
Atmospheric
Sciences
Earth Science
Geology
Meteorology
Science Teaching
Electrical and
Computer
Engineering
Computer
Engineering
Electrical
Engineering
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
80
Revised two courses and
added one new course
based on assessment
results.
Began to work on the
designation of junior level
courses in each program as
preparatory for the written
communication learner
outcome, including the
development of a common
rubric to evaluate technical
writing as the result of
assessment of student
writing quality in EAS 450
and submissions for the
Denise McGuire Research
Award
Redesigned Senior Design
sequence to be more
assessable and to include
outcomes not covered
elsewhere. Changed
Computer Engineering
program to meet ABET
requirements
Yes
Environmental and
Technological
Studies
Technology
Education
Environmental
Studies
Technology
Assessment and
Management
Mathematics
Mathematics
Mathematics
Teaching
Mechanical and
Manufacturing
Engineering
Mechanical
Engineering
Manufacturing
Engineering
Nursing Sciences
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Physics, Astronomy
and Engineering
Science
Physics
Science Teaching
Yes
81
Results from both indirect
and direct assessments are
discussed in ETS faculty
meetings, which causes
potential and real changes
in curriculum as well as
other areas such as
advising, promotion,
summer school offerings,
course sequencing, etc.
As a result of assessing
student outcome data,
expiration dates were set
for placement test scores
and programs were
restructured at the St.
Cloud Technical College.
Redesigned program after
analyzing performance data
and student surveys.
Spring End of Year and Fall
Program Review:
mandatory assessment
times set to review data for
curricular changes, then
trend yearly assessment
data to make curricular
changes
After analysis of FCI exam
results given in Phys 231
and Phys 234, standardized
these courses and initiated
1-credit courses for
students requiring outside
help.
Yes
Statistics/Computer
Networking and
Applications
Statistics
CNA
Yes
Yes
Yes
Yes
Yes
82
Yes
Yes
Yes
Yes
Yes
Yes
As a result of assessment
of Statistics service
courses, more in depth
instruction in Minitab
included. CNA closed loop
on last area's assessment
in the areas of computer
networking and Java
programming
College of Social Sciences Assessment Matrix
Department
Program
Degree
Mission
Statement
Yes or No
Community
Studies
Yes or No
Assessment
Tools
(Measures)
Assessment
Plan
Yes or No
Direct Indirect
Result
s
(Data)
Yes or
No
How
assessment
data are used
program
improvement
Date of
next self
study
Yes
NA
NA
NA
NA
NA
2010
Community
Development
B.A.
Yes
Yes
Yes
Yes
Yes
Yes
2010-11
Gerontology
Minor only
Yes
Yes
Yes
Yes
Yes
Yes
2010-11
Heritage
Preservation
Minor only
No
(program is in
limbo)
No
No
No
No
No
? (Program is
in limbo)
B.A.
Yes
No
Yes
No
Yes
(Alumni
survey)
Yes
2010-11
No
No
No
No
No
No
2010-11
B.A.
Yes
Yes
Yes*
Yes
National
Test
?
Yes
2008-09
Minor only
Yes
Yes
Yes
Yes
multiyear,
multipreposttest
since
Fall
2004
Yes
Yes
Criminal
Justice Studies
Private Security
Economics
Ethnic Studies
Assessable
Learning
Outcomes
Ethnic Studies
83
To be discussed in
workshop of
Summer 2006
2010-11
Department
Program
Degree
Mission
Statement
Yes or No
Assessable
Learning
Outcomes
Yes or No
Assessment
Plan
Yes or No
Assessment
Tools
(Measures)
Direct Indirect
Result
s
(Data)
Yes or
No
How
assessment
data are used
program
improvement
Date of
next self
study
BA
Yes
Some (Geog
111)
In progress
Yes (at
least for
GEOG
111, a
Gen Ed
course
No
2009
Minor
only
No
No
No
No
No
2009
Yes
Yes
Yes
Yes
Yes
2009
Minor
only
No
No
No
No
No
2009
Geography
MS
Yes
Yes
In progress
In
progress
In
progress
Geographic
Information
Science Emphasis
MS
In progress
In progress
In progress
No
No
Tourism Planning
and Development
Emphasis
M
In progress
In progress
In progress
No
No
Certificate
In progress
In progress
In progress
No
No
Geography
Geographic
Information
Science
Land
Surveying/Map
ping Science
Travel and
Tourism
Geographic
Information
Science
84
Department
Program
Degree
Mission
Statement
Yes or No
Yes or No
Assessment
Tools
(Measures)
Direct Indirect
Result
s
(Data)
Yes or
No
How
assessment
data are used
program
improvement
Date of
next self
study
Yes
No
Yes**
Yes
No
No
2008-09
African Studies
Minor only
Yes
No
No
No
No
No
2009-09
East Asian Studies
Minor only
Yes
No
No
No
No
No
2008-09
Latin American
Studies
Minor only
No
No
No
No
No
No
2008-09
B.A.
Yes
Yes
No
No
No
No
2006-07
Public
Administration
BA
Yes
Yes
No
No
No
No
2006-07
International
Relations
BA
Yes (Is in
process of
becoming a
Center,
separate from
Pol. Sci.)
Yes
Drafted
No
No
No
No
2006-07 (?)
Yes
Yes
Yes
In
progress
Yes
2006-07
In progress
(program is
developing a
new Center
for Social
Studies
Education to
run this
interdisciplina
ry major)
In progress
Yes (in
conjunction with
re-accreditation
of SCSU’s
teaching
licensure
program
(NCATE)
Yes
?
?
2009-10
Political
Science
Social Science
& Social
Studies
Yes or No
Assessment
Plan
BA
History
Psychology
Assessable
Learning
Outcomes
BA
BA, BS,
BES
85
Department
Program
Degree
Mission
Statement
Yes or No
Women’s
Studies
Yes or No
Assessment
Plan
Yes or No
Assessment
Tools
(Measures)
Direct Indirect
Result
s
(Data)
Yes or
No
BS
Yes
Yes
Yes***
Yes
Yes
Yes
Anthropology
BA
Yes
(Shared with
Sociology)
No
No
No
?
Sociology
BA
Yes
No
Yes (but
indirect: student
self assessments
of learning)
No
Yes
(student
opinions
of
learning)
Yes
Concentration in
Applied Sociology
BA
Yes
Yes
Yes (but
indirect: student
self assessments
of learning)
Yes
Yes
BA, BS,
BES
Yes
In progress
Has been
assessing WS
201 with preand post-test;
also surveys
alumni
Yes
Yes
Social Work
Sociology/
Anthropology
Assessable
Learning
Outcomes
86
Yes
Sociolog
y has a
very
good
instrume
nt that
they
have
used to
survey
alumni.
Yes
Uses
same
survey as
above.
Yes
How
assessment
data are used
program
improvement
Date of
next self
study
2010-11
accredited by
Council on
Social Work
Education
2008-09
2008-09
To be discussed
next year.
2008-09
2010-11
Department
Program
Degree
Mission
Statement
Yes or No
Criminal Justice
Studies
Economics
Masters in
Criminal Justice
Studies
MS
Public Safety
Executive
Leadership
Graduate
Program
Assessable
Learning
Outcomes
Yes or No
Assessment
Plan
Yes or No
Assessment
Tools
(Measures)
Direct Indirect
Result
s
(Data)
Yes or
No
How
assessment
data are used
program
improvement
Date of
next self
study
No
No
No
Yes
No
2010-11
MS
More
program
description
than Mission
No
No
No
No
No
No
Applied
Economics
MS
In progress
In progress
In progress
In
progress
In
progress
No
Professionl
peace officer
education
program
certified as
provider of
the academic
law
enforcement
licensing
core,
Minnesota
Peace
Officers
Standards &
Training
Board.
2008-09
Public and
Nonprofit
Institutions (a
joint program of
the Economics
and Political
Science
departments)
MS
In progress
In progress
In progress
In
progress
In
progress
No
2009-09
87
Department
Program
Degree
Mission
Statement
Yes or No
Geography
Gerontology
History
Yes or No
Assessment
Plan
Yes or No
Assessment
Tools
(Measures)
Direct Indirect
Result
s
(Data)
Yes or
No
How
assessment
data are used
program
improvement
Date of
next self
study
Geography
MS
In progress
In progress
In progress
In
progress
In
progress
No
2008-09
Geographic
Information
Science Emphasis
MS
In progress
In progress
In progress
No
No
No
2008-09
Tourism Planning
and Development
Emphasis
M
In progress
In progress
In progress
No
No
No
2008-09
Geographic
Information
Science
Certificate
In progress
In progress
In progress
No
No
No
2008-09
Gerontology
MS
Yes (but same
as for
undergrad)
Yes
Yes
Yes
Yes
Yes
2011-12
Gerontology
Certificate
Certificate
Yes (but same
as for
undergrad)
Yes
Yes
Yes
Yes
Yes
2011-12
History
MA
No separate
Mission for
the grad
programs
No
No
No
No
No
2008-09
Public History
Track
MA
No
No
No
No
No
2008-09
MS — History
(Teaching
Emphasis)
MS
No
No
No
No
No
2008-09
2006-07
Political
Science
Psychology
Assessable
Learning
Outcomes
IndustrialOrganizational
Psychology
MA
No
No
No
No
No
No
MS
Yes
Yes
Yes
Yes
Under
develop
ment
Yes
88
In progress
2006-07
Department
Program
Degree
Mission
Statement
Yes or No
MSW
Social Work
Assessable
Learning
Outcomes
Assessment
Plan
Yes or No
Yes
Yes
Yes or No
Yes
Assessment
Tools
(Measures)
Direct Indirect
Yes
Yes
Result
s
(Data)
Yes or
No
How
assessment
data are used
program
improvement
NA
(Program
starts
Fall 07)
Date of
next self
study
2010-11
*Lower-level courses assessed with a nationally normed instrument, the Test of Economic Literacy (TEL), co-authored by Ken Rebeck. It uses a pretestposttest design to assess objective and attitudinal knowledge about a range of basic economic concepts. The test is used at a number of high schools and
colleges, which provides comparative data about students’ performance.
**History submitted a two-part assessment. The first part consists of pretest essay topics customized for each of three courses: American history,
European history, and world history. The essays focus on historiographical issues, including regional differences, multiple causality, multiple
perspectives, and historiographical debates. The second part is more quantitative: For each history course the students must arrange five course-relevant
historical events in correct chronological order.
***Social Work has an extensive assessment plan based upon a large-scale assessment program that they developed in connection with the Department's
CSWE accreditation (successfully completed). Three items in their assessment battery are administered to students in their Gen Ed course, Social Work
and Democratic Citizenship. Includes the Professional Behavior Scale, a Writing Scale, and a Student Assessment.
Learning Resources and Technology Services Assessment Matrix
Department
Programs
Mission
Statement
Degree
Assessable Learning
Outcomes
Yes or No
Yes
Yes or No
No --
Assessment
Plan
Yes or No
Yes
LR&TS
89
Assessment Tools
(Measures)
Direct
Indirect
No
Yes
Results
(Data)
Yes or
No
Yes
How assessment data
are used for program
improvement
Several improvements
as a result of data from
last year’s assessment
efforts include extended
building hours,
customer training for
student workers, added
focused assessments in
work areas, and
increased laptops
Date of next self
study
Assessment efforts
will be continued
next year.
Department
Programs
Degree
Mission
Statement
Assessable Learning
Outcomes
Assessment
Plan
Yes or No
Yes or No
Yes or No
Assessment Tools
(Measures)
Direct
Indirect
Results
(Data)
Yes or
No
How assessment data
are used for program
improvement
available for student
checkout.
90
Date of next self
study
Fly UP