Skip To Top Navigation Skip To Content Skip To Footer

Resources

Assessment Wheel

SLOs (Student Learning Outcomes)


  • Include various levels of complexity in learning (Bloom’s Taxonomy)
  • Address multiple domains of learning (Cognitive, Affective, and Behavioral)
    1. Cognitive: The outcome addresses how the student thinks or what the student knows.
    2.  Affective: The outcome addresses how the student interacts with others, what the student values, or what attitudes the student holds.
    3.  Behavioral: The outcome addresses what the student does (for example, executing a skill, building a project, or designing a system)
  • Make measurable
  • Make meaningful to the department (Faculty care about the SLO topics, but there is a small number of SLOs, ideally between 3 and 5, to track)
  • Describes what the student is expected to learn but does not state how it is measured
Domain of Learning Lower-level, Measurable SLO Upper-level, Measurable SLO
Cognitive Students will know the ethical concerns within the field Students will analyze ethical concerns within the field
Affective Students will value teamwork as an essential skill Students will effectively persuade unproductive team members to work in the group
Behavioral Students will write clearly within the discipline Students will integrate theoretical constructs and practical application within their writing

More SLOs resources:

Assessment: How to Develop Program Outcomes by University of Hawai’i at Manoa.
A Model of Learning Objectives by Gaullaudet University.
Learning Domains by Emporia State.
Introduction to Developing Student Learning Goals by Stempien and Bair from University of Colorado at Boulder.

Measures


  • Include a mix of direct (e.g., tests and rubrics) and indirect (e.g., surveys)
For more on this, see Strategies for Direct and Indirect Assessment of Student Learning by Allen, M.J. from Duke University
  • Make sustainable
  • Include measures that are meaningful and adopted by a majority of faculty
  • Make measures span various levels of the curriculum (Lower-level and Upper-level)
  • Make each measure specific to each SLO

For example, only use part of a measure that relates to the SLO. If the SLO involves knowledge of a certain topic, then the measure should be specific to that topic, or a portion of the measure should be specific to the topic. The SLO might involve Critical Thinking, but you’re using the MFT, so the measure should be the Critical Thinking section of the MFT.

Describe the measure, what it entails, and any scales that are used.

Schedule (Process for Collecting Evidence)

  • Make structured and simple
  • State when and how students complete the measure and how faculty are involved in administering the measure
Outcome Measure Process for Collecting Evidence
Students will write clearly within the discipline Students write a term paper in DIS 201 and again in DIS 408 Faculty collect the term papers, grade each one using a rubric from 1 (unreadable) to 5 (inspiring writing). Scores are tracked across the years for each student
Students complete an exit survey and answer a question about being able to write clearly Faculty administer the survey to the graduating students every spring semester

Results


Targets (Expectations for this Group)

  • These make a distinction in the data for when you know that overall your students have learned the material or not.
  • Make meaningful and feasible to faculty

Percent who Met or Exceeded Expectations

  • Show a reflection of the data collected
  • Make specific to the SLO

For example, only report on part of a measure that relates to the SLO. If the SLO involves knowledge of a certain topic, then the result should be specific to that topic, or a portion of the measure should be specific to the topic. The SLO might involve Critical Thinking, but you have MFT scores, so the report on the Critical Thinking scores from the MFT, not the total MFT score.

Outcome Measure Process for Collecting Evidence Target Result
Students will write clearly within the discipline Students write a term paper in DIS 201 and again in DIS 408 Faculty collect the term papers, grade each one using a rubric from 1 (unreadable) to 5 (inspiring writing). Scores are tracked across the years for each student 80% of the students will increase their writing scores from below a 4 in DIS 201 to either a 4 or 5 in DIS 408 75% of the students increased their writing scores
Students complete an exit survey and answer a question about being able to write clearly Faculty administer the survey to the graduating students every spring semester 80% of students will say that they write clearly or exceptionally well 100% of the students said that they write clearly or exceptionally well

Use of Results


  • Make specific changes (could be small steps) that address ways to improve the scores and/or problems you discovered with the SLO or the assessment tool
  • Communicate clearly how the changes were necessary to improve the scores
  • Make changes that are actionable and meaningful to faculty

Examples of Changes:

  • Modified teaching
  • Added one-minute papers
  • Started taking meeting minutes. (Faculty start to discuss the problem and Chair includes it as an agenda item at department meetings until resolved)
  • Changed assignments
  • Modified curriculum
  • Created a focus group
  • Added an Advisory Committee
  • Attended a professional conference to see what other universities are doing
  • Dropped something (i.e., a concept in the curriculum, an assignment)
  • Shared information with people who could make suggestions (e.g., faculty, Advisory Board, Alumni, and/or current students)
  • Changed the assessment process (modified the measure or changed the SLO)
Outcome Measure Process for Collecting Evidence Target Result Use of Results
Students will write clearly within the discipline Students write a term paper in DIS 201 and again in DIS 408 Faculty collect the term papers, grade each one using a rubric from 1 (unreadable) to 5 (inspiring writing). Scores are tracked across the years for each student 80% of the students will increase their writing scores from below a 4 in DIS 201 to either a 4 or 5 in DIS 408 75% of the students increased their writing scores Faculty discussed the results and decided to add a lecture and slides on APA in DIS 408 and remove a lecture on research
Students complete an exit survey and answer a question about being able to write clearly Faculty administer the survey to the graduating students every spring semester 80% of students will say that they write clearly or exceptionally well 100% of the students said that they write clearly or exceptionally well

Documents to Upload


  • Mission
  • Examples of Measures (Rubrics, Surveys, etc.)
  • Documentation of Stakeholder Involvement (Faculty, Advisory Board, Other Departments, etc.)
  • Meeting Minutes
  • Timeline for Assessment
  • Curriculum Map
  • Raw Data or a report that shows analyzed data

These documents are to be uploaded to the appropriate folders in the Documents Section of Weave.

Co-Curricular

Goals


  • Large, forward thinking statements that break down mission statements.
  • Identify one goal for each service you want your unit to provide to the university.
  • Goals will be broken down further into discrete tasks in order to determine whether goals are being met.

Objectives/Outcomes


  • Specific, measurable, supporting points for goals.
  • Make meaningful to the unit.
  • Describes what the unit is expected to provide but does not state how it is measured.
  • Use SMART (Specific, Measurable, Attainable, Relevant, Time-bound) approach for developing objectives.

More Objective/Outcome Resources:

Student Affairs Assessment Leaders (SAAL) - Guiding Questions for Writing Effective Learning Outcomes
Kean University - SMART Learning Objectives

Measures


  • Consist of the method through which you will determine whether you have, or have not, met your objectives.
  • Make measurements sustainable.
  • Make each measure specific to objective/outcome. Every measure should relate back to an objective, but an objective can have multiple measures.
  • Different objectives require different measurement types. Consider the following:
    • Direct vs. Indirect
      • Direct measures assess performance/demonstration that an objective is being met.
      • Indirect measurements ask people to reflect on unit's performance toward objective.
    • Qualitative vs. Quantitative
      • Qualitative methods are generally more time consuming, but prove deeper.
        • i.e. open-ended questions, small n, text analysis
      • Quantitative methods are generally quicker and more broad.
        • i.e. closed-ended questions, large n, statistical analysis
    • Formative vs. Summative
      • Formative measures take place during the program being assessed to inform on improvements that can be made before the program ends.
      • Summative measures take place after the program being assessed to inform.

Methods of Data Collection

Where/how will you obtain the data?

  • Do these data already exist? (i.e. Banner data, IE reports)
  • Do you need to create a survey or testing measurement?
  • Plan your data analysis concurrent to planning your measurements. Knowing how you will analyze the data saves time and importantly ensures that no data are lost.

More Measures Resources:

Student Affairs Assessment Leaders (SAAL) - Assessment Methods
University of Wisconsin-Madison Office of Quality Improvement - Survey Fundamentals

Results


  • Targets (Expectations for this Group)
    • These make a distintion in the data for when you know that overall your performance has been sufficient.
    • Make meaningful and feasible to staff.
  • Percent who Met or Exceeded Expectations
    • Show a reflection of the data collected.
    • Make specific to the outcome/objective.

Use of Results


Did your results indicate your unit met its objective?

  • If not:
    • What could be the cause?
    • Do objective targets need to be reconsidered?
    • Does unit need to implement intervention to obtain objective target?
  • If yes:
    • Great! Brag a little! (disseminate findings / close the loop)
    • Can your objective targets be raised?
  • Make specific changes (could be small steps) that address ways to improve the scores and/or problems you discovered with the objective/outcome or the assessment tool.
  • Communicate clearly how the changes were necessary to improve the scores.
  • Make changes that are actionable and meaningful to staff.

Examples of changes:

  • Creation of assessment tool.
  • Modify staff assignments/positions.
  • Create a focus group.
  • Establish assessment committee.
  • Attend professional conference.
  • Drop an unnecessary task/procedure.
  • Solicit suggestions from similar units.
  • Change the assessment process. (i.e. change / add outcomes, add new measurements)
-insert table or link to table here-