Undergraduate Assessment reports are to be submitted annually by program/s. The report deadline is October 15th. 

Graduate Assessment reports are to be submitted annually by program/s. The report deadline is October 15th. 

 

A printable Word Document of this template can be found here. 

Annual Program Assessment Report

Academic Year Assessed:
College:
Department:
Submitted by:

Program(s) Assessed:   
Indicate all majors, minors, certificates and/or options that are included in this assessment:

Majors/Minors/Certificate

Options

 

 

 

 

 

 

 

The Assessment Report should contain the following elements, which are outlined in this template:

  1. Assessment Plan, Schedule, and Sources
  2. What was done this assessment cycle – including rubrics, how data was collected, and who analyzed it
  3. What was learned – including areas of strength and areas for improvement
  4. How we responded and plans for improvement
  5. Closing the loop

Sample reports and guidance can be found at: https://www.montana.edu/provost/assessment/program_assessment.html 


1. Assessment Plan, Schedule and Data Source.


a. Please provide a multi-year assessment schedule that will show when all program learning outcomes will be assessed, and by what criteria (data). (You may use the table provided, or you may delete and use a different format).

ASSESSMENT PLANNING CHART

PROGRAM LEARNING OUTCOME

2020-2021

 

2021-2022

 

2022-2023

 

2023-2024

 

Data Source*

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

*Data sources can be items such as randomly selected student essays or projects, specifically designed exam questions, student presentations or performances, or a final paper.  Do not use course evaluations or surveys as primary sources for data collection.

b. What are your threshold values for which you demonstrate student achievement? (Example provided in the table should be deleted before submission)

Threshold Values

PROGRAM LEARNING OUTCOME

Threshold Value

Data Source

Example: 6) Communicate in written form about fundamental and modern microbiological concepts

The threshold value for this outcome is for 75% of assessed students to score above 2 on a 1-4 scoring rubric.

Randomly selected student essays

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

2. What Was Done


a. Was the completed assessment consistent with the plan provided?
YES_____ NO_____

b. If no, please explain why the plan was altered.

 

 

c. How were data collected? (Please include method of collection and sample size.)

 

 

d. Explain the assessment process, and who participatedin the analysis of the data.

 

 

e. Please provide a rubric that demonstrates how your data was evaluated.
(Example provided below should be deleted before submission – your rubric may be very different; it just needs to explain the criteria used for evaluating student achievement).


Example: Rubric for outcome #6

Indicators

Beginning - 1

Developing- 2

Competent- 3

Accomplished- 4

Analysis of Information, Ideas, or Concepts

Identifies problem types

Focuses on difficult problems with persistence

Understands complexity of a problem

Provides logical interpretations of data

 

Application of Information, Ideas, or Concepts

Uses standard solution methods

Provides a logical interpretation of the data

Employs creativity in search of a solution

Achieves clear, unambiguous conclusions from the data

 

Synthesis

Identifies intermediate steps required that connects previous material

Recognizes and values alternative problem solving methods

Connects ideas or develops solutions in a clear coherent order

Develops multiple solutions, positions, or perspectives

Evaluation

Check the solutions against the issue

Identifies what the final solution should determine

Recognizes hidden assumptions and implied premises

Evaluates premises, relevance to a conclusion and adequacy of support for conclusion.

This type of rubric can be used for all levels of assessment (the anticipated evaluation score may vary according to the course level). Some rubrics/assessments may be more tailored for courses (e.g. designed to assess outcomes in upper division courses or for lower division) and therefore the scores might be similar across course levels. Or, if you are assessing more basic learning outcomes, you might expect outcomes to be established earlier in the academic career.

NOTE: Student names must not be included in data collection.  Totals of successful completions, manner of assessment (publications, thesis/dissertation, or qualifying exam) may be presented in table format if they apply to learning outcomes.

3. What Was Learned

Based on the analysis of the data, and compared to the threshold values provided, what was learned from the assessment?

a. Areas of strength

 

 

b. Areas that need improvement

 

4. How We Responded

a. Describe how “What Was Learned” was communicated to the department, or program faculty. 

 

 

b. Based on the faculty responses, will there any curricular or assessment changes? YES______ NO_______

 

 

c. If Yes, what changes will be implemented (choose all that apply and describe specifically below under d.

Gather additional data to verify or refute the result

 

Areas where the acceptable performance threshold has not been met are highlighted.

 

Change the acceptable performance threshold

 

Evaluate the rubric to assure outcomes meet student skill level

 

Identify potential curriculum changes to try to address the problem

 

Use Bloom’s Taxonomy to consider stronger learning outcomes

 

Choose a different assignment to assess the outcome

 

Other (please describe):

 

 

 

d. Please include which outcome is targeted, and how changes will be measured for improvement. If other criteria is used to recommend program changes (such as exit surveys, or employer satisfaction surveys) please explain how the responses are driving department, or program decisions.

 

 


e. When will the changes be next assessed?  

 

6. Closing the Loop


a. Based on assessment from previous years, please describe program level changes that have led to outcome improvements.

 

 

NOTE: Student names must not be included in data collection.  Dialog on successful completions, manner of assessment (publications, thesis/dissertation, or qualifying exam) may be presented in table format if they apply to learning outcomes.  In programs where numbers are very small and individual identification can be made, focus should be on programmatic improvements rather than student success.  Data should be collected through the year on an annual basis.

 

Submit report to [email protected]