Program (s):  BS Ranching Systems
Academic Year Assessed: 2023-2024
Reviewed: Spring 2025


The Assessment and Outcomes Committee (AOC) oversees and supports the assessment of all undergraduate and graduate degree programs not subject to external accreditation. The AOC evaluates Academic Program Assessment Reports and provides constructive feedback to help faculty refine program goals and enhance student learning. The ultimate purpose of academic program assessment is to ensure that academic programs meet their objectives—equipping MSU graduates with the knowledge and skills they need to succeed in the workforce or in graduate studies.

The Program Assessment Report submitted in Fall 2024 for the degree program(s) listed above was reviewed by at least two members of the Assessment and Outcomes Committee (AOC). The summary below includes commendations, recommendations, and ratings for each criterion in the report. When reviewers assigned different ratings, both are provided for your reference. Additional feedback, if offered, can be found in the “Program Report Element” section. For detailed explanations of the rating criteria, please refer to the AOC Rubric for Program Assessment Report Elements (2023-2024) at the end of the report.


The AOC values the time and thoughtful effort faculty invest in assessment. Our goal is to support you in strengthening your program and enhancing student learning. We view assessment as an ongoing, collaborative process and are here to serve as partners in continuous improvement.

Contact Assistant Provost Deborah Blanchard at deborahblanchard@montana.edu with questions or to schedule individual meetings to discuss or brainstorm program assessment goals further.

Commendations:

  • I really appreciated that the program was willing to step aside from their initial assessment planning and reorganize the assessment to align with other assessments occurring in other programs in the department. Recognizing that using a single assessment was not providing enough information to assess this LO is a good outcome of doing this assessment and will allow the faculty to tighten up the plan for assessing this outcome in the future.
  • Discovering that the assignment used may need to be revised in order for students to demonstrate their understanding of the LO is an excellent outcome of conducting a program assessment and really speaks to how involved the faculty are in wanting to improve their programs. It was an excellent report.
  • I commend animal sciences for really reflecting on ethics and how to assess it. Though the current assessment shows that students were not quite where they wanted them to be, the efforts to coordinate how ethics is taught across courses and to assess in a similar fashion across programs shows a definite awareness and desire to improve student learning.

Recommendations:

Reviewers noted that programs could benefit from:

  Using Bloom’s Action Verbs for Learning Outcomes to create measurable PLOs.
  Developing more robust assessment rubrics.
  Using more direct evidence for assessment purposes (i.e. actual student work, specifically-constructed prompts for assessment purposes, etc.)
     X                     Collecting indirect evidence (i.e. surveys of students, alumni, or advisory councils) to support curricular or program changes, changes, leading to better direct evidence artifacts/data.
  Utilizing campus partnerships (i.e. Career Services, Writing Center, AYCSS, etc.)
  Including an alignment chart to map PLOs from various majors, options, minors, or certificates to the program assessment LOs listed.
     X Considering how assessment results will be used to support 7-year cycle Program Review Reporting as a part of closing the loop and future planning.

 

  • I was confused that only four pieces of student work were used for assessment. Were there only four students taking the course? If there were more students taking the course (even if declared in other programs), it is appropriate to use any student work that is available in a given course. It's not an assessment of individual students, it is an assessment of the program and how well the curriculum of the program has been designed to meet the expectations laid out in the program. That means any student taking a required course that has an assignment designed to meet a PLO is "fair game" for program assessment purposes. If the course assignments are aligned with specific PLOs (even across multiple programs), then it doesn't really matter who the student is taking the course. Seven-year program reviews require a more focused look at student enrollment, retention, and graduation rates as it relates to course enrollment, and at that level, it may matter how a self-study is assessing the declared students in a major. (e.g. at the 7-year program review cycle, it might be found that only an average of four out of 25 students is taking a required course for a program and their success may or may not be contributing to the overall success of a program.) The annual program assessment process is focused on continuous improvement of student learning - it's so faculty have the right information to go deeper into the curriculum to make adjustments to the assignments so that what needs or is supposed to be learned is actually happening.
  • Please use the most current reporting template - these are updated and available on the Provost's website.
  • I wonder if the original idea of interviews with students might provide a better assessment? Or perhaps working with C-STES on an assessment tool for ethics.
  • All programs are reminded that annual program assessments support the 7-year Program Review reporting process. When applicable, programs are encouraged to document how assessment findings may inform future.

Specific Item Ratings and Feedback

PROGRAM REPORT ELEMENT

RATING

Program Learning Outcomes - Student learning outcomes identify the intended knowledge, understandings, or abilities that students will acquire through the academic program. The majority of these outcomes are at a high cognitive level.

Reviewer: PLOs are written well.
Reviewer (rated Outstanding)

Outstanding

What Was Done and How Data Were Collected Sections (Assessment Plan ) - The report describes the methodology about data collection and analysis.

Reviewer: Explanation that the plan was altered to align with assessing PLO#5 and coincide with assessment plans of other degree programs in ANRS to coordinate more robust results.

Reviewer: They deviated from the student interviews in order to better match the rest of the degree programs. They used an assignment instead, and two faculty scored with a rubric.

Outstanding

Achieving

What was learned (Assessment Findings) -Findings describe what was learned from the assessment measures. Comparisons are made to threshold values (if they are present). Thoughtful interpretation is made to define Areas of Strength and Areas that Need Improvement based on analysis of data.

Outstanding

Excellent

How We Responded - Sharing Results with Faculty - Results were communicated to the department, or program faculty, with a forum for faculty feedback and recommendations.

Reviewer (rated Outstanding):
It sounds like there's a concerted effort in the animal sciences programs to figure out how to assess ethics together. Thus, the individual programs may begin to revise their original assessment plans.
Outstanding
How We Responded - Changes in Response to Findings -The findings are used to inform annual action plans to improve the program. Assessment findings are appropriately used as information that drives improvement in learning, instruction, curriculum or strategic planning.

Reviewer (rated Outstanding)
Outstanding
Closing the Loop - Based on assessment from previous years, program level changes that have led to program improvements have been implemented and are described.

Reviewer (rated Outstanding)
Outstanding

Note: AOC Reviewers do not always agree on ratings in their Review Reports, nor do they always provide explanation for their ratings. Any specific feedback notes related to Program Report Elements are provided in the boxes above. Refer to Recommendations for overall feedback.