Download a pdf of this report.

1. Past Assessment Summary. Briefly summarize the findings from the last assessment report conducted related to the PLOs being assessed this year. Include any findings that influenced this cycle’s assessment approach. Alternatively, reflect on the program assessment conducted last year, and explain how that impacted or
informed any changes made to this cycle’s assessment plan.

The previous problem solving assessment was done in 2018 with a single class using an existing group project. This assessment indicated that the problem solving scores on a 1-4 scale was between 3.2 and 3.9 for problem solving. One consideration for the future assessment was to find a way to assess the degree of collaboration within a group. To address this a single group of questions were developed for multiple classes and to evaluate the progression from beginning to end of the degree program. A 1-4 scale was again used to evaluate the skills.

2. Action Research Question. What question are you seeking to answer in this cycle’s assessment?
The objective was to determine if there was a difference in the ability of under
classmen to upper classmen for problem-solving.

3. Assessment Plan, Schedule, and Data Source(s).

  • a) Please provide a multi-year assessment schedule that will show when all program learning outcomes will be assessed, and by what criteria (data). Note: This schedule can be adjusted as needed. Attempt to assess all PLOs every three years. You may use the table provided, or you may delete and use a different format.
ASSESSMENT PLANNING SCHEDULE CHART
Program Learning Outcome Courses Mapped to PLOs 2021-2022 2022-2023 2023-2024 2024-2025
1. design and evaluate animal management systems by synthesizing and applying knowledge of biological processes related toanimals and the rangeland plants that support
them. (Knowledge)
         
2. identify and critically evaluate scientific or technical animal science content to make informed decisions providing a foundation for lifelong learning. (Critical thinking)   X      
3. demonstrate effective oral and written communication to a range of audiences and within collaborative environments.
(Communication and collaboration)
    X    
4. use scientific principles to formulate questions, explore solutions, and solve real-world problems and advocate based on science. (Problem solving)       X  
5. Apply ethical standards to manage animal
resources. (Ethics)
        X

 

  • b) What are the threshold values for which your program demonstrates student
    achievement? Note: Example provided in the table should be deleted before submission.
Threshold Values
Program Learning Outcome Threshold Value Data Source(s)
Example: Communicate in written form about fundamental and modern microbiological concepts The threshold value for this outcome is for 75% of assessed students to score above 2 on a 1-4 scoring rubric. Randomly selected student essays
1.  Design and evaluate animal management systems by synthesizing and applying knowledge of biological processes related to animals and the rangeland plants that support them. (Knowledge) The threshold value for this outcome is an on average 20% improvement on knowledge test scores  between freshman and seniors. Assessment Exam
2.  Identify and critically evaluate scientific or technical animal science content to make informed decisions providing a foundation for lifelong learning. (Critical thinking) The threshold value for this outcome is for 80% of assessed students to score above 2 on a 1-3 scoring rubric. Randomly selected student writing assignments
Demonstrate effective oral and written communication to a range of audiences and within collaborative environments. (Communication and collaboration) The threshold value for this outcome is for 80% of assessed students to score above 2 on a 1-3 scoring rubric. Evaluators attend student oral presentations and randomly select students
4.  Use scientific principles to formulate questions, explore solutions, and solve real-world problems and advocate based on science. (Problem solving) The threshold value for this outcome is for 80% of assessed students to score above 2 on a 1-3 scoring rubric. Single group of questions 
administered in various classes to allow the comparison of upperclassmen to underclassmen.
5.  Apply ethical standards to manage animal resources. (Ethics) The threshold value for this outcome is for 80% of assessed students to score above 80% on ethics assessment. Module and Quiz administered in D2L

 

*Data sources should be examples of direct evidence of student learning: specifically designed exam questions, written work, performances, presentations, projects (using a program-specific rubric – not a course grading rubric); scores and pass rates on licensure exams that assess key learning goals; observations of student skill or behavior; summaries classroom response systems; student reflections.

Indirect evidence of student learning includes course grades, grade distributions, assignment grades,
retention and graduation rates, alumni perceptions, and questions on end-of-course evaluations forms
related to the course rather than the instructor. These may provide information for identifying areas of
learning that need more direct assessment but should NOT be used as primary sources for direct
evidence of student learning.

 

4. What Was Done.

a) Was the completed assessment consistent with the program’s assessment plan? If not, please explain the adjustments that were made. YES

b) How were data collected and analyzed and by whom? Please include method of collection and sample size. Three questions were developed to assess the problem-solving abilities of the students.

Three questions were developed to assess the problem-solving abilities of the students in the Animal Science curriculum (Appendix A). A cover sheet asking for name (name was only used to remove duplicates), major and option and level (freshmen etc.) was attached along with a short paragraph about problem-solving. With this was an acronym, IDEAL (I – Identify the problem, D – Define an outcome, E – Explore possible strategies, A – Anticipate Outcomes & Act and L – Look and Learn.) with suggestions on how to go about the questions (Figure 1). The questions were given to students in ANSC 100 Introduction to Animal Science and many of the upper division classes that were taught in the Spring of 2023 (ANSC 337 Diseases of Domestic Livestock, ANSC 322 Applied Breeding and Genetics, ANSC 432R Sheep Management, EQUS 430 Horse Management, EQUS 346 Equine Reproductive Management, and EQUS 347 Equine Form to Function). Other majors were excluded from the dataset. Majors and classes were verified from university records. The objective was to determine if there was a difference in the ability of underclassmen to upper classmen for problem-solving. An effort was made to try and minimize duplication by looking at the students enrolled in each class. This helped to determine which professors were asked to conduct the exercise. The number of students participating from each class and option are found in Table 1.

The IDEAL Problem-Solving Method includes:

  • I – Identify the problem.
  • D – Define an outcome.
  • E – Explore possible strategies.
  • A – Anticipate Outcomes & Act.
  • L – Look and Learn.

Table 1: Demographics of students participating in the problem-solving exercise.

Count of participants by class and major
Major Underclassmen Upperclassmen
ASSE 20 22
EQUS 14 13
LVMI 9 17
Total 43 52

 

c) Please provide a rubric that demonstrates how your data were evaluated. Note: Rubrics are program-specific NOT course grading rubrics. Example provided below should be deleted before submission – your rubric may be very different; it just needs to explain the criteria used for evaluating the student artifacts as they relate to the PLOs being assessed.

 Accomplishment Level

Expert (Grad level work) (4) Outstanding (3)  Meets Expectation (2) Below Expectation (1) Information not Present (0)
  Rarely but
occasionally
seen in an
undergraduate student
Met the
expectation
but also
extremely
well done
Average
performance level. 50-70% of students
should score here.
Promising
but not
quite there.
Responder
did not respond.
Poor fit of assignment

Define Problem Student will define a problem

Student produces a comprehensive definition of a problem and constructs a clear and
insightful statement of problem.
Student
accurately
defines a
problem and
creates a
convincing
problem
statement.
Student defines a problem and
constructs a detailed problem
statement.
Student
begins to
demonstrate the ability to define and construct a problem statement
Information
not present
Identify Strategies Students will solve problem with data provided Student develops a comprehensive approach for solving the problem using data provided. Student applies comprehensive approaches for the problem and uses data to support it.

Student identifies and applies data to solve a problem.

 

Student identifies and applies inadequately to solve the problem Information Not Present
Propose and Implement Solutions Correctly ID solution Student convincingly identifies the solution for the problem Student identifies solution but misses some nuances. Student missed some of the solutions The solution not accurately identified. Information not present
Justification of Selected Solution Student comprehensively utilizes data to justify the selection. Student clearly utilizes data to justify selection. Student utilizes data but needs clarity in the presentation of solution  Student attempts to use data but does not clearly understand the use of the data Information not present

 

The answers were evaluated by 5 faculty members (using the rubric below. The first two students evaluated, one upper classman and one freshman was done as a group. After that, each individual student was evaluated by two different faculty members. The resulting evaluations were summarized. Upper classmen were defined as juniors and seniors while under classmen were defined as sophomores and freshmen. The GPA of individuals obtained from DegreeWorks, was added to the data set. The regression coefficient was low but statistically significant with an increase in average score with increased GPA (Figure 2). Most of the students that completed the questions had
GPA’s that were between 2.5 and 4.0.

5. What Was Learned.

  1. Based on the analysis of the data, and compared to the threshold values established, what was learned from the assessment?

    The upperclassmen were significantly more likely to define the problem, identify strategies for solving the problem, identify the correct conclusion and justify the solution with the data presented (Table 2). Though the difference was statistically significant, the differences were not large, which would suggest that there is room for improving the ability of the students to problem-solve. Furthermore, a score of 2 would indicate that the faculty determined the individual met the expectation of the different categories. Neither upper- nor under-classmen met this score. The current assessment plan suggests that 80% of the students evaluated need to meet expectations for the problem-solving category. This was not met. Only 24% of the upper classmen scored 2 or greater for the three problems (Table 3). Additionally, there was no difference between the options and the summary of option data is in Table 4.

 

Table 2: The effect of class and major on the problem-solving skill for Animal Science students in all
options.

          P-value
Category Underclassmen Upperclassmen SE Class Major
Define the Problem 0.94 1.13 0.37 <0.01 0.42
Identify a Strategy 1.49 1.82 0.14 <0.01 0.93
Propose Solution 1.63 2.03 0.20 <0.01 0.92
Justify Solution 1.44 1.80 0.13 <0.01 0.85
Avg. Score 1.37 1.70 0.15 <0.01 0.97

 

Table 3: Percent of students scoring greater than or equal to 2 by question and class.

Question Underclassmen Upperclassmen
Question 1 20.93 37.73
Question 2 9.30 28.30
Question 3 16.28 24.53
Total 11.63 24.53


Table 4: Average overall score by option and class.

Option Underclassmen Upperclassmen
Science 1.42 1.77
Equine Science 1.52 1.56
Livestock Management and Industry 1.28 1.83


There were some differences in the ability of the students to answer individual questions. A summary of the average score for the individual questions is in Table 5. The questions may have needed more description of what was wanted and could have contributed to some of the differences. There was less difference between the classes with questions one and two which necessitated the understanding of EPDs for beef cattle while the difference was greater for question 3 which was a simple identify the better filter for a back country hiker.

Table 5: The average by class for the individual questions.

Question* Underclassmen Upperclassmen Difference
Question 1 1.42 1.79 0.37
Question 2 1.40 1.77 0.37
Question 3 1.24 1.66 0.42

*Actual questions can be found in Appendix A

b) What areas of strength in the program were identified from this assessment process?

This approach was one of the first times comparisons were made between the starting and ending of program for problem solving. There is an improvement in the ability of iindividual students to solve problems as they progress through our curriculum. Previous to this we had selected individual classes and individual assignments to
evaluate the program.

c) What areas were identified that either need improvement or could be improved in a different way from this assessment process?

The comparison of underclassmen to upperclassmen would suggest there is a slight improvement from the beginning of our program to later in the degree program. However, the ability of the students to problem solve did not meet our goal of 80% of the upperclassmen meeting expectations. The recommendation is to develop a consistent approach to problem solving in all classes. The approach shown in Figure 1 could be used to train students, but others are available online. One major problem was the ability of the students to define the problem. It is important to identify the problem before you try and solve the problem. Part of this could be the construct of the questions, but if students are trained to define the problem in writing anytime they approach a problem this would help to develop their problem-solving skills. Some side notes were that students had problems reading, comprehending and following instructions. Thus, reading comprehension seems to be a problem or lack of
desire to complete the exercise.

6. How We Responded.

     a) Describe how “What Was Learned” was communicated to the department, or program faculty. How did faculty discussions re-imagine new ways program assessment might contribute to program growth/improvement/innovation beyond the bare minimum of achieving program learning objectives through assessment activities conducted at the course level?

The assessment was presented to the Animal and Range Sciences faculty during our annual faculty retreat and time was allotted for feedback and recommendations. The recommendation is to develop a consistent approach to problem solving in all classes. The approach shown in Figure 1 could be used to train students, but others are available online. One major problem was the ability of the students to define the problem. It is important to identify the problem before you try and solve the problem. Part of this could be the construct of the questions, but if students are trained to define the problem in writing anytime they approach a problem this would help to develop their problem-solving skills.

     b) How are the results of this assessment informing changes to enhance student learning in the program?

The assessment has led to discussion among faculty on identifying a common problem
solving approach to implement throughout our curriculum and to take a deliberate
approach to problem solving in the classroom.

     c) If information outside of this assessment is informing programmatic change, please
describe that. N/A

     d) What support and resources (e.g. workshops, training, etc.) might you need to make
these adjustments? N/A

7. Closing the Loop(s). Reflect on the program learning outcomes, how they were assessed in the previous cycle (refer to #1 of the report), and what was learned in this cycle. What action will be taken to improve student learning objectives going forward?

     a) In reviewing the last report that assessed the PLO(s) in this assessment cycle, what changes proposed were implemented and will be measured in future assessment reports?

Comparing the results from this assessment cycle to the previous one suggests that group projects encouraging interaction improve the ability of the students to solve a problem. This could also mask some of the problems with other students that are not as forceful in a group situation. As indicated in the previous assessment, there is a
need to be able evaluate the contribution of an individual if a group assignment is used for the assessment. Utilizing a standard set of questions indicated that individuals were not as good at solving problems. As we encourage collaborative work to solve problems, it would be difficult to distinguish a better solution to the problem solving.

     b) Have you seen a change in student learning based on other program adjustments madein the past? Please describe the adjustments made and subsequent changes in studentlearning. N/A