Interpreting Evidence of Learning

Once you have developed assessment measures for routinely collecting course-level student learning data, the next step is to determine what this evidence tells you about student progress toward the learning goals you have set.

The following questions may be helpful during this stage:

  • What does the data suggest about what students know or have learned?
  • What does the data reveal about students’ abilities to apply the knowledge they have learned?
  • What do these data show about what students value?
  • Which students do you still not know much about? (what level, majors/non-majors, etc.)

Consider the following in answering the above:

  • Use more than one rater to increase reliability when coding qualitative data
  • Consider both quantitative and qualitative analysis techniques
  • Take care to ensure student confidentiality throughout this process by taking student names off papers and disassociating names from data
  • Return to your original questions and issues of investigation to guide your analysis

Here are some additional suggestions for looking at and interpreting the data you collect:

  • Direct evidence in the form of course assignments, whether graded or not, can be evaluated specifically with regards to the achievement of course learning goals.
  • The general practice for interpreting data from CATs, which are usually anonymous, informally collected, and often knowledge-based, is to look through it all, sort it into piles, making meaning of it by thinking about how it applies to what students are learning or not learning. Then you can think about making changes to your teaching practice, assignments, or course readings
  • For MSGF or survey feedback—both examples of indirect assessment—it’s important to look over the results and discuss them with a colleague or CNDLS staff member. It also helps to talk to your class about what changes (if any) you’re making to the course based on their feedback.

4 Responses to Interpreting Evidence of Learning

  1. Daryl Nardick says:

    We need examples of rubrics.

  2. Mindy McWilliams says:

    This content seems like it was written for the program assessment context. It still needs to be better-adapted to the course context.

    For example, the opening paragraph uses language that is too jargony and makes it sound too hard. If I was a faculty member, that language would turn me off.

    Also the bullets really don’t fit with course level assessment practices. You basically wouldn’t expect someone to involve a colleague in helping them assess their own course, they wouldn’t nec. have to use qual and quant techniques, and I dont’ think the last bullet even applies in this context.

  3. Mindy McWilliams says:

    If you think about the kind of data a faculty member will have from course-level assessment data gathering, it might help in writing guidance on this page about how to look at and interpret that data.

    Remember the data was:
    1) [direct] course assignments (graded or not) or pieces of them, evaluated specifically wrt achievement of certain course learning goals
    2) [indirect/direct]CAT data (already anonymous, informally collected, often knowledge-based). The general practice in interpreting these kinds of data is to look through it all, sort it into piles, make meaning of it by thinking about how it applies to what students are/are not getting, and making course adjustments to teaching practice, readings/sources, or assignments/practice for students.
    3) [indirect] MSGF or survey feedback. Important to look over, maybe discuss with a colleague or CNDLS staff, and address with students what you are thinking about changing (or not)

  4. Mindy McWilliams says:

    Some suggested edits to the questions:
    * What does the evidence suggest about what students have learned, or where gaps in their knowledge still remain? [thinking here about assignments and CATS]
    * What does the evidence reveal about students’ abilities to apply the knowledge they have learned? [assignments and CATS]
    * What have you learned about what students value?
    * How does what you have learned from the evidence line up with your student learning goals?

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>