Friday, February 19, 2016

Midterm Evaluations

I do midterm evaluations in my classes.  It's not quite midterm, but I had a sense that maybe the class wasn't going so well, so I handed them out today instead, just to get some data.  Boy, did I.  I kind of want to frame one of them, for its pure eye-rolling irony:

"I am never engaged in this class because it has nothing to do with my major."

His major?  Criminal justice.
The class?  Ethics.

Yeah.

--Prof Chiltepin


16 comments:

  1. I had the "pleasure" of busting some kids for cheating on...

    their ethics exam.

    ReplyDelete
    Replies
    1. My high school ethics teacher went to jail for murder not long after I took the class (it was actually a bit more complicated than that -- it was a domestic situation, and one could argue that the guy had it coming to some extent, though not to the extent of murder not in self-defense, which was the situation -- but still, it did cast question/a new light on some of our discussions).

      Delete
  2. I mean... that's pretty accurate if the behavior of law enforcement is anything to go by.

    ReplyDelete
  3. "He seems to know what he's talking about."

    This was an eval I collected at midterm, so we had completed chapters 1-5 of the textbook - mostly high school chemistry-level material. Christ on a popsicle stick, I hope that I know that stuff. I just love the "seems" in the statement, as if the student isn't quite sure but he's willing to give the benefit of the doubt.

    ReplyDelete
    Replies
    1. Yeah, the students being experts in the field and fully qualified to judge who else is expert as well, and all that.

      Delete
  4. I... I don't fill out student evaluations. I'm sorry. I'm that guy. I've been told repeatedly they're meaningless. If I want to congratulate the professor on something they did particularly well, I'll tell them face to face. If I want them to be rewarded for it, I'll write a letter with my Goddamn name attached.

    If they did something that I didn't like, I'm not putting that on paper because it's shitty to take it out on their job. And if it was truly egregious (hasn't happened yet), again I'm writing a letter with my Goddamn name attached.

    ReplyDelete
    Replies
    1. Thing is, we KNOW they aren't good at what they aim to do, but departments are still required to collect them.

      Getting less than a set return rate, or lower 'scores' (usually expressed as means even though those don't make sense for Likert scales), leads to various 'consequences' which take away time and energy from actual teaching, and can make your academics less happy and confident which affects their teaching effectiveness.

      So sure, they're dumb in many ways, but the administrators love them, and by not filling them out you're having a (small, but cumulative) effect on your faculty, and probably one you don't want to have. Tick a few 'average' boxes, write 'n/a' or 'nothing' in the open response boxes, and you're done... please?

      Grumpy Academic (who currently has the service role which requires writing a report to the Dean explaining every instance of 'low returns' on evaluations and an action plan for how this will be avoided next year and an effectiveness assessment of how well last year's action plan panned out. Having to do this definitely affects my interactions with students and ideas for the rest of the day, at least. And sure, we push back every year on why we do this, but the growing distrust of any professional across the board in the UK (and the US) means that not wanting to do this sort of documentation is usually seen a proof of 'guilt', that is, of not taking customer student opinions seriously, which is a Bad Thing in itself. So please just do the forms so we can tick our own boxes? The easiest report by far is the 'all modules met required levels of evaluation returns' one...).

      Delete
    2. I stopped filling them out the last few years I started reading this site and realized how shitty the evaluation system is. It alo seemed so ridiciculous that we're supposed to evaluate instructors on how well they covered relevent hamster-fur weaving material. Well, since I'm new to the subject (most of the time), how the hell am I supposed to judge that?

      I also suspect these evaluations have more weight against adjuncts. Wouldn't they be more vulnerable to negative evals becuase the school doesn't have much else to go by in judging adjuncts, as compared to a more senior instructor or someone with tenure? But fine, thanks for the comments here from Anonymous about why students should just fill them out anyway. Since I'm back in school again, I'll do them from now on. I used to do them only if the class was really good or to put in a good word for an instructor
      who I thought might get bad evals from a clique of axe-grinders in class.

      But it is such BS. Especially the theory that a shortage of evaulations is indicative of some vague wrongdoing on the part of the instructor. Seriously, that methodology seems completely off base. As an adult who has experience in other parts of the civilian world and the military, we know these evaluations are BS. Soldiers often have to be -ordered- to fill out "command-climate" surveys and the like because they are simply too damn busy with other things or know evaluations don't change anything.

      But thanks for explaining the eval system here. I will fill the damned things out from now on and pass the word.

      Delete
    3. Our end-of-term evaluations are the administration's sole metric of being a "good teacher" but our midterm evals are only read by the instructor. That's actually useful, in three ways. It gives the students a chance to give useful feedback, like "please write larger, we can't read your writing." Students can blow off some steam, like complaining about the number of exams. It gives them the perception (sometimes accurate) feeling that I care what they think, which can make them feel better about the class.

      Delete
    4. If the instructor hands out a mid-term evaluation or a private evaluation, I'll obviously fill that out.

      Delete
  5. I agree with the part that says students not filling out stuff they are sent has a negative impact on instructors (although I understand why students might not do it), and I understand about return rates, but can you say more about the issue of means? It always irks me, but I don't know enough about Likert scales and how they do (or don't) work.

    ReplyDelete
    Replies
    1. Means make more sense if you have a continuous range of values to choose from, instead of integers 1 through 5 (strongly agree, agree, neutral, etc.) It's like an average household having 2.3 children. That doesn't really make sense. It's my understanding that you are supposed to use a different type of statistics when handling data consisting of integers but I'm not real sure about why or how.

      Delete
    2. Cheers, Ben. I remember data can be nominal, ordinal, interval, or ratio (but only because it spells "noir"), but that's about it.
      I'm pretty sure that if my teaching effectiveness was 3.2 last and 3.1 this year, it doesn't necessarily show that I have got worse, although I believe that's how admin see it here.

      Delete
    3. Regarding the means of Likert scale data, I recall reading something on a related issue when I was trying to teach myself stats. Suppose you give 1000 people a test comprising a single true-false question, and 501 of them pick the correct answer. You wouldn't say that the mean score is 100% on the basis that an individual score can only be 0 or 100%.

      So, saying that the mean of a Likert scale item is 3.1 or 3.2 is probably OK even though it's not possible for a single rater to assign a score with that precision. What may be wrong is inferring that the difference in means is distinguishable from random chance, i.e., noise. Methinks it greatly depends on the sample size and inter-sample deviation within the two datasets.

      But I am not a statistician, so I can't definitively say that my ramblings are significantly different from noise.

      Delete

Note: Only a member of this blog may post a comment.