My student evaluations analyzed, by me

I ask students to do a survey at the end of the semester — no extra credit for doing it, just a request for all four of my online classes. 85 students did the survey, though on a couple of questions a few less selected an answer.

Some surprises:

Dump the mp3 files?
For every weekly written lecture, I have audio of me reading it, with embedded Quicktime buttons on the web page, plus a download link for the whole mp3 file. I ask students whether they use the Quicktime buttons embedded in the lecture, download the mp3 file provided, or don’t use the audio at all. The amount of students who download the full mp3 file was far lower than in previous years (only 6%), and the amount using the embedded buttons much higher (62%). 32% rarely or never use the audio.

The mp3 option is time-consuming for me. Each lecture in each class is divided into sections, and each sections has its own audio file. To get the mp3, I have to convert the whole set of .mov files, then zip them all together, and post the zip file’s link. So every time I change one little thing in one little section, I have to go through all this after re-recording the .mov file. Perhaps it’s not necessary. Students used to do it so they could listen in their cars or on iPods. Maybe they don’t anymore?

They are reading my comments on their quizzes!
For the first time I asked whether they were reading the comments I put on the quizzes, not because doing this is time-consuming but because so many students seemed to make the same mistakes in essay after essay. Almost all the students read them, and only 2 out of 85 didn’t know the comments were there (I tell them how to access the comments in an announcement and email after each quiz).

They are also reading the sample essays.
While grading quiz essays, I copy and paste a few of the best into a file, which I then post (without students’ names) as sample essays. Because this is a separate link, and again, because many students continued with poor essays, I asked whether the samples were helpful. Only 11 of the 85 didn’t use them, and almost all the rest found them helpful. So I’ll keep doing that.

Not surprising:

They understand why they’re getting the grade they’re getting.
One of my most important questions tries to balance out their expectations with their performance. If I don’t have high marks on this one, there is a disconnect between grading and perceived achievement. Only 3 out of 85 students disagreed with the statement “I understand why I’m getting the grades I’m getting”.

There are still no good History of England textbooks.
My England class had a particularly high (50%) percentage of people not using the textbook extensively, and it was the only class where there were open comments on the textbook. England is impossible. There is only one true textbook (Roberts and Roberts, Prentice Hall) and it is two full volumes, for a two-semester course (mine is one semester). I am currently using Roy Strong’s The Story of Britain, but may have to switch to something, anything else.

They don’t want an e-book.
Given the preponderance of internet talk about e-books and open source texts, I had a hunch that students wanted to keep their old, boring, expensive printed text book. 58% want to keep that printed textbook. 21% would prefer an e-text, and only 14% want free stuff on the open web instead of a textbook. (Well, OK, this last was a little surprising!)

Good students do the survey.
By their own report, the overwhelming majority of students doing the survey were earning As (51%) or Bs (39%) in the class. I ask this (and a question about how much effort they put into the class) to determine the scholastic level and dedication of the students responding to the survey. I find this most useful!

And my favorite question:

This one is about instructor presence: “I felt that Lisa was really present and visible during this class”. 68% strongly agreed and 29% agreed. Only 4% didn’t think so.

Their comments:

Regarding the comments, I have instituted several changes in response. A few students noted that it was hard to find the primary sources during a quiz, because they have to go back to the forums to look for them. One comment suggested they be somehow linked from the quiz area, but it doesn’t look like I can do that in Moodle, so I’ve added a “Search Forums” box to the main page, and added instructions in the FAQ on using it, and opening another tab or window during a quiz.

Another comment was frustrated with the long, long forum. I use a nested format as the default, but students can change to view to flat or threaded view if they wish — it’s right at the top of each forum. I’ve also added this to the FAQ, with a screenshot.

Yet another complained about not being allowed to add their own sources, which I actually encouraged every week! Another note in the FAQ. (You’ll notice the FAQ has internal links, so I can point them from anywhere to one particular issue.)

I also noted they tended to rate the quality of interaction with other students — they recognized that as part of the design. However, there was no consensus about what I’d done. One student raved about getting to know his/her classmates so well, while another said s/he never really got to know the other students.

Some bemoaned the repetition of the pattern of posting sources, then writing historical theses. Yet in the forums themselves, I saw enthusiasm for the former, and deep improvement over time on the latter. I’m sorry if they find it repetitive. Another word for that is practice! 😉

I may regret this, but in the interest of openness, I’m sharing the whole spreadsheet of results, objective polling only (no comments). Since I’m not being formally evaluated this semester, the only person’s privacy being violated here is my own!

2 thoughts on “My student evaluations analyzed, by me

Comments are closed.