Feedback on teacher survey
In early 2020, the KTH LEQ team distributed a survey to collect information about the course evaluation process and the use of LEQ. Below you will find a short summary of the results as well as the team's answers to questions submitted anonymously.
The survey was distributed in February 2020 to approximately 600 people at KTH who, since 2015, have used LEQ as a course evaluation questionnaire. The response rate is about 16% with a spread in all of KTH's five schools. The questionnaire contained 10 questions with mixed answer options such as questions with cross options mixed with free text.
Below we have answered some of the free text questions/comments that came up in the survey.
I usually get a low response rate on my LEQ, what can I do to increase it?
It is well-known that most web-based questionnaires suffers from low response rates. There are several ways to increase the response rate:
At the start of the course, you mention to students during your first meeting with them what last year’s students said about the course, and what changes you have implemented as a result of that. If you have not made any significant changes, show instead a summary of the answers to the question “What advice would you give to next year’s students?”. By doing this, you show the students that their views of the course matters.
Talk to the students before you start the questionnaire. Remind them about the responses of last year’s students (see 1, above). Also explain how you will use the information gathered.
Another way of increasing the response rate (but with an important drawback) is to allow the students some time to fill in the questionnaire during the last class meeting. The drawback concerns statements in the LEQ that involves the assessment in the course (which likely is not completed before the study period is over). The assessment may hence not be evaluated properly.
- Is there a difference in response rates between the different versions of the LEQ (LEQ6, LEQ12, and LEQ22)?
Yes, there is a slight difference. During 2019 the average response rates were:
LEQ6: 26 %
LEQ12: 28 %
LEQ22: 32 %
So there is no indication that a shorter questionnaire increase the response rate.
To increase the response rate of the questionnaire, it would be very helpful if it was possible to extend the response time, even after the survey is started.
A good way to increase response rate once the questionnaire is started is to monitor the progress and perhaps send emails to students to remind them of the importance of answering it. Allowing time for final results/grades to be entered into Ladok before the questionnaire is closed, is also a good way to increasing the respons rate. At this point in time, it is not possible to change the closing date. We will look into it.
Can you make LEQ as an assessment task with bonus points or similar to increase the number of participants?
If “assessing the course” is a part of the course’s intended learning outcomes, it is possible (as you, in the course assessment (i.e. exams, project assignments, labs etc.), should only assess the intended learning outcomes).
Generally, you should steer away from forcing/bribing students to give opinions on the course. In the worst case, this will either trigger frustration among students that are forced to assess the course or “greedy” behavior among students (that only will fill in the questionnaire if the get a chance of winning an Ipad or similar). Neither of these states is desirable from a course development aspect. Would you trust data acquired through such measures?
KTH students become quite tired of using these all the time so they stop filling them. Has anyone figured out the best way to do this in order not to "burn" the students?
We are investigating several measures which we plan to test in the near future.
I have been told by colleagues that the tone in LEQ free text answers has become more rude in the last years.
Although we experience rude comments from time to time, it is our firm belief that the LEQ greatly has reduced the number of rude comments directed at teaching staff. We believe that the main reason for this is that the LEQ statements steer away from evaluating the teachers and focus on assessing the learning environment. If this is a frequent problem, it is our suggestion to bring this up with the director of studies of the department, or the program director, who will have to bring this up with the students. As a teacher you can also bring this issue up with the students before initiating your LEQ. Rude comments might hurt your feelings, but is not useful when it comes to course development, and should be disregarded when it comes to course analysis.
I find it hard to interpret the result and draw conclusions about specific changes.Interpreting the LEQ footprint requires knowledge about the method. Research has established that students tend to learn more effectively if the course offers a favorable learning environment. The general idea with the LEQ is therefore to probe for indicators of this. Statements 1-6 investigates the meaningfulness of the course, statements 7-16 investigates the comprehensibility of the course, and statements 17-22 investigates the manageability of the course. Website
If specific changes has been made to the course, e.g. large changes in teaching methods, it may be useful to run a separate assessment investigating these changes. You might also add a free text question at the end of the questionnaire, for example “This course round we introduced peer review of the lab reports. Did you find this valuable?”
It would be useful to able to add or delete questions to the questionnaire.
At present, it is possible to add free text questions to the questionnaire. Unfortunately, there will not be any statistics for these questions. If there are certain aspects you would like to investigate, we recommend assessing this separately.
You may choose three different versions of the LEQ to limit the number of statements, but it is not possible to delete statements.
There should be one more alternative with 0 LEQ questions (just the open questions in the end).
We are considering this option. It may be useful for courses at research level as well as master thesis courses.
The questions are presented in a weird order in the file presented to us (often one of the open-ended questions seems to come at the end). Would be great if it was possible to fix.
This seems to be a bug. Please submit this to email@example.com, and we will investigate.
Many students provide good comments but it is not possible to track the same student's comments throughout the questionnaire. It happens that students answer "Same as above" and then it is impossible to know what the student refers to.
The SCA-team would like this feature too. However, as far as we know, this cannot be implemented using the system we are using to run LEQ (Survey and Report). If anyone know if this is possible, we are more than happy to provide that service to you.
It would be good to know what score a student gave to a statement where he/she provides a comment.
This should be possible to see if the student has provided comments to a specific statement (found at the end of the questionnaire report). The answers to the open questions are sorted by the number of weekly hours the student has spent on a course.
I have found that having a conversation with students yield better ideas for course development than using the results from a questionnaire.
Developing courses through discussion with students is often a very fruitful and effective way to bring about changes in a course. There are however a few potential problems with the method that a teacher must be aware of and consider:
According to the Swedish Higher Education Ordinance, section 14 (Högskoleförordningen 14 §):
Higher education institutions shall enable students who are participating in or have completed a course to express their experiences of, and views on the course through a course evaluation to be organized by the higher education institution.The higher education institution shall compile the course evaluations and provide information about their results and any actions prompted by the course evaluations. The results shall be made available to the students. Ordinance (2000:651).This means that by only having a discussion with a limited number of students as a means of assessment of the course, you do not fulfill the requirements of the Higher Education Ordinance. Performing a questionnaire that is sent out to all registered students (or e.g. a Kaizen meeting to which all registered students are invited) ensures compliance. Hence, do both!
There may be students that are uncomfortable with bringing up issues in a discussion with the teacher, especially if the student/s are negative to the course or parts of it. Making sure students can voice their opinions on the course in an anonymous way is therefore recommended. (KTH decision V2016-1044).
There is no way of knowing if the ideas (suggested by the students in the discussion) are representative. Are all students in agreement? But, if individual students in conversations with you come up with suggestions that you find reasonable, that may also lead to course development.
I would see LEQ integrated in Canvas with possibilities to do questionnaires before the course ends, and with statistics available for students to see.
It is possible to do already. Some version of LEQ is available through Canvas Commons. However, the SCA/LEQ group is not supporting and/or encouraging this as students are very reluctant to answer more than one questionnaire during a course. We would prefer that this questionnaire is provided through the quality-assured system we are supporting.
Please rethink the questions: we need easy to understand, down to earth questions.
When the LEQ was developed, we ran several workshops with students checking whether the students interpreted the statements as intended. This led to several revisions of the statements. Another reason for not changing the statements is the possibility to compare between different course rounds. We are therefore reluctant to changing the statements. We are however happy to receive suggestions for changes as we are constantly working to improve the system.
Would be nice to get student feedback on how they interpret the questions, like year 1 students and year 5 students. I would like to see more explanations in the question, for example - Q20. I had opportunities to influence the course activities - it is not really clear what is meant by influence and by course activities.