Skip to main content
To KTH's start page

Examples of using Canvas Course Analytics

The examples on this page can help you as a teacher to see how Canvas Course Analytics can provide insights into your teaching by learning about student progress, participation and learning behavior. The examples describe scenarios where the tool is used, and some include screenshots of how it can look in Course Analytics.

Question compilation

To get to the questions, scroll down the page or press on the question in one of the lists in this compilation. 

Class progress and engagement

The following examples explore how the student group progresses through the course, and how they engage with the resources available in the Canvas room. 


1. Are my students progressing on time and according to the Study plan?

Example story: "I expected at least half of my class would submit their first quiz after one week, and the second quiz after two weeks. When I checked the Course Analytics tab, I noticed that less than a quarter of the class submitted their second quiz by the deadline. A cluster of about x students submitted late or not at all. The same group struggled with the first assignment too. It’s clear some students aren’t keeping pace. In Canvas Course Analytics, I can select these students and start a dialogue with them to facilitate more help for struggling students."


2. Is student course activity engagement satisfactory?

Example story: "The weekly engagement of students looks very high at the start of the course, with a very high rate of access to the course, but it rapidly flattened by the second week. Students' engagement isn’t consistent. Some are rarely logging in unless a deadline for assessments forces them. I also notice week by week, the engagement is dropping. I will surely have to do something to engage students more around the mid-way of the course."


3. Are some course topics or resources more attractive or less?

Example story: "The resource analytics surprised me. Students revisit the ‘Solving Systems’ videos constantly, but almost nobody is opening the ‘Diagonalisation’ notes. I can understand why it is less attractive among my students; the resource is a text-based, theory-heavy document. But the learning material includes the essential basic theories for this course. I should give more attention to this in the next lecture/discussion, to take it up and discuss, because knowing the basic theory is important."


4. Is the group more active on some weeks?

Example story: "Canvas shows big spikes for engagement in the first weeks, but little to no activity in the weeks after assignments. It’s clear that the students have a certain study pattern. They start strong and fade fast as the weeks go on. But this is not good from the course design perspective, since the workload for students around assignment deadlines will be heavy, leaving less room for students to cover all required resources and activities before assessments/examinations. If this pattern persists, I will get more delayed assignments, which will delay the engagement of the next module, and many students will miss eventual assignments (and possibly drop out) eventually."


5. Are there any gaps in the activity timeline so students might have quiet times in the course?

Example story: "Canvas Course Analytics shows that overall weekly logins tend to drop significantly during the weeks with no assignment deadlines. This means students are likely reading the materials and doing activities only to complete assignments. To reduce this rush, I should look into some changes to the assignment due dates to make students’ engagement patterns even across the weeks in the next course round. From next time on, I will also reserve grading and feedback time to students, not immediately after the deadline, but 2-3 days after, so students will keep logging in to receive and read feedback, and then, possibly, I will set up a discussion room for students to discuss feedback."


6. Are some discussions attracting more engagement?

Example story: "I noticed the engineering applications discussion was very active with many replies. Meanwhile, ‘Matrix Multiplication Efficiency’ barely had five comments. Clearly, discussion topics with real-world context or practical applications motivate students to engage in discussion more than procedural topics. Next time, I should focus more on providing real –case practical topics for the discussions."


7. How could discussions be redesigned?

Example story: "To boost participation, I’m planning to use real engineering case studies. So, instead of asking them to explain a concept, I’ll let them interpret simulation results or critique a sample code snippet. Pedagogically, students engage more when something feels authentic. I can use Canvas Course Analytics later to check how the students are engaging in the discussion and what the participation frequencies are."

Individual learner progress and engagement

The following examples focus on how individual student behaviors can inform the learning design. 


8. What activity patterns do the highly engaging students show?

Example story: "Students X, Y, and Z have very steady activity patterns. Every week, they access the resources and complete activities early. Their engagement lines are consistent and strong. I can take those students’ patterns as guides for seeing the feasibility of deadlines, learning design, etc."


9. Are there some students showing reduced engagement?

Example story: "Canvas trend lines show that around Week 5 – the middle of the course, several students, like Tom and Sara, stopped accessing readings and skipped two quizzes. Their early participation was strong, but it’s been declining sharply. Clearly, they’re struggling with abstract concepts. I should check if other students are having the same problems. If not, I should send them a message (from Canvas). If many other students are struggling, I should definitely revisit my learning design."


10. Do the underperforming students have a different engagement pattern?

Example story: "The gradebook shows a group of about 12 students consistently below the class average. When I cross-checked their resource usage (in Course Analytics), it all made sense; they hardly open the worked examples or tutorial videos. Low engagement equals low performance. I may need to motivate students to view those important resources, maybe they need more emphasised guidance."

Learner grade achievement

The following examples look at how assessments and results can inform the learning design. 


11. Are some assessments harder?

Example story: "Quiz 2 caused the biggest dip in scores; the average dropped from 72% on Quiz 1 to 45%. A lot of students left the eigenvalue proof questions blank. The Course Analytics confirm what I felt: this quiz was harder than intended. I should either give more emphasis to this in my lecture or add simpler resources as supplementary material."


12. Does greater resource use lead to greater success?

Example story: "When I checked students’ quiz performance, the highest scorers were also the ones who frequently accessed the solution walkthroughs and the MATLAB demos. The data backs it up: resource engagement is boosting success. I will have to make it more emphasised in the learning design."

Engagement with selected course resources

The following examples focus on the use of resources in the Canvas room.


13. Which course resources are accessed most / least?

Example story: "The 3D vector visualizer and row-reduction tutorials are the top hits; students were keen on trying these interactive materials. But the textbook readings are barely touched. Even the required research notes are getting far fewer views than expected. As I see these materials may not need to be mandatory, since the interactive materials give an alternative opportunity."


14. Are required resources being used?

Example story: "Only about half the class opened the ‘Essential Guide to Eigenvalues’, even though I labelled it as required. No wonder so many struggled on Quiz 2, the very resource designed to help them hasn’t been accessed. I should think about revising these resources; maybe the guide is too abstract for students."


15. Are students engaging with resources at the right times?

Example story: "The graphs show huge spikes in video views the night before quizzes, which can be a classic behaviour. Very few students explore resources consistently throughout the week. Others are waiting until the last moment, which explains the uneven performance."