The Eli Review Blog

New tutorials, research, teacher stories, and app updates - subscribe via RSS or our newsletter!

“Highlights” Analytics Reveal Engagement, Part II

This is the third in the Investigating Your Own Course series.

Last Thursday we used the Highlights report from Eli Review’s Course Analytics feature to determine whether or not Bill Hart-Davidson had designed and facilitated a feedback-rich environment in his Technical Writing course. Today’s post is focused on where to look to gauge if the feedback-rich environment was helpful.

To continue following along using data from your own course, find your Highlights report by following these steps:

  1. Click on the name of a current or previous course from your Course Dashboard.
  2. From the course homepage, click the “Analytics” link. “Engagement Highlights” are shown first.

Question 2: To what extent was this feedback-rich environment helpful?

Last week’s post focused on simple counts and averages that helped us see the types of work students were assigned and the volume of their responses. Quantity can tell us a little bit about quality, however. This post will demonstrate ways that Eli Review analytics allow users to indicate quality. We’ll use these numbers to draw conclusions about how helpful writers and the instructor found the feedback-rich environment.

How helpful on average did writers find the feedback they received?

Writers can indicate how helpful they found comments they’ve received by rating a comment  with 1-5 stars. Bill teaches his class to be as stingy with the stars as he is with endorsing. In the Comments Given group, the “Ratings” column tells us the helpfulness average across all reviews. The least helpful commenter averaged 1.1 stars whereas the most helpful commenter averaged 2.6 stars. Again, that’s a big gap — more than 1.3 stars difference!

In the "Feedback and Helpfulness" table we can see how feedback given by students was rated over the duration of the course.

Note: To see how an individual student compared to the top and bottom 30% of the class, additional student analytics can help Bill contextualize these helpfulness ratings.

How many comments received instructor endorsement?

The Highlights report also includes the count of the comments Bill endorsed. Endorsing is a way for instructors to tell writers that they agree with peers’ comments. Bill was intentionally really stingy about endorsing:

Another analytic shows that he endorsed only 1% of all the comments exchanged in the 13 reviews students completed. Bill’s approach provided a strong signal to writers about helpful feedback and to reviewers when they’d given good feedback. It’s quite telling that Bill endorsed 31 comments given by one student and just one comment given by another.

From the "Feedback and Helpfulness" table, we see the reviewer with the most comments endorsed and the two reviewers with the fewest endorsed comments.

By looking at the quantity of comments he endorsed and at the average helpfulness rating according to the writers who received those comments, Bill can feel confident that writers found the feedback they received helpful.

So, to what extent was the feedback in Bill’s class helpful?

Using two metrics from the Highlights report in Bill’s class — average helpfulness of comments given and count of endorsed comments — we’re able to conclude that students had high expectations for helpful comments, which were often met.

But, there’s more to helpfulness than counts, right?

Sure. That’s why Eli Review provides additional ways to explore and export data:

These reports makes visible the learning students are doing while giving and receiving feedback. It’s formative feedback for Bill about how students are learning. It’s not everything, but it’s  A LOT — at a glance, in real time, and at the end of a course.

In future installments of the Investigating Your Own Course series, we’ll look at more data in the Course Analytics features of Eli Review, including trend graphs, average student profiles, and comment digests.

We hope you’ll follow along with us and investigate your own classes too. Please share your findings on Facebook or  Twitter with the hashtag #seelearning!

Interested in learning more?

“Highlights” Analytics Reveal Engagement, Part II was published to the Eli Review Blog in the categories Features and Media, Pedagogy.

Continue Reading Eli Review Content

Want to keep reading?

Eli Review Free Trial

Ready to try Eli Review?

The Summer 2022 Eli Review Workshop Schedule is Posted! Attending a Peer Learning Workshop is a great way to learn how Eli Review works and strengthen your peer review skills! Sign up now!