The Eli Review Blog

New tutorials, research, teacher stories, and app updates - subscribe via RSS or our newsletter!

A More Helpful Helpfulness: Improvements to Analytics and Digests

We’re closing out 2015 with two improvements to Eli Review’s analytics to better support classroom research on peer learning. Eli helps teachers coach reviewers in offering better comments by making visible:

Instructors can only coach students to give better feedback when they can see how helpful writers are finding comments and quickly get a sense of what reviewers are telling writers. With that in mind, we’re making two big changes, starting today.

Improved Helpfulness Ratings


Writers use helpfulness ratings to show reviewers the extent to which a comment helps them revise. Helpfulness ratings are useful only if there is no skew.

Before today, Eli reported helpfulness ratings in some analytics so that they skewed toward zero. Unrated comments were included in the average, which may reflect that the writer didn’t rate it rather than that the comment did not merit even a single star.

Now, all helpfulness averages exclude unrated comments.

Counting only rated comments as part of the helpfulness score does have some effects on this measure. The most important is that each rated comment will contribute more to the average. If only a few comments by a reviewer have ratings, the helpfulness score will not provide a comprehensive picture of a reviewer’s helpfulness, but a narrow one. The more comments that are rated by writers, the better the helpfulness score will be as an indicator of how well a reviewer consistently helps her classmates.


The specific places where you can see the new helpfulness ratings in action include:

If you’re looking for strategies for using helpfulness ratings effectively, check out our helpful helpfulness tutorial .

Improvements to Comment Digests

While supporting several researchers investigating the comments exchanged in their classes, we’ve found a number of ways to make our current comment digest downloads more useful. Specifically, we’re adding to each row in the digest:

  • Course ID and Title, plus Review ID and Title, which help identify where specific comments came from; this makes it easier to see how comments change over time and to combine downloads from multiple tasks or courses into a single file.
  • Whether comments were added to revision plans, showing which comments writers planned to use and sorting only comments added to plans for more analysis.

Download a sample to see what the new data downloads look like.

These improvements to data downloads refer specifically to the digests in any completed Review task and, in Analytics, from the Downloads tab .

For ideas about teaching reviewers to offer better comments, check out “ Getting Better at Giving Helpful Feedback Matters ” where we share ideas from several instructors about using the comment digest for in-class and reflection activities.

Thanks for your feedback!

As always, these improvements are thanks to the feedback we get from instructors and from students about how Eli Review can better support your work helping writers give and get better feedback. If you have other ideas on how we might improve, please feel free to contact us!

A More Helpful Helpfulness: Improvements to Analytics and Digests was published to the Eli Review Blog in the categories Analytics, Features and Media, Feedback, Interfaces, Scholarship.

Continue Reading Eli Review Content

Want to keep reading?

Eli Review Free Trial

Ready to try Eli Review?

The Summer 2022 Eli Review Workshop Schedule is Posted! Attending a Peer Learning Workshop is a great way to learn how Eli Review works and strengthen your peer review skills! Sign up now!