We’re closing out 2015 with two improvements to Eli Review’s analytics to better support classroom research on peer learning. Eli helps teachers coach reviewers in offering better comments by making visible:
- how helpful writers perceive those comments to be using a 5-star helpfulness rating
- all comments reviewers offered peers on a single screen and download
Instructors can only coach students to give better feedback when they can see how helpful writers are finding comments and quickly get a sense of what reviewers are telling writers. With that in mind, we’re making two big changes, starting today.
Improved Helpfulness Ratings
Writers use helpfulness ratings to show reviewers the extent to which a comment helps them revise. Helpfulness ratings are useful only if there is no skew.
Before today, Eli reported helpfulness ratings in some analytics so that they skewed toward zero. Unrated comments were included in the average, which may reflect that the writer didn’t rate it rather than that the comment did not merit even a single star.
Now, all helpfulness averages exclude unrated comments.
Counting only rated comments as part of the helpfulness score does have some effects on this measure. The most important is that each rated comment will contribute more to the average. If only a few comments by a reviewer have ratings, the helpfulness score will not provide a comprehensive picture of a reviewer’s helpfulness, but a narrow one. The more comments that are rated by writers, the better the helpfulness score will be as an indicator of how well a reviewer consistently helps her classmates.
The specific places where you can see the new helpfulness ratings in action include:
- The Engagement tab in any completed Review task .
- All Course and Student-level Analytics, particularly the Helpfulness Trend Graph .
- All data downloads that include helpfulness (engagement data and comment digests).
If you’re looking for strategies for using helpfulness ratings effectively, check out our helpful helpfulness tutorial .
Improvements to Comment Digests
While supporting several researchers investigating the comments exchanged in their classes, we’ve found a number of ways to make our current comment digest downloads more useful. Specifically, we’re adding to each row in the digest:
- Course ID and Title, plus Review ID and Title, which help identify where specific comments came from; this makes it easier to see how comments change over time and to combine downloads from multiple tasks or courses into a single file.
- Whether comments were added to revision plans, showing which comments writers planned to use and sorting only comments added to plans for more analysis.
Download a sample to see what the new data downloads look like.
These improvements to data downloads refer specifically to the digests in any completed Review task and, in Analytics, from the Downloads tab .
For ideas about teaching reviewers to offer better comments, check out “ Getting Better at Giving Helpful Feedback Matters ” where we share ideas from several instructors about using the comment digest for in-class and reflection activities.
Thanks for your feedback!
As always, these improvements are thanks to the feedback we get from instructors and from students about how Eli Review can better support your work helping writers give and get better feedback. If you have other ideas on how we might improve, please feel free to contact us!