Mike McLeod explains how Eli values comments.
Unlike other systems where reviewers’ feedback is accepted or deleted, leaving no trace, Eli captures, stores, and surfaces reviewers’ comments in various ways.
In Eli, reviewers’ feedback is as valuable as writers’ drafts.
Jeff Grabill explains how Eli motivates reviewers.
In Eli, writers rate the helpfulness of reviewers’ comments, and those ratings generate data that allow instructors and reviewers to gauge their effort and performance in giving feedback.
Jeff tells how this data motivates reviewers to get better.
Model in the moment.
If you teach in setting where students have computers, use “get the most recent results” in the “Response to Writing” tab to see a random selection of all the comments reviewers are giving in real-time.
Stop the class to model from these comments, and coach reviewers to follow best practices.
Borrow Bill’s pattern.
In his class, Bill models reviewers’ comments often. By talking about comments while the review is happening, he helps reviewers detect the “describe, evaluate, suggest” pattern that leads to effective, helpful feedback.
No matter your classroom setting, talk with students about the trends you notice in their feedback.
Be clear about what kinds of comments you endorsed or why you chose not to endorse comments.
Praise the highly-rated reviewers and those who offered highly rated comments.
Select “see all comment ratings for all reviewers” to identify which reviewers might need extra help.
Encourage reflection on review.
At regular intervals, encourage students to talk and write about how they are improving as reviewers. Require them to justify their claims with evidence from their helpfulness ratings and from comments they’ve given writers. See if they can detect a pattern of effective commenting in their work.
Also, encourage students to talk about the comments they’ve received–which ones still resonate for them and why.
Back to top