Review is the crucial step in Eli Review, where writers get the feedback they need on their writing tasks that will help drive their work in revision tasks. As an instructor, it’s important to think of yourself as the review coordinator – you establish what’s to be reviewed, you divide students into groups, and you define the criteria by which students respond to one another.
Once a review begins, you can use the real-time data generated by reviewers to model effective feedback and to guide further interventions. After reviews are complete, you can guide students in creating revision plans, which will outline their revision strategies and let you provide just-in-time coaching for their revision process.
- Creating a Review
- Review Report: Class-level Data
- Review Report: Student-level Data
- Downloading Review Data
1. Creating a Review
Designing effective reviews is the most complex work in Eli. Eli’s review features allow instructors to design reviews as simple or as complex as their learning goals require.
Step 1: Define Review Details
The first thing you’ll do is define some crucial contextual details about the review. Specifically:
- Review Title: Since you’ll likely have students do multiple reviews, and those reviews will be displayed in a table, it’s important to be as specific as possible when naming each one to make it easier to locate. Rather than something unspecific like “Review of Draft,” consider something much more detailed, like “Review of 1st Draft of Research Paper 1”.
- Due Date: Specify by what date students must complete their reviews.
- Overview: An optional field that will allow you to add details instructions for reviewers, links to online resources, or learning goals you feel it’s important for students to consider while reviewing.
- Privacy: An option that will allow students to review one another anonymously. Rather than seeing the full names of writers they review, they’ll see a generic name “Classmate #1” instead. Important things to consider with anonymity:
- You, as the instructor, will always know who said what to whom. This option only conceals identities for students.
- Eli cannot detect for student names inside their texts. If you intend to have students review one another anonymously, be sure to instruct them not to put their name inside their writing.
This step only defines the structural characteristics of the review; later steps will allow you to specify what students review and how to structure their feedback.
Step 2: Review Materials
Once you’ve defined the descriptive details of your review, you’ll define what materials will be reviewed. These are the texts students will respond to during a review. From here you can select one or more of the writing tasks that have been assigned to the students in this course to be reviewed. Note that you won’t be able to create a review until you’ve created at least one writing task that can be reviewed.
Step 3: Reviewer Groups
The last step in creating a review is to give some structure to how you want students to respond to one another. Divide students into groups. Keep in mind that in Eli, writers and reviewers are 1:1; a student assigned to a group with 3 other reviewers will review the 3 texts of each of those reviewers. There are 3 ways to create groups:
- Add Group – use the “Add Group” button to manually create as many groups as needed and then drag students from the “Ungrouped” list into the desired group.
- Shuffle Into Existing Groups – click this link to shuffle all students in the “Ungrouped” list into the existing groups. The groups will shuffle each time the button is pressed.
- Specify # of reviews each writer gets – select the number of students you want in a group (selecting 4 from the dropdown means that 4 students will be in each group and each will do 3 reviews) . This will create as many groups as necessary with students randomly placed throughout.
Important things to keep in mind about groups:
- Rearrange when needed: You can always drag-and-drop students between groups if you don’t like how Eli has paired them, or if you want to distribute your strongest reviewers more evenly through the groups, for example.
- Avoid groups of 1: Eli will warn you if you create a review group with only 1 reviewer. A writer cannot review their own text.
- Late Writers: Eli will make it easy to see which writers have completed the assigned writing, and who are therefore ready for feedback, as well as those who aren’t ready yet. The “Don’t Group Late Writers” option will keep late writers from being sorted into groups.
Late writers are a particular challenge for coordinating feedback; see our tutorial for strategies on managing late writers.
Step 4: Add Response Types
The last step in creating a review is to give some structure to how you want students to respond to one another.
There are five different response types you can add to any review:
- Trait Identification: in a criteria matching response, the prompt names a trait that you wish reviewers to identify in their peers’ work. You might, for example, create a prompt that says: “The article summary begins with a full citation of the article.” Reviewers respond by checking a box to indicate if the trait is present or absent in the draft they are reviewing.
- Rating Scale: A rating is a scaled response item that the teacher asks the reviewer to respond to on a scale. The prompt may frame the scale for students right in the question: “On a scale of 1 to 5 with 1 being lowest and 5 highest, how confident are you that the article citation is in correct MLA format?” After a review, Eli will generate average responses for the entire class as well as for individual reviewers and will also identify the most highly-rated writers.
- Likert Scale: A likert scale is typically framed as a declarative statement with a fixed set of responses (strongly agree, agree, etc) from which the reviewer is asked to select a single response. When adding a likert scale, you’ll be asked to supply a prompt for students and to identify each item on the scale. After the review, you’ll see averages for each scale item as well as for individual writers.
- Comments: Comments are open-ended text responses from reviewers, similar to what a reviewer might write in the margin or at the end of a draft. Comments can be attached to specific passages in a text for assignments that have been typed or pasted into the Eli editor, or they can be made as global suggestions to the writer. Instructors can prompt reviewers to offer specific types of feedback with a comment response if they like. For instance, a teacher might say “please offer comments on claims that could use more evidence to strengthen the writers’ overall argument.”
- Final Thoughts: Final thoughts allow students to leave a summative comment for their peers. This is useful for having students offer global comments about the entire work or to reflect on the experience of the text, rather than respond to specific questions.
Important things to keep in mind about response types:
- Every review must have at least one response type, at a minimum.
- There is no limit to the number of response types a review can have. You may add as many trait identification sets, rating scales, or likert scales as your learning goals require.
- Scaled response types (both ratings and likert) allow you to ask students to “Explain Your Response,” which both you and the writer will see after the review.
For more about the affordances of each response type, and some tips on designing your prompts, see our response type tutorial.
Saving as Draft / Assigning / Editing
When you’re done working on your review, you can proceed in two different ways:
- Save as Draft: this will allow you to save your work on a review without it being made visible to students. You can revisit and make changes to the review at any time.
- Assign to Students: when you’re ready for the review to be made visible to students and for them to begin their reviews, choose this option. You can make edits to the review after you assign it, but once students start responding, the only changes you’ll be able to make are the student groups and the due date.
It is possible to edit or delete a review once it’s been assigned to students, provided that students have not already begun responding to it. Once students have begun submitting feedback to one another, Eli will not allow you to change the review because:
- any feedback students already submitted would be lost, and
- students may have already seen and utilized feedback submitted by reviewers
Be sure that the review is as designed as you’d like it before students begin responding; after that, the system won’t allow edits, with the exception of:
- due date (to give reviewers more time)
- title (in case your labeling needs change)
- groups (to accommodate for tardy writers)
Loading Existing Review Tasks
The repository display will allow you to browse all of the review tasks you’ve created in the past, even as part of another course. When you’ve found the task you want to clone, just click the “Load” link and the task form will be populated with the settings for that task.
You’ll be asked to enter a new due date, review groups (if not reused in the same course), and review materials. You’ll also be able to edit any of the settings from previous task. Once you’re done editing, you can save the task as a draft or assign it to students, as normal. For more information on loading tasks, see the Task Repository section of the user guide.
What Students See When Reviewing
To get a sense of how reviews are displayed for students, see the Responding to Writing section of the Student User Guide.
2. Review Report – Class-Level Data
One of the crucial areas in which Eli stands apart from other review technologies is the data Eli produces during the review process. Once a review is available to students, Eli will prepare a series reports that both communicate reviewer progress toward completion and surface useful data about writer and reviewer performance. This section outlines the review report summarizing performance for the entire class and the next section describes the report for individual students.
Each section of these reports is designed to provide instructors with:
- Real-time, live review results – at any given time, Eli will let you know how far along students are with the review. This data is generated in real-time, which means that you can measure progress and see what students are saying to one another even before they complete their reviews. This affords many pedagogical interventions, including the ability to stop reviewers in the middle of a review task and model effective responses.
- Surfacing “peer exemplars” – Eli will identify both writers and reviewers that students are responding to favorably. You can use this data to model effective responses or to critique.
Task Overview Display
The first display in the review report is an overview display that will help you coordinate the review, including tools that allow you to do the following:
- Reminder of response types – an overview of the structure you created for students and how they’ll be responding.
- Make edits – you’ll be able to alter the details and response types right up until the point where students have begun submitting reviews, after which you’ll only be able to edit the due date.
- Rearrange groups – you can move reviewers between groups or move them out of groups altogether based on their reviewing ability or their completion (or lack thereof) of the writing tasks under review.
- Anonymity toggle – also sometimes called “projector mode” which will allow you to conceal the identities of students while you project the report, perhaps to model a reviewer’s feedback.
You can navigate between the other sections of the report using the navigation tabs.
Responses to Writing
The “Responses to Writing” tab provides an overview of how the entire class is responding to the writing being reviewed.
This is an aggregate display, tracking not individuals but class-level performance. From here, you’ll be able to:
- Monitor progress toward completion – you’ll see this represented as a percentage of reviews complete versus reviews remaining.
- Assess data from quantitative response types – likert scales will be represented as tables, showing responses to each individual item, while rating scales will show the class average.
- Identify “peer exemplars” amongst writers – the data for rating scales will include links to the profiles for the two students rated most highly by their reviewers. This will allow you to display that student’s writing to model their work or to engage in a conversation about why it was rated so highly.
- View selective comments – This display will surface a random sampling of feedback given to writers, allowing you to assess the overall tone of how students are responding to one another. You can refresh the display to get a new ramdom sampling. This can be useful for modeling effective feedback.
This report will continue to evolve as students complete their reviews. Projecting it during a review and using it as a status board, including calling reviewers or writers out by name, can have very positive results and can help improve feedback.
More about the “Responses to Writing” report:
Responses to Reviews
Where the “Responses to Writing” report is a display of reviewer responses to writing, the “Responses to Reviews” report is a display of how writers have responded to the reviews they’ve received. This report is meant to help identify the most helpful reviewers as well as the most helpful individual comments given by reviewers.
- Review statistics – you’ll see a breakdown of several data points regarding review, including:
- Total number of comments made
- Average number of comments per reviewer
- Percentage of comments rated by writers
- Average feedback writing of all comments
- Highest-rated reviewers – Eli will identify the two reviewers rated most helpful by the writers they reviewed and will include a short summary of their performance and a sample of their comments.
- Breakdown of all reviewer ratings – an optional report will display a complete breakdown of all students, sorted by the helpfulness ratings given to them by the writers they reviewed.
- Highest-rated comments – Eli will select a sample of the most highly-rated comments from this review and display them here. They can be used to model effective feedback or to identify particularly helpful reviewers.
As with the “Responses to Writing” report, this report will evolve as students complete their reviews. It is also made more useful if students rate their feedback, which can be encouraged by sharing this data with them – the more feedback is rated, the better the report.
More about the “Responses to Reviews” report:
The Engagement Data section of the review report provides data about reviewer activity that can be helpful when coaching them in how to give better feedback. Instructors may choose to use this data in their evaluations, but this quantitative data is meant to give insight into reviewer behavior that might help improve the helpfulness of their feedback.
Included in this report:
- For both comments given and comments received by a reviewer:
- Count – the total number of comments given and received for each reviewer
- % Rated for Helpfulness – the percentage of comments that have been rated for helpfulness by the recipient
- Helpfulness Score – the average helpfulness rating (on a scale of 1-5) of the comments given by the reviewer and the ratings given to the comments the reviewer received*
- Ratio of comments given by a reviewer to the comments they received
* helpfulness calculations ignore comments that haven’t been rated – Eli does not assume the lack of a rating by the recipient implies anything about their perceived value of a comment
This report is truncated to fit into a standard browser window and is focused specifically on comments. The Engagement Data Download provides significantly more data about reviewer engagement.
3. Review Report – Student-Level Data
Where the class-level reports provide data about student performance in the aggregate, the individual student report provides a breakdown of a single student and their work in a review. Clicking on a student’s name (or their anonymized label) anywhere in the review report will open that student’s individual report, displaying two different views of that student’s work:
- Student as a writer – a complete breakdown of the feedback this student received from reviewers about their own writing.
- Student as a reviewer – a complete breakdown of all the feedback this student gave to the writers they reviewed.
The data in these reports resembles the data found in the class-level reports, but here it is focused specifically on student performance. Both reports – whether listing feedback the student received or feedback the student gave, will be displayed in a similar format. In both cases, you’ll see:
- Full text of the comment
- A link to view the comment in context of the writing (when applicable)
- The perceived helpfulness rating given by the comment recipient
- An option to give your endorsement to the comment
Endorsing an individual comment is a way for you, as the instructor, to send two messages at one:
- To the writer: writers will see your endorsement of comments as a message that they should seriously consider paying attention to what their reviewer said when preparing their reviews.
- To the reviewer: when a reviewer sees their comment has been endorsed, it sends them a message that you thought their feedback was particularly helpful.
Students will only see when comments have been endorsed, not when they haven’t been endorsed, so use of this feature is at the instructor’s discretion. See our endorsement tutorial for detailed examples of using endorsements strategically.
What Students See in Their Reports
To get a sense of what students see after a review, see the Review Report: Feedback from Reviewers section of the Student User Guide.
4. Downloading Review Data
While most data generated by reviewers is accessible through this review report, teachers and teacher researchers often prefer to work with raw data when conducting research. In some cases, the typical browser window is too small to allow a satisfying / usable display of all the data Eli makes available.
To that end, there are two download options for review data: the comment digest and the engagement data report.
Report Formatting – .csv files
Both data reports will download as CSV (comma-separated values) files. This type of file can be imported into almost all spreadsheet and database applications, making it possible to sort and query the data in any number of ways.
You can learn more about CSV files here:
Report #1 – Comment Digest
The comment digest is a report compiled as reviewers complete their reviews. It is a collection of all of the comments exchanged between reviewers and the quantitative data about those comments*. Instructors can download digests of individual reviewers or of all the reviewers in a course.
A comment digest includes one record for every comment exchanged between reviewers, and each record contains the following data:
- names of the comment author and the recipient
- full text of each comment
- date and time of comment submission
- comment word count
- comment helpfulness rating assigned by recipient
- instructor endorsement (yes/no)
* The Comment Digest is only available for reviews in which contextual comments were enabled.
Sample: you can download a sample comment digest to preview this feature.
Report #2 – Engagement Data
This report is an extension of the data available in the Engagement Data tab of a review report. In includes one row per review participant, and each row contains the following data:
- reviewer name
- number of assigned reviews completed
- % completion of the review
- date of completion
- final comment made (yes/no)
- final comment word count
- ratio of comments given to comments received
- number of comments given as a reviewer (count)
- number of comments given rated for helpfulness by recipients
- % of comments given rated by recipients for helpfulness
- average helpfulness score of comments given
- number of comments received from reviewers
- number of comments received rated for helpfulness by the writer
- % of comments received rated by writer for helpfulness
- average helpfulness score of comments received
Things to note about how Eli assembles the Engagement Data report:
- reports correspond to the number of enabled features (i.e. reports that don’t use the “final comment” feature will not have such data in the corresponding report)
- numbers are rounded to the 3rd significant digits
- comments without a rating are excluded from comment totals and helpfulness – Eli does not assume that a lack of a rating implies a choice on a recipient’s part and does not hold that against a reviewer
Sample: you can download sample engagement data to preview this feature.