Eli Review User Guide

Review Tasks

Review is the crucial step in Eli Review. Reviewers give feedback based on criteria and writers get the feedback they need on their writing tasks so that they can do better revision.

As an instructor, it’s important to think of yourself as the review coordinator – you

  1. establish what’s to be reviewed
  2. define the criteria and ways reviewers give feedback to writers
  3. assign groups and the task
  4. debrief to talk with students about peer learning

Debriefing can happen during and/or after review begins. You can use the real-time data to model effective feedback. After reviews are complete, you can model peer exemplar drafts and guide students in creating revision plans.

1. Creating a Review

Eli’s review tasks allow instructors to devise guidelines for peer feedback that include checklists, ratings, and comments.

Step 1: Define Review Details

The first thing you’ll do is define some crucial contextual details about the review. Specifically:

Loading Existing Review Tasks

Instead of creating a review from scratch, you can “load from library” to borrow and customize a task from Eli’s curriculum or from any Eli course in which you’ve been an instructor or co-instructor.

Load from library is an option in the upper right.

The details above in Step 1 as well as those in Steps 2 and 3 will likely need to be updated, but Step 4 Response Types will be complete.  The repository display will allow you to browse all of the review tasks you’ve created in the past, even as part of another course. When you’ve found the  task you want to clone, just click the “Load” link and the task form will be populated with the settings for that task.

For more information on loading tasks, check out the tutorial on peer feedback for TILT writing assignments  and see the Task Repository section of the user guide.

Step 2: Review Materials


Next, you’ll define the writing tasks to be reviewed.

You can select up to 5 tasks.

For example, you might select:

Step 3: Reviewer Groups

This step puts students into groups. Eli requires reciprocal peer learning.

Students assigned to a group with 3 other peers will be required to give feedback to all 3 peers BEFORE they can get the feedback they’ve been given by those peers.

Because students cannot “complete” a review until they’ve given feedback to all their peers, it’s important to only put writers with submitted drafts into peer groups.

The Automatic grouping type and option to Exclude late writers from groups enforce this expectation.

We refer to writers who have not yet submitted a draft as “late” writers. Late writers are a particular challenge for coordinating feedback. Reviewers grouped with late writers will see “waiting” as the status of their review task because they cannot “complete” their review until their peer “completes” the writing task. See our tutorial for strategies on managing late writers or accommodating late writers.

Grouping Types

Eli has two grouping types: Automatic and Manual.

The Automatic grouping option shuffles students into peer review groups on a scheduled review Start Date.

Importantly, Automatic grouping only forms groups of on-time writers, students who complete a review’s associated writing tasks before its Start Date. The resulting peer review groups consist exclusively of students who are ready to begin the peer learning activity, so on-time writers can begin working without waiting for late group members to complete the writing task.

Automatic grouping has two settings:

The Manual grouping option provides 3 ways to create groups:

The edit group options allow instructors to arrange random groups that exclude late writers

Important things to keep in mind about groups:

Keep in mind that a student in a group must give feedback to all their peers to “complete” the assignment. Reviewers grouped with late writers see the “waiting” status, but they can see the feedback they’ve personally received.

Step 4: Add Response Types

The last step in creating a review is to give some structure to how you want students to give helpful feedback.


There are five different response types you can add to any review, and we cover examples of each response type in this tutorial. Here are the basics:

Important things to keep in mind about response types:

For more about the affordances of each response type, and some tips on designing your prompts, see our response type tutorial.

Previewing / Saving as Draft / Assigning / Editing

You can use the buttons at the bottom of a review to

2. Review Report – Class-Level Data

Eli’s feedback analytics are unique, giving instructors multiple quantitative and qualitative to view, sort, and analyze both drafts and writing.

Once students submission to the review begin, Eli makes available a series reports about reviewer progress toward completion as well as writer and reviewer performance. This section outlines the review report summarizing performance for the entire class and the next section describes the report for individual students.

Each section of these reports is designed to provide instructors with:

Task Overview Display

The first display in the review report is an overview display that will help you coordinate the review, including tools that allow you to do the following:

You can navigate between the other sections of the report using the navigation tabs.

Responses to Writing

The “Responses to Writing” tab provides an overview of how the entire class is responding to the writing being reviewed.

This is an aggregate display, tracking not individuals but class-level performance. From here, you’ll be able to:

This report will continue to evolve as students complete their reviews. Projecting it during a review and using it as a status board, including calling reviewers or writers out by name, can help reviewers feel good about their contributions and model helpful comments.

Responses to Reviews

The “Responses to Reviews” report is a display of how writers have responded to the reviews they’ve received. This report is meant to help identify the most helpful reviewers as well as the most helpful individual comments given by reviewers.

As with the “Responses to Writing” report, this report will evolve as students complete their reviews. It is also made more useful if students rate their feedback, which can be encouraged by sharing this data with them – the more feedback is rated, the better the report.

Engagement Data

The Engagement Data section of the review report provides data about reviewer activity that can be helpful when coaching them in how to give better feedback. Instructors may choose to use this data in their evaluations, but this quantitative data is meant to give insight into reviewer behavior that might help improve the helpfulness of their feedback.

Engagement data can be filtered by stories:

Included in this report:

* helpfulness calculations ignore comments that haven’t been rated – Eli does not assume the lack of a rating by the recipient implies anything about their perceived value of a comment

This report is truncated to fit into a standard browser window and is focused specifically on comments. The Engagement Data Download provides significantly more data about reviewer engagement.

3. Review Report – Student-Level Data

Where the class-level reports provide data about student performance in the aggregate, the individual student report provides a breakdown of a single student and their work in a review. Clicking on a student’s name (or their anonymized label) anywhere in the review report will open that student’s individual report.

The Student Summary Report shows the student’s completion status, including submission times, for each assigned review as well as the engagement data.

Each student report has links mirroring the class-level reports:

The data in these reports resembles the data found in the class-level reports, but here it is focused specifically on student performance.  Both reports – whether listing feedback the student received or feedback the student gave, will be displayed in a similar format. In both cases, you’ll see:

Unlike the class-level reports, however, these student reports offer detailed information about every comment received or given by that student. The report includes for each comment:

Endorsing Feedback

Endorsing an individual comment is a way for you, as the instructor, to send two messages at once:

Students will only see when comments have been endorsed, not when they haven’t been endorsed, so use of this feature is at the instructor’s discretion. See our endorsement tutorial for detailed examples of using endorsements strategically.

What Students See in Their Reports

To get a sense of what students see after a review, see the Review Report: Feedback from Reviewers section of the Student User Guide.

4. Downloading Review Data

While most data generated by reviewers is accessible through this review report, teachers and teacher researchers often prefer to work with raw data when conducting research. In some cases, the typical browser window is too small to allow a satisfying / usable display of all the data Eli makes available.

To that end, there are two download options for review data: the comment digest and the engagement data report.

Report Formatting – .csv files

Both data reports will download as CSV (comma-separated values) files. This type of file can be imported into almost all spreadsheet and database applications, making it possible to sort and query the data in any number of ways.

You can learn more about CSV files here:

Report #1 – Engagement Data

This report is an extension of the data available in the Engagement Data tab of a review report. In includes one row per  review participant, and each row contains the following data:

Things to note about how Eli assembles the Engagement Data report:

Sample: you can download sample engagement data to preview this feature.

Report #2 – Criteria and Scale Data

This download collects all of the quantitative feedback generated during a review. Reviewer responses to criteria sets, rating scales, and Likert scales are all included in this download. Specifically, it includes the following information:

* The Criteria and Scale Data download is only available for reviews that have these features added..

Sample: you can download a sample scale and criteria dataset to preview this feature.

Report #3 – Comment Digest

The comment digest is a report compiled as reviewers complete their reviews. It is a collection of all of the comments exchanged between reviewers and the quantitative data about those comments*. Instructors can download digests of individual reviewers or of all the reviewers in a course.

A comment digest includes one record for every comment exchanged between reviewers, and each record contains the following data:

* The Comment Digest is only available for reviews in which contextual comments were enabled.

Sample: you can download a sample comment digest to preview this feature.

Sign up for a Professional Development Workshop to learn how Eli works! Sign up now!