Eli Review User Guide

Course-Level Analytics

The Course Analytics Report is meant to provide a portrait of the aggregate activity of all students throughout an entire course. Activity from all the individual reviews completed to date is combined in this report. When a course is first created, the report contains no data because there’s no activity on which to report; as instructors lead students through rounds of the write-review-revise cycles, the report expands.

Course Analytics provides a course-level view of student performance, focused on averages to determine trends; its companion, the Student Analytics Report, provides similar information but looks at the performances of an individual over the duration of the course to compare against class averages.

Navigating Course Analytics

Every Eli course has a unique analytics report compiled from the data generated by students during the review process. The course-level analytics report can be accessed by clicking the “Analytics” link in the primary navigation bar under the title for any given course.

Every Eli course has an "Analytics" report accessible from the main menu.

Every Eli course has an “Analytics” report accessible from the main menu.

Each report  contains five sub-sections with unique displays of student engagement data. Each section can be accessed via the report navigation bar.

You can access any section of the Course Analytics report via the blue navigation bar.

The report sections include the following:

  1. Engagement Highlights – get a quick overview of engagement data across the course
  2. Trend Graphs – charts breaking down individual metrics
  3. Average Student Profile – the average student performance in each engagement metric
  4. Engagement by Task Type – looking at levels of engagement for each type of task
  5. Data Downloads – options for downloading student performance data

When using any of the reports, keep in mind the following:

Engagement Highlights

The first section of the Course Analytics report is the Engagement Highlights display.  It is designed to provide a quick overview of some of the most important engagement data collected about students. Its two tables offer a glimpse:

The "Engagement Highlights" report provides a quick view of some of the most powerful review data.

The data in the Feedback and Helpfulness table can be sorted by clicking any of the column headers. It can also be downloaded as a CSV file for more advanced searching and sorting. The Engagement by Task Type report provides similar review data but is expanded upon significantly.

Trend Graphs

These reports consider individual engagement data metrics and graph their performance over time, revealing trends in student performance. The three engagement metrics currently available are:

Each graph is accompanied by a table containing all the raw data that went into plotting it as well as a toolbar enabling a number of different ways to download those materials.

The download options available for each trend graph.

Each point on a graph and each column in the data tables correspond to a single review completed by students in the course. If an instructor has assigned seven reviews, there will be seven points on each graph and seven columns in each table. Hovering the cursor over any data point or column header will reveal the name of that review and its relevant measure; clicking any data point on a chart will reveal the individual report for that review.

Hovering the cursor over any point on a trend graph or table column header will reveal relevant information.

Switching between different graphs is quick and easy; above each graph is a dropdown labeled “Switch Trend Graphs” – just click this dropdown for easy access to each of the three reports.

Easily switch between the three available trend graphs.

Ratio of Comments Given to Comments Received

This graph is meant to help instructors see, in general, how much feedback each reviewer has given in relation to how much they’ve received. The given/received ratio is calculated by looking at the number of contextual comments exchanged during reviews and, for the class average, the individual averages are aggregated into a single score.

The trend graph for ratio of comments given to comments received by each student.

The given/received ratio tells us the following:

Ideally, this graph would show a straight line – the closer the ratio is to 1, the more evenly distributed feedback is between writers and reviewers.

While this data is not scientific, it can give helpful insight into student behavior and suggest how that student might benefit from some direct instruction. Here are some ways to think about ratios:

Comment Volume

This graph is meant to help an instructor see, in general, how much feedback is being exchanged during reviews.

The comment volume trend graph.

Seeing comment volumes plotted in this chart can demonstrate a number of things:

The table below the volume chart lists the exact number of comments given by every reviewer who participated in a review as well as the class average for comparison.

Average Helpfulness Ratings

Writers are given the option to rate each comment they receive on a scale of 1-5 stars. While it’s up to the individual instructor to coach students to rate their feedback and how to assess it (what a one-star comment looks like compared to a five-star comment), those ratings are compiled in this trend graph.

The trend graph plotting changes in comment helpfulness over time.

This chart can help provide answers to questions like:

Common trends: ratings graphs often slope downward over time. In early reviews, writers are eager to rate all of the feedback they receive very highly our of a desire to be friendly and collegial. Over time, as they get better at differentiating helpful from unhelpful feedback, ratings will often decline as they begin applying higher ratings more infrequently.

The table below the helpfulness chart lists the helpfulness averages for individual students on the comments they gave as part of a review.

Average Student Profile

This report is intended to help instructors understand what the performance of an average student looks like in their Eli reviews. When assessing overall performance, knowing what the average student did in a course as well as what kind of feedback they both gave and received can help illustrate where more coaching may be helpful.

The Average Student Profile report.

Additionally, the data in these tables is intended to help instructors answer larger questions like “how much help did each student receive?” when compiling reports on student performance and engagement.

Engagement by Task Type

This report provides instructors with statistics about student performance on the different types of tasks they have assigned to their students.

The Engagement by Task Type report.

The table will help an instructor understand, at a class-level, how students engaged with each type of tasks and will help them better understand how one might design new tasks for this class or for future classes.


Data Downloads

These features give instructors the ability to download all of their raw student data. While most of this data is accessible through a browser, these tools allow instructors to take the uncooked student data and perform their own queries and sorts with it. This is particularly helpful for teacher researchers investigating their own practice or writing researchers studying student review behaviors.

Data download options for class-level engagement and comment data.

For a complete breakdown of the data available in these downloads, and about formats and methods for using that data, see the Data Download FAQ. You can also view samples of the data available via download:

Notes about Report Data

There are a few factors to keep in mind about what data is included in these reports:


upward arrow button