The Eli Review Blog

New tutorials, research, teacher stories, and app updates - Subscribe via our newsletter!

3 Feedback Design Lessons

That I learned the hard way so you don’t have to

Partnering with instructors to design helpful feedback tasks aligned to the criteria in their writing tasks is one of the best parts of my role as the director of professional development at Eli Review. This summer, I had the chance to design peer feedback and revision tasks for three very different courses:

  • a large lecture introduction to business course where students compose a nine chapter business plan,
  • a second-year legal writing course on appeals, and
  • a first-year writing course where students conduct an inquiry research project in writing studies.

As a designer, I try to make sure that peer review tasks extend the instruction in discipline-based thinking and writing. One of my litmus test questions is:

Will a writer who submitted a weak draft be able to use these questions to give helpful feedback to others and then turn back to their own draft with greater insight?

This question helps me focus on identifying the most important criteria. It also reminds me that students have to write to each other about these criteria in their own words and from their own standpoint. They need guidance in what to look for as well as in what to say.

As I puzzled my way through these three different discipline-based courses, I realized how important it is to

  1. listen to students,
  2. repeat the big idea, and
  3. focus on what’s next instead of what’s missing or wrong.

Listen to students.

Helpful reviewers include a friendly handshake and a wave.

When designing feedback tasks, there’s simply no substitute for having access to peer-to-peer conversation. For two of the three course designs, I was able to refer to the comment digest and downloads from previous Eli Review courses. I read through the feedback to get a sense of which criteria students talked about, which they didn’t mention, and what they threw in for good measure. Most importantly, I paid attention to how they expressed their feedback. 

Listening to students changed my approach to writing instructions for commenting. My typical instructions refer to Bill Hart-Davidson’s describe-evaluate-suggest pattern for helpful comments. 

These instructions guide reviewers in restating the writer’s main point, identifying how well the draft meets expectations, and offering next steps for building on strengths or improving weaknesses.

The comment instructions drafted by the original teaching team for the course focused on theses principles of effective feedback with cues about length:

The writer is going to keep improving this idea in the next stage writing assignment, and eventually this draft will contribute to a graded chapter writing project. If the writer was going to invest 20 more minutes in improving this draft now, what should they focus on and why?

Your goal is to write at least one paragraph (100 words) of feedback that would help in planning out these 20 minutes. In 1-2 sentences, restate the most important idea in this draft . Then, in 2-3 sentences, explain how clearly you think that idea was developed within the writing and how you think that idea will play out in reality (for instance, discuss if you think these ideas are feasible, etc…) Finally, in 2-3 sentences, suggest ways the writer can proactively address issues related to clarity in the writing and infeasibility of the idea. What additional terms/topics from the readings/lecturers might they include to substantiate their thoughts and ideas? There’s no need for writers to correct their grammar if they have nothing of substance to say, so focus your feedback on improving the clarity of ideas.

(Original instructions)

After reading what reviewers actually wrote in response to these instructions, I realized that in addition to those three moves, the best feedback also tried to forge a personal connection with the writer and included a closing remark that looked forward to the next draft. Like a handshake and a wave, these two moves set the stage for some reviewers to offer specific and thorough feedback.

Many reviewers who didn’t give substantive feedback seemed to stumble through even these polite moves. These students seemed to be struggling with their own authority to engage with other students’ drafts at all. Unsure they had anything to offer, they resorted to praise, repeating missed criteria, or mentioning minor corrections.  

I revised the instructions to be both more personal by and more concrete:

The writer is going to keep improving this idea in the next stage writing assignment, and eventually this draft will contribute to a graded chapter writing project.

  • If the writer were going to invest 20 more minutes in improving this draft now, what should they focus on and why?
  • What additional terms/topics from the readings/lecturers might they include to substantiate their thoughts and ideas?

Your goal is to write at least one paragraph (100 words) of feedback that would help in planning out these 20 minutes. There’s no need for writers to correct their grammar if they have nothing of substance to say, so focus your feedback on improving the application of course concepts and the clarity of ideas.

Guidelines for 100+ Words of Feedback

The guidelines below include the purpose of each part of your feedback paragraph with sample sentence starters used by previous students in this stage. You are NOT expected to use all the sentence starters; include one or more per section or come up with your own.

Connect with the Writer

In 1-2 sentences, make a personal connection with the writer. This is good practice for responding to cold emails and other workplace communication.

  • Based on my personal experiences with __, I wanted to mention that ….
  • As a ___ major, I’m really drawn to this idea because ….
  • You are prioritizing ___, which is often overlooked and undervalued ….

Describe the Idea Back to the Writer

In 1-2 sentences, restate the most important idea in this draft. This lets the writer know you’ve understood their point, which makes your feedback more valuable.

  • It sounds like a major part of your product is to ….
  • I could see your product in a store like ….
  • I need a description of the product so that I can picture it in my head: would I see ….?

Evaluate How Well the Chapter Meets Expectations

In 2-3 sentences, explain how you think that idea will play out in reality.

  • I love ___;  however, I think in practice …
  • I think you are not addressing competitors/demographics such as ….
  • You name the competitors, but I don’t know enough about them to tell how your product is different. I think you should ….
  • This plan seems too similar to what the competitors are doing; you might differentiate your company by ….
  • If you are competing with ___, I don’t see how you can ….
  • I suggest going back to the SWOT PowerPoint because you’ve mixed up internal/external factors; it should be ….
  • You mentioned, “__.” Why do you think that is?
  • It seems like you contradict yourself by saying ___ and then ___.

Suggest Ways to Build on Strengths or Improve Weaknesses

In 2-3 sentences, suggest ways the writer can proactively address issues related to the application of course concepts, infeasibility of the idea, and clarity in the writing. 

  • This seems like overkill for a very niche problem because …. What about….?
  • You will need to ____. Perhaps you could sacrifice ___ in order to ____. This approach would ….
  • The biggest roadblock you’d be facing is ….
  • The example about ___ does not seem to fit because ….
  • I think you need an example of research to back up the idea that ….
  • Try answering, “….?”
  • What will motivate ___ to ___?
  • What if ____ happened? How would ___?
  • When you say, “…”, it may come off as ___ to people who ___. How could you say that idea so that it …?
  • Make sure you split this up so that …. 
  • I wasn’t entirely engaged in reading until …. 
  • To make this flow better, try to ….

Closing Remarks

In 1-2 sentences, wrap-up your feedback.

  • I admired how you ….
  • I think you’ve picked a very creative approach because ….
  • I think the success of this business plan relies a lot on ….
    (Revised instructions)

These elaborate instructions include sentence templates drawn from previous students’ feedback for each move. It embeds a model for what to say and how to say it in the review task itself. The examples also include as many praise statements as critique statements. These five moves and sample sentences reflect peer-to-peer conversation about criteria rather than trying to force students to imitate instructor-to-student feedback.

Altogether, these changes make feedback friendlier.

Repeat, repeat, repeat.

Keep asking the most important question.

Our team at Eli Review harps about practice: “Your Students Aren’t Revising Enough,” “No Pain, No Gain”, “Turn Rehearsal into Practice,” “Who leveled up?” We also talk about predictable routines: “Tips from Eli Review Instructors” and “Time Management in Peer Learning.” I thought I had a good grasp on the ways repetition supports learners.

Working on the legal writing course pushed me to think about repetition differently. It was a difficult course to build because I’m not a lawyer. I understood most of the words in the textbook, and the examples made sense when I was staring at the page. But, if I stepped away and tried to come back, all the vocabulary and criteria made me feel dizzy. Immersed, I felt fairly confident. Distracted, everything was gobbledygook. I realized that students would have the same experience of the peer review tasks. As reviewers of messy drafts, they’d feel adrift amid the sea of criteria.

How do you anchor a host of nitty-gritty details? My solution was to repeat a big picture idea. In the last section of 8 out of 10 peer feedback tasks in the legal writing course, reviewers will answer this question:

What more could the writer do to put the judge in the frame of mind to rule for the client?

It’s a big idea straight from the textbook Winning on Appeal, 3rd edition (Dysart, Southwick, Aldisert).  The question reminds reviewers of the audience, not merely conventions.The question encourages students to pull all the feedback together toward one persuasive purpose. 

The repetition of this question also develops an important habit. It will be a great outcome if students leave the course and reflexively ask, “What more could the writer do to put the judge in the frame of mind to rule for the client?” 

Generate, not critique.

Encourage reviewers to imagine what’s next in the writing process.

Nearly twenty years ago (eek!), I had the opportunity to update the peer review section in Andrea Lunsford’s The St. Martin’s Handbook, 5th edition, and one of the key parts was presenting peer review as “thinking alongside the writer.” That language was not new then, but it has animated all my work on feedback since.

I know good feedback is more like taking a turn at kneading dough guided by a few Great British Bake Off window-pane tests than like wine-tasting. With dough, a little more starch, a little more water, a few more kneads, a little more rest, or a temperature change might help. With wine, the vintage gets graded because there’s no changing the rain on the grapes. Good reviewers contribute; they aren’t primarily flaw-finders.

Baker holds bread dough to the light, stretching for window-pane test.

“Bagels – Windowpane” by grongar is licensed under CC BY 2.0

The course design for an inquiry project in writing studies forced me to back away from critique even more as a goal for peer feedback. My typical strategy when figuring out what to cover in a review boils down to three components: must have, must not have, and what varies to determine quality. But, that strategy didn’t work well for this class.

Students choose their own genre for the final project, which asks them to report a debate/conversation among writing teachers in published scholarship. Though the requirements for cited sources are specific, every other aspect of the project depends. In fact, the research unfolds across several weeks, so some writers might have three citations and others five when the peer review happens. Since so much depends on the writer’s choices and progress to-date, the peer review instructions can’t highlight common problems or offer shared criteria. The instructions have to help reviewers think alongside the writer about something that is quite abstract and not yet even half-baked.

I needed a window-pane test that would work with pizza dough, sourdough, focaccia, you name it. And, reviewers needed to be able to trust their own judgment. I developed this rating scale and final comment combination:

How clear is the writers’ report of the conversation to you at this point in the writing process?

10 stars – Crystal clear. I understand the stakes, personalities, conflicts, alliances, conversation turns, and memorable lines.

7 stars – Fuzzy. I can tell most of what’s going on.

4 stars – Blurry. I know one or two things for certain.

1 star – Unclear. I don’t follow much of this conversation at all.

What factors lead to your star rating on the clarity of the conversation?

Write 3-5 sentences of feedback that help the writer understand your experience as a reader of this collection of paragraphs about sources and their connections. What advice do you have as the writer transforms this project building draft into the final product?

——Sample Sentences——

  • I rated this draft’s clarity about the conversation as ____ because ___.
  • As a reader, I thought the most striking source/frame/concept was ___ because ___. The concrete detail that most helped me was ___. I could really tell that a key turn in the conversation comes in ___.
  • The paragraph about ___ seemed repetitive/general/choppy/unfocused. I felt like I needed to know more about ____ and less about ___.
  • It is unclear to me how ___ connects to ___. I think you could say more about ___ in order to ___.
  • From reading this draft, I learned that ___. That’s really interesting because ___.
  • As you transform this draft into the final product, I hope you dig more into ___. 

Asking reviewers to rate the clarity of the draft is pretty risky. Often, reviewers think “clear” means: “I knew every word and didn’t stumble much.” Instead, “clear” in this context means: “I get the big picture of this scholarly debate from the details presented here, and the order of details is interesting and easy to follow.” Reviewers have to read the draft’s potential as much as they read its contents. This is hard. Then, reviewers need to offer feedback that helps writers transform notes about sources into their target genre for the final product. Giving this kind of generative feedback that imagines where the draft can go is very different from showing a writer how to earn more points by meeting expectations, finding the errant topic sentence, or pointing out comma errors.

Will this clarity rating and explanation help reviewers think alongside writers as they turn their notes into a polished genre of their choice? I don’t know, but we’ll be able to listen to students and adjust the instructions next time.

Course Design Services

Contact our team of Eli Review Professional Development Coaches if you’d like us to design your course.

This fall, we’re offering a $500 honorarium to instructors of large lecture classes (200+ students) who are willing to assign 5 reviews (we’ll help design them!) and participate in a couple of meetings. Learn more about becoming a Feedback Partner.

3 Feedback Design Lessons was published to the Eli Review Blog in the categories Example Peer Feedback Tasks, Pedagogy, Professional Development.

Sign up for a Professional Development Workshop to learn how Eli works! Sign up now!