How Can I Get the Most Out of Peer Review?

Contents

Getting Started

Why include writing in my courses?

What is writing to learn?

What is writing to engage?

What is writing in the disciplines?

Useful Knowledge

What should I know about rhetorical situations?

Do I have to be an expert in grammar to assign writing?

What should I know about genre and design?

What should I know about second-language writing?

What teaching resources are available?

What should I know about WAC and graduate education?

Assigning Writing

What makes a good writing assignment?

How can I avoid getting lousy student writing?

What benefits might reflective writing have for my students?

Using Peer Review

Why consider collaborative writing assignments?

Do writing and peer review take up too much class time?

How can I get the most out of peer review?

Responding to Writing

How can I handle responding to student writing?

How can writing centers support writing in my courses?

What writing resources are available for my students?

Using Technology

How can computer technologies support writing in my classes?

Designing and Assessing WAC Programs

What is a WAC program?

What designs are typical for WAC programs?

How can WAC programs be assessed?

More on WAC

Where can I learn more about WAC?

Peer-review workshops serve many useful functions for student writers, most notably:

  • They provide writers with real readers who must make sense of the writing.
  • They help writers improve their reading/critical analysis skills.
  • They, most obviously, help writers improve their writing skills and final products.

In this section, we discuss eight principles that can help you get the most out of peer review.

Specify Tasks for the Peer Review

Even if you decide to let students do an "open" review (in which they imagine themselves as members of the target audience and give "reader response" reactions), make that task clear as you set up the peer-review session.

If you want to have students review particular features of a document, make sure that those tasks are clear and precise. Typically, deriving points for review from the assessment criteria for the task will help students see connections between the task and its evaluation.

Although you can list tasks on the board, students often prefer a worksheet that notes specific tasks. If students can write their commentary on a word processor, they are likely to write more extensive comments, so take advantage of computer supports whenever possible. More and more Web 2.0 tools are being incorporated into peer review as well, so consider possibilities beyond the word processor.

Consider Sequencing the Peer-Review Tasks in Multiple Workshops

If you want students to look for particular features of a document, students often feel most comfortable moving through a sequence from simply identifying a feature to evaluating it to suggesting revisions. Particularly if you give students multiple peer-review opportunities, keep this sequence in mind for workshop sheets that build on the prior one. And as you design these worksheets, label each level of task clearly so that students know if they are to identify or suggest revisions as part of a given peer-review session.

Model How to Use the Workshop Sheet or Criteria List Before Peer Review

Although most students will have had experience with peer review in writing classes in high school and freshman composition, students can still benefit from understanding each teacher's expectations of the peer-review session. One of the most effective techniques is to provide a sample student draft (either as a handout or projected for the class) and to elicit class comments on each point on your workshop sheet. Teachers can then elaborate on points students bring up or clarify what writing skills the points on the workshop sheet are designed to help students review.

Model Effective Commenting

The least helpful comment to receive from a peer reviewer is "It looks OK to me." We want students to find strengths or positive features in a draft, but we need to encourage them to be as specific as possible, both about strengths and weaknesses.

Other points that you should remind students of as you model how to give effective commentary in peer review:

  • Always point out strengths as well as elements that need more work.
  • Try to attend to larger issues first (audience, purpose, organization, detail, etc.). Talk about sentences, word choices, punctuation only late in the peer-review process.
  • Be specific. Point to particular places in the draft where revision will be helpful.
  • Don't hesitate to respond as a reader, especially early in the review process, for example,
    • I got confused here.
    • I saw your point clearly here.
    • I was convinced by your example or analogy or argument.
  • If you disagree with the comments of another peer reviewer, say so. Not all readers react the same ways, and divergent points of view can help writers see options for revising.
  • Make comments in spirit of helpfulness. Take comments in spirit of helpfulness.
  •  

Model How to Handle Divergent Advice

Remind students that they are responsible for the final drafts they submit to you but that they should carefully weigh each comment they receive from a peer reviewer. Comments that suggest radically different revisions of the same part of a draft generally help writers see various ways to revise but may confuse students about what to do. Students need not choose one of the suggested revisions, but they should note that multiple suggestions pointed at the same part of a draft typically highlight a place where some revision is necessary for readers.

Think About Logistics

The logistics of peer review are generally simple, but they do require some forethought. If you want students to read drafts in a round-robin exercise or to exchange drafts with one other student, you don't need to require any photocopying. But if you want each student to read three other drafts, make sure you remind students to bring three copies of their drafts to class on the day of the exchange.

You can let students pick their own peer-review partners or group members, but you might also consider assigning peer reviewers based on your knowledge of students' writing and editing skills.

If you hold in-class peer-review sessions, circulate during the session to make sure students are on track and to intervene as necessary. Also, save a few minutes at the end of the session to discuss common problems with the class as a whole.

Provide Adequate Time for Students to Complete Thorough Peer Review

The longer the draft or the more complex the criteria, the longer students will take to complete a thorough peer review. If you assign shorter documents, you can easily devote a part of a class to peer review or ask students to complete the peer review outside of class. But if you assign long, complex documents, consider breaking the peer review into several short chunks. For instance, students might complete one peer reading looking just for problems with focus, another for weaknesses in organization and development, and still another on graphics. Finally, students might have one or two additional peer-review sessions devoted exclusively to mechanics.

If you're concerned about taking so much class time for multiple peer reviews, consider the alternatives outlined under "Do writing and peer review take up too much class time?"

Build in Incentives for Helpful Comments

If students don't see the value of peer review, they are unlikely to spend much time reviewing others' drafts or to take peer advice seriously. The most effective way to encourage students to take peer review seriously, both as the reviewer and as the writer, is to include effective peer review as part of the overall grade for the assignment. Skimming peer review comments will take just a few minutes (even for multiple reviews of complex documents), and you'll quickly see which students provided the most helpful commentary. Alternatively, you can ask students to rank their peer reviewers and base the peer review part of the grade on peer ratings.

If you're uncomfortable weighing the quality of peer reviewing in the assignment grade, consider dividing the course grade to include a separate class participation or peer-reviewing grade.

Sample Workshop Sheet: Worksheet for Public Audience Draft

 

Writer_____________________ Reader ________________________

Ask the writer what questions or concerns he or she has about the text. Ask about target audience, and then read the draft carefully. Respond to the writer's concerns before you complete the rest of this worksheet.

***********************************************************************

  1. Note here the audience for the paper you're reviewing. Be as specific as possible.
  2. Where does the writer need more detail? (Also, take a moment to note questions that would help the writer see where and what detail to add.)
  3. Do sentences with paragraphs and paragraphs within sections track logically? Where do you stumble over gaps in logic? Which headings might be clearer for readers? Where should the writer add headings? Suggest specific revisions for greater coherence.
  4. Has the writer found an appropriate style and tone for this text? Where might the writer revise to make style and tone more effective?
  5. Has the writer cited appropriate and unbiased sources of information? Are quotations integrated into the text? Are the citations clear? Do you see any places where the writer needs to cite a source but now doesn't? Point those out to the writer.
  6. Consider format constraints. What will the writer need to attend to soon in terms of visual features for this text?
  7. Take any one paragraph and revise it for clarity and conciseness. Suggest other paragraphs that could benefit from the same revisions.
  8. The strongest part of this text now is
  9. The part of the text that still needs work is

Beyond the Basics

Writing teachers have incorporated peer review as a regular part of assignment design and sequencing since the 1970s. In a 30-year retrospective of an early article on peer review, Flynn (2010) reviews more recent scholarship and research on peer review. As she notes, early work on peer review was often promotional rather than critical, and this trend continues, she claims, as researchers investigate peer review in ESL and in computer-mediated contexts for composition classes. Flynn's point notwithstanding, well-structured peer review is one of several collaborative and cooperative approaches to learning that can improve students' written work, as Henry & Ledbetter (2011) explain in detail. A particularly helpful element in this text is its emphasis on explicit modeling of "the collaborative task of problem solving" a particular writing challenge (8) and the appendix that lays out the specific steps for modeling successful peer review.

As we look beyond composition classes, recent empirical studies of peer review support even the earliest anecdotal scholarship that students will not immediately provide substantive and useful feedback without instructional support. With guidance and training, however, students can improve the quality of their feedback (see, for instance, Ernest et al.), and many of the empirical studies of peer review note that students appreciate the opportunity to receive response to their writing in advance of grades (for example, see Poronnik & Moni 2006).

Work by Patchen, Charney & Schunn (2009) underscores both the value of peer review and the necessity for training students and instructors on giving particular kinds of feedback. In this study, students' peer review comments were compared with commentary from a writing instructor and from a content specialist. Although students' comments resemble instructors' comments, students praised peers twice as often as instructors praised peers (141), the content instructor was most likely to focus on problems followed by peer reviewers followed by the writing instructor, and both instructors were more likely to focus on problems in content than were students. The writing instructor was most likely to suggest solutions followed by students and then followed by the content instructor. As Patchen et al. point out, then, students can be trained to look more closely at content and to avoid mitigating problems they see in a text. Content instructors can be reminded to praise more often and to offer more solutions to the problems they note in texts. (See also Cho et al., 2006, for additional comparison of peer and instructor feedback.)

A cross-course study of peer review by Van den Berg et al. (2006) exemplifies typical problems that teachers might have with peer review if they do not attend carefully to logistics. More important, though, are their summary findings on the quality of peer assessment (PA):

  1. PA at the start (paper-outline) as well as at the end (draft version) of the writing process generates more feedback on the text's structure and writing process.
  2. When students present the oral part of PA in small feedback groups they engage more in analysing and revising, than when they have to present the outcome of their assessment publicly.
  3. A combination of written and oral feedback is more profitable than written or oral feedback only. In their oral feedback, students interact to clarify the text and suggest measures for revision. In their written feedback, students focus more on structure, whereas in oral feedback they focus more on style. (146)>/li>

Other recent empirical studies in biology (Gunersel et al., 2008), biochemistry (Hartberg et al., 2008), bioengineering (Volz & Saterbak, 2009), and elementary education (Trautman, 2009) support the value of peer review, particularly as computer-mediated. The first three articles noted here refer to Calibrated Peer Review (for more detail see Russell et al., 2004, or visit http://cpr.molsci.ucla.edu/Home.aspx for description and details) that Gunersel and colleagues describe: "CPR assignments engage students in writing and in reviewing their peers' work, and include a calibration phase during which students practice reviewing according to an instructor-designed rubric" (25). In the first two studies, CPR significantly improved some students' writing proficiency. Volz & Saterbak present more nuanced analysis drawn from a smaller classroom study that showed students are more able to evaluate peers' work than their own, and that the CPR framework constrains teachers and students in their assessments. However, these researchers further note that "CPR has been an asset to our course—both in that it has provided a novel peer review experience for the students and in that it has offered us a unique opportunity to understand more precisely where our students are struggling in their ability to enact disciplinary arguments." Reynolds & Moscovitz (2008), however, note significant limitations of CPR that teachers should consider before and as they first work with the tool.

Trautman reports on a study of 77 elementary education majors using an online peer review system in a course on learning and teaching science. Key findings include more revisions in the peer review conditions and their revisions "that had been influenced more by reviews received than by ideas they had developed while reviewing other students' reports" (697). Moreover, 74 of the 77 students completed a post-review survey: "Eighty-one percent agreed or strongly agreed with the statement, 'I changed my mind about something in my report because of comments I received through peer review,' and 86% agreed or strongly agreed that peer reviewing other students' work had helped them to improve their own scientific writing" (697).

Not all empirical studies are equally encouraging about using peer review in disciplinary courses. Covill (2010), for example, notes that "using peer review in college psychology classes, and a formal approach to reviewing, may not improve the quality of students [sic] writing but does affect the timing of students' revision behavior (before, rather than after, handing in the first draft) and students' attitudes toward instruction" (199). Some of the most recent empirical work on peer review, however, collected in a special issue of Journal of Writing Research, continues to build a more nuanced view of how and when peer review can be most useful for different populations of writers and varied disciplinary settings. The introduction to the special issue (Goldin et al., 2012), sets the context for the specific articles by reviewing major findings in empirical studies of peer review in the last dozen years.

Perhaps more important, recent work on peer review (as one of a number of collaborative or peer-assisted learning techniques) draws on broad theoretical perspectives including situated cognition and constructivist learning theory. Such theory draws on Vygotsky and Piaget and frames peer review as both collaborative and scaffolded learning. (See Trautman pp. 687-87 for a crisp explanation of these theoretical frameworks.) Longfellow et al. (2008), for example, describe a project and study that "supports Vygotsky's notion of how expert scaffolding from peers or near-peers enables new knowledge to become meaningful" (102) and "that successful 'expert' students may be better equipped and better placed than lecturers to pass on these particular skills to novice students" (103) in a peer-facilitated setting.

Moving beyond undergraduate writing contexts, Maurer (1999) ties the effectiveness of peer review to graduate studies, notably to introductory MA-level courses in sociology. Maurer argues that because graduate students are less likely now to find jobs in academe, they need to be prepared broadly both to know disciplinary theories and content and to collaborate successfully in professional settings. Peer review enhances students' critical thinking skills as readers and writers, improving their arguments as beginning graduate students. Beyond these advantages, peer review also fosters the collaborative awareness of peer readers and their needs.

Like other authors we mention here, Maurer also cautions that even graduate students will not immediately practice effective peer review. Graduate students, like undergraduates, need clear criteria to guide their reading and response to peers' drafts, "including the cogency of the thesis, the use of supporting evidence, clarity of communication, and other criteria germane to the seminar" (178). Moreover, graduate students also need help recognizing their strengths as responders so that they can overcome their reluctance to critique peers and improve their critical thinking, reading, and writing.

Finally, descriptive accounts of innovations to peer review continue to appear in disciplines as diverse as physiology (Peleaz, 2002), statistics (Cline, 2008), civil engineering with particular emphasis on large classes (Smith & Kampf, 2004), and business (Holst-Larkin, 2008). Reviewing pedagogy journals in a particular area will supplement the more general (and generalizable) advice offered here.

For additional examples of peer-review questions, see Trautman (2009) and Goldin (2011). For examples of grading rubrics that could be revised into peer-review prompts, see Leydens & Santi (2006), Poronnik & Moni (2006), Timmerman et al. (2011), Goldin & Ashley (2012), and Hartberg et al. (2008), listed below. For a detailed account of different functions of peer review rubrics, see Goldin & Ashley (2012).

References

Cho, K., Schunn, C.D., & Charney, D. (2006). Commenting on writing: Typology and perceived helpfulness of comments from novice peer reviewers and subject matter experts. Written Communication, 23(3), 260-294.

Cline, K.S. (2008). A writing-intensive statistics course. PRIMUS, 18(5), 399-410.

Covill, A.E. (2010). Comparing peer review and self-review as ways to improve college students' writing. Journal of Literacy Research, 42(2), 199-226.

Ernest, A., Johnson, P., & Kelly-Riley, D. (2011). Assessing rhetorically: Evidence of student progress in small-group writing tutorials. Learning Assistance Review, 16(2), 23-40.

Flynn, E. (2011). Re-viewing peer review. The Writing Instructor. 8(December, 2011). http://www.writinginstructor.com/30review (Accessed electronically on 10-31-2012).

Goldin, I. M. (2011). A Focus on Content: The Use of Rubrics in Peer Review to Guide Students and Instructors. Retrieved from http://etd.library.pitt.edu/ETD/available/etd-07142011-004329/

Goldin, I.M., & Ashley, K.D. (2012). Eliciting formative assessment in peer review. Journal of Writing Research, 4(2), 203-237.

Goldin, I.M., Ashley, K.D., Schunn, C.D. (2012). Redesigning educational peer review interactions using computer tools: An introduction. Journal of Writing Research, 4(2), 111-119.

Gunersel, A.B., Simpson, N.J., Aufderheide, K.J., Wang, L. (2008). Effectiveness of calibrated peer review [tm] for improving writing and critical thinking skills in biology undergraduate students. Journal of the Scholarship of Teaching and Learning, 8(2), 25-37.

Hartberg, Y., Gunersel, A.B., Simpson, N.J., Balester, V. (2008). Development of student writing in biochemistry using calibrated peer review. Journal of the Scholarship of Teaching and Learning, 8(1), 29-44.

Henry, J., & Ledbetter, L. (2011). Teaching intellectual teamwork in WAC courses through peer review. Currents in Teaching and Learning, 3(2), 4-21.

Holst-Larkin, J. (2008). Actively learning about readers: Audience modelling in business writing. Business Communication Quarterly, 71(1), 75-80.

Leydens, J., & Santi, P. (2006). Optimizing faculty use of writing as a learning tool in geoscience education. Journal of Geoscience Education, 54(4), 491-502.

Longfellow, E., May, S., Burke, L., Marks-Maran, D. (2008). "They had a way of helping that actually helped": A case study of a peer-assisted learning scheme. Teaching in Higher Education, 13(1), 93-105.

Maurer, S.B. (1999). The role of the area seminar in graduate education. Response to "Rethinking the Graduate Seminar." Teaching Sociology, 27(2), 174-179.

Patchen, M.M., Charney, D., & Schunn, C.D. (2009). A validation study of students' end comments: Comparing comments by students, a writing instructor, and a content instructor. Journal of Writing Research, 1(2), 124-152.

Pelaez, N. J. (2002). Problem-based writing with peer review improves academic performance in physiology. Advances in Physiology Education, 26(3), 174-184.

Poronnik, P., & Moni, R.W. (2006). The opinion editorial: Teaching physiology outside the box. Advances in Physiology Education, 30(2), 73-82.

Reynolds, J., & Moscovitz, C. (2008). Calibrated Peer Review assignments in science courses: Are they designed to promote critical thinking and writing skills? Journal of College Science Teaching, 38(2), 60-66.

Russell, A. A., Cunningham, S., & George, Y. S. (2004). Calibrated Peer Review: A writing and critical thinking instructional tool. In Invention and Impact: Building Excellence in Undergraduate Science, Technology, Engineering and Mathematics (STEM) Education. American Association for the Advancement of Science.

Smith, K., & Kampf, C. (2004). Developing writing assignments and feedback strategies for maximum effectiveness in large classroom environments. Proceedings 2004 International Professional Communication Conference; pp. 77-82.  IEEE International Professional Communication Conference, Minneapolis, MN; September, 2004. New York: IEEE Press.

Timmerman, B.E.C., Strickland, D.C., Johnson, R.L., & Payne, J.R. (2011). Development of a 'universal' rubric for assessing undergraduates' scientific reasoning skills using scientific writing. Assessment & Evaluation in Higher Education, 36(5), 509-547.

Trautmann, N.M. (2009). Interactive learning through web-mediated peer review of student science reports. Educational Technology Research and Development, 57(5), 685-704.

Van den Berg, I., Admiraal, W., & Pilot, A. (2006). Designing student peer assessment in higher education: analysis of written and oral peer feedback. Teaching in Higher Education, 11(2), 135-147.

Volz, T., & Saterbak, A. (2009, January 19). Students' strengths and weaknesses in evaluating technical arguments as revealed through implementing Calibrated Peer Review™ in a bioengineering laboratory. Across the Disciplines, 6. Retrieved November 8, 2012, from https://wac.colostate.edu/docs/atd/technologies/volz_saterbak.doc.