Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Using PeerMark - guidance for staff

Contents

Table of Contents
minLevel2

...

  1. Staff set up a Turnitin assignment to which students submit their work.
  2. Staff set up an associated PeerMark, including review questions, setting how many pieces of work each students review, by when, how allocated, whether anonymous &c.
  3. Students submit their work to the Turnitin assignment.
  4. Students review others' work.
  5. Students receive the feedback given by their peers, along with a mark for their own review(s).
  6. [[CHECK]] Depending on the timing, students can then incorporate this feedback into a final, perhaps credit-bearing, submission.

Info
Edinburgh University has made a number of PeerMark case studies  available, including the experiences and thinking of staff setting up peer assessment for the first time. Nb you may need to allow the media in your web browser security settings. For general peer assessment design principles and case studies, see the University of Strathclyde's PEER Toolkit and the contributions from Eva Sorensen (Chemical Engineering) and Richard Milne (Virology) on UCL's Teaching & Learning Portal. For troubleshooting , see this guidance from the University of Reading.

...

In the 'Peermark Assignment' tab of the PeerMark Manager you enter basic information about the activity.

Title

This will appear for students and should be distinctive and descriptive.

Point value (required)

The marks available for the peer review itself - i.e. not for the reviewed work. This reflects research findings that asking students to assign numeric marks to their peers exacerbates any sense of risk and brings undue complications and pressure to peer review without bringing any particular learning benefits.

Instructions to students

Brief guidance about what students should do and why.

Start date, Due date, Post date

NB How do these relate to the Turnitin assignment's dates?

Make sure you click the 'Save & Continue' button to proceed to the next tab.


 

Info

Considerations

Instructions. Students tend to prefer tutor marking, which may indicate positivist beliefs about objectivity in marking and the assumption that there is a correct mark for their work which is not open to interpretation (McConlogue, 2012). Most researchers into peer assessment (including Bloxham and West, 2007; McConologue, 2014; Nicol, 2010; Topping, 2009) stress the need to discuss with students the rationale, criteria and expectations for peer review before, during and after the activity, rather than relying on textual instructions alone. Discussing or negotiating expectations could clarify how much time students were expected to spend on each review and an indication of how much feedback should be given. These particulars would help to even out the quality and quantity of peer feedback and avoid perceptions of unfairness (Cartney, 2010).

Point value. This should be sufficient to indicate to the students that their participation in peer review matters.

Dates.The fact that PeerMark is for formative feedback only raises possibilities for students reviewing draft work at an early relatively unpolished stage which remains open to rewriting on the basis of feedback (Colvill, 2010). In which case, set the Feedback Release Date to allow time for students to make changes in advance of their final credit-bearing submission. The time allowance for the PeerMark activity (i.e. between Start Date and Due Date) should reflect the time students are expected to spend, and allow for their other commitments.

 

 

...

On the 'Peermark Assignment' tab there is a link for additional settings. Here's some explanation for the less obvious ones.

'Award full points if review is written' 

If ticked this means tutors will not be able to mark the reviews and a student will need to meet set requirements for every part of the review in order to get the available marks, on an all-or-nothing basis. If unticked, tutors can assign and differentiate marks for each student's review. 

'Allow students to view author and reviewer names'

If left unticked, you probably need to remind students not to put any identifying information in the title, filename, or body of their work.

'Paper(s) automatically distributed by Peermark'

This sets the number of randomly allocated papers each student has to review.

'Papers(s) selected by the student'

This sets the number of papers a student can choose to review. Students can review a combination of allocated and selected papers.

'Require self-review'

If checked, a student has to review their own paper. It isn't currently possible to select self review only - the number allocated by PeerMark has to be at least one.

Info

Considerations

Award full points if the review is written. Where this all-or-nothing setting is deployed as an incentive to participate, keep in mind the importance of dialogue at all stages.

Allow students to view author and reviewer names. Setting the peer review to be anonymous will prevent friendship, enmity or power processes determining the review and forestall collusion. Clear criteria and an ethos which encourages mutual constructive criticism while discouraging platitudes are other measures to allay the social comfort students may feel about commenting on others' work. It may be necessary work out with students a convention for referencing each others' work in the absence of names, should they want to do so.

Allow submitters to read all papers after the Start Date. As well as allowing students to benchmark their work, this allows students to select work to review, if this has been enabled in the settings.

Allow students to read ALL papers and ALL reviews after the Feedback Release Date. Again, this communicates to students that they are welcome and encouraged to benchmark both their submissions and their reviews, and opens up the possibility of conversations which outlast the PeerMark activity.

Distribution of papers. Keep in mind boredom, tiredness and time pressures when deciding how many submissions each student should review. Falchikov and Goldfinch (2009) found that larger numbers of reviewers did not bring any validity gains and may reduce reliability due to the 'diffusion of responsibility effect' whereby students are less likely to perceive their own review as mattering. (Falchikov and Goldfinch were comparing peer and tutor numeric marks rather than feedback, though).

Require self-review. This would provide opportunities for students

 

...