Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: More considerations and references

Using PeerMark - guidance for staff

Contents

Table of Contents
minLevel2

...

  1. Staff set up a Turnitin assignment to which students submit their work.
  2. Staff set up an associated PeerMark, including review questions, setting how many pieces of work each students review, by when, how allocated, whether anonymous &c.
  3. Students submit their work to the Turnitin assignment.
  4. Students review others' work - formative feedback only, without a numeric mark.
  5. Students receive the feedback given by their peers, along with a mark for their own review(s).
  6. Depending on the timing, students can then incorporate this feedback into a final, perhaps credit-bearing, submission.
  7. Students can receive a mark from a staff assessor for their review.

Info
Edinburgh University has made a number of PeerMark case studies  available, including the experiences and thinking of staff setting up peer assessment for the first time. Nb you may need to allow the media in your web browser security settings. For general peer assessment design principles and case studies, see the University of Strathclyde's PEER Toolkit and contributions from Eva Sorensen (Chemical Engineering) and Richard Milne (Virology) on UCL's Teaching & Learning Portal. For troubleshooting see this guidance from the University of Reading.

...

Info

Considerations

Won't students take each others' ideas? This is one reservation which is widely held by students. Richard Milne (UCL Centre for Virology) comments on his own experience of setting up peer review activities, "I wasn't worried about students stealing each others' ideas ... you discuss a subject with somebody else and then formulate your own way of thinking about it based on the conversation you’ve had". Students can be encouraged to credit each others' ideas (and a convention can be agreed for circumstances of anonymity).

Can students at any level of knowledge carry out good peer reviews? In their meta-analysis comparing validity of tutor and student assessments, Falchikov and Goldfinch (2009) could not find evidence that peer assessment in higher level courses was any more reliable (between different assessors) or valid than at (according to a standard) than at introductory levels. They speculate that careful preparation by tutors and students can compensate for subject knowledge of students at earlier stages of their course.


Can peer assessment work in every subject area? Although they found some differences, a meta-analysis of academic-peer agreement in marking by Falchikov and Goldfinch did not find that subject area had a significant effect on the quality of peer assessment. They also report that peer assessment of academic products (e.g. essays, posters) or processes (e.g. oral presentation skills, groupwork participation) have more validity than those in the context of professional practice (e.g. internships). This may be related to students' greater experience with academic products and processes. Their research also suggests that while students are equal to peer-assessment in one new discipline, requiring multi-disciplinary assessments is likely to reduce validity.

...

In the 'Peermark Assignment' tab of the PeerMark Manager you enter basic information about the activity.

Title

This will appear for students and should be distinctive and descriptive.

Point value (required)

The marks available for the peer review itself - i.e. not for the reviewed work. This reflects research findings that asking students to assign numeric marks to their peers exacerbates any sense of risk and brings undue complications and pressure to peer review without bringing any particular learning benefits.

Instructions to students

Brief guidance about what students should do and why.

Start date, Due date, Post date

NB How do these relate to the Turnitin assignment's dates?

Make sure you click the 'Save & Continue' button to proceed to the next tab.


 

Info

Considerations

Instructions. Students tend to prefer tutor marking, which may indicate positivist beliefs about objectivity in marking and the assumption and may assume that there is a correct mark for their work which is not open to interpretation (McConlogue, 2012). Most researchers into peer assessment (including Bloxham and West, 2007; McConologue, 2014; Nicol, 2010; Orsmond, 2004; Topping, 2009) stress the need to involve students in discussing , - and ideally negotiating - the yardstick against which they will measure themselves and others, rather than relying on textual instructions alone. They recommend discussing the rationale, criteria and expectations for peer and/or self review before, during and after the activity, rather than relying on textual instructions alone. Discussing or negotiating expectations could clarify how much time students were expected to spend on each review and indicate how much feedback should be given. These discussions can , which will help even out the quality and quantity of peer feedback and avoid perceptions of unfairness (Cartney, 2010).

Dates.The fact that PeerMark is for formative feedback only raises possibilities for students reviewing draft work at an early relatively unpolished stage which remains open to rewriting on the basis of feedback (Colvill, 2010). In which case, set the Feedback Release Date to allow time for students to make changes in advance of their final credit-bearing submission. The time allowance for the PeerMark activity (i.e. between Start Date and Due Date) should reflect the time students are expected to spend, and allow for their other commitments.

 

 

...

On the 'Peermark Assignment' tab there is a link for additional settings. Here's some explanation for the less obvious ones.

'Award full points if review is written' 

If ticked this means tutors will not be able to mark the reviews and a student will need to meet set requirements for every part of the review in order to get the available marks, on an all-or-nothing basis. If unticked, tutors can assign and differentiate marks for each student's review. 

'Allow students to view author and reviewer names'

If left unticked, you probably need to remind students not to put any identifying information in the title, filename, or body of their work.

'Paper(s) automatically distributed by Peermark'

This sets the number of randomly allocated papers each student has to review.

'Papers(s) selected by the student'

This sets the number of papers a student can choose to review. Students can review a combination of allocated and selected papers.

'Require self-review'

If checked, a student has to review their own paper. It isn't currently possible to select self review only - the number allocated by PeerMark has to be at least one.

Info

Considerations

Award full points if the review is written. Where this all-or-nothing setting is deployed as an incentive to participate, keep in mind the importance of dialogue at all stages.

Allow students to view author and reviewer names. Setting the peer review to be anonymous will prevent friendship, enmity or power processes determining the review and forestall collusion. Clear criteria and an ethos which encourages mutual constructive criticism while discouraging platitudes are other measures to allay the social comfort students may feel about commenting on others' work. It may be necessary work out with students a convention for referencing each others' work in the absence of names, should they want to do so.

Allow submitters to read all papers after the Start Date. As well as allowing students to compare different work, this allows students to select work to review, if this has been enabled in the settings.

Allow students to read ALL papers and ALL reviews after the Feedback Release Date. Again, this communicates to students that they are welcome and encouraged to benchmark both their submissions and their reviews, and opens up the possibility of conversations which outlast the PeerMark activity.

Distribution of papers. Keep in mind boredom, tiredness and time pressures when deciding how many submissions each student should review. Falchikov and Goldfinch (2009) found that larger numbers of reviewers did not bring any validity gains and may reduce reliability due to the 'diffusion of responsibility effect' whereby students are less likely to perceive their own review as mattering. (Falchikov and Goldfinch were comparing peer and tutor marks rather than feedback, though).

Require self-review. Ormond (2004)would provide opportunities for studentsSince one of the aims of peer-assessment is to help students use the criteria in their own work, self-review is likely to be a helpful exercise. However, it may pose a distinct idiosyncratic or cultural set of complications related to self-esteem, self-confidence, modesty, and how students habitually estimate their own ability (Saito and Fujita, 2004). For this reason PeerMark requires at least one peer review, whether or not there is a self review.

 

Adding Questions

The 'PeerMark Questions' tab of the PeerMark Manager allows you create the questions you want the peer reviewers to answer. To add a question, click 'Add question'

Enter your question text, the question type. There are two types of question you can use;
a 'Free Response' question - for example "What is the thesis of the paper?" and a 'Scale' question – for example "How well does the introduction pull you in as a reader? Scale, Not very well to
Really well"

 

For a 'Free response' question, enter the minimum answer length (this counts words).

For a 'Scale' question, enter the scale size and the lowest and highest values

You can also use libraries to manage your Peermark questions. Clicking on Library Settings allows you to create and delete libraries, and to save and retrieve questions from those libraries. There is also a 'Sample Library' which you can add pre-made questions from.

 

Info

Considerations

  • Questions. These relate Who decides the questions? Although these will be aligned to the assessment criteria and the intended learning outcomes of the course. However, there is a clear message from the peer assessment literature about , most of the work on peer assessment stresses the importance of involving students in developing and clarifying criteria, even if they arrive at similar criteria to the tutors. The purpose here is to increase a sense of ownership, reduce anxiety, and also reach a shared understanding about the meaning of the criteria which (Falchikov and Goldfinch, 2009) improves reliability and validity - and with those, confidence in the process. What kinds of questions? Orsmond process. Orsmond (2004, Section 2) discusses alternative techniques for introducing assessment criteria to students, including practice in applying the criteriaapplying the criteria.
  • What kinds of questions?  Falchikov and Goldfinch (2009) found that asking students to make a global judgement (i.e. of the submission as a whole) based on distinct criteria was more effective than either a judgement without criteria, or separate judgements of separate dimensions of the submission. McConlogue (2014) points out that as well as value judgements, reviewed students will also expect feedback that makes suggestions about how to improve; PeerMark open questions allow tutors to prompt student reviewers for these suggestions.
     
  • Order of questions. Topping (2009) recommends asking students to give positive feedback first, since this improves subsequent acceptance of negative feedback.
  • Opportunities for practice. Again, there is a clear recommendation from the literature that students have the opportunity to rehearse working with the criteria. This could fit well with the aforementioned recommended discussion of the criteria.

 

 

Distribution

Note

Please note that after reviewing has started you won't be able to pair students - so do make any allocations in advance.

In the 'Distribution' tab of the PeerMark Manager you can see all the student accounts associated with this assignment and how they will be allocated reviews. If you want to, this is where you can get involved with who reviews whose work.

If you can't see all the accounts you are expecting, click outside of the Peermark Manager to return to your Turnitin assignment page; then click its 'Turnitin Students' tab. From there you can click 'Enrol all students', which will bring in all students 'enrolled' in that Moodle course area.

 

If you need to exempt student from the PeerMark activity, you can exclude them by clicking their adjacent red Minus icon; their name displays greyed-out and they gain a green Plus icon, which you can click if you need to reinstate them.

If you want to pair students (so that a particular student is allocated the work of another particular student to review, overriding any other distribution settings) you can do so by clicking the blue Plus icon and then selecting a student to pair with from the dropdown list. Paired students are then required to review the work they are allocated.

Info

Considerations.

Does it matter which students review which other students' work? You Tutors may want to connect students on the basis of interest. Another As well as matching students through PeerMark, another way to achieve this is to set up groups in your Moodle area and apply these to the Turnitin assignmentand apply these to the Turnitin assignment. Alternatively, PeerMark has the option of letting students choose the work they review (though this introduces the possibility that some students will receive more feedback than others - usually contentious).

 

 

 

Accessing Peermark reviews

In the 'Submission Inbox' you can see details of all the PeerMark assignments set up for that Turnitin assignment.

Click on the 'Launch Peermark Reviews' icon

If there is more than one Peermark assignment set up for this Turnitin assignment you can select the one you want .

The 'Reviews' tab shows you a list of the students.

Students that have submitted a paper will have an icon next to them under the 'Review' column. The 'Received' column shows how many reviews a student's submission has received.The 'Submitted' column shows how many reviews a student has submitted. Clicking on the numbers with a grey background takes you to either the 'Received Reviews' or the 'Submitted Reviews' tab, and from there, clicking on the blue 'tick' icon launches the document viewer.

Tutors can also write reviews. Clicking on the blue 'Write instructor review' icon in the 'Review' column allows you to write an additional review as the assignment tutor. Can reviewed students distinguish the tutor review?

If you have left a review for a submission it will display a green 'Edit instructor review' icon.

 

 

...

References

  • Bloxham, S., & West, A. (2007). Learning to write in higher education: students’ perceptions of an intervention in developing understanding of assessment criteria. Teaching in Higher Education, 12(1), 77–89.
  • Cartney, P. (2010). Exploring the use of peer assessment as a vehicle for closing the gap between feedback given and feedback used. Assessment & Evaluation in Higher Education, 35(5), 551–564. 
  • Covill, A. (2010). Comparing Peer Review and Self-Review as Ways to Improve College Students’ Writing. Journal of Literacy Research, 42(2), 199–226.
     
  • Falchikov, N., & Goldfinch, J. (2000). Student Peer Assessment in Higher Education: A Meta-Analysis Comparing Peer and Teacher Marks. Review of Educational Research, 70(3), 287–322.
  • McConlogue, T. (2012). But is it fair? Developing students’ understanding of grading complex written work through peer assessment. Assessment & Evaluation in Higher Education, 37(1), 113–123.
  • McConlogueM.cConlogue, T. (2014). Making judgements: investigating the process of composing and receiving peer feedback. Studies in Higher Education, 1–12. 
  • Milne, R. (2013). Peer review of virology essays. Available from: https://www.ucl.ac.uk/teaching-learning/case-studies-news/assessment-feedback/peer-review-of-virology-essays
  • Nicol, D., (2007). Peer Evaluation in Assessment Review project. Available from http://www.reap.ac.uk/PEER.aspx
  • Nicol, D. (2010). From monologue to dialogue: improving written feedback processes in mass higher education. Assessment & Evaluation in Higher Education, 35(5), 501–517.
  • Orsmond, P. (2004). Self- and peer-assessment: guidance on practice in the biosciences. Leeds: Centre for Bioscience, Higher Education Academy.
  • Saito, H., & Fujita, T. (2004). Characteristics and
    peer-assessment: guidance on practice in the biosciences. Leeds: Centre for Bioscience, Higher Education Academy.
    user acceptance of peer rating in EFL writing classrooms. Language Teaching Research, 8(1), 31–54
     
  • Schommer, M. (1990). Effects of beliefs about the nature of knowledge on comprehension. Journal of Educational Psychology, 82(3), 498–504.
  • Sorensen, E. (2013). Experiences of using peer assessment in a 4th year design module. Available from: http://www.ucl.ac.uk/teaching-learning/case-studies-news/assessment-feedback/peer-assessment-chemical-engineering
  • Topping, K. J. (2009). Peer Assessment. Theory Into Practice, 48(1), 20–27.
  • Yorke, M. (2003). Formative assessment in higher education: moves towards theory and the enhancement of pedagogic practice. Higher Education, 45, 477–501.
 

...

 

...