Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: More considerations and refs

Using PeerMark - guidance for staff

Contents

Table of Contents
minLevel2

...

  • Students' ability to understand and work with assessment criteria.
  • Students' participation ability in the authentic academic practice of peer review.
  • Insights, through articulating judgements and producing constructive feedback which suggests how peers can improve, about how they themselves students can go about critiquing and improving their own work.
  • The possibility of feedback that is quicker, more individualised, and more plentiful than tutors are able to provide. 
  • The possibility of feedback on students' draft work, with sufficient time for amendments before the its deadline.
  • Avoiding 'learned dependence' (Yorke, 2003) - students' over-reliance on tutor opinions , and over-humility about the importance of their own understandings.
  • Triangulation - the original submission, peer reviews and tutor assessment (not to mention self assessment where used) can be compared, giving students new perspectives on their submission, the criteria, and the reviews they have written.
  • Relatedly, insights into subjectivity and governance in the assessment process.
  • Also relatedly, a departure from monologic, transmissive feedback as students weigh up the differences in the reviews. This in turn promises a desirable change in the way feedback is received from simple certainties to more sophisticated, evaluative thinking (Schommer, 1990).
  • An occasion for dialogue with tutors and peers about assessment.

...

Info

Considerations

Won't students take each others' ideas? This is one reservation which is widely held by students. Richard Milne (UCL Centre for Virology) comments on his own experience of setting up peer review activities, "I wasn't worried about students stealing each others' ideas ... you discuss a subject with somebody else and then formulate your own way of thinking about it based on the conversation you’ve had". Students can be encouraged to credit each others' ideas (and a convention can be agreed for circumstances of anonymity).

Can students at any level of knowledge carry out good peer reviews? In their meta-analysis comparing validity of tutor and student assessments, Falchikov and Goldfinch (2009) could not find evidence that peer assessment in higher level courses was any more reliable or valid than at introductory levels. They speculate that careful preparation by tutors and students can compensate for subject knowledge of students at earlier stages of their course.


Can peer assessment work in every subject area? Although they found some differences (arts, social sciences and medical sciences had lower peer-academic agreement in some cases) , a meta-analysis of academic-peer agreement in marking by Falchikov and Goldfinch did not find that subject area had a significant effect on the quality of peer assessment. They also report that peer assessment of academic products (e.g. essays, posters) or processes (e.g. oral presentation skills, groupwork participation) have more validity than those in the context of professional practice (e.g. internships). This may be related to students' greater experience with academic products and processes. Their research also suggests that while students are equal to peer-assessment in one new discipline, requiring multi-disciplinary assessments is likely to reduce validity.

...

In the 'Peermark Assignment' tab of the PeerMark Manager you enter basic information about the activity.

Title

This will appear for students and should be distinctive and descriptive.

Point value (required)

The marks available for the peer review itself - i.e. not for the reviewed work. This reflects research findings that asking students to assign numeric marks to their peers exacerbates any sense of risk and brings undue complications and pressure to peer review without bringing any particular learning benefits.

Instructions to students

Brief guidance about what students should do and why.

Start date, Due date, Post date

NB How do these relate to the Turnitin assignment's dates?

Make sure you click the 'Save & Continue' button to proceed to the next tab.


 

Info

Considerations

Instructions. Students tend to prefer tutor marking, which may indicate positivist beliefs about objectivity in marking and the assumption that there is a correct mark for their work which is not open to interpretation (McConlogue, 2012). Most researchers into peer assessment (including Bloxham and West, 2007; McConologue, 2014; Nicol, 2010; Topping, 2009) stress the need to discuss with students involve students in discussing, and ideally negotiating, the rationale, criteria and expectations for peer review before, during and after the activity, rather than relying on textual instructions alone. Discussing or negotiating expectations could clarify how much time students were expected to spend on each review and an indication of indicate how much feedback should be given. These particulars would discussions can help to even out the quality and quantity of peer feedback and avoid perceptions of unfairness (Cartney, 2010).Point value. This should be sufficient to indicate to the students that their participation in peer review matters.

Dates.The fact that PeerMark is for formative feedback only raises possibilities for students reviewing draft work at an early relatively unpolished stage which remains open to rewriting on the basis of feedback (Colvill, 2010). In which case, set the Feedback Release Date to allow time for students to make changes in advance of their final credit-bearing submission. The time allowance for the PeerMark activity (i.e. between Start Date and Due Date) should reflect the time students are expected to spend, and allow for their other commitments.

 

 

...

On the 'Peermark Assignment' tab there is a link for additional settings. Here's some explanation for the less obvious ones.

'Award full points if review is written' 

If ticked this means tutors will not be able to mark the reviews and a student will need to meet set requirements for every part of the review in order to get the available marks, on an all-or-nothing basis. If unticked, tutors can assign and differentiate marks for each student's review. 

'Allow students to view author and reviewer names'

If left unticked, you probably need to remind students not to put any identifying information in the title, filename, or body of their work.

'Paper(s) automatically distributed by Peermark'

This sets the number of randomly allocated papers each student has to review.

'Papers(s) selected by the student'

This sets the number of papers a student can choose to review. Students can review a combination of allocated and selected papers.

'Require self-review'

If checked, a student has to review their own paper. It isn't currently possible to select self review only - the number allocated by PeerMark has to be at least one.

Info

Considerations

Award full points if the review is written. Where this all-or-nothing setting is deployed as an incentive to participate, keep in mind the importance of dialogue at all stages.

Allow students to view author and reviewer names. Setting the peer review to be anonymous will prevent friendship, enmity or power processes determining the review and forestall collusion. Clear criteria and an ethos which encourages mutual constructive criticism while discouraging platitudes are other measures to allay the social comfort students may feel about commenting on others' work. It may be necessary work out with students a convention for referencing each others' work in the absence of names, should they want to do so.

Allow submitters to read all papers after the Start Date. As well as allowing students to benchmark their compare different work, this allows students to select work to review, if this has been enabled in the settings.

Allow students to read ALL papers and ALL reviews after the Feedback Release Date. Again, this communicates to students that they are welcome and encouraged to benchmark both their submissions and their reviews, and opens up the possibility of conversations which outlast the PeerMark activity.

Distribution of papers. Keep in mind boredom, tiredness and time pressures when deciding how many submissions each student should review. Falchikov and Goldfinch (2009) found that larger numbers of reviewers did not bring any validity gains and may reduce reliability due to the 'diffusion of responsibility effect' whereby students are less likely to perceive their own review as mattering. (Falchikov and Goldfinch were comparing peer and tutor numeric marks rather than feedback, though).

Require self-review. This Ormond (2004)would provide opportunities for students

 

...

The 'PeerMark Questions' tab of the PeerMark Manager allows you create the questions you want the peer reviewers to answer. To add a question, click 'Add question'

Enter your question text, the question type. There are two types of question you can use;
a 'Free Response' question - for example "What is the thesis of the paper?" and a 'Scale' question – for example "How well does the introduction pull you in as a reader? Scale, Not very well to
Really well"

 

For a 'Free response' question, enter the minimum answer length (this counts words).

For a 'Scale' question, enter the scale size and the lowest and highest values

You can also use libraries to manage your Peermark questions. Clicking on Library Settings allows you to create and delete libraries, and to save and retrieve questions from those libraries. There is also a 'Sample Library' which you can add pre-made questions from.

 

Info

Considerations

  • Questions. These relate to the assessment criteria and the intended learning outcomes of the course. However, there is a clear message from the peer assessment literature about the importance of involving students in developing and clarifying criteria, even if they arrive at similar criteria to the tutors. The purpose here is to increase a sense of ownership, reduce anxiety, and also reach a shared understanding about the meaning of the criteria which (Falchikov and Goldfinch, 2009) improves reliability and validity - and with those, confidence in the process. 
  • What kinds of questions? It is very important Orsmond (2004, Section 2) discusses alternative techniques for introducing assessment criteria to students, including practice in applying the criteria. McConlogue (2014) points out that as well as value judgements, the feedback also suggests how the reviewed student can reviewed students will also expect feedback that makes suggestions about how to improve.
     
  • Order of questions. Topping (2009) recommends asking students to give positive feedback first, since this improves subsequent acceptance of negative feedback.
  • Opportunities for practice. Again, there is a clear recommendation from the literature that students have the opportunity to rehearse working with the criteria. This could fit well with the aforementioned recommended discussion of the criteria.

 

...

References

  • Bloxham, S., & West, A. (2007). Learning to write in higher education: students’ perceptions of an intervention in developing understanding of assessment criteria. Teaching in Higher Education, 12(1), 77–89.
  • Cartney, P. (2010). Exploring the use of peer assessment as a vehicle for closing the gap between feedback given and feedback used. Assessment & Evaluation in Higher Education, 35(5), 551–564. 
  • Covill, A. (2010). Comparing Peer Review and Self-Review as Ways to Improve College Students’ Writing. Journal of Literacy Research, 42(2), 199–226.
     
  • Falchikov, N., & Goldfinch, J. (2000). Student Peer Assessment in Higher Education: A Meta-Analysis Comparing Peer and Teacher Marks. Review of Educational Research, 70(3), 287–322.
  • McConlogue, T. (2012). But is it fair? Developing students’ understanding of grading complex written work through peer assessment. Assessment & Evaluation in Higher Education, 37(1), 113–123.
  • McConlogue, T. (2014). Making judgements: investigating the process of composing and receiving peer feedback. Studies in Higher Education, 1–12. 
  • Milne, R. (2013). Peer review of virology essays. Available from: https://www.ucl.ac.uk/teaching-learning/case-studies-news/assessment-feedback/peer-review-of-virology-essays
  • Nicol, D., (2007). Peer Evaluation in Assessment Review project. Available from http://www.reap.ac.uk/PEER.aspx
  • Nicol, D. (2010). From monologue to dialogue: improving written feedback processes in mass higher education. Assessment & Evaluation in Higher Education, 35(5), 501–517.
  • Orsmond, P. (2004). Self- and peer-assessment: guidance on practice in the biosciences. Leeds: Centre for Bioscience, Higher Education Academy.
  • Schommer, M. (1990). Effects of beliefs about the nature of knowledge on comprehension. Journal of Educational Psychology, 82(3), 498–504.
  • Sorensen, E. (2013). Experiences of using peer assessment in a 4th year design module. Available from: http://www.ucl.ac.uk/teaching-learning/case-studies-news/assessment-feedback/peer-assessment-chemical-engineering
  • Topping, K. J. (2009). Peer Assessment. Theory Into Practice, 48(1), 20–27.
  • Yorke, M. (2003). Formative assessment in higher education: moves towards theory and the enhancement of pedagogic practice. Higher Education, 45, 477–501.
 

 

 

http://www.autocar.co.uk/car-news/new-cars/new-electric-trabant