Case Study: Gamified Peer Review: Making Feedback Fun
Freyja Thoroddsen Sigurdardottir; Rannveig Sigurvinsdottir; and John Baird
Summary
This case study focuses on a Gamified Peer Review (GPR) activity designed to position students as partners in a learning process which motivates engagement, feedback quality, and skill development. Our design gamifies the ‘student-as-rater’ action where students evaluate the quality of feedback received, an under-incentivised and under-studied element of GPR design. Most students found the activity valuable and reported performance gains in giving and receiving feedback. Motivation was enhanced by meaningful peer review and the game element, but somewhat undermined by perceived unfairness and excessive workload.
Overview
Peer review is a structured and reciprocal process where students evaluate and provide feedback on peers’ work while also receiving feedback on their own. They can then use this as ‘feed forward’ to improve their work before final submission (Nicol et al., 2014; Quinton & Smallbone, 2010). Gamification incorporates game elements – points and leaderboards being the most common (Indriasari et al., 2020) – into learning to enhance student engagement (Dicheva et al., 2018; Zeybeck & Sagyı, 2023).
Important characteristics of active learning include:
- active engagement in constructing one’s learning (Prince, 2009)
- taking responsibility for one’s own learning and, sometimes, that of others
- engagement in higher order cognitive skills like evaluation and creation (Freeman et al., 2014; Odom et al., 2009).
Our Gamified Peer Review (GPR) activity is a form of active learning because students evaluate and produce feedback on each other’s work before critically evaluating this feedback and using it to improve their own work. The game component contributes to a positive motivational climate. Our design gamifies the ‘student-as-rater’ action where students evaluate the quality of feedback received, an under-incentivised and under-studied element of GPR design (Indriasari et al., 2020).
Activity Design
The GPR activity was introduced in 2023 to a course with a typical enrolment of 15-25 undergraduate students. We used the FeedbackFruits Peer Review tool to support implementation.
GPR has two components (see Figure 1):
- Peer review (25% of the final grade): Students develop a presentation slide deck over the semester, submitting it three times for a round of peer review (‘student-as-author’; Indriasari et al., 2020). They also review the slide decks of three peers each round (‘student-as-reviewer’; Indriasari et al., 2020). Students use the feedback to refine their final submission (‘student-as-author’). They also submit a reflection justifying changes made (or not) based on the feedback (‘student-as-author’).
- Gamified rating (10% of the final grade): For each round of peer review, students rank the helpfulness of the feedback they receive from each peer by awarding gold, silver, and bronze medals (‘student-as-rater’; Indriasari et al., 2020). These medals translate into experience points (XPs) which contribute to their final grade (see Figure 2). A leaderboard displaying accumulated XPs is published after each review round.


The GPR process is asynchronous and multi-phased, enhancing accessibility. Students have up to a week to complete each step, allowing them to engage at a time that suits them. This structure is particularly helpful for those who benefit from additional processing time, experience performance anxiety, or are studying in a language that is not their first.
Students are encouraged to approach peer review with respect and care. They are reminded that reviews reflect real effort and should be evaluated thoughtfully. While students are not expected to agree with all feedback, they are expected to engage with it critically. In this way, GPR aligns with inclusive teaching principles by allowing students to engage through writing, reviewing, and evaluating feedback – each supporting different strengths and approaches to learning.
To create a safe partnership environment, peer reviews are anonymised – students do not know whose work they are reviewing or who reviewed theirs – and since 2025, the leaderboard is anonymous by default. These features reduce social pressure and mitigate bias, particularly for students who may feel excluded in more traditional or competitive settings.
The GPR activity is discipline agnostic and can be used with any student-generated ‘product’. In our case, this was a slide deck, but could be an essay, report, video, podcast, and so on.
There are a range of educational technology tools, or combinations of tools, that could be used to implement this activity. This case study is intended to empower educators to adapt the GPR activity to their educational technology context.
Rationale
Feedback is crucial for student learning (Chickering & Gamson, 1987; Fraser et al., 1987; Topping, 1998) but can be time-consuming. Timely feedback is particularly important when students can revise and resubmit assignments (Henderson et al., 2019). Peer review serves as an effective alternative to teacher feedback, enhancing learning for both the reviewer and the recipient without adding to the teacher’s workload (Nicol et al., 2014). Students often find peer feedback more understandable and helpful than teacher feedback (Falchikov, 2005; Topping, 1998) while receiving multiple peer reviews increases the quantity and diversity of feedback (Topping, 1998).
A vital factor in the success of any (peer review) activity is ‘the extent to which [students] intend to engage in [the] activity’ i.e., their motivation (Jones, 2018, p. 5). Motivation is essential for effective learning (Williams & Williams, 2011) and is linked to curiosity, persistence, and performance (Vallerand et al., 1992). Gamification can enhance motivation by incorporating game elements that promote participation and engagement (Zeybeck & Sagyı, 2023). In our GPR activity, gamification is not intended just as an extrinsic motivator – e.g., grades or credits (Ryan & Deci, 2000) – but a structured approach to foster meaningful peer feedback.
Evaluation: Student Response
Survey data were collected from students in 2023 and 2024. Participation was voluntary, anonymous, and students could choose to skip questions. 24 of 27 students (89%) participated in 2023 and 17 of 18 in 2024 (94%).
The survey followed the APA ethical principles and code of conduct and was carried out in accordance with the principles of the Declaration of Helsinki. Quantitative data were used to examine the concepts of Acceptability (Perceived educational value and Process acceptance) (adapted from Grainger et al., 2018), Cost (adapted from Hulleman et al., 2016), and Reported performance gains.
Qualitative data were used to examine the impact of the peer review and game components on student motivation. We used the Expectancy-Value-Cost model of motivation (Barron & Hulleman, 2015; Hulleman et al., 2016) to frame our analysis.
Quantitative Analysis
An independent samples t-test showed no significant differences in the concepts studied between the years 2023 and 2024, so the data were pooled for analysis. Figures 3 through 6 show the percentage of participants who agreed and disagreed with survey statements in relation to each construct.




Did Students Perceive Educational Value in the Gamified Peer Review Activity?
50-60% of students agreed with most statements. Overall perceived educational value was therefore considerably high. A notable exception was whether the group wanted this activity to be implemented in other courses. While it is difficult for us to conclude exactly why the overall group would not want to see this approach used in other courses, further scrutiny of the data has revealed that the number of peer review rounds appears to make a difference. In Figure 7, we can see that with four rounds of peer review (2023 cohort), 54.2% of participants would not have recommended this for use in other courses, but when only three rounds of peer review were used (2024 cohort), that number dropped to 17.6%. We suggest that this opinion is based on the perceived workload associated with more rounds of peer review, and, therefore, recommend using three rounds of peer review.

Were Students Satisfied with the Implementation of the Gamified Peer Review Activity Process?
Most found the peer review tool easy to use, but the group was quite split on whether they were satisfied with the activity process and the guidance they received.
Did Students Report Performance Gains From Engaging in the Gamified Peer Review Activity?
73% reported performance gains in terms of their own approach to giving peer reviews and 88% in terms of the impact of feedback on the quality of their final submission. 55% felt that their own final submission had improved because of reviewing peers.
Did Engaging in the Gamified Peer Review Activity Come at a Cost?
Most felt it took too much time, and just over half felt it took time from more important things in the course.
How did Acceptability (Perceived Educational Value and Process Acceptance), Reported Performance Gains and Cost Relate to Each Other?
The survey items formed reliable factors, as items within each concept were highly intercorrelated (α > 0.70). The correlation analysis (Table 1) shows significant relationships between the concepts studied. We can see that perceived educational value was statistically related to greater satisfaction with the process, more reported performance gains, and lower perceived cost. Process acceptability was also related to greater reported performance gains.
| Process | Reported performance gains | Cost | |
|---|---|---|---|
| Perceived educational value | 0.708** | 0.660** | – 0.316* |
| Process | 0.412** | – 0.227 | |
| Reported performance gains | – 0.057 |
*p<0.05, ** p<0.001
Qualitative Analysis
Peer Review Component
Of 33 comments, the biggest positive impact on motivation related to the Value construct – Is the task worth doing? (15 of 16 comments). Students reported the peer review component as being meaningful and practical (e.g., ‘useful’), having intrinsic motivation benefits (e.g., ‘interesting’), fostering positive peer relationships (e.g., ‘supportive’), and providing variety (e.g., ‘a nice change’).
By contrast, the biggest negative impact related to the Cost construct – Am I free of barriers that prevent me from investing my time, energy, and resources into the activity? (13 of 17 comments). Students felt the peer review component involved an unreasonable workload or was unnecessary, and that it provoked negative psychological reactions (e.g., ‘boring’).
Game Component
Of 39 comments, the biggest positive impact on motivation was again related to the Value construct (20 of 22 comments). Students reported the game component as having intrinsic motivation benefits (e.g., ‘fun’) and being meaningful and practical (e.g., ‘a good method to improve and take the task more seriously’).
By contrast, the biggest negative impact related to perceived unfairness in medal distribution (10 of 17 comments). This is related to either ‘friendship marking’ (Dochy et al., 1999), e.g., ‘some students give out higher medals to their friends’ or a carelessness in approach due to poor engagement (Kao, 2013) resulting in mismatches between the medals awarded and the quality of the feedback received. For example, one student commented: ‘I was discussing it with my peers and some of them gave medals randomly or gave the person who was criticising their work a bronze medal.’ Perceived unfairness is an overarching demotivational issue potentially impacting one or all the expectancy, value and cost dimensions.
Evaluation: Instructor Response
The instructor has found that student reactions to the activity vary; some appreciate its value, while others see it as an unnecessary burden. She acknowledges that students may struggle with new teaching methods.
She has observed that, overall, students engage thoughtfully with peer feedback, distinguishing between useful critiques and those that may not apply. Some students have sought clarification on feedback they have received, either to ensure they understand it correctly or to discuss whether they should incorporate it. She also notes that the quality of the slide decks improves from draft to final submission.
Strategies to Support Student Engagement
We found the following strategies supported students in successfully engaging with this activity:
- Clear articulation of the activity’s value to ensure student buy-in.
- Provision of clear, multi-format guidance on completing the process and providing good feedback.
- A practice opportunity at the start of the course to ensure familiarity with the process and identify any technical issues.
- Use of course announcements and course calendar deadlines as reminders for completing GPR process stages.
- Teacher monitoring and timely support, particularly for the first round of reviews.
- In-class opportunities to complete all or part of the process. This facilitates student support and helps mitigate the Cost impact in terms of time.
Key Takeaways
Preparation: Do not implement this design at the last minute. Plan thoroughly before teaching begins.
Relevance and value: Emphasise that the design is about meaningful engagement, not excessive work. Invest in supporting student understanding of the value of the activity and how to successfully engage with it.
Simplicity: Limit the number of rounds and ensure the process is as streamlined as possible and best balances workload with performance gains.
References
Barron K. E., & Hulleman, C. S. (2015). Expectancy-value-cost model of motivation. In J. D. Wright (Ed.), International Encyclopedia of Social and Behavioral Sciences, 8, 503–509. http://doi.org/10.1016/B978-0-08-097086-8.26099-6
Centre for Quality Support and Development. (2023). Using a variety of assessment to support inclusive learning. University of Reading. https://www.reading.ac.uk/cqsd/-/media/project/functions/cqsd/documents/ade/tandl-resources/prp-type-and-variety-of-assessments.pdf
Chickering, A. W., & Gamson, Z. F. (1987). Seven principles of good practice in undergraduate education. AAHE Bulletin,39, 3–7.
Dicheva, D., Irwin, K., & Dichev, C. (2018). Motivational factors in educational gamification. In 2018 IEEE 18th International Conference on Advanced Learning Technologies. http://doi.org/10.1109/ICALT.2018.00102
Dochy, F., Segers, M., & Sluijsmans, D. (1999). The use of self-, peer and co-assessment in higher education: A review. Studies in Higher Education, 24(3), 331–350. http://doi.org/10.1080/03075079912331379935
Falchikov, N. (2005). Improving Assessment through Student’s Involvement: Practical Solutions for Aiding Learning in Higher and Further Education. Routledge.
Fraser, B. J., Walberg, H. J., Welch, W. W., & Hattie, J. A. (1987). Syntheses of educational productivity research. International Journal of Educational Research, 11(2), 147–252. https://doi.org/10.1016/0883-0355(87)90035-8
Freeman, S., Eddy, S. L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H., & Wenderoth, M.P. (2014). Active learning increases student performance in science, engineering, and mathematics. PNAS Proceedings of the National Academy of Sciences of the United States of America, 111(23), 8410–8415. https://psycnet.apa.org/doi/10.1073/pnas.1319030111
Grainger, R., Dai, W., Osborne, E., & Kenwright, D. (2018). Medical students create multiple-choice questions for learning in pathology education: a pilot study. BMC Medical Education, 18(1), 201.https://doi.org/10.1186/s12909-018-1312-1
Henderson, M., Phillips, M., Ryan, T., Boud, D., Dawson, P., Molloy, E., & Mahoney, P. (2019). Conditions that enable effective feedback. Higher Education Research & Development, 38(7), 1401–1416. https://doi.org/10.1080/07294360.2019.1657807
Hulleman, C. S., Barron, K. E., Kosovich, J. J., & Lazowski, R. A. (2016). Student motivation: Current theories, constructs, and interventions within an expectancy-value framework. In A. A. Lipnevich et al. (Eds.), Psychosocial Skills and School Systems in the 21st Century (pp. 241–278). Springer International Publishing. http://doi.org/10.1007/978-3-319-28606-8_10
Indriasari, T. D., Luxton-Reilly, A., & Denny, P. (2020). Gamification of student peer review in education: A systematic literature review. Education and Information Technologies, 25, 5205–5234. https://doi.org/10.1007/s10639-020-10228-x
Jones, B. (2018) Motivating students by design: Practical strategies for professors (2nd ed.). CreateSpace.
Kao, G.Y.-M. (2013). Enhancing the quality of peer review by reducing student ‘free riding’: Peer assessment with positive interdependence. British Journal of Educational Technology, 44(1), 112–124. https://doi.org/10.1111/j.1467-8535.2011.01278.x
Nicol, D., Thomson, A., & Breslin, C. (2014). Rethinking feedback practices in higher education: A peer review perspective. Assessment & Evaluation in Higher Education, 39(1), 102–122. https://doi.org/10.1080/02602938.2013.795518
Odom, S.E., Glenn, B.L., Sanner, S., & Cannella, K.A. (2009). Group Peer Review as an Active Learning Strategy in a Research Course. The International Journal of Teaching and Learning in Higher Education, 21(1), 108–117.
Prince, M. (2013). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–231. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x
Quinton, S., & Smallbone, T. (2010). Feeding forward: Using feedback to promote student reflection and learning – a teaching model. Innovations in Education and Teaching International, 47(1), 125–135. https://doi.org/10.1080/14703290903525911
Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology, 25(1), 54–67. https://doi.org/10.1006/ceps.1999.1020
Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249–276. https://doi.org/10.3102/00346543068003249
Vallerand, R., Pelletier, L., Blais, M., Brière, N., Senécal, C., & Vallieres, E. (1992). The academic motivation scale: A measure of intrinsic, extrinsic, and amotivation in education. Educational and Psychological Measurement, 52(4), 1003–1017. https://doi.org/10.1177/0013164492052004025
Williams, K., & Williams, C. (2011). Five key ingredients for improving motivation. Research in Higher Education Journal, 11. http://aabri.com/manuscripts/11834.pdf
Zeybek, N., & Sagyı, E. (2022). Gamification in education: Why, where, when, and how? – A systematic Review. Games and Culture, 19(2), 237–264. https://doi.org/10.1177/15554120231158625
About the authors
Freyja Thoroddsen Sigurdardottir is a lecturer at the Department of Business and Economics at Reykjavik University. Rannveig Sigurvinsdottir is an associate professor at the Department of Psychology at Reykjavik University. John Baird is an Educational Developer at Reykjavik University.
Corresponding Author: Freyja Thoroddsen Sigurdardottir, freyjas@ru.is