TY - JOUR
T1 - A Systematic Review of Peer Assessment Design Elements
AU - Alqassab, Maryam
AU - Strijbos, Jan Willem
AU - Panadero, Ernesto
AU - Ruiz, Javier Fernández
AU - Warrens, Matthijs
AU - To, Jessica
N1 - Funding Information:
The authors like to thank Nikolai Klitzing, Meike Faber, Fabian Kracher, Monique Messado, Elvetia Parker, Wai Shan Tong, and Alex Horton for their assistance in coding trails.
Funding Information:
The first author’s research was funded by the German Elite Network of Bavaria (Reference Number: K-GS-2012-209) during the initial conceptualization and data search stages (March 2014 – October 2016), and by the Spanish Ministry of Science and Innovation (Ministerio de Ciencia e Innovación) under the Juan de la Cierva Incorporación program (Reference number: IJC2020-043302-I) during the first and second revisions of this manuscript (April 2022 – August 2022). Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature.
Publisher Copyright:
© 2023, The Author(s).
PY - 2023/3
Y1 - 2023/3
N2 - The growing number of peer assessment studies in the last decades created diverse design options for researchers and teachers to implement peer assessment. However, it is still unknown if there are more commonly used peer assessment formats and design elements that could be considered when designing peer assessment activities in educational contexts. This systematic review aims to determine the diversity of peer assessment designs and practices in research studies. A literature search was performed in the electronic databases PsycINFO, PsycARTICLES, Web of Science Core Collection, Medline, ERIC, Academic Search Premier, and EconLit. Using data from 449 research studies (derived from 424 peer-reviewed articles), design differences were investigated for subject domains, assessment purposes, objects, outcomes, and moderators/mediators. Arts and humanities was the most frequent subject domain in the reviewed studies, and two-third of the studies had a formative purpose of assessment. The most used object of assessment was written assessment, and beliefs and perceptions were the most investigated outcomes. Gender topped the list of the investigated moderators/mediators of peer assessment. Latent class analysis of 27 peer assessment design elements revealed a five-class solution reflecting latent patterns that best describe the variability in peer assessment designs (i.e. prototypical peer assessment designs). Only ten design elements significantly contributed to these patterns with an associated effect size R2 ranging from.204 to.880, indicating that peer assessment designs in research studies are not as diverse as they theoretically can be.
AB - The growing number of peer assessment studies in the last decades created diverse design options for researchers and teachers to implement peer assessment. However, it is still unknown if there are more commonly used peer assessment formats and design elements that could be considered when designing peer assessment activities in educational contexts. This systematic review aims to determine the diversity of peer assessment designs and practices in research studies. A literature search was performed in the electronic databases PsycINFO, PsycARTICLES, Web of Science Core Collection, Medline, ERIC, Academic Search Premier, and EconLit. Using data from 449 research studies (derived from 424 peer-reviewed articles), design differences were investigated for subject domains, assessment purposes, objects, outcomes, and moderators/mediators. Arts and humanities was the most frequent subject domain in the reviewed studies, and two-third of the studies had a formative purpose of assessment. The most used object of assessment was written assessment, and beliefs and perceptions were the most investigated outcomes. Gender topped the list of the investigated moderators/mediators of peer assessment. Latent class analysis of 27 peer assessment design elements revealed a five-class solution reflecting latent patterns that best describe the variability in peer assessment designs (i.e. prototypical peer assessment designs). Only ten design elements significantly contributed to these patterns with an associated effect size R2 ranging from.204 to.880, indicating that peer assessment designs in research studies are not as diverse as they theoretically can be.
KW - Instructional design
KW - Peer assessment
KW - Peer assessment diversity
KW - Systematic review
UR - http://www.scopus.com/inward/record.url?scp=85147183534&partnerID=8YFLogxK
U2 - 10.1007/s10648-023-09723-7
DO - 10.1007/s10648-023-09723-7
M3 - Article
AN - SCOPUS:85147183534
SN - 1040-726X
VL - 35
JO - Educational Psychology Review
JF - Educational Psychology Review
IS - 1
M1 - 18
ER -