TY - JOUR
T1 - METhodological RadiomICs Score (METRICS)
T2 - a quality scoring tool for radiomics research endorsed by EuSoMII
AU - Kocak, Burak
AU - Akinci D'Antonoli, Tugba
AU - Mercaldo, Nathaniel
AU - Alberich-Bayarri, Angel
AU - Baessler, Bettina
AU - Ambrosini, Ilaria
AU - Andreychenko, Anna E
AU - Bakas, Spyridon
AU - Beets-Tan, Regina G H
AU - Bressem, Keno
AU - Buvat, Irene
AU - Cannella, Roberto
AU - Cappellini, Luca Alessandro
AU - Cavallo, Armando Ugo
AU - Chepelev, Leonid L
AU - Chu, Linda Chi Hang
AU - Demircioglu, Aydin
AU - deSouza, Nandita M
AU - Dietzel, Matthias
AU - Fanni, Salvatore Claudio
AU - Fedorov, Andrey
AU - Fournier, Laure S
AU - Giannini, Valentina
AU - Girometti, Rossano
AU - Groot Lipman, Kevin B W
AU - Kalarakis, Georgios
AU - Kelly, Brendan S
AU - Klontzas, Michail E
AU - Koh, Dow-Mu
AU - Kotter, Elmar
AU - Lee, Ho Yun
AU - Maas, Mario
AU - Marti-Bonmati, Luis
AU - Müller, Henning
AU - Obuchowski, Nancy
AU - Orlhac, Fanny
AU - Papanikolaou, Nikolaos
AU - Petrash, Ekaterina
AU - Pfaehler, Elisabeth
AU - Pinto Dos Santos, Daniel
AU - Ponsiglione, Andrea
AU - Sabater, Sebastià
AU - Sardanelli, Francesco
AU - Seeböck, Philipp
AU - Sijtsema, Nanna M
AU - Stanzione, Arnaldo
AU - Traverso, Alberto
AU - Ugga, Lorenzo
AU - Vallières, Martin
AU - van Dijk, Lisanne V
AU - van Griethuysen, Joost J M
AU - van Hamersvelt, Robbert W
AU - van Ooijen, Peter
AU - Vernuccio, Federica
AU - Wang, Alan
AU - Williams, Stuart
AU - Witowski, Jan
AU - Zhang, Zhongyi
AU - Zwanenburg, Alex
AU - Cuocolo, Renato
N1 - © 2024. The Author(s).
PY - 2024
Y1 - 2024
N2 - PURPOSE: To propose a new quality scoring tool, METhodological RadiomICs Score (METRICS), to assess and improve research quality of radiomics studies.METHODS: We conducted an online modified Delphi study with a group of international experts. It was performed in three consecutive stages: Stage#1, item preparation; Stage#2, panel discussion among EuSoMII Auditing Group members to identify the items to be voted; and Stage#3, four rounds of the modified Delphi exercise by panelists to determine the items eligible for the METRICS and their weights. The consensus threshold was 75%. Based on the median ranks derived from expert panel opinion and their rank-sum based conversion to importance scores, the category and item weights were calculated.RESULT: In total, 59 panelists from 19 countries participated in selection and ranking of the items and categories. Final METRICS tool included 30 items within 9 categories. According to their weights, the categories were in descending order of importance: study design, imaging data, image processing and feature extraction, metrics and comparison, testing, feature processing, preparation for modeling, segmentation, and open science. A web application and a repository were developed to streamline the calculation of the METRICS score and to collect feedback from the radiomics community.CONCLUSION: In this work, we developed a scoring tool for assessing the methodological quality of the radiomics research, with a large international panel and a modified Delphi protocol. With its conditional format to cover methodological variations, it provides a well-constructed framework for the key methodological concepts to assess the quality of radiomic research papers.CRITICAL RELEVANCE STATEMENT: A quality assessment tool, METhodological RadiomICs Score (METRICS), is made available by a large group of international domain experts, with transparent methodology, aiming at evaluating and improving research quality in radiomics and machine learning.KEY POINTS: • A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol. • The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time. • METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines. • A web application has been developed to help with the calculation of the METRICS score ( https://metricsscore.github.io/metrics/METRICS.html ) and a repository created to collect feedback from the radiomics community ( https://github.com/metricsscore/metrics ).
AB - PURPOSE: To propose a new quality scoring tool, METhodological RadiomICs Score (METRICS), to assess and improve research quality of radiomics studies.METHODS: We conducted an online modified Delphi study with a group of international experts. It was performed in three consecutive stages: Stage#1, item preparation; Stage#2, panel discussion among EuSoMII Auditing Group members to identify the items to be voted; and Stage#3, four rounds of the modified Delphi exercise by panelists to determine the items eligible for the METRICS and their weights. The consensus threshold was 75%. Based on the median ranks derived from expert panel opinion and their rank-sum based conversion to importance scores, the category and item weights were calculated.RESULT: In total, 59 panelists from 19 countries participated in selection and ranking of the items and categories. Final METRICS tool included 30 items within 9 categories. According to their weights, the categories were in descending order of importance: study design, imaging data, image processing and feature extraction, metrics and comparison, testing, feature processing, preparation for modeling, segmentation, and open science. A web application and a repository were developed to streamline the calculation of the METRICS score and to collect feedback from the radiomics community.CONCLUSION: In this work, we developed a scoring tool for assessing the methodological quality of the radiomics research, with a large international panel and a modified Delphi protocol. With its conditional format to cover methodological variations, it provides a well-constructed framework for the key methodological concepts to assess the quality of radiomic research papers.CRITICAL RELEVANCE STATEMENT: A quality assessment tool, METhodological RadiomICs Score (METRICS), is made available by a large group of international domain experts, with transparent methodology, aiming at evaluating and improving research quality in radiomics and machine learning.KEY POINTS: • A methodological scoring tool, METRICS, was developed for assessing the quality of radiomics research, with a large international expert panel and a modified Delphi protocol. • The proposed scoring tool presents expert opinion-based importance weights of categories and items with a transparent methodology for the first time. • METRICS accounts for varying use cases, from handcrafted radiomics to entirely deep learning-based pipelines. • A web application has been developed to help with the calculation of the METRICS score ( https://metricsscore.github.io/metrics/METRICS.html ) and a repository created to collect feedback from the radiomics community ( https://github.com/metricsscore/metrics ).
U2 - 10.1186/s13244-023-01572-w
DO - 10.1186/s13244-023-01572-w
M3 - Article
C2 - 38228979
SN - 1869-4101
VL - 15
JO - Insights into Imaging
JF - Insights into Imaging
IS - 1
M1 - 8
ER -