Samenvatting
The most appropriate next step in depression treatment after the initial treatment fails is unclear. This study explores the suitability of the Markov decision process for optimizing sequential treatment decisions for depression. We conducted a formal comparison of a Markov decision process approach and mainstream state-transition models as used in health economic decision analysis to clarify differences in the model structure. We performed two reviews: the first to identify existing applications of the Markov decision process in the field of healthcare and the second to identify existing health economic models for depression. We then illustrated the application of a Markov decision process by reformulating an existing health economic model. This provided input for discussing the suitability of a Markov decision process for solving sequential treatment decisions in depression. The Markov decision process and state-transition models differed in terms of flexibility in modeling actions and rewards. In all, 23 applications of a Markov decision process within the context of somatic disease were included, 16 of which concerned sequential treatment decisions. Most existing health economic models relating to depression have a state-transition structure. The example application replicated the health economic model and enabled additional capacity to make dynamic comparisons of more interventions over time than was possible with traditional state-transition models. Markov decision processes have been successfully applied to address sequential treatment-decision problems, although the results have been published mostly in economics journals that are not related to healthcare. One advantage of a Markov decision process compared with state-transition models is that it allows extended action space: the possibility of making dynamic comparisons of different treatments over time. Within the context of depression, although existing state-transition models are too basic to evaluate sequential treatment decisions, the assumptions of a Markov decision process could be satisfied. The Markov decision process could therefore serve as a powerful model for optimizing sequential treatment in depression. This would require a sufficiently elaborate state-transition model at the cohort or patient level.
Originele taal-2 | English |
---|---|
Pagina's (van-tot) | 1015–1032 |
Aantal pagina's | 18 |
Tijdschrift | Pharmacoeconomics |
Volume | 40 |
Vroegere onlinedatum | 14-sep.-2022 |
DOI's | |
Status | Published - nov.-2022 |