Purpose:
Since empirical utility scores are now available for all health states related to cervical cancer screening and treatment, we estimated the impact of using these utility scores on the cost-effectiveness of cervical cancer screening, compared to using utility scores from the literature.
Method:
We first reviewed the literature on cost-effectiveness analyses (CEAs) of cervical cancer screening published between 2003 and 2013. We focused on studies that used quality adjusted life years (QALYs). We evaluated the differences in utility assumptions between the publications. For the different CEAs and based on the empirical data, we calculated the number of days lost due to loss of quality of life for the different health states, by multiplying the assumed utilities with the mean durations of the loss in quality of life.
We used the microsimulation screening analysis (MISCAN) model to estimate the impact of using these different utility scores on the cost-effectiveness (costs per QALY gained) of primary human papillomavirus (HPV) screening compared to no screening.
Result:
Utility scores and number of days lost due to loss of quality of life as assumed for women in different health states that are related to cervical cancer and its prevention are very heterogeneous across the different CEAs, as well as compared to the empirical data. These differences result in a significant variation in cost-effectiveness of primary HPV screening compared to no screening, ranging from €8,035 to €13,518 per QALY gained (See Figure). If the empirical data was used, the cost effectiveness of primary HPV screening was €11,839 per QALY gained.
Conclusion:
The assumed number of days lost as a consequence
of loss in quality of life for different health states in CEAs of cervical
cancer screening has a major effect on the estimated cost-effectiveness ratio
of screening. We showed that most CEAs, compared to the empirical data,
overestimated the number of QALYs gained by screening, and therefore
overestimate the cost effectiveness of screening. Utility assessment in CEAs
therefore needs to be based on good quality data. From our analysis, we can
conclude that for comparability, extensive sensitivity analyses on quality of
life assumptions and presentation of costs per life year gained in CEAs are
needed.