ORAL ABSTRACTS: APPLICATIONS IN OUTCOMES RESEARCH AND POLICY
Purpose: Effectively treating tuberculosis (TB) requires administering drugs to which the infection is not resistant. Though costly, drug sensitivity testing (DST) of patients receiving first-line treatment can triage those with multi-drug-resistant (MDR) TB to appropriate but expensive treatment alternatives. In India, patients receive DST if they have not responded to four months of treatment, as measured by the imperfect but inexpensive sputum smear (SS) test. We seek to determine the optimal time to administer DST and the patterns of SS results that should prompt DST. If DST is administered too soon, many patients without MDR TB will be unnecessarily tested. If administered too late, patients with MDR TB may continue to transmit disease and experience declining health.
Method: We use a partially observed Markov decision process (POMDP) to determine the optimal timing and frequency of SS test information collection and DST testing in India. We calculate parameters such as patient response to treatment, patient dynamics while on treatment (the possibility of default or death), and discounted lifetime costs and health benefits using clinical studies and our previously published TB microsimulation model. We solve the POMDP using value iteration on a constrained feasible belief set.
Result: Current policy appears suboptimal given India's relatively high national estimates of MDR TB prevalence and transmission. For these estimates, DST should be administered to all patients at the outset of treatment. We project that this could save $7800 per TB patient in discounted net monetary benefits after accounting for averted downstream transmissions. However, in settings where the risk of transmission or MDR prevalence is much lower than the national average, a patient's SS result sequence can change the optimal DST timing, and individually tailored testing policies would be optimal. See Figure 1: national averages for India lie in the white region but districts with low MDR prevalence or transmission have different optimal policies that vary by patient SS outcomes.
Conclusion: India should revise the drug sensitivity testing protocol in their first-line national TB treatment program to provide DST during the first month of treatment in areas of average or high MDR TB prevalence and transmission, and may wish to consider individually tailored DST regimens in low transmission, low MDR prevalence areas to reduce financial costs.
Method: Databases (PubMed, Embase, Cochrane Library and Web of Science) were searched for RCTs investigating the therapy in hepatitis B e antigen (HBeAg) positive and/or HBeAg negative patients with CHB published in English before October 29, 2014. Network meta-analyses (NMA) were conducted to estimate pooled effectiveness and safety data using the following outcomes: 1) Efficacy for HBeAg positive patients: virologic response; alanine aminotransferase (ALT) normalization; HBeAg loss; HBeAg seroconversion; and HBsAg loss; 2) Efficacy for HBeAg negative patients: virologic response and ALT normalization; and 3) Safety: Serious adverse events, any adverse events, and withdrawal due to adverse events.
Result: A total of 62 studies were selected for inclusion. In HBeAg positive patients, tenofovir was most effective in achieving virologic response (predicted probability: 86%); tenofovir was found significantly better than adefovir, telbivudine, entecavir, pegylated interferon and interferon. In terms of other efficacy outcomes (i.e. ALT normalization; HBeAg seroconversion; and HBsAg loss), tenofovir was not significantly better than the other treatments. In HBeAg negative patients, tenofovir was the most effective in achieving virologic response (98%); tenofovir was significantly better than adefovir, telbivudine, pegylated interferon and interferon. Tenofovir was not significantly better than the other treatments for ALT normalization. There was no significant difference between tenofovir and other oral agents for safety outcomes.
Conclusion: Our study found that for HBeAg positive patients, tenofovir is the most effective treatment followed by entecavir, for the outcomes of virologic response and ALT normalization; for HBeAg loss and HBeAg seroconversion, pegylated interferon is the most effective treatment followed by tenofovir. For HBeAg negative patients, tenofovir is also the most effective treatment, followed by adefovir and entecavir, in terms of virologic response and ALT normalization. Current practice guidelines should be informed by cost-effectiveness and patient perspectives, in addition to evidence regarding effectiveness and safety.
To assess attributable health and cost outcomes associated with community-acquired Clostridium difficile infection (CDI).
We conducted a population-based matched cohort study. Between 01/01/2003 and 31/12/2010, we identified incident cases of community-acquired CDI (infected patients) defined as patients with the ICD-10-CA code A04.7 present during: an emergency department (ED) visit (principal/non-principal diagnosis, index date: ED registration date); a non-elective hospital admission (principal/non-principal diagnosis) with length of stay ≤2 days (index date: hospital admission date); or a non-elective hospital admission (principal diagnosis) with CDI symptoms (e.g., diarrhea) documented during a physician or ED visit within two weeks prior to the hospital admission date (index date: physician visit or ED registration date). We followed infected patients until 31/12/2011. Infected patients were matched 1:1 without replacement to uninfected subjects in the general population using propensity score and hard matching on a set of baseline characteristics. Health outcomes included colectomy within 1-year post index date and all-cause mortality. Cost outcomes (from the healthcare payer perspective in 2012 Canadian dollars) included phase-specific costs, up-to-1-year costs unadjusted for survival, and up-to-3-year costs adjusted for survival.
We identified 7,903 patients infected with community-acquired CDI. The crude mean annual incidence was 7.8 per 100,000. The mean age was 63.5 years (standard deviation=22.0) and 63% were female. The relative risk for undergoing a colectomy within 1-year post index date was 5.53 (95% confidence interval [CI], 3.30-9.27) and the relative risk for mortality within 1-year post index date was 1.58 (95%CI, 1.44-1.75). Infected patients had 1.3- to 5.3-fold higher mean costs versus uninfected subjects. The mean attributable cost (adjusted for survival) of an incident community-acquired CDI patient was $8,881 (95%CI: $7,951-$9,904) in the first year, $2,663 in the second year, and $2,480 in the third year. Mean attributable costs were generally higher among those diagnosed in 2010 (possibly due to a virulent strain), males, those aged ≥65 years, and those who died within 1-year after the index date.
Community-acquired CDI is associated with a substantial health and economic burden. CDI leads to a greater risk for colectomy and all-cause mortality, and higher short- and long-term healthcare costs. This is the first study to evaluate the costs of community-acquired CDI using a large population-based sample and to evaluate long-term costs of community-acquired CDI.
Method: 4,173 individuals ≥ 5 years of age who presented at ambulatory centers for treatment of acute respiratory illness (≤ 7 days) with cough or fever in 2011-2012 were included. Eligible enrollees provided nasal and pharyngeal swabs for real-time, reverse transcriptase polymerase chain reaction (RT-PCR) testing for influenza, self-reported symptoms, personal characteristics and self-reported influenza vaccination status. CART was used to develop a series of models with prediction success of sensitivity, specificity, positive predictive values (PPV), negative predictive values (NPV), and areas under the curve (AUC) calculated.
Result: 645 enrollees <65 years and 60 enrollees ≥65 years had PCR-confirmed influenza. Antiviral medication was prescribed for 14% of those individuals. Among nine possible clinical features, CART selected the best combination (fever, cough, fatigue, and shortness of breath and household smoke)with a sensitivity of 81%, specificity of 52%, PPV of 24%, NPV of 94% and AUC=0.68. Limiting the sample to those 345 patients for whom antivirals are clearly recommended i.e., individuals <65 years with a high risk condition or ≥65 years, and who presented for care ≤2 days from symptom onset, presence of fever and cough resulted in a prediction algorithm with 86% sensitivity, 47% specificity, 27% PPV, 95% NPV and AUC=0.67.
Conclusion: The algorithm based on CART recursive partitioning, among outpatients ≥5 years, was used to estimate probability of influenza with good sensitivity and high NPV, but low PPV in an influenza season with low prevalence of disease. After further testing for seasons with higher influenza prevalence, CART may be used to exclude many who do not need antivirals, and indicate who should be considered for viral testing for confirmation of influenza.
Method: Retrospective cohort study of 1,586 adults with non-valvular AF or flutter seen in primary care settings of an integrated healthcare system between December 2012 and March 2014. Treatment recommendations were made by an Atrial Fibrillation Decision Support Tool (AFDST) based on projections for QALE calculated by a decision analytic model that integrates patient-specific risk factors for stroke and hemorrhage and examines strategies of no antithrombotic therapy, aspirin, or oral anticoagulation.
Current treatment was discordant from recommended treatment in 45% (326/725) of women and in 39% (338/860) of men (p = 0.02). Among the elderly (age ≥ 85) current treatment was discordant from recommended treatment in 35% (89/258), while treatment was discordant among 43% (575/1328) of patients < 85 years of age (p = < 0.01). We further examined age categories in 5-year increments and found that discordant therapy was as high as 60-70% in those between the ages of 31 and 50. Among 326 women with discordant treatment 99% (322/326) was due to under-treatment and 1% (4/326) was due to overtreatment. Among 338 men with discordant treatment 81% (274/338) was due to under-treatment, while 19% (64/338) was due to overtreatment. Among 89 elderly patients with discordant treatment 98% (87/89) of discordance was due to under-treatment and 2% (2/89) was due to overtreatment, whereas in those < 85 years of age, 88% (509/575) was due to under-treatment and 12% (66/575) of was due to overtreatment.
Women are still undertreated with antithrombotic therapy for AF. Somewhat surprisingly, compared with older patients, a larger proportion of patients < 85 years of age are receiving treatment that is discordant from recommended therapy. Furthermore, in women and the elderly the major reason for discordant therapy is under-treatment; whereas in men and younger patients, a larger proportion of discordance is due to overtreatment.