713 resultados para Kivivuori, Janne
Resumo:
Mathematical models of disease progression predict disease outcomes and are useful epidemiological tools for planners and evaluators of health interventions. The R package gems is a tool that simulates disease progression in patients and predicts the effect of different interventions on patient outcome. Disease progression is represented by a series of events (e.g., diagnosis, treatment and death), displayed in a directed acyclic graph. The vertices correspond to disease states and the directed edges represent events. The package gems allows simulations based on a generalized multistate model that can be described by a directed acyclic graph with continuous transition-specific hazard functions. The user can specify an arbitrary hazard function and its parameters. The model includes parameter uncertainty, does not need to be a Markov model, and may take the history of previous events into account. Applications are not limited to the medical field and extend to other areas where multistate simulation is of interest. We provide a technical explanation of the multistate models used by gems, explain the functions of gems and their arguments, and show a sample application.
Resumo:
BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.
Resumo:
Malawi adopted the Option B+ strategy in 2011. Its success in reducing mother-to-child transmission depends on coverage and timing of HIV testing. We assessed HIV status ascertainment and its predictors during pregnancy. HIV status ascertainment was 82.3% (95% confidence interval: 80.2 to 85.9) in the pre-Option B+ period and 85.7% (95% confidence interval: 83.4 to 88.0) in the Option B+ period. Higher HIV ascertainment was independently associated with higher age, attending antenatal care more than once, and registration in 2010. The observed high variability of HIV ascertainment between sites (50.6%-97.7%) and over time suggests that HIV test kit shortages and insufficient numbers of staff posed major barriers to reducing mother-to-child transmission.
Resumo:
BACKGROUND Febrile neutropenia (FN) and other infectious complications are some of the most serious treatment-related toxicities of chemotherapy for cancer, with a mortality rate of 2% to 21%. The two main types of prophylactic regimens are granulocyte (macrophage) colony-stimulating factors (G(M)-CSF) and antibiotics, frequently quinolones or cotrimoxazole. Current guidelines recommend the use of colony-stimulating factors when the risk of febrile neutropenia is above 20%, but they do not mention the use of antibiotics. However, both regimens have been shown to reduce the incidence of infections. Since no systematic review has compared the two regimens, a systematic review was undertaken. OBJECTIVES To compare the efficacy and safety of G(M)-CSF compared to antibiotics in cancer patients receiving myelotoxic chemotherapy. SEARCH METHODS We searched The Cochrane Library, MEDLINE, EMBASE, databases of ongoing trials, and conference proceedings of the American Society of Clinical Oncology and the American Society of Hematology (1980 to December 2015). We planned to include both full-text and abstract publications. Two review authors independently screened search results. SELECTION CRITERIA We included randomised controlled trials (RCTs) comparing prophylaxis with G(M)-CSF versus antibiotics for the prevention of infection in cancer patients of all ages receiving chemotherapy. All study arms had to receive identical chemotherapy regimes and other supportive care. We included full-text, abstracts, and unpublished data if sufficient information on study design, participant characteristics, interventions and outcomes was available. We excluded cross-over trials, quasi-randomised trials and post-hoc retrospective trials. DATA COLLECTION AND ANALYSIS Two review authors independently screened the results of the search strategies, extracted data, assessed risk of bias, and analysed data according to standard Cochrane methods. We did final interpretation together with an experienced clinician. MAIN RESULTS In this updated review, we included no new randomised controlled trials. We included two trials in the review, one with 40 breast cancer patients receiving high-dose chemotherapy and G-CSF compared to antibiotics, a second one evaluating 155 patients with small-cell lung cancer receiving GM-CSF or antibiotics.We judge the overall risk of bias as high in the G-CSF trial, as neither patients nor physicians were blinded and not all included patients were analysed as randomised (7 out of 40 patients). We considered the overall risk of bias in the GM-CSF to be moderate, because of the risk of performance bias (neither patients nor personnel were blinded), but low risk of selection and attrition bias.For the trial comparing G-CSF to antibiotics, all cause mortality was not reported. There was no evidence of a difference for infection-related mortality, with zero events in each arm. Microbiologically or clinically documented infections, severe infections, quality of life, and adverse events were not reported. There was no evidence of a difference in frequency of febrile neutropenia (risk ratio (RR) 1.22; 95% confidence interval (CI) 0.53 to 2.84). The quality of the evidence for the two reported outcomes, infection-related mortality and frequency of febrile neutropenia, was very low, due to the low number of patients evaluated (high imprecision) and the high risk of bias.There was no evidence of a difference in terms of median survival time in the trial comparing GM-CSF and antibiotics. Two-year survival times were 6% (0 to 12%) in both arms (high imprecision, low quality of evidence). There were four toxic deaths in the GM-CSF arm and three in the antibiotics arm (3.8%), without evidence of a difference (RR 1.32; 95% CI 0.30 to 5.69; P = 0.71; low quality of evidence). There were 28% grade III or IV infections in the GM-CSF arm and 18% in the antibiotics arm, without any evidence of a difference (RR 1.55; 95% CI 0.86 to 2.80; P = 0.15, low quality of evidence). There were 5 episodes out of 360 cycles of grade IV infections in the GM-CSF arm and 3 episodes out of 334 cycles in the cotrimoxazole arm (0.8%), with no evidence of a difference (RR 1.55; 95% CI 0.37 to 6.42; P = 0.55; low quality of evidence). There was no significant difference between the two arms for non-haematological toxicities like diarrhoea, stomatitis, infections, neurologic, respiratory, or cardiac adverse events. Grade III and IV thrombopenia occurred significantly more frequently in the GM-CSF arm (60.8%) compared to the antibiotics arm (28.9%); (RR 2.10; 95% CI 1.41 to 3.12; P = 0.0002; low quality of evidence). Neither infection-related mortality, incidence of febrile neutropenia, nor quality of life were reported in this trial. AUTHORS' CONCLUSIONS As we only found two small trials with 195 patients altogether, no conclusion for clinical practice is possible. More trials are necessary to assess the benefits and harms of G(M)-CSF compared to antibiotics for infection prevention in cancer patients receiving chemotherapy.
Resumo:
BACKGROUND AND AIMS Hepatitis C (HCV) is a leading cause of morbidity and mortality in people who live with HIV. In many countries, access to direct acting antiviral agents to treat HCV is restricted to individuals with advanced liver disease (METAVIR stage F3 or F4). Our goal was to estimate the long term impact of deferring HCV treatment for men who have sex with men (MSM) who are coinfected with HIV and often have multiple risk factors for liver disease progression. METHODS We developed an individual-based model of liver disease progression in HIV/HCV coinfected men who have sex with men. We estimated liver-related morbidity and mortality as well as the median time spent with replicating HCV infection when individuals were treated in liver fibrosis stages F0, F1, F2, F3 or F4 on the METAVIR scale. RESULTS The percentage of individuals who died of liver-related complications was 2% if treatment was initiated in F0 or F1. It increased to 3% if treatment was deferred until F2, 7% if it was deferred until F3 and 22% if deferred until F4. The median time individuals spent with replicating HCV increased from 5 years if treatment was initiated in F2 to almost 15 years if it was deferred until F4. CONCLUSIONS Deferring HCV therapy until advanced liver fibrosis is established could increase liver-related morbidity and mortality in HIV/HCV coinfected individuals, and substantially prolong the time individuals spend with replicating HCV infection.
Resumo:
BACKGROUND The number of patients in need of second-line antiretroviral drugs is increasing in sub-Saharan Africa. We aimed to project the need of second-line antiretroviral therapy in adults in sub-Saharan Africa up to 2030. METHODS We developed a simulation model for HIV and applied it to each sub-Saharan African country. We used the WHO country intelligence database to estimate the number of adult patients receiving antiretroviral therapy from 2005 to 2014. We fitted the number of adult patients receiving antiretroviral therapy to observed estimates, and predicted first-line and second-line needs between 2015 and 2030. We present results for sub-Saharan Africa, and eight selected countries. We present 18 scenarios, combining the availability of viral load monitoring, speed of antiretroviral scale-up, and rates of retention and switching to second-line. HIV transmission was not included. FINDINGS Depending on the scenario, 8·7-25·6 million people are expected to receive antiretroviral therapy in 2020, of whom 0·5-3·0 million will be receiving second-line antiretroviral therapy. The proportion of patients on treatment receiving second-line therapy was highest (15·6%) in the scenario with perfect retention and immediate switching, no further scale-up, and universal routine viral load monitoring. In 2030, the estimated range of patients receiving antiretroviral therapy will remain constant, but the number of patients receiving second-line antiretroviral therapy will increase to 0·8-4·6 million (6·6-19·6%). The need for second-line antiretroviral therapy was two to three times higher if routine viral load monitoring was implemented throughout the region, compared with a scenario of no further viral load monitoring scale-up. For each monitoring strategy, the future proportion of patients receiving second-line antiretroviral therapy differed only minimally between countries. INTERPRETATION Donors and countries in sub-Saharan Africa should prepare for a substantial increase in the need for second-line drugs during the next few years as access to viral load monitoring improves. An urgent need exists to decrease the costs of second-line drugs. FUNDING World Health Organization, Swiss National Science Foundation, National Institutes of Health.
Resumo:
Bovine tuberculosis (bTB) is a (re-)emerging disease in European countries, including Switzerland. This study assesses the seroprevalence of infection with Mycobacterium bovis and closely related agents in wild boar (Sus scrofa) in Switzerland, because wild boar are potential maintenance hosts of these pathogens. The study employs harmonised laboratory methods to facilitate comparison with the situation in other countries. Eighteen out of 743 blood samples tested seropositive (2.4%, CI: 1.5-3.9%) by ELISA, and the results for 61 animals previously assessed using culture and PCR indicated that this serological test was not 100% specific for M. bovis, cross-reacting with M. microti. Nevertheless, serology appears to be an appropriate test methodology in the harmonisation of wild boar testing throughout Europe. In accordance with previous findings, the low seroprevalence found in wild boar suggests wildlife is an unlikely source of the M. bovis infections recently detected in cattle in Switzerland. This finding contrasts with the epidemiological situation pertaining in southern Spain.
Resumo:
BACKGROUND Tuberculosis (TB) is the leading cause of death in South Africa. The burden of disease varies by age, with peaks in TB notification rates in the HIV-negative population at ages 0-5, 20-24, and 45-49 years. There is little variation between age groups in the rates in the HIV-positive population. The drivers of this age pattern remain unknown. METHODS We developed an age-structured simulation model of Mycobacterium tuberculosis (Mtb) transmission in Cape Town, South Africa. We considered five states of TB progression: susceptible, infected (latent TB), active TB, treated TB, and treatment default. Latently infected individuals could be re-infected; a previous Mtb infection slowed progression to active disease. We further considered three states of HIV progression: HIV negative, HIV positive, on antiretroviral therapy. To parameterize the model, we analysed treatment outcomes from the Cape Town electronic TB register, social mixing patterns from a Cape Town community and used literature estimates for other parameters. To investigate the main drivers behind the age patterns, we conducted sensitivity analyses on all parameters related to the age structure. RESULTS The model replicated the age patterns in HIV-negative TB notification rates of Cape Town in 2009. Simulated TB notification rate in HIV-negative patients was 1000/100,000 person-years (pyrs) in children aged <5 years and decreased to 51/100,000 in children 5-15 years. The peak in early adulthood occurred at 25-29 years (463/100,000 pyrs). After a subsequent decline, simulated TB notification rates gradually increased from the age of 30 years. Sensitivity analyses showed that the dip after the early adult peak was due to the protective effect of latent TB and that retreatment TB was mainly responsible for the rise in TB notification rates from the age of 30 years. CONCLUSION The protective effect of a first latent infection on subsequent infections and the faster progression in previously treated patients are the key determinants of the age-structure of TB notification rates in Cape Town.
Resumo:
OBJECTIVE To estimate the cost-effectiveness of prevention of mother-to-child transmission (MTCT) of HIV with lifelong antiretroviral therapy (ART) for pregnant and breastfeeding women ('Option B+') compared with ART during pregnancy or breastfeeding only unless clinically indicated ('Option B'). DESIGN Mathematical modelling study of first and second pregnancy, informed by data from the Malawi Option B+ programme. METHODS Individual-based simulation model. We simulated cohorts of 10 000 women and their infants during two subsequent pregnancies, including the breastfeeding period, with either Option B+ or B. We parameterized the model with data from the literature and by analysing programmatic data. We compared total costs of antenatal and postnatal care, and lifetime costs and disability-adjusted life-years of the infected infants between Option B+ and Option B. RESULTS During the first pregnancy, 15% of the infants born to HIV-infected mothers acquired the infection. With Option B+, 39% of the women were on ART at the beginning of the second pregnancy, compared with 18% with Option B. For second pregnancies, the rates MTCT were 11.3% with Option B+ and 12.3% with Option B. The incremental cost-effectiveness ratio comparing the two options ranged between about US$ 500 and US$ 1300 per DALY averted. CONCLUSION Option B+ prevents more vertical transmissions of HIV than Option B, mainly because more women are already on ART at the beginning of the next pregnancy. Option B+ is a cost-effective strategy for PMTCT if the total future costs and lost lifetime of the infected infants are taken into account.