376 resultados para Logistic Regression
Resumo:
Background: Vancomycin is a cornerstone antibiotic for the management of severe Gram positive infections. However, high doses of vancomycin are associated with a risk of nephrotoxicity. This study aimed to evaluate the relationship between the evolution of vancomycin trough concentration and the occurrence of nephrotoxicity, and to identify risk factors for both vancomycin-associated nephrotoxicity and vancomycin overexposure. Methods: A total of 1240 patients records from our hospital therapeutic drug monitoring database between 2007 and 2011 were screened and grouped according to predefined criteria defining vancomycin overexposure (one or more occurrence of a trough level ≥ 20 mg/L) and treatment-related nephrotoxicity (rise of serum creatinine by ≥ 50% over baseline). A representative sample of 150 cases was selected for in depth analysis. Weighted logistic regression analyses were used to test associations between vancomycin overexposure, nephrotoxicity and other predictors of interest. Results: Patients with high trough concentrations were found to be more likely to develop nephrotoxicity (odds ratio: 4.12; p <0.001). Specific risk factors, notably concomitant nephrotoxic treatments and comorbid conditions (heart failure), were found to independently increase the risk of either nephrotoxicity or vancomycin exposure. Finally, the exploration of temporal relationships between variations of vancomycin trough concentrations and creatinine levels were in line with circular causality with some antecedence of vancomycin on creatinine changes. Conclusion: Our results confirm the important nephrotoxic potential of vancomycin and indicate that the utilisation of this drug deserves thorough individualization for conditions susceptible to increase its concentration exposure and reactive adjustment based on therapeutic drug monitoring.
Resumo:
BACKGROUND: Oral contraceptives are known to reduce the incidence rate of endometrial cancer, but it is uncertain how long this effect lasts after use ceases, or whether it is modified by other factors. METHODS: Individual participant datasets were sought from principal investigators and provided centrally for 27 276 women with endometrial cancer (cases) and 115 743 without endometrial cancer (controls) from 36 epidemiological studies. The relative risks (RRs) of endometrial cancer associated with oral contraceptive use were estimated using logistic regression, stratified by study, age, parity, body-mass index, smoking, and use of menopausal hormone therapy. FINDINGS: The median age of cases was 63 years (IQR 57-68) and the median year of cancer diagnosis was 2001 (IQR 1994-2005). 9459 (35%) of 27 276 cases and 45 625 (39%) of 115 743 controls had ever used oral contraceptives, for median durations of 3·0 years (IQR 1-7) and 4·4 years (IQR 2-9), respectively. The longer that women had used oral contraceptives, the greater the reduction in risk of endometrial cancer; every 5 years of use was associated with a risk ratio of 0·76 (95% CI 0·73-0·78; p<0·0001). This reduction in risk persisted for more than 30 years after oral contraceptive use had ceased, with no apparent decrease between the RRs for use during the 1960s, 1970s, and 1980s, despite higher oestrogen doses in pills used in the early years. However, the reduction in risk associated with ever having used oral contraceptives differed by tumour type, being stronger for carcinomas (RR 0·69, 95% CI 0·66-0·71) than sarcomas (0·83, 0·67-1·04; case-case comparison: p=0·02). In high-income countries, 10 years use of oral contraceptives was estimated to reduce the absolute risk of endometrial cancer arising before age 75 years from 2·3 to 1·3 per 100 women. INTERPRETATION: Use of oral contraceptives confers long-term protection against endometrial cancer. These results suggest that, in developed countries, about 400 000 cases of endometrial cancer before the age of 75 years have been prevented over the past 50 years (1965-2014) by oral contraceptives, including 200 000 in the past decade (2005-14). FUNDING: Medical Research Council, Cancer Research UK.
Resumo:
BACKGROUND: Lack of donor organs remains a major obstacle in organ transplantation. Our aim was to evaluate (1) the association between engaging in high-risk recreational activities and attitudes toward organ donation and (2) the degree of reciprocity between organ acceptance and donation willingness in young men. METHODS: A 17-item, close-ended survey was offered to male conscripts ages 18 to 26 years in all Swiss military conscription centers. Predictors of organ donation attitudes were assessed in bivariate analyses and multiple logistic regression. Reciprocity of the intentions to accept and to donate organs was assessed by means of donor card status. RESULTS: In 1559 responses analyzed, neither motorcycling nor practicing extreme sports reached significant association with donor card holder status. Family communication about organ donation, student, or academic profession and living in a Latin linguistic region were predictors of positive organ donation attitudes, whereas residence in a German-speaking region and practicing any religion predicted reluctance. Significantly more respondents were willing to accept than to donate organs, especially among those without family communication concerning organ donation. CONCLUSIONS: For the first time, it was shown that high-risk recreational activities do not influence organ donation attitudes. Second, a considerable discrepancy in organ donation reciprocity was identified. We propose that increasing this reciprocity could eventually increase organ donation rates.
Resumo:
BACKGROUND: Artemisinin-resistant Plasmodium falciparum has emerged in the Greater Mekong sub-region and poses a major global public health threat. Slow parasite clearance is a key clinical manifestation of reduced susceptibility to artemisinin. This study was designed to establish the baseline values for clearance in patients from Sub-Saharan African countries with uncomplicated malaria treated with artemisinin-based combination therapies (ACTs). METHODS: A literature review in PubMed was conducted in March 2013 to identify all prospective clinical trials (uncontrolled trials, controlled trials and randomized controlled trials), including ACTs conducted in Sub-Saharan Africa, between 1960 and 2012. Individual patient data from these studies were shared with the WorldWide Antimalarial Resistance Network (WWARN) and pooled using an a priori statistical analytical plan. Factors affecting early parasitological response were investigated using logistic regression with study sites fitted as a random effect. The risk of bias in included studies was evaluated based on study design, methodology and missing data. RESULTS: In total, 29,493 patients from 84 clinical trials were included in the analysis, treated with artemether-lumefantrine (n = 13,664), artesunate-amodiaquine (n = 11,337) and dihydroartemisinin-piperaquine (n = 4,492). The overall parasite clearance rate was rapid. The parasite positivity rate (PPR) decreased from 59.7 % (95 % CI: 54.5-64.9) on day 1 to 6.7 % (95 % CI: 4.8-8.7) on day 2 and 0.9 % (95 % CI: 0.5-1.2) on day 3. The 95th percentile of observed day 3 PPR was 5.3 %. Independent risk factors predictive of day 3 positivity were: high baseline parasitaemia (adjusted odds ratio (AOR) = 1.16 (95 % CI: 1.08-1.25); per 2-fold increase in parasite density, P <0.001); fever (>37.5 °C) (AOR = 1.50 (95 % CI: 1.06-2.13), P = 0.022); severe anaemia (AOR = 2.04 (95 % CI: 1.21-3.44), P = 0.008); areas of low/moderate transmission setting (AOR = 2.71 (95 % CI: 1.38-5.36), P = 0.004); and treatment with the loose formulation of artesunate-amodiaquine (AOR = 2.27 (95 % CI: 1.14-4.51), P = 0.020, compared to dihydroartemisinin-piperaquine). CONCLUSIONS: The three ACTs assessed in this analysis continue to achieve rapid early parasitological clearance across the sites assessed in Sub-Saharan Africa. A threshold of 5 % day 3 parasite positivity from a minimum sample size of 50 patients provides a more sensitive benchmark in Sub-Saharan Africa compared to the current recommended threshold of 10 % to trigger further investigation of artemisinin susceptibility.
Resumo:
BACKGROUND: Low vitamin D status has been associated with an increased risk of developing type 2 diabetes and insulin resistance (IR), although this has been recently questioned. OBJECTIVE: We examined the association between serum vitamin D metabolites and incident IR. METHODS: This was a prospective, population-based study derived from the CoLaus (Cohorte Lausannoise) study including 3856 participants (aged 51.2 ± 10.4 y; 2217 women) free from diabetes or IR at baseline. IR was defined as a homeostasis model assessment (HOMA) index >2.6. Fasting plasma insulin and glucose were measured at baseline and at follow-up to calculate the HOMA index. The association of vitamin D metabolites with incident IR was analyzed by logistic regression, and the results were expressed for each independent variable as ORs and 95% CIs. RESULTS: During the 5.5-y follow-up, 649 (16.9%) incident cases of IR were identified. Participants who developed IR had lower baseline serum concentrations of 25-hydroxyvitamin D3 [25(OH)D3 (25-hydroxycholecalciferol); 45.9 ± 22.8 vs. 49.9 ± 22.6 nmol/L; P < 0.001], total 25(OH)D3 (25(OH)D3 + epi-25-hydroxyvitamin D3 [3-epi-25(OH)D3]; 49.1 ± 24.3 vs. 53.3 ± 24.1 nmol/L; P < 0.001), and 3-epi-25(OH)D3 (4.2 ± 2.9 vs. 4.3 ± 2.5 nmol/L; P = 0.01) but a higher 3-epi- to total 25(OH)D3 ratio (0.09 ± 0.05 vs. 0.08 ± 0.04; P = 0.007). Multivariable analysis adjusting for month of sampling, age, and sex showed an inverse association between 25(OH)D3 and the likelihood of developing IR [ORs (95% CIs): 0.86 (0.68, 1.09), 0.60 (0.46, 0.78), and 0.57 (0.43, 0.75) for the second, third, and fourth quartiles compared with the first 25(OH)D3 quartile; P-trend < 0.001]. Similar associations were found between total 25(OH)D3 and incident IR. There was no significant association between 3-epi-25(OH)D3 and IR, yet a positive association was observed between the 3-epi- to total 25(OH)D3 ratio and incident IR. Further adjustment for body mass index, sedentary status, and smoking attenuated the association between 25(OH)D3, total 25(OH)D3, and the 3-epi- to total 25(OH)D3 ratio and the likelihood of developing IR. CONCLUSION: In the CoLaus study in healthy adults, the risk of incident IR is not associated with serum concentrations of 25(OH)D3 and total 25(OH)D3.
Resumo:
BACKGROUND: Several studies observed associations of various aspects of diet with mental health, but little is known about the relationship between following the 5-a-day recommendation for fruit and vegetables consumption and mental health. Thus, we examined the associations of the Swiss daily recommended fruit and vegetable intake with psychological distress. METHODS: Data from 20,220 individuals aged 15+ years from the 2012 Swiss Health Survey were analyzed. The recommended portions of fruit and vegetables per day were defined as 5-a-day (at least 2 portions of fruit and 3 of vegetables). The outcome was perceived psychological distress over the previous 4 weeks (measured by the 5-item mental health index [MHI-5]). High distress (MHI-5 score ≤ 52), moderate distress (MHI-5 > 52 and ≤ 72) and low distress (MHI-5 > 72 and ≤ 100) were differentiated and multinomial logistic regression analyses adjusted for known confounding factors were performed. RESULTS: The 5-a-day recommendation was met by 11.6 % of the participants with low distress, 9.3 % of those with moderate distress, and 6.2 % of those with high distress. Consumers fulfilling the 5-a-day recommendation had lower odds of being highly or moderately distressed than individuals consuming less fruit and vegetables (moderate vs. low distress: OR = 0.82, 95 % confidence interval [CI] 0.69-0.97; high vs. low distress: OR = 0.55, 95 % CI 0.41-0.75). CONCLUSIONS: Daily intake of 5 servings of fruit and vegetable was associated with lower psychological distress. Longitudinal studies are needed to further determine the causal nature of this relationship.
Resumo:
BACKGROUND: Anti-cancer treatment and the cancer population have evolved since the last European Organisation for Research and Treatment of Cancer (EORTC) fungemia survey, and there are few recent large epidemiological studies. METHODS: This was a prospective cohort study including 145 030 admissions of patients with cancer from 13 EORTC centers. Incidence, clinical characteristics, and outcome of fungemia were analyzed. RESULTS: Fungemia occurred in 333 (0.23%; 95% confidence interval [CI], .21-.26) patients, ranging from 0.15% in patients with solid tumors to 1.55% in hematopoietic stem cell transplantation recipients. In 297 evaluable patients age ranged from 17 to 88 years (median 56 years), 144 (48%) patients were female, 165 (56%) had solid tumors, and 140 (47%) had hematological malignancies. Fungemia including polymicrobial infection was due to: Candida spp. in 267 (90%), C. albicans in 128 (48%), and other Candida spp. in 145 (54%) patients. Favorable overall response was achieved in 113 (46.5%) patients by week 2. After 4 weeks, the survival rate was 64% (95% CI, 59%-70%) and was not significantly different between Candida spp. Multivariable logistic regression identified baseline septic shock (odds ratio [OR] 3.04, 95% CI, 1.22-7.58) and tachypnoea as poor prognostic factors (OR 2.95, 95% CI, 1.66-5.24), while antifungal prophylaxis prior to fungemia (OR 0.20, 95% CI, .06-.62) and remission of underlying cancer (OR, 0.18; 95% CI, .06-.50) were protective. CONCLUSIONS: Fungemia, mostly due to Candida spp., was rare in cancer patients from EORTC centers but was associated with substantial mortality. Antifungal prophylaxis and remission of cancer predicted better survival.
Resumo:
BACKGROUND: According to the gateway hypothesis, tobacco use is a gateway of cannabis use. However, there is increasing evidence that cannabis use also predicts the progression of tobacco use (reverse gateway hypothesis). Unfortunately, the importance of cannabis use compared to other predictors of tobacco use is less clear. The aim of this study was to examine which variables, in addition to cannabis use, best predict the onset of daily cigarette smoking in young men. METHODS: A total of 5,590 young Swiss men (mean age = 19.4 years, SD = 1.2) provided data on their substance use, socio-demographic background, religion, health, social context, and personality at baseline and after 18 months. We modelled the predictors of progression to daily cigarette smoking using logistic regression analyses (n = 4,230). RESULTS: In the multivariate overall model, use of cannabis remained among the strongest predictors for the onset of daily cigarette use. Daily cigarette use was also predicted by a lifetime use of at least 50 cigarettes, occasional cigarette use, educational level, religious affiliation, parental situation, peers with psychiatric problems, and sociability. CONCLUSIONS: Our results highlight the relevance of cannabis use compared to other potential predictors of the progression of tobacco use and thereby support the reverse gateway hypothesis.
Resumo:
The study aimed at assessing the link between physical activity (PA), sports activity and snus use among young men in Switzerland. Data from the Cohort Study on Substance Use Risk Factors (C-SURF) were used to measure PA with the International Physical Activity Questionnaire and sports activity with a single item. Multivariate logistic regression analysis was conducted to measure the association between snus use, PA and sports activity. Similar models were run for smoking and snuff use. Snus use increased in a dose-response association with PA (high level: OR = 1.72; 95% CI 1.16-2.55) and with individuals exercising once a week or more often (OR = 1.65; 95% CI 1.26-2.16; p < 0.001) or almost every day (OR = 2.27; 95% CI 1.72-3.01; p < 0.001) in separate models. Entered simultaneously, only sports and exercise maintained a basically unchanged significant dose-response relationship, whereas PA became non-significant. A non-significant dose-response relation was found for cigarette smoking and snuff use, indicating that the association with sport is specific to snus and not to tobacco use in general or smokeless tobacco in particular. This study showed that the association between snus use and sports is not specific to Nordic countries.
Resumo:
BACKGROUND: Endovascular treatment for acute ischemic stroke patients was recently shown to improve recanalization rates and clinical outcome in a well-defined study population. Intravenous thrombolysis (IVT) alone is insufficiently effective to recanalize in certain patients or of little value in others. Accordingly, we aimed at identifying predictors of recanalization in patients treated with or without IVT. METHODS: In the observational Acute Stroke Registry and Analysis of Lausanne (ASTRAL) registry, we selected those stroke patients (1) with an arterial occlusion on computed tomography angiography (CTA) imaging, (2) who had an arterial patency assessment at 24 hours (CTA/magnetic resonance angiography/transcranial Doppler), and (3) who were treated with IVT or had no revascularization treatment. Based on 2 separate logistic regression analyses, predictors of spontaneous and post-thrombolytic recanalization were generated. RESULTS: Partial or complete recanalization was achieved in 121 of 210 (58%) thrombolyzed patients. Recanalization was associated with atrial fibrillation (odds ratio , 1.6; 95% confidence interval, 1.2-3.0) and absence of early ischemic changes on CT (1.1, 1.1-1.2) and inversely correlated with the presence of a significant extracranial (EC) stenosis or occlusion (.6, .3-.9). In nonthrombolyzed patients, partial or complete recanalization was significantly less frequent (37%, P < .01). The recanalization was independently associated with a history of hypercholesterolemia (2.6, 1.2-5.6) and the proximal site of the intracranial occlusion (2.5, 1.2-5.4), and inversely correlated with a decreased level of consciousness (.3, .1-.8), and EC (.3, .1-.6) and basilar artery pathology (.1, .0-.6). CONCLUSIONS: Various clinical findings, cardiovascular risk factors, and arterial pathology on acute CTA-based imaging are moderately associated with spontaneous and post-thrombolytic arterial recanalization at 24 hours. If confirmed in other studies, this information may influence patient selection toward the most appropriate revascularization strategy.
Resumo:
BACKGROUND: Due to the underlying diseases and the need for immunosuppression, patients after lung transplantation are particularly at risk for gastrointestinal (GI) complications that may negatively influence long-term outcome. The present study assessed the incidences and impact of GI complications after lung transplantation and aimed to identify risk factors. METHODS: Retrospective analysis of all 227 consecutively performed single- and double-lung transplantations at the University hospitals of Lausanne and Geneva was performed between January 1993 and December 2010. Logistic regressions were used to test the effect of potentially influencing variables on the binary outcomes overall, severe, and surgery-requiring complications, followed by a multiple logistic regression model. RESULTS: Final analysis included 205 patients for the purpose of the present study, and 22 patients were excluded due to re-transplantation, multiorgan transplantation, or incomplete datasets. GI complications were observed in 127 patients (62 %). Gastro-esophageal reflux disease was the most commonly observed complication (22.9 %), followed by inflammatory or infectious colitis (20.5 %) and gastroparesis (10.7 %). Major GI complications (Dindo/Clavien III-V) were observed in 83 (40.5 %) patients and were fatal in 4 patients (2.0 %). Multivariate analysis identified double-lung transplantation (p = 0.012) and early (1993-1998) transplantation period (p = 0.008) as independent risk factors for developing major GI complications. Forty-three (21 %) patients required surgery such as colectomy, cholecystectomy, and fundoplication in 6.8, 6.3, and 3.9 % of the patients, respectively. Multivariate analysis identified Charlson comorbidity index of ≥3 as an independent risk factor for developing GI complications requiring surgery (p = 0.015). CONCLUSION: GI complications after lung transplantation are common. Outcome was rather encouraging in the setting of our transplant center.
Resumo:
AIM: This study quantified the impact of perinatal predictors and medical centre on the outcome of very low-gestational-age neonates (VLGANs) born at <32 completed weeks in Switzerland. METHODS: Using prospectively collected data from a 10-year cohort of VLGANs, we developed logistic regression models for three different time points: delivery, NICU admission and seven days of age. The data predicted survival to discharge without severe neonatal morbidity, such as major brain injury, moderate or severe bronchopulmonary dysplasia, retinopathy of prematurity (≥stage three) or necrotising enterocolitis (≥stage three). RESULTS: From 2002 to 2011, 6892 VLGANs were identified: 5854 (85%) of the live-born infants survived and 84% of the survivors did not have severe neonatal complications. Predictors for adverse outcome at delivery and on NICU admission were low gestational age, low birthweight, male sex, multiple birth, birth defects and lack of antenatal corticosteroids. Proven sepsis was an additional risk factor on day seven of life. The medical centre remained a statistically significant factor at all three time points after adjusting for perinatal predictors. CONCLUSION: After adjusting for perinatal factors, the survival of Swiss VLGANs without severe neonatal morbidity was strongly influenced by the medical centre that treated them.
Resumo:
Children with Wiskott-Aldrich syndrome (WAS) are often first diagnosed with immune thrombocytopenia (ITP), potentially leading to both inappropriate treatment and the delay of life-saving definitive therapy. WAS is traditionally differentiated from ITP based on the small size of WAS platelets. In practice, microthrombocytopenia is often not present or not appreciated in children with WAS. To develop an alternative method of differentiating WAS from ITP, we retrospectively reviewed all complete blood counts and measurements of immature platelet fraction (IPF) in 18 subjects with WAS and 38 subjects with a diagnosis of ITP treated at our hospital. Examination of peripheral blood smears revealed a wide range of platelet sizes in subjects with WAS. Mean platelet volume (MPV) was not reported in 26% of subjects, and subjects in whom MPV was not reported had lower platelet counts than did subjects in whom MPV was reported. Subjects with WAS had a lower IPF than would be expected for their level of thrombocytopenia, and the IPF in subjects with WAS was significantly lower than in subjects with a diagnosis of ITP. Using logistic regression, we developed and validated a rule based on platelet count and IPF that was more sensitive for the diagnosis of WAS than was the MPV, and was applicable regardless of the level of platelets or the availability of the MPV. Our observations demonstrate that MPV is often not available in severely thrombocytopenic subjects, which may hinder the diagnosis of WAS. In addition, subjects with WAS have a low IPF, which is consistent with the notion that a platelet production defect contributes to the thrombocytopenia of WAS. Knowledge of this detail of WAS pathophysiology allows to differentiate WAS from ITP with increased sensitivity, thereby allowing a physician to spare children with WAS from inappropriate treatment, and make definitive therapy available in a timely manner.
Resumo:
BACKGROUND: Gemcitabine plus cisplatin (GC) has been adopted as a neoadjuvant regimen for muscle-invasive bladder cancer despite the lack of Level I evidence in this setting. METHODS: Data were collected using an electronic data-capture platform from 28 international centers. Eligible patients had clinical T-classification 2 (cT2) through cT4aN0M0 urothelial cancer of the bladder and received neoadjuvant GC or methotrexate, vinblastine, doxorubicin, plus cisplatin (MVAC) before undergoing cystectomy. Logistic regression was used to compute propensity scores as the predicted probabilities of patients being assigned to MVAC versus GC given their baseline characteristics. These propensity scores were then included in a new logistic regression model to estimate an adjusted odds ratio comparing the odds of attaining a pathologic complete response (pCR) between patients who received MVAC and those who received GC. RESULTS: In total, 212 patients (146 patients in the GC cohort and 66 patients in the MVAC cohort) met criteria for inclusion in the analysis. The majority of patients in the MVAC cohort (77%) received dose-dense MVAC. The median age of patients was 63 years, they were predominantly men (74%), and they received a median of 3 cycles of neoadjuvant chemotherapy. The pCR rate was 29% in the MVAC cohort and 31% in the GC cohort. There was no significant difference in the pCR rate when adjusted for propensity scores between the 2 regimens (odds ratio, 0.91; 95% confidence interval, 0.48-1.72; P = .77). In an exploratory analysis evaluating survival, the hazard ratio comparing hazard rates for MVAC versus GC adjusted for propensity scores was not statistically significant (hazard ratio, 0.78; 95% confidence interval, 0.40-1.54; P = .48). CONCLUSIONS: Patients who received neoadjuvant GC and MVAC achieved comparable pCR rates in the current analysis, providing evidence to support what has become routine practice. Cancer 2015;121:2586-2593. © 2015 American Cancer Society.
Resumo:
BACKGROUND: Transmitted human immunodeficiency virus type 1 (HIV) drug resistance (TDR) mutations are transmitted from nonresponding patients (defined as patients with no initial response to treatment and those with an initial response for whom treatment later failed) or from patients who are naive to treatment. Although the prevalence of drug resistance in patients who are not responding to treatment has declined in developed countries, the prevalence of TDR mutations has not. Mechanisms causing this paradox are poorly explored. METHODS: We included recently infected, treatment-naive patients with genotypic resistance tests performed ≤1 year after infection and before 2013. Potential risk factors for TDR mutations were analyzed using logistic regression. The association between the prevalence of TDR mutations and population viral load (PVL) among treated patients during 1997-2011 was estimated with Poisson regression for all TDR mutations and individually for the most frequent resistance mutations against each drug class (ie, M184V/L90M/K103N). RESULTS: We included 2421 recently infected, treatment-naive patients and 5399 patients with no response to treatment. The prevalence of TDR mutations fluctuated considerably over time. Two opposing developments could explain these fluctuations: generally continuous increases in the prevalence of TDR mutations (odds ratio, 1.13; P = .010), punctuated by sharp decreases in the prevalence when new drug classes were introduced. Overall, the prevalence of TDR mutations increased with decreasing PVL (rate ratio [RR], 0.91 per 1000 decrease in PVL; P = .033). Additionally, we observed that the transmitted high-fitness-cost mutation M184V was positively associated with the PVL of nonresponding patients carrying M184V (RR, 1.50 per 100 increase in PVL; P < .001). Such association was absent for K103N (RR, 1.00 per 100 increase in PVL; P = .99) and negative for L90M (RR, 0.75 per 100 increase in PVL; P = .022). CONCLUSIONS: Transmission of antiretroviral drug resistance is temporarily reduced by the introduction of new drug classes and driven by nonresponding and treatment-naive patients. These findings suggest a continuous need for new drugs, early detection/treatment of HIV-1 infection.