979 resultados para MULTILEVEL LOGISTIC-REGRESSION


Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: The human immunodeficiency virus type 1 reverse-transcriptase mutation K65R is a single-point mutation that has become more frequent after increased use of tenofovir disoproxil fumarate (TDF). We aimed to identify predictors for the emergence of K65R, using clinical data and genotypic resistance tests from the Swiss HIV Cohort Study. METHODS: A total of 222 patients with genotypic resistance tests performed while receiving treatment with TDF-containing regimens were stratified by detectability of K65R (K65R group, 42 patients; undetected K65R group, 180 patients). Patient characteristics at start of that treatment were analyzed. RESULTS: In an adjusted logistic regression, TDF treatment with nonnucleoside reverse-transcriptase inhibitors and/or didanosine was associated with the emergence of K65R, whereas the presence of any of the thymidine analogue mutations D67N, K70R, T215F, or K219E/Q was protective. The previously undescribed mutational pattern K65R/G190S/Y181C was observed in 6 of 21 patients treated with efavirenz and TDF. Salvage therapy after TDF treatment was started for 36 patients with K65R and for 118 patients from the wild-type group. Proportions of patients attaining human immunodeficiency virus type 1 loads <50 copies/mL after 24 weeks of continuous treatment were similar for the K65R group (44.1%; 95% confidence interval, 27.2%-62.1%) and the wild-type group (51.9%; 95% confidence interval, 42.0%-61.6%). CONCLUSIONS: In settings where thymidine analogue mutations are less likely to be present, such as at start of first-line therapy or after extended treatment interruptions, combinations of TDF with other K65R-inducing components or with efavirenz or nevirapine may carry an enhanced risk of the emergence of K65R. The finding of a distinct mutational pattern selected by treatment with TDF and efavirenz suggests a potential fitness interaction between K65R and nonnucleoside reverse-transcriptase inhibitor-induced mutations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Alcohol consumption leading to morbidity and mortality affects HIV-infected individuals. Here, we aimed to study self-reported alcohol consumption and to determine its association with adherence to antiretroviral therapy (ART) and HIV surrogate markers. METHODS: Cross-sectional data on daily alcohol consumption from August 2005 to August 2007 were analysed and categorized according to the World Health Organization definition (light, moderate or severe health risk). Multivariate logistic regression models and Pearson's chi(2) statistics were used to test the influence of alcohol use on endpoints. RESULTS: Of 6,323 individuals, 52.3% consumed alcohol less than once a week in the past 6 months. Alcohol intake was deemed light in 39.9%, moderate in 5.0% and severe in 2.8%. Higher alcohol consumption was significantly associated with older age, less education, injection drug use, being in a drug maintenance programme, psychiatric treatment, hepatitis C virus coinfection and with a longer time since diagnosis of HIV. Lower alcohol consumption was found in males, non-Caucasians, individuals currently on ART and those with more ART experience. In patients on ART (n=4,519), missed doses and alcohol consumption were positively correlated (P<0.001). Severe alcohol consumers, who were pretreated with ART, were more often off treatment despite having CD4+ T-cell count <200 cells/microl; however, severe alcohol consumption per se did not delay starting ART. In treated individuals, alcohol consumption was not associated with worse HIV surrogate markers. CONCLUSIONS: Higher alcohol consumption in HIV-infected individuals was associated with several psychosocial and demographic factors, non-adherence to ART and, in pretreated individuals, being off treatment despite low CD4+ T-cell counts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Contact surveillance is an important strategy to ensure effective early diagnosis and control of leprosy; passive detection may not be as efficient because it is directly tied to the ready availability of heath care services and health education campaigns. The aim of this study was to reinforce that contact surveillance is the most effective strategy for the control of leprosy. The analysed data were obtained from a cohort of contacts and cases diagnosed through a national referral service for leprosy. We analysed data from patients diagnosed between 1987-2010 at the Souza Araújo Ambulatory in Rio de Janeiro. Epidemiological characteristics of leprosy cases diagnosed through contact surveillance and characteristics of passively detected index cases were compared using a conditional logistic regression model. Cases diagnosed by contact surveillance were found earlier in the progression of the disease, resulting in less severe clinical presentations, lower levels of initial and final disability grades, lower initial and final bacterial indices and a lower prevalence of disease reaction. In this respect, contact surveillance proved to be an effective tertiary prevention strategy, indicating that active surveillance is especially important in areas of high endemicity, such as Brazil.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: This study aimed to investigate the influence of deep sternal wound infection on long-term survival following cardiac surgery. MATERIAL AND METHODS: In our institutional database we retrospectively evaluated medical records of 4732 adult patients who received open-heart surgery from January 1995 through December 2005. The predictive factors for DSWI were determined using logistic regression analysis. Then, each patient with deep sternal wound infection (DSWI) was matched with 2 controls without DSWI, according to the risk factors identified previously. After checking balance resulting from matching, short-term mortality was compared between groups using a paired test, and long-term survival was compared using Kaplan-Meier analysis and a Cox proportional hazard model. RESULTS: Overall, 4732 records were analyzed. The mean age of the investigated population was 69.3±12.8 years. DSWI occurred in 74 (1.56%) patients. Significant independent predictive factors for deep sternal infections were active smoking (OR 2.19, CI95 1.35-3.53, p=0.001), obesity (OR 1.96, CI95 1.20-3.21, p=0.007), and insulin-dependent diabetes mellitus (OR 2.09, CI95 1.05-10.06, p=0.016). Mean follow-up in the matched set was 125 months, IQR 99-162. After matching, in-hospital mortality was higher in the DSWI group (8.1% vs. 2.7% p=0.03), but DSWI was not an independent predictor of long-term survival (adjusted HR 1.5, CI95 0.7-3.2, p=0.33). CONCLUSIONS: The results presented in this report clearly show that post-sternotomy deep wound infection does not influence long-term survival in an adult general cardio-surgical patient population.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AIM To investigate the incidence of neoplasms in inflammatory bowel disease (IBD) patients and the potential causative role of thiopurines. METHODS We performed an observational descriptive study comparing the incidence of malignancies in IBD patients treated with thiopurines and patients not treated with these drugs. We included 812 patients which were divided in two groups depending on whether they have received thiopurines or not. We have studied basal characteristics of both groups (age when the disease was diagnosed, sex, type of IBD, etc.) and treatments received (Azathioprine, mercaptopurine, infliximab, adalimumab or other immunomodulators), as well as neoplasms incidence. Univariate analysis was performed with the student t test, χ(2) test or Wilcoxon exact test as appropriate. A logistic regression analysis was performed as multivariate analysis. Statistical significance was establish at P values of less than 0.05, and 95%CI were used for the odds ratios. RESULTS Among 812 patients included, 429 (52.83%) have received thiopurines: 79.5% azathioprine, 14% mercaptopurine and 6.5% both drugs. 44.76% of patients treated with thiopurines and 46, 48% of patients who did not receive this treatment were women (P > 0.05). The proportion of ulcerative colitis patients treated with thiopurines was 30.3% compare to 66. 67% of patients not treated (P < 0.001). Mean azathioprine dose was 123.79 ± 36.5 mg/d (range: 50-250 mg/d), mean usage time was 72.16 ± 55.7 mo (range: 1-300 mo) and the accumulated dose along this time was 274.32 ± 233.5 g (1.5-1350 g). With respect to mercaptopurine, mean dose was 74.7 ± 23.9 mg/d (range: 25-150 mg/d), mean usage time of 23.37 ± 27.6 mo (range: 1-118 mo), and the accumulated dose along this time was 52.2 ± 63.5 g (range: 1.5-243 g). Thiopurine S-methyltransferase activity was tested in 66% of patients treated with thiopurines, among which 98.2% had an intermediate or high activity. Among the patients treated with thiopurines, 27.27% (112 patients) and 11.66% (50 patients) received treatment with Infliximab and Adalimumab respectively, but only 1.83% (7 patients) and 0.78% (3 patients) received these drugs in the group of patients who did not received thiopurines (P < 0.001 and P < 0.001 respectively). Finally, 6.8% (29 patients) among those treated with thiopurines have received other immunesupresants (Methotrexate, Tacrolimus, Cyclosporin), compare to 1% (4 patients) of patients not treated with thiopurines (P < 0.001). Among patients treated with thiopurines, 3.97% developed a malignancy, and among those not treated neoplasms presented in 8.1% (P = 0.013). The most frequent neoplasms were colorectal ones (12 cases in patients not treated with thiopurines but none in treated, P < 0.001) followed by non-melanoma skin cancer (8 patients in treated with thiopurines and 6 in not treated, P > 0.05). CONCLUSION In our experience, thiopurine therapy did not increase malignancies development in IBD patients, and was an efective and safe treatment for these diseases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Given the very large amount of data obtained everyday through population surveys, much of the new research again could use this information instead of collecting new samples. Unfortunately, relevant data are often disseminated into different files obtained through different sampling designs. Data fusion is a set of methods used to combine information from different sources into a single dataset. In this article, we are interested in a specific problem: the fusion of two data files, one of which being quite small. We propose a model-based procedure combining a logistic regression with an Expectation-Maximization algorithm. Results show that despite the lack of data, this procedure can perform better than standard matching procedures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES: To assess the extent to which stage at diagnosis and adherence to treatment guidelines may explain the persistent differences in colorectal cancer survival between the USA and Europe. DESIGN: A high-resolution study using detailed clinical data on Dukes' stage, diagnostic procedures, treatment and follow-up, collected directly from medical records by trained abstractors under a single protocol, with standardised quality control and central statistical analysis. SETTING AND PARTICIPANTS: 21 population-based registries in seven US states and nine European countries provided data for random samples comprising 12 523 adults (15-99 years) diagnosed with colorectal cancer during 1996-1998. OUTCOME MEASURES: Logistic regression models were used to compare adherence to 'standard care' in the USA and Europe. Net survival and excess risk of death were estimated with flexible parametric models. RESULTS: The proportion of Dukes' A and B tumours was similar in the USA and Europe, while that of Dukes' C was more frequent in the USA (38% vs 21%) and of Dukes' D more frequent in Europe (22% vs 10%). Resection with curative intent was more frequent in the USA (85% vs 75%). Elderly patients (75-99 years) were 70-90% less likely to receive radiotherapy and chemotherapy. Age-standardised 5-year net survival was similar in the USA (58%) and Northern and Western Europe (54-56%) and lowest in Eastern Europe (42%). The mean excess hazard up to 5 years after diagnosis was highest in Eastern Europe, especially among elderly patients and those with Dukes' D tumours. CONCLUSIONS: The wide differences in colorectal cancer survival between Europe and the USA in the late 1990s are probably attributable to earlier stage and more extensive use of surgery and adjuvant treatment in the USA. Elderly patients with colorectal cancer received surgery, chemotherapy or radiotherapy less often than younger patients, despite evidence that they could also have benefited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Rationale: Children with atopic diseases in early life are frequently found with positive IgE tests to nuts, without a history of previous ingestion. We aimed to identify risk factors for reactions to nuts at their first introduction. Methods: A detailed retrospective case note and database analysis was performed. Inclusion criteria were: patients aged 3 to 16 years who had had a standardized food challenge to peanut and/or tree nuts due to primary sensitisation to the nut (positive specific IgE or SPT). A detailed assessment was performed of factors relating to food challenge outcome with univariate and multivariate logistic regression analysis. Results: There were 98 food challenges (48% peanut, 52% tree nut) with 29 positive, 67 negative and 2 inconclusive challenges. A positive maternal history and a specific IgE > 2 kU/l were strongly associated with a significantly increased risk of a positive food challenge (OR 3.54; 95% CI 1.28 to 9.81; and OR 4.82; 95% CI 1.57 to 14.86; respectively). There was no significant association between the type of nut, age, presence of other food allergies, paternal or sibling atopic history, other atopic conditions or severity of previous reaction to other foods. Conclusions: We have demonstrated an association between the presence of a maternal atopic history and a specific IgE > 2 kU/l, and a significant increase in the likelihood of a positive food challenge in children with primary sensitisation to nuts. Although requiring further prospective validation we suggest these easily identifiable components should be considered when deciding the need for a nut challenge.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES: We aimed to (i) evaluate psychological distress in adolescent survivors of childhood cancer and compare them to siblings and a norm population; (ii) compare the severity of distress of distressed survivors and siblings with that of psychotherapy patients; and (iii) determine risk factors for psychological distress in survivors. METHODS: We sent a questionnaire to all childhood cancer survivors aged <16 years when diagnosed, who had survived ≥ 5 years and were aged 16-19 years at the time of study. Our control groups were same-aged siblings, a norm population, and psychotherapy patients. Psychological distress was measured with the Brief Symptom Inventory-18 (BSI-18) assessing somatization, depression, anxiety, and a global severity index (GSI). Participants with a T-score ≥ 57 were defined as distressed. We used logistic regression to determine risk factors. RESULTS: We evaluated the BSI-18 in 407 survivors and 102 siblings. Fifty-two survivors (13%) and 11 siblings (11%) had scores above the distress threshold (T ≥ 57). Distressed survivors scored significantly higher in somatization (p=0.027) and GSI (p=0.016) than distressed siblings, and also scored higher in somatization (p ≤ 0.001) and anxiety (p=0.002) than psychotherapy patients. In the multivariable regression, psychological distress was associated with female sex, self-reported late effects, and low perceived parental support. CONCLUSIONS: The majority of survivors did not report psychological distress. However, the severity of distress of distressed survivors exceeded that of distressed siblings and psychotherapy patients. Systematic psychological follow-up can help to identify survivors at risk and support them during the challenging period of adolescence.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Anemia is a common condition in CKD that has been identified as a cardiovascular (CV) risk factor in end-stage renal disease, constituting a predictor of low survival. The aim of this study was to define the onset of anemia of renal origin and its association with the evolution of kidney disease and clinical outcomes in stage 3 CKD (CKD-3). METHODS: This epidemiological, prospective, multicenter, 3-year study included 439 CKD-3 patients. The origin of nephropathy and comorbidity (Charlson score: 3.2) were recorded. The clinical characteristics of patients that developed anemia according to EBPG guidelines were compared with those that did not, followed by multivariate logistic regression, Kaplan-Meier curves and ROC curves to investigate factors associated with the development of renal anemia. RESULTS: During the 36-month follow-up period, 50% reached CKD-4 or 5, and approximately 35% were diagnosed with anemia (85% of renal origin). The probability of developing renal anemia was 0.12, 0.20 and 0.25 at 1, 2 and 3 years, respectively. Patients that developed anemia were mainly men (72% anemic vs. 69% non-anemic). The mean age was 68 vs. 65.5 years and baseline proteinuria was 0.94 vs. 0.62 g/24h (anemic vs. non anemic, respectively). Baseline MDRD values were 36 vs. 40 mL/min and albumin 4.1 vs. 4.3 g/dL; reduction in MDRD was greater in those that developed anemia (6.8 vs. 1.6 mL/min/1.73 m2/3 years). These patients progressed earlier to CKD-4 or 5 (18 vs. 28 months), with a higher proportion of hospitalizations (31 vs. 16%), major CV events (16 vs. 7%), and higher mortality (10 vs. 6.6%) than those without anemia. Multivariate logistic regression indicated a significant association between baseline hemoglobin (OR=0.35; 95% CI: 0.24-0.28), glomerular filtration rate (OR=0.96; 95% CI: 0.93-0.99), female (OR=0.19; 95% CI: 0.10-0.40) and the development of renal anemia. CONCLUSIONS: Renal anemia is associated with a more rapid evolution to CKD-4, and a higher risk of CV events and hospitalization in non-dialysis-dependent CKD patients. This suggests that special attention should be paid to anemic CKD-3 patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Leprosy remains prevalent in Brazil. ErbB2 is a receptor for leprosy bacilli entering Schwann cells, which mediates Mycobacterium leprae-induced demyelination and the ERBB2 gene lies within a leprosy susceptibility locus on chromosome 17q11-q21. To determine whether polymorphisms at the ERBB2 locus contribute to this linkage peak, three haplotype tagging single nucleotide polymorphisms (tag-SNPs) (rs2517956, rs2952156, rs1058808) were genotyped in 72 families (208 cases; 372 individuals) from the state of Pará (PA). All three tag-SNPs were associated with leprosy per se [best SNP rs2517959 odds ratio (OR) = 2.22; 95% confidence interval (CI) 1.37-3.59; p = 0.001]. Lepromatous (LL) (OR = 3.25; 95% CI 1.37-7.70; p = 0.007) and tuberculoid (TT) (OR = 1.79; 95% CI 1.04-3.05; p = 0.034) leprosy both contributed to the association, which is consistent with the previous linkage to chromosome 17q11-q21 in the population from PA and supports the functional role of ErbB2 in disease pathogenesis. To attempt to replicate these findings, six SNPs (rs2517955, rs2517956, rs1810132, rs2952156, rs1801200, rs1058808) were genotyped in a population-based sample of 570 leprosy cases and 370 controls from the state of Rio Grande do Norte (RN) and the results were analysed using logistic regression analysis. However, none of the associations were replicated in the RN sample, whether analysed for leprosy per se, LL leprosy, TT leprosy, erythema nodosum leprosum or reversal reaction conditions. The role of polymorphisms at ERBB2 in controlling susceptibility to leprosy in Brazil therefore remains unclear.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AIM: To investigate the relationships between six classes of non-medical prescription drug use (NMPDU) and five personality traits. METHODS: Representative baseline data on 5777 Swiss men around 20 years old were taken from the Cohort Study on Substance Use Risk Factors. NMPDU of opioid analgesics, sedatives/sleeping pills, anxiolytics, antidepressants, beta-blockers and stimulants over the previous 12 months was measured. Personality was assessed using the Brief Sensation Seeking Scale; attention deficit-hyperactivity (ADH) using the Adult Attention-Deficit-Hyperactivity Disorder Self-Report Scale; and aggression/hostility, anxiety/neuroticism and sociability using the Zuckerman-Kuhlmann Personality Questionnaire. Logistic regression models for each personality trait were fitted, as were seven multiple logistic regression models predicting each NMPDU adjusting for all personality traits and covariates. RESULTS: Around 10.7% of participants reported NMPDU in the last 12 months, with opioid analgesics most prevalent (6.7%), then sedatives/sleeping pills (3.0%), anxiolytics (2.7%), and stimulants (1.9%). Sensation seeking (SS), ADH, aggression/hostility, and anxiety/neuroticism (but not sociability) were significantly positively associated with at least one drug class (OR varied between 1.24, 95%CI: 1.04-1.48 and 1.86, 95%CI: 1.47-2.35). Aggression/hostility, anxiety/neuroticism and ADH were significantly and positively related to almost all NMPDU. Sociability was inversely related to NMPDU of sedatives/sleeping pills and anxiolytics (OR, 0.70; 95%CI: 0.51-0.96 and OR, 0.64; 95%CI: 0.46-0.90, respectively). SS was related only to stimulant use (OR, 1.74; 95%CI: 1.14-2.65). CONCLUSION: People with higher scores for ADH, aggression/hostility and anxiety/neuroticism are at higher risk of NMPDU. Sociability appeared to protect from NMPDU of sedatives/sleeping pills and anxiolytics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Certain host single nucleotide polymorphisms (SNPs) affect the likelihood of a sustained virological response (SVR) to treatment in subjects infected with hepatitis C virus (HCV). SNPs in the promoters of interleukin (IL)-10 (-1082 A/G, rs1800896), myxovirus resistance protein 1 (-123 C/A, rs17000900 and -88 G/T, rs2071430) and tumour necrosis factor (TNF) (-308 G/A, rs1800629 and -238 G/A, rs361525) genes and the outcome of PEGylated α-interferon plus ribavirin therapy were investigated. This analysis was performed in 114 Brazilian, HCV genotype 1-infected patients who had a SVR and in 85 non-responders and 64 relapsers. A significantly increased risk of having a null virological response was observed in patients carrying at least one A allele at positions -308 [odds ratios (OR) = 2.58, 95% confidence intervals (CI) = 1.44-4.63, p = 0.001] or -238 (OR = 7.33, 95% CI = 3.59-14.93, p < 0.001) in the TNF promoter. The risk of relapsing was also elevated (-308: OR = 2.87, 95% CI = 1.51-5.44, p = 0.001; -238: OR = 4.20, 95% CI = 1.93-9.10, p < 0.001). Multiple logistic regression of TNF diplotypes showed that patients with at least two copies of the A allele had an even higher risk of having a null virological response (OR = 16.43, 95% CI = 5.70-47.34, p < 0.001) or relapsing (OR = 6.71, 95% CI = 2.18-20.66, p = 0.001). No statistically significant association was found between the other SNPs under study and anti-HCV therapy response.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data on biliary carriage of bacteria and, specifically, of bacteria with worrisome and unexpected resistance traits (URB) are lacking. A prospective study (April 2010 to December 2011) was performed that included all patients admitted for <48 h for elective laparoscopic cholecystectomy in a Spanish hospital. Bile samples were cultured and epidemiological/clinical data recorded. Logistic regression models (stepwise) were performed using bactobilia or bactobilia by URB as dependent variables. Models (P < 0.001) showing the highest R(2) values were considered. A total of 198 patients (40.4% males; age, 55.3 ± 17.3 years) were included. Bactobilia was found in 44 of them (22.2%). The presence of bactobilia was associated (R(2) Cox, 0.30) with previous biliary endoscopic retrograde cholangiopancreatography (ERCP) (odds ratio [OR], 8.95; 95% confidence interval [CI], 2.96 to 27.06; P < 0.001), previous admission (OR, 2.82; 95% CI, 1.10 to 7.24; P = 0.031), and age (OR, 1.09 per year; 95% CI, 1.05 to 1.12; P < 0.001). Ten out of the 44 (22.7%) patients with bactobilia carried URB: 1 Escherichia coli isolate (CTX-M), 1 Klebsiella pneumoniae isolate (OXA-48), 3 high-level gentamicin-resistant enterococci, 1 vancomycin-resistant Enterococcus isolate, 3 Enterobacter cloacae strains, and 1 imipenem-resistant Pseudomonas aeruginosa strain. Bactobilia by URB (versus those by non-URB) was only associated (R(2) Cox, 0.19) with previous ERCP (OR, 11.11; 95% CI, 1.98 to 62.47; P = 0.006). For analyses of patients with bactobilia by URB versus the remaining patients, previous ERCP (OR, 35.284; 95% CI, 5.320 to 234.016; P < 0.001), previous intake of antibiotics (OR, 7.200; 95% CI, 0.962 to 53.906; P = 0.050), and age (OR, 1.113 per year of age; 95% CI, 1.028 to 1.206; P = 0.009) were associated with bactobilia by URB (R(2) Cox, 0.19; P < 0.001). Previous antibiotic exposure (in addition to age and previous ERCP) was a risk driver for bactobilia by URB. This may have implications in prophylactic/therapeutic measures.