979 resultados para Multivariable logistic regression
Resumo:
SUMMARY The main objective was to evaluate the association between SNPs and haplotypes of the FABP1-4 genes and type 2 diabetes, as well as its interaction with fat intake, in one general Spanish population. The association was replicated in a second population in which HOMA index was also evaluated. METHODS 1217 unrelated individuals were selected from a population-based study [Hortega study: 605 women; mean age 54 y; 7.8% with type 2 diabetes]. The replication population included 805 subjects from Segovia, a neighboring region of Spain (446 females; mean age 52 y; 10.3% with type 2 diabetes). DM2 mellitus was defined in a similar way in both studies. Fifteen SNPs previously associated with metabolic traits or with potential influence in the gene expression within the FABP1-4 genes were genotyped with SNPlex and tested. Age, sex and BMI were used as covariates in the logistic regression model. RESULTS One polymorphism (rs2197076) and two haplotypes of the FABP-1 showed a strong association with the risk of DM2 in the original population. This association was further confirmed in the second population as well as in the pooled sample. None of the other analyzed variants in FABP2, FABP3 and FABP4 genes were associated. There was not a formal interaction between rs2197076 and fat intake. A significant association between the rs2197076 and the haplotypes of the FABP1 and HOMA-IR was also present in the replication population. CONCLUSIONS The study supports the role of common variants of the FABP-1 gene in the development of type 2 diabetes in Caucasians.
Resumo:
Risk factor surveillance is a complementary tool of morbidity and mortality surveillance that improves the likelihood that public health interventions are implemented in a timely fashion. The aim of this study was to identify population predictors of malaria outbreaks in endemic municipalities of Colombia with the goal of developing an early warning system for malaria outbreaks. We conducted a multiple-group, exploratory, ecological study at the municipal level. Each of the 290 municipalities with endemic malaria that we studied was classified according to the presence or absence of outbreaks. The measurement of variables was based on historic registries and logistic regression was performed to analyse the data. Altitude above sea level [odds ratio (OR) 3.65, 95% confidence interval (CI) 1.34-9.98], variability in rainfall (OR 1.85, 95% CI 1.40-2.44) and the proportion of inhabitants over 45 years of age (OR 0.17, 95% CI 0.08-0.38) were factors associated with malaria outbreaks in Colombian municipalities. The results suggest that environmental and demographic factors could have a significant ability to predict malaria outbreaks on the municipal level in Colombia. To advance the development of an early warning system, it will be necessary to adjust and standardise the collection of required data and to evaluate the accuracy of the forecast models.
Resumo:
Dengue virus (DENV) and parvovirus B19 (B19V) infections are acute exanthematic febrile illnesses that are not easily differentiated on clinical grounds and affect the paediatric population. Patients with these acute exanthematic diseases were studied. Fever was more frequent in DENV than in B19V-infected patients. Arthritis/arthralgias with DENV infection were shown to be significantly more frequent in adults than in children. The circulating levels of interleukin (IL)-1 receptor antagonist (Ra), CXCL10/inducible protein-10 (IP-10), CCL4/macrophage inflammatory protein-1 beta and CCL2/monocyte chemotactic protein-1 (MCP-1) were determined by multiplex immunoassay in serum samples obtained from B19V (37) and DENV-infected (36) patients and from healthy individuals (7). Forward stepwise logistic regression analysis revealed that circulating CXCL10/IP-10 tends to be associated with DENV infection and that IL-1Ra was significantly associated with DENV infection. Similar analysis showed that circulating CCL2/MCP-1 tends to be associated with B19V infection. In dengue fever, increased circulating IL-1Ra may exert antipyretic actions in an effort to counteract the already increased concentrations of IL-1β, while CXCL10/IP-10 was confirmed as a strong pro-inflammatory marker. Recruitment of monocytes/macrophages and upregulation of the humoral immune response by CCL2/MCP-1 by B19V may be involved in the persistence of the infection. Children with B19V or DENV infections had levels of these cytokines similar to those of adult patients.
Resumo:
BACKGROUND: Social support has been found to be protective from adverse health effects of psychological stress. We hypothesized that higher social support would predict a more favorable course of Crohn's disease (CD) directly (main effect hypothesis) and via moderating other prognostic factors (buffer hypothesis). METHODS: Within a multicenter cohort study we observed 597 adults with CD for 18 months. We assessed social support using the ENRICHD Social Support Inventory. Flares, nonresponse to therapy, complications, and extraintestinal manifestations were recorded as a combined endpoint indicating disease deterioration. We controlled for several demographic, psychosocial, and clinical variables of potential prognostic importance. We used multivariate binary logistic regression to estimate the overall effect of social support on the odds of disease deterioration and to explore main and moderator effects of social support by probing interactions with other predictors. RESULTS: The odds of disease deterioration decreased by 1.5 times (95% confidence interval [CI]: 1.2-1.9) for an increase of one standard deviation (SD) of social support. In case of low body mass index (BMI) (i.e., 1 SD below the mean or <19 kg/m(2)), the odds decreased by 1.8 times for an increase of 1 SD of social support. In case of low social support, the odds increased by 2.1 times for a decrease of 1 SD of BMI. Low BMI was not predictive under high social support. CONCLUSIONS: The findings suggest that elevated social support may favorably affect the clinical course of CD, particularly in patients with low BMI. (Inflamm Bowel Dis 2010;).
Resumo:
BACKGROUND: The human immunodeficiency virus type 1 reverse-transcriptase mutation K65R is a single-point mutation that has become more frequent after increased use of tenofovir disoproxil fumarate (TDF). We aimed to identify predictors for the emergence of K65R, using clinical data and genotypic resistance tests from the Swiss HIV Cohort Study. METHODS: A total of 222 patients with genotypic resistance tests performed while receiving treatment with TDF-containing regimens were stratified by detectability of K65R (K65R group, 42 patients; undetected K65R group, 180 patients). Patient characteristics at start of that treatment were analyzed. RESULTS: In an adjusted logistic regression, TDF treatment with nonnucleoside reverse-transcriptase inhibitors and/or didanosine was associated with the emergence of K65R, whereas the presence of any of the thymidine analogue mutations D67N, K70R, T215F, or K219E/Q was protective. The previously undescribed mutational pattern K65R/G190S/Y181C was observed in 6 of 21 patients treated with efavirenz and TDF. Salvage therapy after TDF treatment was started for 36 patients with K65R and for 118 patients from the wild-type group. Proportions of patients attaining human immunodeficiency virus type 1 loads <50 copies/mL after 24 weeks of continuous treatment were similar for the K65R group (44.1%; 95% confidence interval, 27.2%-62.1%) and the wild-type group (51.9%; 95% confidence interval, 42.0%-61.6%). CONCLUSIONS: In settings where thymidine analogue mutations are less likely to be present, such as at start of first-line therapy or after extended treatment interruptions, combinations of TDF with other K65R-inducing components or with efavirenz or nevirapine may carry an enhanced risk of the emergence of K65R. The finding of a distinct mutational pattern selected by treatment with TDF and efavirenz suggests a potential fitness interaction between K65R and nonnucleoside reverse-transcriptase inhibitor-induced mutations.
Resumo:
BACKGROUND: Alcohol consumption leading to morbidity and mortality affects HIV-infected individuals. Here, we aimed to study self-reported alcohol consumption and to determine its association with adherence to antiretroviral therapy (ART) and HIV surrogate markers. METHODS: Cross-sectional data on daily alcohol consumption from August 2005 to August 2007 were analysed and categorized according to the World Health Organization definition (light, moderate or severe health risk). Multivariate logistic regression models and Pearson's chi(2) statistics were used to test the influence of alcohol use on endpoints. RESULTS: Of 6,323 individuals, 52.3% consumed alcohol less than once a week in the past 6 months. Alcohol intake was deemed light in 39.9%, moderate in 5.0% and severe in 2.8%. Higher alcohol consumption was significantly associated with older age, less education, injection drug use, being in a drug maintenance programme, psychiatric treatment, hepatitis C virus coinfection and with a longer time since diagnosis of HIV. Lower alcohol consumption was found in males, non-Caucasians, individuals currently on ART and those with more ART experience. In patients on ART (n=4,519), missed doses and alcohol consumption were positively correlated (P<0.001). Severe alcohol consumers, who were pretreated with ART, were more often off treatment despite having CD4+ T-cell count <200 cells/microl; however, severe alcohol consumption per se did not delay starting ART. In treated individuals, alcohol consumption was not associated with worse HIV surrogate markers. CONCLUSIONS: Higher alcohol consumption in HIV-infected individuals was associated with several psychosocial and demographic factors, non-adherence to ART and, in pretreated individuals, being off treatment despite low CD4+ T-cell counts.
Resumo:
Contact surveillance is an important strategy to ensure effective early diagnosis and control of leprosy; passive detection may not be as efficient because it is directly tied to the ready availability of heath care services and health education campaigns. The aim of this study was to reinforce that contact surveillance is the most effective strategy for the control of leprosy. The analysed data were obtained from a cohort of contacts and cases diagnosed through a national referral service for leprosy. We analysed data from patients diagnosed between 1987-2010 at the Souza Araújo Ambulatory in Rio de Janeiro. Epidemiological characteristics of leprosy cases diagnosed through contact surveillance and characteristics of passively detected index cases were compared using a conditional logistic regression model. Cases diagnosed by contact surveillance were found earlier in the progression of the disease, resulting in less severe clinical presentations, lower levels of initial and final disability grades, lower initial and final bacterial indices and a lower prevalence of disease reaction. In this respect, contact surveillance proved to be an effective tertiary prevention strategy, indicating that active surveillance is especially important in areas of high endemicity, such as Brazil.
Resumo:
BACKGROUND: This study aimed to investigate the influence of deep sternal wound infection on long-term survival following cardiac surgery. MATERIAL AND METHODS: In our institutional database we retrospectively evaluated medical records of 4732 adult patients who received open-heart surgery from January 1995 through December 2005. The predictive factors for DSWI were determined using logistic regression analysis. Then, each patient with deep sternal wound infection (DSWI) was matched with 2 controls without DSWI, according to the risk factors identified previously. After checking balance resulting from matching, short-term mortality was compared between groups using a paired test, and long-term survival was compared using Kaplan-Meier analysis and a Cox proportional hazard model. RESULTS: Overall, 4732 records were analyzed. The mean age of the investigated population was 69.3±12.8 years. DSWI occurred in 74 (1.56%) patients. Significant independent predictive factors for deep sternal infections were active smoking (OR 2.19, CI95 1.35-3.53, p=0.001), obesity (OR 1.96, CI95 1.20-3.21, p=0.007), and insulin-dependent diabetes mellitus (OR 2.09, CI95 1.05-10.06, p=0.016). Mean follow-up in the matched set was 125 months, IQR 99-162. After matching, in-hospital mortality was higher in the DSWI group (8.1% vs. 2.7% p=0.03), but DSWI was not an independent predictor of long-term survival (adjusted HR 1.5, CI95 0.7-3.2, p=0.33). CONCLUSIONS: The results presented in this report clearly show that post-sternotomy deep wound infection does not influence long-term survival in an adult general cardio-surgical patient population.
Resumo:
AIM To investigate the incidence of neoplasms in inflammatory bowel disease (IBD) patients and the potential causative role of thiopurines. METHODS We performed an observational descriptive study comparing the incidence of malignancies in IBD patients treated with thiopurines and patients not treated with these drugs. We included 812 patients which were divided in two groups depending on whether they have received thiopurines or not. We have studied basal characteristics of both groups (age when the disease was diagnosed, sex, type of IBD, etc.) and treatments received (Azathioprine, mercaptopurine, infliximab, adalimumab or other immunomodulators), as well as neoplasms incidence. Univariate analysis was performed with the student t test, χ(2) test or Wilcoxon exact test as appropriate. A logistic regression analysis was performed as multivariate analysis. Statistical significance was establish at P values of less than 0.05, and 95%CI were used for the odds ratios. RESULTS Among 812 patients included, 429 (52.83%) have received thiopurines: 79.5% azathioprine, 14% mercaptopurine and 6.5% both drugs. 44.76% of patients treated with thiopurines and 46, 48% of patients who did not receive this treatment were women (P > 0.05). The proportion of ulcerative colitis patients treated with thiopurines was 30.3% compare to 66. 67% of patients not treated (P < 0.001). Mean azathioprine dose was 123.79 ± 36.5 mg/d (range: 50-250 mg/d), mean usage time was 72.16 ± 55.7 mo (range: 1-300 mo) and the accumulated dose along this time was 274.32 ± 233.5 g (1.5-1350 g). With respect to mercaptopurine, mean dose was 74.7 ± 23.9 mg/d (range: 25-150 mg/d), mean usage time of 23.37 ± 27.6 mo (range: 1-118 mo), and the accumulated dose along this time was 52.2 ± 63.5 g (range: 1.5-243 g). Thiopurine S-methyltransferase activity was tested in 66% of patients treated with thiopurines, among which 98.2% had an intermediate or high activity. Among the patients treated with thiopurines, 27.27% (112 patients) and 11.66% (50 patients) received treatment with Infliximab and Adalimumab respectively, but only 1.83% (7 patients) and 0.78% (3 patients) received these drugs in the group of patients who did not received thiopurines (P < 0.001 and P < 0.001 respectively). Finally, 6.8% (29 patients) among those treated with thiopurines have received other immunesupresants (Methotrexate, Tacrolimus, Cyclosporin), compare to 1% (4 patients) of patients not treated with thiopurines (P < 0.001). Among patients treated with thiopurines, 3.97% developed a malignancy, and among those not treated neoplasms presented in 8.1% (P = 0.013). The most frequent neoplasms were colorectal ones (12 cases in patients not treated with thiopurines but none in treated, P < 0.001) followed by non-melanoma skin cancer (8 patients in treated with thiopurines and 6 in not treated, P > 0.05). CONCLUSION In our experience, thiopurine therapy did not increase malignancies development in IBD patients, and was an efective and safe treatment for these diseases.
Resumo:
Given the very large amount of data obtained everyday through population surveys, much of the new research again could use this information instead of collecting new samples. Unfortunately, relevant data are often disseminated into different files obtained through different sampling designs. Data fusion is a set of methods used to combine information from different sources into a single dataset. In this article, we are interested in a specific problem: the fusion of two data files, one of which being quite small. We propose a model-based procedure combining a logistic regression with an Expectation-Maximization algorithm. Results show that despite the lack of data, this procedure can perform better than standard matching procedures.
Resumo:
Background: Obesity is a major risk factor for type 2 diabetes mellitus (T2DM). A proper anthropometric characterisation of T2DM risk is essential for disease prevention and clinical risk assessement. Methods: Longitudinal study in 37 733 participants (63% women) of the Spanish EPIC (European Prospective Investigation into Cancer and Nutrition) cohort without prevalent diabetes. Detailed questionnaire information was collected at baseline and anthropometric data gathered following standard procedures. A total of 2513 verified incident T2DM cases occurred after 12.1 years of mean follow-up. Multivariable Cox regression was used to calculate hazard ratios of T2DM by levels of anthropometric variables. Results: Overall and central obesity were independently associated with T2DM risk. BMI showed the strongest association with T2DM in men whereas waist-related indices were stronger independent predictors in women. Waist-to-height ratio revealed the largest area under the ROC curve in men and women, with optimal cut-offs at 0.60 and 0.58, respectively. The most discriminative waist circumference (WC) cut-off values were 99.4 cm in men and 90.4 cm in women. Absolute risk of T2DM was higher in men than women for any combination of age, BMI and WC categories, and remained low in normal-waist women. The population risk of T2DM attributable to obesity was 17% in men and 31% in women. Conclusions: Diabetes risk was associated with higher overall and central obesity indices even at normal BMI and WC values. The measurement of waist circumference in the clinical setting is strongly recommended for the evaluation of future T2DM risk in women.
Resumo:
OBJECTIVES: To assess the extent to which stage at diagnosis and adherence to treatment guidelines may explain the persistent differences in colorectal cancer survival between the USA and Europe. DESIGN: A high-resolution study using detailed clinical data on Dukes' stage, diagnostic procedures, treatment and follow-up, collected directly from medical records by trained abstractors under a single protocol, with standardised quality control and central statistical analysis. SETTING AND PARTICIPANTS: 21 population-based registries in seven US states and nine European countries provided data for random samples comprising 12 523 adults (15-99 years) diagnosed with colorectal cancer during 1996-1998. OUTCOME MEASURES: Logistic regression models were used to compare adherence to 'standard care' in the USA and Europe. Net survival and excess risk of death were estimated with flexible parametric models. RESULTS: The proportion of Dukes' A and B tumours was similar in the USA and Europe, while that of Dukes' C was more frequent in the USA (38% vs 21%) and of Dukes' D more frequent in Europe (22% vs 10%). Resection with curative intent was more frequent in the USA (85% vs 75%). Elderly patients (75-99 years) were 70-90% less likely to receive radiotherapy and chemotherapy. Age-standardised 5-year net survival was similar in the USA (58%) and Northern and Western Europe (54-56%) and lowest in Eastern Europe (42%). The mean excess hazard up to 5 years after diagnosis was highest in Eastern Europe, especially among elderly patients and those with Dukes' D tumours. CONCLUSIONS: The wide differences in colorectal cancer survival between Europe and the USA in the late 1990s are probably attributable to earlier stage and more extensive use of surgery and adjuvant treatment in the USA. Elderly patients with colorectal cancer received surgery, chemotherapy or radiotherapy less often than younger patients, despite evidence that they could also have benefited.
Resumo:
Rationale: Children with atopic diseases in early life are frequently found with positive IgE tests to nuts, without a history of previous ingestion. We aimed to identify risk factors for reactions to nuts at their first introduction. Methods: A detailed retrospective case note and database analysis was performed. Inclusion criteria were: patients aged 3 to 16 years who had had a standardized food challenge to peanut and/or tree nuts due to primary sensitisation to the nut (positive specific IgE or SPT). A detailed assessment was performed of factors relating to food challenge outcome with univariate and multivariate logistic regression analysis. Results: There were 98 food challenges (48% peanut, 52% tree nut) with 29 positive, 67 negative and 2 inconclusive challenges. A positive maternal history and a specific IgE > 2 kU/l were strongly associated with a significantly increased risk of a positive food challenge (OR 3.54; 95% CI 1.28 to 9.81; and OR 4.82; 95% CI 1.57 to 14.86; respectively). There was no significant association between the type of nut, age, presence of other food allergies, paternal or sibling atopic history, other atopic conditions or severity of previous reaction to other foods. Conclusions: We have demonstrated an association between the presence of a maternal atopic history and a specific IgE > 2 kU/l, and a significant increase in the likelihood of a positive food challenge in children with primary sensitisation to nuts. Although requiring further prospective validation we suggest these easily identifiable components should be considered when deciding the need for a nut challenge.
Resumo:
BACKGROUND: Anemia is a common condition in CKD that has been identified as a cardiovascular (CV) risk factor in end-stage renal disease, constituting a predictor of low survival. The aim of this study was to define the onset of anemia of renal origin and its association with the evolution of kidney disease and clinical outcomes in stage 3 CKD (CKD-3). METHODS: This epidemiological, prospective, multicenter, 3-year study included 439 CKD-3 patients. The origin of nephropathy and comorbidity (Charlson score: 3.2) were recorded. The clinical characteristics of patients that developed anemia according to EBPG guidelines were compared with those that did not, followed by multivariate logistic regression, Kaplan-Meier curves and ROC curves to investigate factors associated with the development of renal anemia. RESULTS: During the 36-month follow-up period, 50% reached CKD-4 or 5, and approximately 35% were diagnosed with anemia (85% of renal origin). The probability of developing renal anemia was 0.12, 0.20 and 0.25 at 1, 2 and 3 years, respectively. Patients that developed anemia were mainly men (72% anemic vs. 69% non-anemic). The mean age was 68 vs. 65.5 years and baseline proteinuria was 0.94 vs. 0.62 g/24h (anemic vs. non anemic, respectively). Baseline MDRD values were 36 vs. 40 mL/min and albumin 4.1 vs. 4.3 g/dL; reduction in MDRD was greater in those that developed anemia (6.8 vs. 1.6 mL/min/1.73 m2/3 years). These patients progressed earlier to CKD-4 or 5 (18 vs. 28 months), with a higher proportion of hospitalizations (31 vs. 16%), major CV events (16 vs. 7%), and higher mortality (10 vs. 6.6%) than those without anemia. Multivariate logistic regression indicated a significant association between baseline hemoglobin (OR=0.35; 95% CI: 0.24-0.28), glomerular filtration rate (OR=0.96; 95% CI: 0.93-0.99), female (OR=0.19; 95% CI: 0.10-0.40) and the development of renal anemia. CONCLUSIONS: Renal anemia is associated with a more rapid evolution to CKD-4, and a higher risk of CV events and hospitalization in non-dialysis-dependent CKD patients. This suggests that special attention should be paid to anemic CKD-3 patients.
Resumo:
Leprosy remains prevalent in Brazil. ErbB2 is a receptor for leprosy bacilli entering Schwann cells, which mediates Mycobacterium leprae-induced demyelination and the ERBB2 gene lies within a leprosy susceptibility locus on chromosome 17q11-q21. To determine whether polymorphisms at the ERBB2 locus contribute to this linkage peak, three haplotype tagging single nucleotide polymorphisms (tag-SNPs) (rs2517956, rs2952156, rs1058808) were genotyped in 72 families (208 cases; 372 individuals) from the state of Pará (PA). All three tag-SNPs were associated with leprosy per se [best SNP rs2517959 odds ratio (OR) = 2.22; 95% confidence interval (CI) 1.37-3.59; p = 0.001]. Lepromatous (LL) (OR = 3.25; 95% CI 1.37-7.70; p = 0.007) and tuberculoid (TT) (OR = 1.79; 95% CI 1.04-3.05; p = 0.034) leprosy both contributed to the association, which is consistent with the previous linkage to chromosome 17q11-q21 in the population from PA and supports the functional role of ErbB2 in disease pathogenesis. To attempt to replicate these findings, six SNPs (rs2517955, rs2517956, rs1810132, rs2952156, rs1801200, rs1058808) were genotyped in a population-based sample of 570 leprosy cases and 370 controls from the state of Rio Grande do Norte (RN) and the results were analysed using logistic regression analysis. However, none of the associations were replicated in the RN sample, whether analysed for leprosy per se, LL leprosy, TT leprosy, erythema nodosum leprosum or reversal reaction conditions. The role of polymorphisms at ERBB2 in controlling susceptibility to leprosy in Brazil therefore remains unclear.