921 resultados para random coefficient regression model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. To evaluate the beneficial effect of antimalarial treatment on lupus survival in a large, multiethnic, international longitudinal inception cohort. Methods. Socioeconomic and demographic characteristics, clinical manifestations, classification criteria, laboratory findings, and treatment variables were examined in patients with systemic lupus erythematosus (SLE) from the Grupo Latino Americano de Estudio del Lupus Eritematoso (GLADEL) cohort. The diagnosis of SLE, according to the American College of Rheumatology criteria, was assessed within 2 years of cohort entry. Cause of death was classified as active disease, infection, cardiovascular complications, thrombosis, malignancy, or other cause. Patients were subdivided by antimalarial use, grouped according to those who had received antimalarial drugs for at least 6 consecutive months (user) and those who had received antimalarial drugs for <6 consecutive months or who had never received antimalarial drugs (nonuser). Results. Of the 1,480 patients included in the GLADEL cohort, 1,141 (77%) were considered antimalarial users, with a mean duration of drug exposure of 48.5 months (range 6-98 months). Death occurred in 89 patients (6.0%). A lower mortality rate was observed in antimalarial users compared with nonusers (4.4% versus 11.5%; P < 0.001). Seventy patients (6.1%) had received antimalarial drugs for 6-11 months, 146 (12.8%) for 1-2 years, and 925 (81.1%) for >2 years. Mortality rates among users by duration of antimalarial treatment (per 1,000 person-months of followup) were 3.85 (95% confidence interval [95% CI] 1.41-8.37), 2.7 (95% CI 1.41-4.76), and 0.54 (95% CI 0.37-0.77), respectively, while for nonusers, the mortality rate was 3.07 (95% CI 2.18-4.20) (P for trend < 0.001). After adjustment for potential confounders in a Cox regression model, antimalarial use was associated with a 38% reduction in the mortality rate (hazard ratio 0.62, 95% CI 0.39-0.99). Conclusion. Antimalarial drugs were shown to have a protective effect, possibly in a time-dependent manner, on SLE survival. These results suggest that the use of antimalarial treatment should be recommended for patients with lupus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background-Peculiar aspects of Chagas cardiomyopathy raise concerns about efficacy and safety of sympathetic blockade. We studied the influence of beta-blockers in patients with Chagas cardiomyopathy. Methods and Results-We examined REMADHE trial and grouped patients according to etiology (Chagas versus non-Chagas) and beta-blocker therapy. Primary end point was all-cause mortality or heart transplantation. Altogether 456 patients were studied; 27 (5.9%) were submitted to heart transplantation and 202 (44.3%) died. Chagas etiology was present in 68 (14.9%) patients; they had lower body mass index (24.1+/-4.1 versus 26.3+/-5.1, P=0.001), smaller end-diastolic left ventricle diameter (6.7+/-1.0 mm versus 7.0+/-0.9 mm, P=0.001), smaller proportion of beta-blocker therapy (35.8% versus 68%, P<0.001), and higher proportion of spironolactone therapy (74.6% versus 57.8%, P=0.003). Twenty-four (35.8%) patients with Chagas disease were under beta-blocker therapy and had lower serum sodium (136.6+/-3.1 versus 138.4+/-3.1 mEqs, P=0.05) and lower body mass index (22.5+/-3.3 versus 24.9+/-4.3, P=0.03) compared with those who received beta-blockers. Survival was lower in patients with Chagas heart disease as compared with other etiologies. When only patients under beta-blockers were considered, the survival of patients with Chagas disease was similar to that of other etiologies. The survival of patients with beta-blockers was higher than that of patients without beta-blockers. In Cox regression model, left ventricle end-diastolic diameter (hazard ratio, 1.78; CI, 1.15 to 2.76; P=0.009) and beta-blockers (hazard ratio, 0.37; CI, 0.14 to 0.97; P=0.044) were associated with better survival. Conclusions-Our study suggests that beta-blockers may have beneficial effects on survival of patients with heart failure and Chagas heart disease and warrants further investigation in a prospective, randomized trial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Heart failure and diabetes often occur simultaneously in patients, but the prognostic value of glycemia in chronic heart failure is debatable. We evaluated the role of glycemia on prognosis of heart failure. Methods Outpatients with chronic heart failure from the Long-term Prospective Randomized Controlled Study Using Repetitive Education at Six-Month Intervals and Monitoring for Adherence in Heart Failure Outpatients (REMADHE) trial were grouped according to the presence of diabetes and level of glycemia. All-cause mortality/heart transplantation and unplanned hospital admission were evaluated. Results Four hundred fifty-six patients were included (135 [29.5%] female, 124 [27.2%] with diabetes mellitus, age of w50.2 +/- 11.4 years, and left-ventricle ejection fraction of 34.7% +/- 10.5%). During follow-up (3.6 +/- 2.2 years), 27 (5.9%) patients were submitted to heart transplantation and 202 (44.2%) died; survival was similar in patients with and without diabetes mellitus. When patients with and without diabetes were categorized according to glucose range (glycemia <= 100 mg/dL [5.5 mmol/L]), as well as when distributed in quintiles of glucose, the survival was significantly worse among patients with lower levels of glycemia. This finding persisted in Cox proportional hazards regression model that included gender, etiology, left ventricle ejection fraction, left ventricle diastolic diameter, creatinine level and beta-blocker therapy, and functional status (hazard ratio 1.45, 95% CI 1.09-1.69, P = .039). No difference regarding unplanned hospital admission was found. Conclusion We report on an inverse association between glycemia and mortality in outpatients with chronic heart failure. These results point to a new pathophysiologic understanding of the interactions between diabetes mellitus, hyperglycemia, and heart disease. (Am Heart J 2010; 159: 90-7.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods. A prospective cohort study was conducted with 831 pregnant women from antenatal clinics in primary healthcare in Sao Paulo, Brazil. The clinical interview schedule-revised and demographic questionnaires were administered between the 20th and 30th weeks of gestation. Information on infant weight and gestational age at birth were obtained from hospital records. Univariate analyses were used to examine the association between the main exposure and main outcomes. Statistical associations were examined with chiregression model. Results. The prevalence of CMD during gestation was 33.6 (95% CI: 30.4-36.9). The follow-up rate was 99.5%. Sixty three (7.6%) newborns were classified as LBW and 56 (6.9%) were classified as PTB. CMD during pregnancy was not associated with risk of PTB (adjusted OR:1.03, 95% CI: 0.57-1.88) or LBW (adjusted OR:1.09, 95% CI: 0.62-1.91). Conclusions. CMD prevalence is high among low-income and low-risk pregnant women attended by public health services in a middle-income country, but not confer an increased risk for adverse obstetric outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Bronchial typical carcinoid tumors are tow-grade malignancies. However, metastases are diagnosed in some patients. Predicting the individual risk of these metastases to determine patients eligible for a radical lymphadenectomy and patients to be followed-up because of distant metastasis risk is relevant. Our objective was to screen for predictive criteria of bronchial typical carcinoid tumor aggressiveness based on a logistic regression model using clinical, pathological and biomolecular data. Methods: A multicenter retrospective cohort study, including 330 consecutive patients operated on for bronchial typical carcinoid tumors and followed-up during a period more than 10 years in two university hospitals was performed. Selected data to predict the individual risk for both nodal and distant metastasis were: age, gender, TNM staging, tumor diameter and location (central/peripheral), tumor immunostaining index of p53 and Ki67, Bcl2 and the extracellular density of neoformed microvessels and of collagen/elastic extracellular fibers. Results: Nodal and distant metastasis incidence was 11% and 5%, respectively. Univariate analysis identified all the studied biomarkers as related to nodal metastasis. Multivariate analysis identified a predictive variable for nodal metastasis: neo angiogenesis, quantified by the neoformed pathological microvessels density. Distant metastasis was related to mate gender. Discussion: Predictive models based on clinical and biomolecular data could be used to predict individual risk for metastasis. Patients under a high individual risk for lymph node metastasis should be considered as candidates to mediastinal lymphadenectomy. Those under a high risk of distant metastasis should be followed-up as having an aggressive disease. Conclusion: Individual risk prediction of bronchial typical carcinoid tumor metastasis for patients operated on can be calculated in function of biomolecular data. Prediction models can detect high-risk patients and help surgeons to identify patients requiring radical lymphadenectomy and help oncologists to identify those as having an aggressive disease requiring prolonged follow-up. (C) 2008 European Association for Cardio-Thoracic Surgery. Published by Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Mucosal leishmaniasis is caused mainly by Leishmania braziliensis and it occurs months or years after cutaneous lesions. This progressive disease destroys cartilages and osseous structures from face, pharynx and larynx. Objective and methods The aim of this study was to analyse the significance of clinical and epidemiological findings, diagnosis and treatment with the outcome and recurrence of mucosal leishmaniasis through binary logistic regression model from 140 patients with mucosal leishmaniasis from a Brazilian centre. Results The median age of patients was 57.5 and systemic arterial hypertension was the most prevalent secondary disease found in patients with mucosal leishmaniasis (43%). Diabetes, chronic nephropathy and viral hepatitis, allergy and coagulopathy were found in less than 10% of patients. Human immunodeficiency virus (HIV) infection was found in 7 of 140 patients (5%). Rhinorrhea (47%) and epistaxis (75%) were the most common symptoms. N-methyl-glucamine showed a cure rate of 91% and recurrence of 22%. Pentamidine showed a similar rate of cure (91%) and recurrence (25%). Fifteen patients received itraconazole with a cure rate of 73% and recurrence of 18%. Amphotericin B was the drug used in 30 patients with 82% of response with a recurrence rate of 7%. The binary logistic regression analysis demonstrated that systemic arterial hypertension and HIV infection were associated with failure of the treatment (P < 0.05). Conclusion The current first-line mucosal leishmaniasis therapy shows an adequate cure but later recurrence. HIV infection and systemic arterial hypertension should be investigated before start the treatment of mucosal leishmaniasis. Conflicts of interest The authors are not part of any associations or commercial relationships that might represent conflicts of interest in the writing of this study (e.g. pharmaceutical stock ownership, consultancy, advisory board membership, relevant patents, or research funding).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hematological disturbances are common in systemic lupus erythematous (SLE). Specifically, autoimmune hemolytic anemia (AHA) may manifest in SLE patients at the time of diagnosis or within the first year of the disease. AHA is often associated with thrombocytopenia, lupus nephritis, and central nervous system activity. In this study we investigated these associations in Brazilian patients with SLE. Forty-four consecutive SLE patients who had a history of AHA were age, gender, and disease duration matched with 318 SLE patients without AHA who formed the control group. All patients fulfilled the revised American College of Rheumatology criteria for SLE and were followed-up within our Service. Clinical and laboratorial manifestations were similar in both groups, except for the predominance of leukopenia, thrombocytopenia, and anti-dsDNA on univariate analysis in the AHA group. The multivariate logistic regression model revealed risk only for thrombocytopenia in the AHA group compared to the control group (odds ratio, 2.70; 95% confidence interval, 1.32-5.50). Our results corroborate previous data that AHA in SLE increases the risk of thrombocytopenia in individuals with SLE. This association suggests a common mechanism in AHA and SLE pathophysiologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To evaluate rates of visual field progression in eyes with optic disc hemorrhages and the effect of intraocular pressure (IOP) reduction on these rates. Design: Observational cohort study. Participants: The study included 510 eyes of 348 patients with glaucoma who were recruited from the Diagnostic Innovations in Glaucoma Study (DIGS) and followed for an average of 8.2 years. Methods: Eyes were followed annually with clinical examination, standard automated perimetry visual fields, and optic disc stereophotographs. The presence of optic disc hemorrhages was determined on the basis of masked evaluation of optic disc stereophotographs. Evaluation of rates of visual field change during follow-up was performed using the visual field index (VFI). Main Outcome Measures: The evaluation of the effect of optic disc hemorrhages on rates of visual field progression was performed using random coefficient models. Estimates of rates of change for individual eyes were obtained by best linear unbiased prediction (BLUP). Results: During follow-up, 97 (19%) of the eyes had at least 1 episode of disc hemorrhage. The overall rate of VFI change in eyes with hemorrhages was significantly faster than in eyes without hemorrhages (-0.88%/year vs. -0.38%/year, respectively, P < 0.001). The difference in rates of visual field loss pre- and post-hemorrhage was significantly related to the reduction of IOP in the post-hemorrhage period compared with the pre-hemorrhage period (r = -0.61; P < 0.001). Each 1 mmHg of IOP reduction was associated with a difference of 0.31%/year in the rate of VFI change. Conclusions: There was a beneficial effect of treatment in slowing rates of progressive visual field loss in eyes with optic disc hemorrhage. Further research should elucidate the reasons why some patients with hemorrhages respond well to IOP reduction and others seem to continue to progress despite a significant reduction in IOP levels. Financial Disclosure(s): Proprietary or commercial disclosure may be found after the references. Ophthalmology 2010; 117: 2061-2066 (C) 2010 by the American Academy of Ophthalmology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE. To evaluate the effect of disease severity on the diagnostic accuracy of the Cirrus Optical Coherence Tomograph (Cirrus HD-OCT; Carl Zeiss Meditec, Inc., Dublin, CA) for glaucoma detection. METHODS. One hundred thirty-five glaucomatous eyes of 99 patients and 79 normal eyes of 47 control subjects were recruited from the longitudinal Diagnostic Innovations in Glaucoma Study (DIGS). The severity of the disease was graded based on the visual field index (VFI) from standard automated perimetry. Imaging of the retinal nerve fiber layer (RNFL) was obtained using the optic disc cube protocol available on the Cirrus HD-OCT. Pooled receiver operating characteristic (ROC) curves were initially obtained for each parameter of the Cirrus HD-OCT. The effect of disease severity on diagnostic performance was evaluated by fitting an ROC regression model, with VFI used as a covariate, and calculating the area under the ROC curve (AUCs) for different levels of disease severity. RESULTS. The largest pooled AUCs were for average thickness (0.892), inferior quadrant thickness (0.881), and superior quadrant thickness (0.874). Disease severity had a significant influence on the detection of glaucoma. For the average RNFL thickness parameter, AUCs were 0.962, 0.932, 0.886, and 0.822 for VFIs of 70%, 80%, 90%, and 100%, respectively. CONCLUSIONS. Disease severity had a significant effect on the diagnostic performance of the Cirrus HD-OCT and thus should be considered when interpreting results from this device and when considering the potential applications of this instrument for diagnosing glaucoma in the various clinical settings. (Invest Ophthalmol Vis Sci. 2010;51:4104-4109) DOI:10.1167/iovs.094716

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background It is noteworthy that there is a clear clinical, epidemiological and pathophysiological association between upper and lower airway inflammation in rhinitis and asthma. Objective The aim of this study was to compare the eosinophil counts in induced sputum and nasal lavage fluids in asthma, checking their association and the accuracy of nasal eosinophilia as a predictor of sputum eosinophilia by a cross-sectional study. Methods The clinical evaluation, asthma control questionnaire (ACQ), pre- and post-bronchodilator spirometry, nasal and sputum sample was performed. The nasal eosinophilia was analysed by a receiver operating curve and logistic regression model. Results In 140 adults, the post-bronchodilator forced expiratory volume in 1 s (FEV(1)) did not differ between patients with or without sputum eosinophilia (0.18). After adjusted for upper airway symptoms, age, ACQ score and post-bronchodilator FEV(1), sputum eosinophilia was associated with 52 times increase in odds of nasal eosinophilia, whereas each 1% increase in bronchodilator response was associated with 7% increase in odds of nasal eosinophilia. Conclusion This study brings further evidence that upper airway diseases are an important component of the asthma syndrome. Furthermore, monitoring of nasal eosinophilia by quantitative cytology may be useful as a surrogate of sputum cytology in as a component of composite measurement for determining airway inflammation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To determine the possible factors predicting the insulin requirement in pregnancies complicated by gestational diabetes mellitus (GDM). Method: A total of 294 patients with GDM diagnosed by the 100-g/3-h oral glucose tolerance test (OGTT) were studied. The following factors were analyzed: maternal age, nulliparity, family history of diabetes, prepregnancy BMI, prior GDM, prior fetal macrosomia, multiple pregnancy, polyhydramnios, gestational age at diagnosis of GDM, smoking, hypertension, number of abnormal 100-g/3-h OGTT values, and glycated hemoglobin (HbA1c). The association between each factor and the need for insulin therapy was then analyzed individually. The performance of these factors to predict the probability of insulin therapy was estimated using a logistic regression model. Results: Univariate analysis showed a positive correlation between insulin therapy and prepregnancy BMI, family history of diabetes, hypertension, prior GDM, prior fetal macrosomia, number of abnormal 100-g/3-h OGTT values, and HbA1c (P < 0.05). Prepregnancy BMI, family history of diabetes, number of abnormal 100-g/3-h OGTT values and HbA1c were statistically significant variables in the logistic regression model. Conclusions: The probability of insulin therapy can be estimated in pregnant women with GDM based on prepregnancy BMI, family history of diabetes, number of abnormal 100-g/3-h OGTT values, and HbA1c concentration. (C) 2010 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies that have investigated ascorbic acid (AA) concentrations in cord blood have pointed to significant associations with maternal blood AA concentrations. smoking, age, diet, type of delivery, duration of gestation, fetal distress and birth weight. The aim of the present study was to determine the relationship between cord blood AA concentrations in newborns and maternal characteristics. A total of 117 Brazilian healthy parturients were included in this cross-sectional study. The concentrations of AA in blood were determined by the HPLC method. Data concerning socio-economic, demographic, obstetric, nutritional and health characteristics of the parturients, including alcohol consumption and smoking habit, were assessed by a standardised questionnaire. A FFQ was used to investigate the intake of foods rich in vitamin C. Cord blood AA concentration was significantly correlated with per capita income (r 0.26; P=0.005), maternal blood AA concentration (r 0.48; P<0.001) and maternal vitamin C-rich food intake score (r 0.36; P<0.001). The linear regression model including maternal AA concentration, alcohol consumption, smoking, parity, vitamin C-rich food intake score and per capita income explained 31.13% of the variation in cord blood AA concentrations in newborns. We recommend further experimental studies to assess the effects of ethanol on placental AA uptake, and epidemiological cohort studies to evaluate in detail the influence of maternal alcohol consumption on cord blood AA concentrations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Around 15% of patients die or become dependent after cerebral vein and dural sinus thrombosis (CVT). Method: We used the International Study on Cerebral Vein and Dural Sinus Thrombosis (ISCVT) sample (624 patients, with a median follow-up time of 478 days) to develop a Cox proportional hazards regression model to predict outcome, dichotomised by a modified Rankin Scale score > 2. From the model hazard ratios, a risk score was derived and a cut-off point selected. The model and the score were tested in 2 validation samples: (1) the prospective Cerebral Venous Thrombosis Portuguese Collaborative Study Group (VENO-PORT) sample with 91 patients; (2) a sample of 169 consecutive CVT patients admitted to 5 ISCVT centres after the end of the ISCVT recruitment period. Sensitivity, specificity, c statistics and overall efficiency to predict outcome at 6 months were calculated. Results: The model (hazard ratios: malignancy 4.53; coma 4.19; thrombosis of the deep venous system 3.03; mental status disturbance 2.18; male gender 1.60; intracranial haemorrhage 1.42) had overall efficiencies of 85.1, 84.4 and 90.0%, in the derivation sample and validation samples 1 and 2, respectively. Using the risk score (range from 0 to 9) with a cut-off of 6 3 points, overall efficiency was 85.4, 84.4 and 90.1% in the derivation sample and validation samples 1 and 2, respectively. Sensitivity and specificity in the combined samples were 96.1 and 13.6%, respectively. Conclusions: The CVT risk score has a good estimated overall rate of correct classifications in both validation samples, but its specificity is low. It can be used to avoid unnecessary or dangerous interventions in low-risk patients, and may help to identify high-risk CVT patients. Copyright (C) 2009 S. Karger AG, Basel

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To estimate the effects of combined spinal-epidural and traditional epidural analgesia on uterine basal tone and its association with the occurrence of fetal heart rate (FHR) abnormalities. METHODS: Seventy-seven laboring patients who requested pain relief during labor were randomly assigned to combined spinal-epidural (n=41) or epidural analgesia (n=36). Uterine contractions and FHR were recorded 15 minutes before and after analgesia. Uterine tone was evaluated with intrauterine pressure catheter. Primary outcomes were the elevation of baseline uterine tone and occurrence of FHR prolonged decelerations or bradycardia after analgesia. The influence of other variables such as oxytocin use, hypotension, and speed of pain relief were estimated using a logistic regression model. RESULTS: The incidence of all outcomes was significantly greater in the combined spinal-epidural group compared with epidural: uterine hypertonus (17 compared with 6; P=.018), FHR abnormalities (13 compared with 2; P<.01), and both events simultaneously (11 compared with 1; P<.01). Logistic regression analysis showed the type of analgesia as the only independent predictor of uterine hypertonus (odds ratio 3.526, 95% confidence interval 1.21-10.36; P=.022). For the occurrence of FHR abnormalities, elevation of uterine tone was the independent predictor (odds ratio 18.624, 95% confidence interval 4.46-77.72; P<.001). Regression analysis also found a correlation between decrease on pain scores immediately after analgesia and the estimated probability of occurrence of hypertonus and FHR abnormalities. CONCLUSION: Combined spinal-epidural analgesia is associated with a significantly greater incidence of FHR abnormalities related to uterine hypertonus compared with epidural analgesia. The faster the pain relief after analgesia, the higher the probability of uterine hypertonus and FHR changes. CLINICAL TRIAL REGISTRATION: Umin Clinical Trials Registry, http://www.umin.ac.jp/ctr/index.htm, UMIN000001186

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Widespread use of prostate-specific antigen screening has resulted in younger and healthier men being diagnosed with prostate cancer. Their demands and expectations of surgical intervention are much higher and cannot be adequately addressed with the classic trifecta outcome measures. Objective: A new and more comprehensive method for reporting outcomes after radical prostatectomy, the pentafecta, is proposed. Design, setting, and participants: From January 2008 through September 2009, details of 1111 consecutive patients who underwent robot-assisted radical prostatectomy performed by a single surgeon were retrospectively analyzed. Of 626 potent men, 332 who underwent bilateral nerve sparing and who had 1 yr of follow-up were included in the study group. Measurements: In addition to the traditional trifecta outcomes, two perioperative variables were included in the pentafecta: no postoperative complications and negative surgical margins. Patients who attained the trifecta and concurrently the two additional outcomes were considered as having achieved the pentafecta. A logistic regression model was created to evaluate independent factors for achieving the pentafecta. Results and limitations: Continence, potency, biochemical recurrence-free survival, and trifecta rates at 12 mo were 96.4%, 89.8%, 96.4%, and 83.1%, respectively. With regard to the perioperative outcomes, 93.4% had no postoperative complication and 90.7% had negative surgical margins. The pentafecta rate at 12 mo was 70.8%. On multivariable analysis, patient age (p = 0.001) was confirmed as the only factor independently associated with the pentafecta. Conclusions: A more comprehensive approach for reporting prostate surgery outcomes, the pentafecta, is being proposed. We believe that pentafecta outcomes more accurately represent patients` expectations after minimally invasive surgery for prostate cancer. This approach may be beneficial and may be used when counseling patients with clinically localized disease. (C) 2011 European Association of Urology. Published by Elsevier B. V. All rights reserved.