986 resultados para Logistic regression methodology
Resumo:
BACKGROUND: There is uncertain evidence of effectiveness of 5-aminosalicylates (5-ASA) to induce and maintain response and remission of active Crohn's disease (CD), and weak evidence to support their use in post-operative CD. AIM: To assess the frequency and determinants of 5-ASA use in CD patients and to evaluate the physicians' perception of clinical response and side effects to 5-ASA. METHODS: Data from the Swiss Inflammatory Bowel Disease Cohort, which collects data since 2006 on a large sample of IBD patients, were analysed. Information from questionnaires regarding utilisation of treatments and perception of response to 5-ASA were evaluated. Logistic regression modelling was performed to identify factors associated with 5-ASA use. RESULTS: Of 1420 CD patients, 835 (59%) were ever treated with 5-ASA from diagnosis to latest follow-up. Disease duration >10 years and colonic location were both significantly associated with 5-ASA use. 5-ASA treatment was judged to be successful in 46% (378/825) of treatment episodes (physician global assessment). Side effects prompting stop of therapy were found in 12% (98/825) episodes in which 5-ASA had been stopped. CONCLUSIONS: 5-Aminosalicylates were frequently prescribed in patients with Crohn's disease in the Swiss IBD cohort. This observation stands in contrast to the scientific evidence demonstrating a very limited role of 5-ASA compounds in the treatment of Crohn's disease.
Resumo:
OBJECTIVES: To describe disease characteristics and treatment modalities in a multidisciplinary cohort of systemic lupus erythematosus (SLE) patients in Switzerland. METHODS: Cross-sectional analysis of 255 patients included in the Swiss SLE Cohort and coming from centres specialised in Clinical Immunology, Internal Medicine, Nephrology and Rheumatology. Clinical data were collected with a standardised form. Disease activity was assessed using the Safety of Estrogens in Lupus Erythematosus National Assessment-SLE Disease Activity Index (SELENA-SLEDAI), an integer physician's global assessment score (PGA) ranging from 0 (inactive) to 3 (very active disease) and the erythrocyte sedimentation rate (ESR). The relationship between SLE treatment and activity was assessed by propensity score methods using a mixed-effect logistic regression with a random effect on the contributing centre. RESULTS: Of the 255 patients, 82% were women and 82% were of European ancestry. The mean age at enrolment was 44.8 years and the median SLE duration was 5.2 years. Patients from Rheumatology had a significantly later disease onset. Renal disease was reported in 44% of patients. PGA showed active disease in 49% of patients, median SLEDAI was 4 and median ESR was 14 millimetre/first hour. Prescription rates of anti-malarial drugs ranged from 3% by nephrologists to 76% by rheumatologists. Patients regularly using anti-malarial drugs had significantly lower SELENA-SLEDAI scores and ESR values. CONCLUSION: In our cohort, patients in Rheumatology had a significantly later SLE onset than those in Nephrology. Anti-malarial drugs were mostly prescribed by rheumatologists and internists and less frequently by nephrologists, and appeared to be associated with less active SLE.
Resumo:
From January 1995 to August 1997 we evaluated prospectively the clinical presentation, laboratory findings and short-term survival of smear-positive pulmonary tuberculosis (TB) patients who sought care at our hospital. After providing informed, written consent, the patients were interviewed and laboratory tests were performed. Information about survivorship and death was collected through September 1998. Eighty-six smear-positive pulmonary TB patients were enrolled; 26.7% were HIV-seropositive. Seventeen HIV-seronegative pulmonary TB patients (19.8%) presented chronic diseases in addition to TB. In the multiple logistic regression analysis a CD4+ cell count <= 200 cell/mm³ was independently associated with HIV seropositivity. In the Cox regression model, fitted to all patients, HIV seropositivity and age > or = 50 years were independently associated with decreased survival. Among HIV-seronegative persons, the presence of an additional disease increased the risk of death of almost six-fold. Use of antiretroviral drugs was associated with a lower risk of death among HIV-seropositive smear-positive pulmonary TB patients (RH = 0.32, 95% CI 0.10-0.92). In our study smear-positive pulmonary TB patients had a low short-term survival rate that was strongly associated with HIV infection, age and co-morbidities. Therapy with antiretroviral drugs reduced the short-term risk of death among HIV-seropositive patients after TB diagnosis.
Resumo:
Background: Newer antiepileptic drugs (AED) are increasingly prescribed, and seem to have a comparable efficacy as the classical AED, but are better tolerated. Very scarce data exist regarding their prognostic impact in patients with status epilepticus (SE). We therefore analyzed the evolution of prescription of newer AED between 2006-2010 in our prospective SE database, and assessed their impact on SE prognosis.¦Methods: We found 327 SE episodes occurring in 271 adults. The use of older versus newer AED (levetiracetam, pregabalin, topiramate, lacosamide) and its relationship to outcome (return to clinical baseline conditions, new handicap, or death) were analyzed. Logistic regression models were applied to adjust for known SE outcome predictors.¦Results: We observed an increasing prescription of newer AED over time (30% of patients received them at the study beginning, vs. 42% towards the end). In univariate analyses, patients treated with newer AED had worse outcome than those treated with classical AED only (19% vs 9% for mortality; 33% vs 64% for return to baseline, p<0.001). After adjustment for etiology and SE severity, use of newer AED was independently related to a reduced likelihood of return to baseline (p<0.001), but not to increased mortality.¦Conclusion: This retrospective study shows an increase of the use of newer AED for SE treatment, but does not suggest an improved prognosis following their prescription. Also in view of their higher price, well-designed, prospective assessments analyzing their impact on efficacy and tolerability should be conducted before a widespread use in SE.
Resumo:
Altitudinal tree lines are mainly constrained by temperature, but can also be influenced by factors such as human activity, particularly in the European Alps, where centuries of agricultural use have affected the tree-line. Over the last decades this trend has been reversed due to changing agricultural practices and land-abandonment. We aimed to combine a statistical land-abandonment model with a forest dynamics model, to take into account the combined effects of climate and human land-use on the Alpine tree-line in Switzerland. Land-abandonment probability was expressed by a logistic regression function of degree-day sum, distance from forest edge, soil stoniness, slope, proportion of employees in the secondary and tertiary sectors, proportion of commuters and proportion of full-time farms. This was implemented in the TreeMig spatio-temporal forest model. Distance from forest edge and degree-day sum vary through feed-back from the dynamics part of TreeMig and climate change scenarios, while the other variables remain constant for each grid cell over time. The new model, TreeMig-LAb, was tested on theoretical landscapes, where the variables in the land-abandonment model were varied one by one. This confirmed the strong influence of distance from forest and slope on the abandonment probability. Degree-day sum has a more complex role, with opposite influences on land-abandonment and forest growth. TreeMig-LAb was also applied to a case study area in the Upper Engadine (Swiss Alps), along with a model where abandonment probability was a constant. Two scenarios were used: natural succession only (100% probability) and a probability of abandonment based on past transition proportions in that area (2.1% per decade). The former showed new forest growing in all but the highest-altitude locations. The latter was more realistic as to numbers of newly forested cells, but their location was random and the resulting landscape heterogeneous. Using the logistic regression model gave results consistent with observed patterns of land-abandonment: existing forests expanded and gaps closed, leading to an increasingly homogeneous landscape.
Resumo:
ABSTRACT: BACKGROUND: Chest pain raises concern for the possibility of coronary heart disease. Scoring methods have been developed to identify coronary heart disease in emergency settings, but not in primary care. METHODS: Data were collected from a multicenter Swiss clinical cohort study including 672 consecutive patients with chest pain, who had visited one of 59 family practitioners' offices. Using delayed diagnosis we derived a prediction rule to rule out coronary heart disease by means of a logistic regression model. Known cardiovascular risk factors, pain characteristics, and physical signs associated with coronary heart disease were explored to develop a clinical score. Patients diagnosed with angina or acute myocardial infarction within the year following their initial visit comprised the coronary heart disease group. RESULTS: The coronary heart disease score was derived from eight variables: age, gender, duration of chest pain from 1 to 60 minutes, substernal chest pain location, pain increases with exertion, absence of tenderness point at palpation, cardiovascular risks factors, and personal history of cardiovascular disease. Area under the receiver operating characteristics curve was of 0.95 with a 95% confidence interval of 0.92; 0.97. From this score, 413 patients were considered as low risk for values of percentile 5 of the coronary heart disease patients. Internal validity was confirmed by bootstrapping. External validation using data from a German cohort (Marburg, n = 774) revealed a receiver operating characteristics curve of 0.75 (95% confidence interval, 0.72; 0.81) with a sensitivity of 85.6% and a specificity of 47.2%. CONCLUSIONS: This score, based only on history and physical examination, is a complementary tool for ruling out coronary heart disease in primary care patients complaining of chest pain.
Resumo:
Visceral larva migrans syndrome by Toxocara affects mainly children between 2 and 5 years of age, it is generally asymptomatic, and the seroprevalence varies from 3 to 86% in different countries. A total of 399 schoolchildren from 14 public schools of the Butantã region, São Paulo city, Brazil, were evaluated by Toxocara serology (enzyme-linked immunosorbent assay). Epidemiological data to the Toxocara infection obtained from a protocol were submitted to multiple logistic regression analysis for a risk profile definition. Blood was collected on filter paper by finger puncture, with all samples tested in duplicate. Considering titers > 1/160 as positive, the seroprevalence obtained was 38.8%. Among infected children, the mean age was 9.4 years, with a similar distribution between genders. A significant association was observed with the presence of onychophagia, residence with a dirty backyard, living in a slum, previous wheezing episodes, school attended, and family income (p < 0.05). All data, except "living in a slum", were considered to be determinant of a risk profile for the acquisition of Toxocara infection. A monthly income > 5 minimum salaries represented a protective factor, although of low relevance. Toxocara eggs were found in at least one of the soil samples obtained from five schools, with high prevalence of Toxocara infections, indicating the frequent soil contamination by this agent.
Resumo:
Atrial fibrillation (AF) is a frequent arrhythmia after conventional coronary artery bypass grafting. With the advent of minimally invasive technique for left internal mammary artery-left anterior descending coronary artery (LIMA-LAD) grafting, we analyzed the incidence and the risk factors of postoperative AF in this patient population. This prospective study involves all patients undergoing isolated LIMA-LAD grafting with minimally invasive technique between January 1994 and June 2000. Twenty-four possible risk factors for postoperative AF were entered into univariate and multivariate logistic regression analyses. Postoperative AF occurred in 21 of the 90 patients (23.3%) analyzed. Double- or triple-vessel disease was present in 12/90 patients (13.3%). On univariate analysis, right coronary artery disease (p <0.01), age (p = 0.01), and diabetes (p = 0.04) were found to be risk factors for AF. On multivariate analysis, right coronary artery disease was identified as the sole significant risk factor (p = 0.02). In this patient population, the incidence of AF after minimally invasive coronary artery bypass is in the range of that reported for conventional coronary artery bypass grafting. Right coronary artery disease was found to be an independent predictor, and this may be related to the fact that in this patient population the diseased right coronary artery was not revascularized at the time of the surgical procedure. For the same reason, this risk factor may find a broader application to noncardiac thoracic surgery.
Resumo:
Purpose: to assess the trends of self-reported prevalence of cardiovascular risk factors (CV RFs: hypertension, dyslipidaemia, diabetes) and their management for period 1992 to 2007 in the Swiss population. Methods: four National health interview surveys conducted between 1992 and 2007 in representative samples of the Swiss population (63,782 subjects overall). Self-reported CV RFs prevalence, treatment and controllevels were computed after weighting. Weights were calculated by raking ratio such that the marginal distribution of the weighted totals conforms to the marginal distribution of the targeted population. Multivariate analysis adjusted on age, sex, education, nationality and SMI was conducted using logistic regression. Results: prevalence of ail CV RFs increased between 1992 and 2007, see table. Although the self-reported prevalence of treatment among subjects with CV RFs increased, and this was confirmed by multivariate analysis: OR for hypocholesterolaemic treatment relative to 1992: 0.64 [0.52-0.78]; 1.39 [1.18-1.65] and 2.00 [1.69-2.36] for 1997, 2002 and 2007, respectively. Still, in 2007, circa 40% of hypertensive, 60% of dyslipidaemic and 50% of diabetic subjects weren't treated. Conversely, an adequate control of CV RFs was reported by treated subjects, with an increase during the study period. This increase was confirmed by multivariate analysis (not shown). Conclusion: the self-reported prevalence of hypertension, dyslipidaemia and diabetes increased between 1992 and 2007 in the Swiss population. Despite a good control of treated subjects, still a significant percentage of subjects with CV RFs are not treated.
Resumo:
OBJECTIVE: To compare surgical site infection (SSI) rates in open or laparoscopic appendectomy, cholecystectomy, and colon surgery. To investigate the effect of laparoscopy on SSI in these interventions. BACKGROUND: Lower rates of SSI have been reported among various advantages associated with laparoscopy when compared with open surgery, particularly in cholecystectomy. However, biases such as the lack of postdischarge follow-up and confounding factors might have contributed to the observed differences between the 2 techniques. METHODS: This observational study was based on prospectively collected data from an SSI surveillance program in 8 Swiss hospitals between March 1998 and December 2004, including a standardized postdischarge follow-up. SSI rates were compared between laparoscopic and open interventions. Factors associated with SSI were identified by using logistic regression models to adjust for potential confounding factors. RESULTS: SSI rates in laparoscopic and open interventions were respectively 59/1051 (5.6%) versus 117/1417 (8.3%) in appendectomy (P = 0.01), 46/2606 (1.7%) versus 35/444 (7.9%) in cholecystectomy (P < 0.0001), and 35/311 (11.3%) versus 400/1781 (22.5%) in colon surgery (P < 0.0001). After adjustment, laparoscopic interventions were associated with a decreased risk for SSI: OR = 0.61 (95% CI 0.43-0.87) in appendectomy, 0.27 (0.16-0.43) in cholecystectomy, and 0.43 (0.29-0.63) in colon surgery. The observed effect of laparoscopic techniques was due to a reduction in the rates of incisional infections, rather than in those of organ/space infections. CONCLUSION: When feasible, a laparoscopic approach should be preferred over open surgery to lower the risks of SSI.
Resumo:
INTRODUCTION. Reduced cerebral perfusion pressure (CPP) may worsen secondary damage and outcome after severe traumatic brain injury (TBI), however the optimal management of CPP is still debated. STUDY HYPOTHESIS: We hypothesized that the impact of CPP on outcome is related to brain tissue oxygen tension (PbtO2) level and that reduced CPP may worsen TBI prognosis when it is associated with brain hypoxia. DESIGN. Retrospective analysis of prospective database. METHODS. We analyzed 103 patients with severe TBI who underwent continuous PbtO2 and CPP monitoring for an average of 5 days. For each patient, duration of reduced CPP (\60 mm Hg) and brain hypoxia (PbtO2\15 mm Hg for[30 min [1]) was calculated with linear interpolation method and the relationship between CPP and PbtO2 was analyzed with Pearson's linear correlation coefficient. Outcome at 30 days was assessed with the Glasgow Outcome Score (GOS), dichotomized as good (GOS 4-5) versus poor (GOS 1-3). Multivariable associations with outcome were analyzed with stepwise forward logistic regression. RESULTS. Reduced CPP (n=790 episodes; mean duration 10.2 ± 12.3 h) was observed in 75 (74%) patients and was frequently associated with brain hypoxia (46/75; 61%). Episodes where reduced CPP were associated with normal brain oxygen did not differ significantly between patients with poor versus those with good outcome (8.2 ± 8.3 vs. 6.5 ± 9.7 h; P=0.35). In contrast, time where reduced CPP occurred simultaneously with brain hypoxia was longer in patients with poor than in those with good outcome (3.3±7.4 vs. 0.8±2.3 h; P=0.02). Outcome was significantly worse in patients who had both reduced CPP and brain hypoxia (61% had GOS 1-3 vs. 17% in those with reduced CPP but no brain hypoxia; P\0.01). Patients in whom a positive CPP-PbtO2 correlation (r[0.3) was found also were more likely to have poor outcome (69 vs. 31% in patients with no CPP-PbtO2 correlation; P\0.01). Brain hypoxia was an independent risk factor of poor prognosis (odds ratio for favorable outcome of 0.89 [95% CI 0.79-1.00] per hour spent with a PbtO2\15 mm Hg; P=0.05, adjusted for CPP, age, GCS, Marshall CT and APACHE II). CONCLUSIONS. Low CPP may significantly worsen outcome after severe TBI when it is associated with brain tissue hypoxia. PbtO2-targeted management of CPP may optimize TBI therapy and improve outcome of head-injured patients.
Resumo:
Background: Hypotension, a common intra-operative incident, bears an important potential for morbidity. It is most often manageable and sometimes preventable, which renders its study important. Therefore, we aimed at examining hospital variations in the occurrence of intraoperative hypotension and its predictors. As secondary endpoints, we determined to what extent hypotension relates to the risk of postoperative incidents and death. Methods: We used the Anaesthesia Databank Switzerland, built on routinely and prospectively collected data on all anaesthesias in 21 hospitals. The three outcomes were assessed using multi-level logistic regression models. Results: Among 147573 anaesthesia, hypotension ranged from 0.6 to 5.2% in participating hospitals, and from 0.3 up to 12% in different surgical specialties. Most (73.4%) were minor single events. Age, ASA status, combined general and regional anaesthesia techniques, duration of surgery, and hospitalization were significantly associated to hypotension. Although significantly associated, the emergency status of the surgery had a weaker effect. Hospitals' Odds Ratios for hypotension varied between 0.12 to 2.50 (p ≤0.001) with respect to the mean prevalence of 3.1%, even after adjusting for patient and anaesthesia factors, and for type of surgery. At least one postoperative incident occurred in 9.7% of the interventions, including 0.03% deaths. Intra-operative hypotension was associated with higher risk of post-operative incidents and death. Conclusions: Wide variations in the occurrence of hypotension amongst hospitals remain after adjustment for risk factors. Although differential reporting from hospitals may exist, variations in anesthesia techniques and blood pressure maintenance could have also contributed. Intra-operative hypotension is associated with morbidities and sometimes death, and constant vigilance must thus be advocated.
Resumo:
BACKGROUND: Serosorting is practiced by men who have sex with men (MSM) to reduce human immunodeficiency virus (HIV) transmission. This study evaluates the prevalence of serosorting with casual partners, and analyses the characteristics and estimated numbers of serosorters in Switzerland 2007-2009. METHODS: Data were extracted from cross-sectional surveys conducted in 2007 and 2009 among self-selected MSM recruited online, through gay newspapers, and through gay organizations. Nested models were fitted to ascertain the appropriateness of pooling the datasets. Multiple logistic regression analysis was performed on pooled data to determine the association between serosorting and demographic, lifestyle-related, and health-related factors. Extrapolations were performed by applying proportions of various types of serosorters to Swiss population data collected in 2007. RESULTS: A significant and stable number of MSM (approximately 39% in 2007 and 2009) intentionally engage in serosorting with casual partners in Switzerland. Variables significantly associated with serosorting were: gay organization membership (aOR = 1.67), frequent internet use for sexual encounters (aOR = 1.71), having had a sexually transmitted infection (STI) at any time in the past 12 months (aOR = 1.70), HIV-positive status (aOR = 0.52), regularly frequenting sex-on-premises venues (aOR = 0.42), and unprotected anal intercourse (UAI) with partners of different or unknown HIV status in the past 12 months (aOR = 0.22). Approximately one-fifth of serosorters declared HIV negativity without being tested in the past 12 months; 15.8% reported not knowing their own HIV status. CONCLUSION: The particular risk profile of serosorters having UAI with casual partners (multiple partners, STI history, and inadequate testing frequency) requires specific preventive interventions tailored to HIV status.
Resumo:
INTRODUCTION: HIV-infected pregnant women are very likely to engage in HIV medical care to prevent transmission of HIV to their newborn. After delivery, however, childcare and competing commitments might lead to disengagement from HIV care. The aim of this study was to quantify loss to follow-up (LTFU) from HIV care after delivery and to identify risk factors for LTFU. METHODS: We used data on 719 pregnancies within the Swiss HIV Cohort Study from 1996 to 2012 and with information on follow-up visits available. Two LTFU events were defined: no clinical visit for >180 days and no visit for >360 days in the year after delivery. Logistic regression analysis was used to identify risk factors for a LTFU event after delivery. RESULTS: Median maternal age at delivery was 32 years (IQR 28-36), 357 (49%) women were black, 280 (39%) white, 56 (8%) Asian and 4% other ethnicities. One hundred and seven (15%) women reported any history of IDU. The majority (524, 73%) of women received their HIV diagnosis before pregnancy, most of those (413, 79%) had lived with diagnosed HIV longer than three years and two-thirds (342, 65%) were already on antiretroviral therapy (ART) at time of conception. Of the 181 women diagnosed during pregnancy by a screening test, 80 (44%) were diagnosed in the first trimester, 67 (37%) in the second and 34 (19%) in the third trimester. Of 357 (69%) women who had been seen in HIV medical care during three months before conception, 93% achieved an undetectable HIV viral load (VL) at delivery. Of 62 (12%) women with the last medical visit more than six months before conception, only 72% achieved an undetectable VL (p=0.001). Overall, 247 (34%) women were LTFU over 180 days in the year after delivery and 86 (12%) women were LTFU over 360 days with 43 (50%) of those women returning. Being LTFU for 180 days was significantly associated with history of intravenous drug use (aOR 1.73, 95% CI 1.09-2.77, p=0.021) and not achieving an undetectable VL at delivery (aOR 1.79, 95% CI 1.03-3.11, p=0.040) after adjusting for maternal age, ethnicity, time of HIV diagnosis and being on ART at conception. CONCLUSIONS: Women with a history of IDU and women with a detectable VL at delivery were more likely to be LTFU after delivery. This is of concern regarding their own health, as well as risk for sexual partners and subsequent pregnancies. Further strategies should be developed to enhance retention in medical care beyond pregnancy.
Resumo:
INTRODUCTION: Common variation in the CHRNA5-CHRNA3-CHRNB4 gene region is robustly associated with smoking quantity. Conversely, the association between one of the most significant single nucleotide polymorphisms (SNPs; rs1051730 within the CHRNA3 gene) with perceived difficulty or willingness to quit smoking among current smokers is unknown. METHODS: Cross-sectional study including current smokers, 502 women, and 552 men. Heaviness of smoking index (HSI), difficulty, attempting, and intention to quit smoking were assessed by questionnaire. RESULTS: The rs1051730 SNP was associated with increased HSI (age, gender, and education-adjusted mean ± SE: 2.6 ± 0.1, 2.2 ± 0.1, and 2.0 ± 0.1 for AA, AG, and GG genotypes, respectively, p < .01). Multivariate logistic regression adjusting for gender, age, education, leisure-time physical activity, and personal history of cardiovascular or lung disease showed rs1051730 to be associated with higher smoking dependence (odds ratio [OR] and 95% CI for each additional A-allele: 1.38 [1.11-1.72] for smoking more than 20 cigarette equivalents/day; 1.31 [1.00-1.71] for an HSI ≥5 and 1.32 [1.05-1.65] for smoking 5 min after waking up) and borderline associated with difficulty to quit (OR = 1.29 [0.98-1.70]), but this relationship was no longer significant after adjusting for nicotine dependence. Also, no relationship was found with willingness (OR = 1.03 [0.85-1.26]), attempt (OR = 1.00 [0.83-1.20]), or preparation (OR = 0.95 [0.38-2.38]) to quit. Similar findings were obtained for other SNPs, but their effect on nicotine dependence was no longer significant after adjusting for rs1051730. Conclusions: These data confirm the effect of rs1051730 on nicotine dependence but failed to find any relationship with difficulty, willingness, and motivation to quit.