178 resultados para LOGISTIC REGRESSION WITH STATE-DEPENDENT SAMPLE SELECTION
Resumo:
QUESTION UNDER STUDY: To assess which high-risk acute coronary syndrome (ACS) patient characteristics played a role in prioritising access to intensive care unit (ICU), and whether introducing clinical practice guidelines (CPG) explicitly stating ICU admission criteria altered this practice. PATIENTS AND METHODS: All consecutive patients with ACS admitted to our medical emergency centre over 3 months before and after CPG implementation were prospectively assessed. The impact of demographic and clinical characteristics (age, gender, cardiovascular risk factors, and clinical parameters upon admission) on ICU hospitalisation of high-risk patients (defined as retrosternal pain of prolonged duration with ECG changes and/or positive troponin blood level) was studied by logistic regression. RESULTS: Before and after CPG implementation, 328 and 364 patients, respectively, were assessed for suspicion of ACS. Before CPG implementation, 36 of the 81 high-risk patients (44.4%) were admitted to ICU. After CPG implementation, 35 of the 90 high-risk patients (38.9%) were admitted to ICU. Male patients were more frequently admitted to ICU before CPG implementation (OR=7.45, 95% CI 2.10-26.44), but not after (OR=0.73, 95% CI 0.20-2.66). Age played a significant role in both periods (OR=1.57, 95% CI 1.24-1.99), both young and advanced ages significantly reducing ICU admission, but to a lesser extent after CPG implementation. CONCLUSION: Prioritisation of access to ICU for high-risk ACS patients was age-dependent, but focused on the cardiovascular risk factor profile. CPG implementation explicitly stating ICU admission criteria decreased discrimination against women, but other factors are likely to play a role in bed allocation.
Resumo:
OBJECTIVE: The present study aimed to measure the prevalence of adult attention deficit hyperactivity disorder (ADHD) in a large, representative sample of young Swiss men and to assess factors associated with this disorder. METHODS: Our sample consisted of 5656 Swiss men (mean age 20 years) who participated in the Cohort Study on Substance Use Risk Factors (C-SURF). ADHD was assessed with the World Health Organization (WHO) adult ADHD Self Report Screener (ASRS). Logistic regression analyses were conducted to assess the association between ADHD and several socio-demographic, clinical and familial factors. RESULTS: The prevalence of ADHD was 4.0%, being higher in older and French-speaking conscripts. A higher prevalence also was identified among men whose mothers had completed primary or high school/university and those with a family history of alcohol or psychiatric problems. Additionally, adults with ADHD demonstrated impairment in their professional life, as well as considerable mental health impairment. CONCLUSION: Our results demonstrate that ADHD is common among young Swiss men. The impairments in function and mental health we observed highlight the need for further support and interventions to reduce burden in affected individuals. Interventions that incorporate the whole family also seem crucial.
Resumo:
BACKGROUND: In vitro aggregating brain cell cultures containing all types of brain cells have been shown to be useful for neurotoxicological investigations. The cultures are used for the detection of nervous system-specific effects of compounds by measuring multiple endpoints, including changes in enzyme activities. Concentration-dependent neurotoxicity is determined at several time points. METHODS: A Markov model was set up to describe the dynamics of brain cell populations exposed to potentially neurotoxic compounds. Brain cells were assumed to be either in a healthy or stressed state, with only stressed cells being susceptible to cell death. Cells may have switched between these states or died with concentration-dependent transition rates. Since cell numbers were not directly measurable, intracellular lactate dehydrogenase (LDH) activity was used as a surrogate. Assuming that changes in cell numbers are proportional to changes in intracellular LDH activity, stochastic enzyme activity models were derived. Maximum likelihood and least squares regression techniques were applied for estimation of the transition rates. Likelihood ratio tests were performed to test hypotheses about the transition rates. Simulation studies were used to investigate the performance of the transition rate estimators and to analyze the error rates of the likelihood ratio tests. The stochastic time-concentration activity model was applied to intracellular LDH activity measurements after 7 and 14 days of continuous exposure to propofol. The model describes transitions from healthy to stressed cells and from stressed cells to death. RESULTS: The model predicted that propofol would affect stressed cells more than healthy cells. Increasing propofol concentration from 10 to 100 μM reduced the mean waiting time for transition to the stressed state by 50%, from 14 to 7 days, whereas the mean duration to cellular death reduced more dramatically from 2.7 days to 6.5 hours. CONCLUSION: The proposed stochastic modeling approach can be used to discriminate between different biological hypotheses regarding the effect of a compound on the transition rates. The effects of different compounds on the transition rate estimates can be quantitatively compared. Data can be extrapolated at late measurement time points to investigate whether costs and time-consuming long-term experiments could possibly be eliminated.
Resumo:
BACKGROUND: Prediction of clinical course and outcome after severe traumatic brain injury (TBI) is important. OBJECTIVE: To examine whether clinical scales (Glasgow Coma Scale [GCS], Injury Severity Score [ISS], and Acute Physiology and Chronic Health Evaluation II [APACHE II]) or radiographic scales based on admission computed tomography (Marshall and Rotterdam) were associated with intensive care unit (ICU) physiology (intracranial pressure [ICP], brain tissue oxygen tension [PbtO2]), and clinical outcome after severe TBI. METHODS: One hundred one patients (median age, 41.0 years; interquartile range [26-55]) with severe TBI who had ICP and PbtO2 monitoring were identified. The relationship between admission GCS, ISS, APACHE II, Marshall and Rotterdam scores and ICP, PbtO2, and outcome was examined by using mixed-effects models and logistic regression. RESULTS: Median (25%-75% interquartile range) admission GCS and APACHE II without GCS scores were 3.0 (3-7) and 11.0 (8-13), respectively. Marshall and Rotterdam scores were 3.0 (3-5) and 4.0 (4-5). Mean ICP and PbtO2 during the patients' ICU course were 15.5 ± 10.7 mm Hg and 29.9 ± 10.8 mm Hg, respectively. Three-month mortality was 37.6%. Admission GCS was not associated with mortality. APACHE II (P = .003), APACHE-non-GCS (P = .004), Marshall (P < .001), and Rotterdam scores (P < .001) were associated with mortality. No relationship between GCS, ISS, Marshall, or Rotterdam scores and subsequent ICP or PbtO2 was observed. The APACHE II score was inversely associated with median PbtO2 (P = .03) and minimum PbtO2 (P = .008) and had a stronger correlation with amount of time of reduced PbtO2. CONCLUSION: Following severe TBI, factors associated with outcome may not always predict a patient's ICU course and, in particular, intracranial physiology.
Resumo:
The diagnosis of inflammatory bowel disease (IBD), comprising Crohn's disease (CD) and ulcerative colitis (UC), continues to present difficulties due to unspecific symptoms and limited test accuracies. We aimed to determine the diagnostic delay (time from first symptoms to IBD diagnosis) and to identify associated risk factors. A total of 1591 IBD patients (932 CD, 625 UC, 34 indeterminate colitis) from the Swiss IBD cohort study (SIBDCS) were evaluated. The SIBDCS collects data on a large sample of IBD patients from hospitals and private practice across Switzerland through physician and patient questionnaires. The primary outcome measure was diagnostic delay. Diagnostic delay in CD patients was significantly longer compared to UC patients (median 9 versus 4 months, P < 0.001). Seventy-five percent of CD patients were diagnosed within 24 months compared to 12 months for UC and 6 months for IC patients. Multivariate logistic regression identified age <40 years at diagnosis (odds ratio [OR] 2.15, P = 0.010) and ileal disease (OR 1.69, P = 0.025) as independent risk factors for long diagnostic delay in CD (>24 months). In UC patients, nonsteroidal antiinflammatory drug (NSAID intake (OR 1.75, P = 0.093) and male gender (OR 0.59, P = 0.079) were associated with long diagnostic delay (>12 months). Whereas the median delay for diagnosing CD, UC, and IC seems to be acceptable, there exists a long delay in a considerable proportion of CD patients. More public awareness work needs to be done in order to reduce patient and doctor delays in this target population.
Resumo:
PURPOSE: To examine the associations between substance use and other health-risk behaviors and quality of life (QOL) among young men. METHODS: The analytical sample consisted of 5,306 young Swiss men who participated in the Cohort Study on Substance Use Risk Factors. Associations between seven distinct self-reported health-risk behaviors (risky single-occasion drinking; volume drinking; cigarette smoking; cannabis use; use of any other illicit drugs; sexual intercourse without a condom; low physical activity) were assessed via chi-square analysis. Logistic regression analyses were conducted to study the associations between each particular health-risk behavior and either physical or mental QOL (assessed with the SF-12v2) while adjusting for socio-demographic variables and the presence of all other health-risk behaviors. RESULTS: Most health-risk behaviors co-occurred. However, low physical activity was not or negatively related to other health-risk behaviors. Almost all health-risk behaviors were associated with a greater likelihood of compromised QOL. However, sexual intercourse without a condom (not associated with both physical and mental QOL) and frequent risky single-occasion drinking (not related to mental QOL after adjusting for the presence of other health-risk behaviors; positively associated with physical QOL) differed from this pattern. CONCLUSIONS: Health-risk behaviors are mostly associated with compromised QOL. However, sexual intercourse without a condom and frequent risky single-occasion drinking differ from this pattern and are therefore possibly particularly difficult to change relative to other health-risk behaviors.
Resumo:
This prospective study applies an extended Information-Motivation-Behavioural Skills (IMB) model to establish predictors of HIV-protection behaviour among HIV-positive men who have sex with men (MSM) during sex with casual partners. Data have been collected from anonymous, self-administered questionnaires and analysed by using descriptive and backward elimination regression analyses. In a sample of 165 HIV-positive MSM, 82 participants between the ages of 23 and 78 (M=46.4, SD=9.0) had sex with casual partners during the three-month period under investigation. About 62% (n=51) have always used a condom when having sex with casual partners. From the original IMB model, only subjective norm predicted condom use. More important predictors that increased condom use were low consumption of psychotropics, high satisfaction with sexuality, numerous changes in sexual behaviour after diagnosis, low social support from friends, alcohol use before sex and habitualised condom use with casual partner(s). The explanatory power of the calculated regression model was 49% (p<0.001). The study reveals the importance of personal and social resources and of routines for condom use, and provides information for the research-based conceptualisation of prevention offers addressing especially people living with HIV ("positive prevention").
Resumo:
Aim: The diagnosis of inflammatory bowel disease (IBD), comprising Crohn's disease (CD) and ulcerative colitis (UC), continues to present difficulties due to unspecific symptoms and limited test accuracies. We aimed to determine the diagnostic delay (time from first symptoms to IBD diagnosis) and to identify associated risk factors in a national cohort in Switzerland.¦Materials and Methods: A total of 1,591 IBD patients (932 CD, 625 UC, 34 indeterminate colitis) from the Swiss IBD cohort study (SIBDCS) were evaluated. The SIBDCS collects data on a large sample of IBD patients from hospitals and private practice across Switzerland through physician and patient questionnaires. The primary outcome measure was the diagnostic delay.¦Results: Diagnostic delay in CD patients was significantly longer compared to UC patients (median 9 vs. 4 months, P < 0.001). Seventy-five percent of CD patients were diagnosed within 24 months compared to 12 months for UC and 6 months for IC patients. Multivariate logistic regression identified age <40 years at diagnosis (OR 2.15, P = 0.010) and ileal disease (OR 1.69, P = 0.025) as independent risk factors for long diagnostic delay in CD (>24 months). A trend for long diagnostic delay (>12 months) was associated with NSAID intake (OR 1.75, P = 0.093) and male gender (OR 0.59, P = 0.079) in UC patients.¦Conclusions: Whereas the median delay for diagnosing CD, UC, and IC seems to be acceptable, there exists a long delay in a considerable proportion of CD patients. More public awareness work needs to be done in order to reduce patient's and doctor's delay in this target population.
Resumo:
BACKGROUND: Surgical recurrence rates among patients with Crohn's disease with ileocolic resection (ICR) remain high, and factors predicting surgical recurrence remain controversial. We aimed to identify risk and protective factors for repetitive ICRs among patients with Crohn's disease in a large cohort of patients. METHODS: Data on 305 patients after first ICR were retrieved from our cross-sectional and prospective database (median follow-up: 15 yr [0-52 yr]). Data were compared between patients with 1 (ICR = 1, n = 225) or more than 1 (ICR >1, n = 80) resection. Clinical phenotypes were classified according to the Montreal Classification. Gender, family history of inflammatory bowel disease, smoking status, type of surgery, immunomodulator, and biological therapy before, parallel to and after first ICR were analyzed. RESULTS: The mean duration from diagnosis until first ICR did not differ significantly between the groups, being 5.93 ± 7.65 years in the ICR = 1 group and 5.36 ± 6.35 years in the ICR >1 group (P = 0.05). Mean time to second ICR was 6.7 ± 5.74 years. In the multivariate logistic regression analysis, ileal disease location (odds ratio [OR], 2.42; 95% confidence interval [CI], 1.02-5.78; P = 0.05) was a significant risk factor. A therapy with immunomodulators at time of or within 1 year after first ICR (OR, 0.23; 95% CI, 0.09-0.63; P < 0.01) was a protective factor. Neither smoking (OR, 1.16; 95% CI, 0.66-2.06) nor gender (male OR, 0.85; 95% CI, 0.51-1.42) or family history (OR, 1.68; 95% CI, 0.84-3.36) had a significant impact on surgical recurrence. CONCLUSIONS: Immunomodulators have a protective impact regarding surgical recurrence after ICR. In contrast, ileal disease location constitutes a significant risk factor for a second ICR.
Resumo:
Human genetic variation contributes to differences in susceptibility to HIV-1 infection. To search for novel host resistance factors, we performed a genome-wide association study (GWAS) in hemophilia patients highly exposed to potentially contaminated factor VIII infusions. Individuals with hemophilia A and a documented history of factor VIII infusions before the introduction of viral inactivation procedures (1979-1984) were recruited from 36 hemophilia treatment centers (HTCs), and their genome-wide genetic variants were compared with those from matched HIV-infected individuals. Homozygous carriers of known CCR5 resistance mutations were excluded. Single nucleotide polymorphisms (SNPs) and inferred copy number variants (CNVs) were tested using logistic regression. In addition, we performed a pathway enrichment analysis, a heritability analysis, and a search for epistatic interactions with CCR5 Δ32 heterozygosity. A total of 560 HIV-uninfected cases were recruited: 36 (6.4%) were homozygous for CCR5 Δ32 or m303. After quality control and SNP imputation, we tested 1 081 435 SNPs and 3686 CNVs for association with HIV-1 serostatus in 431 cases and 765 HIV-infected controls. No SNP or CNV reached genome-wide significance. The additional analyses did not reveal any strong genetic effect. Highly exposed, yet uninfected hemophiliacs form an ideal study group to investigate host resistance factors. Using a genome-wide approach, we did not detect any significant associations between SNPs and HIV-1 susceptibility, indicating that common genetic variants of major effect are unlikely to explain the observed resistance phenotype in this population.
Resumo:
Introduction: There is little information regarding compliance with dietary recommendations in Switzerland. Objectives: To assess the trends in compliance with dietary recommendations in the Geneva population for period 1999 - 2009. Methods: Ten cross-sectional, population-based surveys (Bus Santé study). Dietary intake was assessed using a self-administered, validated semi quantitative Food Frequency Questionnaire. Compliance with the Swiss Society for Nutrition recommendations for nutrient intake was assessed. In all 9320 participants aged 35 to 75 years (50% women) were included. Trends were assessed by logistic regression adjusting for age, smoking stats, education and nationality, using survey year as the independent variable. Results: After excluding participants with extreme intakes, the percentage of participants with a cholesterol consumption< 300 mg/day increased from 40.8% in 1999 to 43.6% in 2009 for men (multivariate-adjusted p for trend = 0.04) and from 57.8% to 61.4% in women (multivariate-adjusted p for trend = 0.06). Calcium intake > 1 g/day decreased from 53.3% to 46.0% in men and from 47.6% to 40.7% in women (multivariate-adjusted p for trend< 0.001). Adequate iron intake decreased from 68.3%to 65.3% in men and from 13.3% to 8.4% in women (multivariate-adjusted p for trend< 0.001). Conversely, no significant changes were observed for carbohydrates, protein, total fat (including saturated, monounsaturated and polyunsaturated fatty acids), fibre, vitamins D and A. Conclusion: Fewimprovements were noted in adherence to dietary recommendations in the Geneva population between 1999 and 2009. The low and decreasing prevalence of adequate calcium and iron intake are of concern.
Resumo:
To evaluate the efficacy of anti-J5 serum in the treatment of severe infectious purpura, 73 children were randomized to receive either anti-J5 (40) or control (33) plasma. Age, blood pressure, and biologic risk factors were similar in both groups. At admission, however, tumor necrosis factor serum concentrations were 974 +/- 173 pg/ml compared with 473 +/- 85 pg/ml (P = .023) and interleukin-6 serum concentrations were 129 +/- 45 compared with 19 +/- 5 ng/ml (P = .005) in the control and treated groups, respectively. The duration of shock and the occurrence of complications were similar in both groups. The mortality rate was 36% in the control group and 25% in the treated group (P = .317; odds ratio, 0.76; 95% confidence interval, 0.46-1.26). This trend disappeared after correction for unbalances in risk factors at randomization using a logistic regression model. These results suggest that anti-j5 plasma did not affect the course or mortality of severe infectious purpura in children.
Resumo:
BACKGROUND: There is uncertain evidence of effectiveness of 5-aminosalicylates (5-ASA) to induce and maintain response and remission of active Crohn's disease (CD), and weak evidence to support their use in post-operative CD. AIM: To assess the frequency and determinants of 5-ASA use in CD patients and to evaluate the physicians' perception of clinical response and side effects to 5-ASA. METHODS: Data from the Swiss Inflammatory Bowel Disease Cohort, which collects data since 2006 on a large sample of IBD patients, were analysed. Information from questionnaires regarding utilisation of treatments and perception of response to 5-ASA were evaluated. Logistic regression modelling was performed to identify factors associated with 5-ASA use. RESULTS: Of 1420 CD patients, 835 (59%) were ever treated with 5-ASA from diagnosis to latest follow-up. Disease duration >10 years and colonic location were both significantly associated with 5-ASA use. 5-ASA treatment was judged to be successful in 46% (378/825) of treatment episodes (physician global assessment). Side effects prompting stop of therapy were found in 12% (98/825) episodes in which 5-ASA had been stopped. CONCLUSIONS: 5-Aminosalicylates were frequently prescribed in patients with Crohn's disease in the Swiss IBD cohort. This observation stands in contrast to the scientific evidence demonstrating a very limited role of 5-ASA compounds in the treatment of Crohn's disease.
Resumo:
ABSTRACT: BACKGROUND: Chest pain raises concern for the possibility of coronary heart disease. Scoring methods have been developed to identify coronary heart disease in emergency settings, but not in primary care. METHODS: Data were collected from a multicenter Swiss clinical cohort study including 672 consecutive patients with chest pain, who had visited one of 59 family practitioners' offices. Using delayed diagnosis we derived a prediction rule to rule out coronary heart disease by means of a logistic regression model. Known cardiovascular risk factors, pain characteristics, and physical signs associated with coronary heart disease were explored to develop a clinical score. Patients diagnosed with angina or acute myocardial infarction within the year following their initial visit comprised the coronary heart disease group. RESULTS: The coronary heart disease score was derived from eight variables: age, gender, duration of chest pain from 1 to 60 minutes, substernal chest pain location, pain increases with exertion, absence of tenderness point at palpation, cardiovascular risks factors, and personal history of cardiovascular disease. Area under the receiver operating characteristics curve was of 0.95 with a 95% confidence interval of 0.92; 0.97. From this score, 413 patients were considered as low risk for values of percentile 5 of the coronary heart disease patients. Internal validity was confirmed by bootstrapping. External validation using data from a German cohort (Marburg, n = 774) revealed a receiver operating characteristics curve of 0.75 (95% confidence interval, 0.72; 0.81) with a sensitivity of 85.6% and a specificity of 47.2%. CONCLUSIONS: This score, based only on history and physical examination, is a complementary tool for ruling out coronary heart disease in primary care patients complaining of chest pain.
Resumo:
BACKGROUND AND AIMS: The Senecio hybrid zone on Mt Etna, Sicily, is characterized by steep altitudinal clines in quantitative traits and genetic variation. Such clines are thought to be maintained by a combination of 'endogenous' selection arising from genetic incompatibilities and environment-dependent 'exogenous' selection leading to local adaptation. Here, the hypothesis was tested that local adaptation to the altitudinal temperature gradient contributes to maintaining divergence between the parental species, S. chrysanthemifolius and S. aethnensis. METHODS: Intra- and inter-population crosses were performed between five populations from across the hybrid zone and the germination and early seedling growth of the progeny were assessed. KEY RESULTS: Seedlings from higher-altitude populations germinated better under low temperatures (9-13 °C) than those from lower altitude populations. Seedlings from higher-altitude populations had lower survival rates under warm conditions (25/15 °C) than those from lower altitude populations, but also attained greater biomass. There was no altitudinal variation in growth or survival under cold conditions (15/5 °C). Population-level plasticity increased with altitude. Germination, growth and survival of natural hybrids and experimentally generated F(1)s generally exceeded the worse-performing parent. CONCLUSIONS: Limited evidence was found for endogenous selection against hybrids but relatively clear evidence was found for divergence in seed and seedling traits, which is probably adaptive. The combination of low-temperature germination and faster growth in warm conditions might enable high-altitude S. aethnensis to maximize its growth during a shorter growing season, while the slower growth of S. chrysanthemifolius may be an adaptation to drought stress at low altitudes. This study indicates that temperature gradients are likely to be an important environmental factor generating and maintaining adaptive divergence across the Senecio hybrid zone on Mt Etna.