945 resultados para multivariate regression tree
Resumo:
The genetic diversity of three temperate fruit tree phytoplasmas ‘Candidatus Phytoplasma prunorum’, ‘Ca. P. mali’ and ‘Ca. P. pyri’ has been established by multilocus sequence analysis. Among the four genetic loci used, the genes imp and aceF distinguished 30 and 24 genotypes, respectively, and showed the highest variability. Percentage of substitution for imp ranged from 50 to 68% according to species. Percentage of substitution varied between 9 and 12% for aceF, whereas it was between 5 and 6% for pnp and secY. In the case of ‘Ca P. prunorum’ the three most prevalent aceF genotypes were detected in both plants and insect vectors, confirming that the prevalent isolates are propagated by insects. The four isolates known to be hypo-virulent had the same aceF sequence, indicating a possible monophyletic origin. Haplotype network reconstructed by eBURST revealed that among the 34 haplotypes of ‘Ca. P. prunorum’, the four hypo-virulent isolates also grouped together in the same clade. Genotyping of some Spanish and Azerbaijanese ‘Ca. P. pyri’ isolates showed that they shared some alleles with ‘Ca. P. prunorum’, supporting for the first time to our knowledge, the existence of inter-species recombination between these two species.
Resumo:
The diagnosis of inflammatory bowel disease (IBD), comprising Crohn's disease (CD) and ulcerative colitis (UC), continues to present difficulties due to unspecific symptoms and limited test accuracies. We aimed to determine the diagnostic delay (time from first symptoms to IBD diagnosis) and to identify associated risk factors. A total of 1591 IBD patients (932 CD, 625 UC, 34 indeterminate colitis) from the Swiss IBD cohort study (SIBDCS) were evaluated. The SIBDCS collects data on a large sample of IBD patients from hospitals and private practice across Switzerland through physician and patient questionnaires. The primary outcome measure was diagnostic delay. Diagnostic delay in CD patients was significantly longer compared to UC patients (median 9 versus 4 months, P < 0.001). Seventy-five percent of CD patients were diagnosed within 24 months compared to 12 months for UC and 6 months for IC patients. Multivariate logistic regression identified age <40 years at diagnosis (odds ratio [OR] 2.15, P = 0.010) and ileal disease (OR 1.69, P = 0.025) as independent risk factors for long diagnostic delay in CD (>24 months). In UC patients, nonsteroidal antiinflammatory drug (NSAID intake (OR 1.75, P = 0.093) and male gender (OR 0.59, P = 0.079) were associated with long diagnostic delay (>12 months). Whereas the median delay for diagnosing CD, UC, and IC seems to be acceptable, there exists a long delay in a considerable proportion of CD patients. More public awareness work needs to be done in order to reduce patient and doctor delays in this target population.
Resumo:
Objectives:To investigate the associations between falls before hospital¦admission, falls during hospitalization, and length of stay in elderly¦people admitted to post-acute geriatric rehabilitation. Method: History¦of falling in the previous 12 months before admission was recorded¦among 249 older persons (mean age 82.3±7.4 years, 69.1% women)¦consecutively admitted to post-acute rehabilitation. Data on medical,¦functional and cognitive status were collected upon admission. Falls¦during hospitalization and length of stay were recorded at discharge.¦Results: Overall, 92 (40.4%) patients reported no fall in the 12 months¦before admission; 63(27.6%) reported 1 fall, and 73(32.0%) reported¦multiple falls. Previous falls occurrence (one or more falls) was significantly¦associated with in-stay falls (19.9% of previous fallers fell¦during the stay vs 7.6% in patients without history of falling, P=.01),¦and with a longer length of stay (22.4 ± 10.1 days vs 27.1 ± 14.3 days,¦P=.01). In multivariate robust regression controlling for gender, age,¦functional and cognitive status, history of falling remained significantly¦associated with longer rehabilitation stay (2.8 days more in single fallers,¦p=.05, and 3.3 days more in multiple fallers, p=.0.1, compared to¦non-fallers). Conclusion: History of falling in the 12 months prior to¦post acute geriatric rehabilitation is independently associated with a¦longer rehabilitation length of stay. Previous fallers have also an¦increased risk of falling during rehabilitation stay. This suggests that¦hospital fall prevention measures should particularly target these high¦riskpatients.
Resumo:
OBJECTIVE: Minimizing unwarranted prescription of antibiotics remains an important objective. Because of the heterogeneity between units regarding patient mix and other characteristics, site-specific targets for reduction must be identified. Here we present a model to address the issue by means of an observational cohort study. SETTING: A tertiary, multidisciplinary, neonatal, and pediatric intensive care unit of a university teaching hospital. PATIENTS: All newborns and children present in the unit (n = 456) between September 1998 and March 1999. Reasons for admission included postoperative care after cardiac surgery, major neonatal or pediatric surgery, severe trauma, and medical conditions requiring critical care. METHODS: Daily recording of antibiotics given and of indications for initiation. After discontinuation, each treatment episode was assessed as to the presence or absence of infection. RESULTS: Of the 456 patients 258 (56.6%) received systemic antibiotics, amounting to 1815 exposure days (54.6%) during 3322 hospitalization days. Of these, 512 (28%) were prescribed as prophylaxis and 1303 for suspected infection. Treatment for suspected ventilator-associated pneumonia accounted for 616 (47%) of 1303 treatment days and suspected sepsis for 255 days (20%). Patients were classified as having no infection or viral infection during 552 (40%) treatment days. The average weekly exposure rate in the unit varied considerably during the 29-week study period (range: 40-77/100 hospitalization days). Patient characteristics did not explain this variation. CONCLUSION: In this unit the largest reduction in antibiotic treatment would result from measures assisting suspected ventilator-associated pneumonia to be ruled out and from curtailing extended prophylaxis.
Resumo:
Aim: The diagnosis of inflammatory bowel disease (IBD), comprising Crohn's disease (CD) and ulcerative colitis (UC), continues to present difficulties due to unspecific symptoms and limited test accuracies. We aimed to determine the diagnostic delay (time from first symptoms to IBD diagnosis) and to identify associated risk factors in a national cohort in Switzerland.¦Materials and Methods: A total of 1,591 IBD patients (932 CD, 625 UC, 34 indeterminate colitis) from the Swiss IBD cohort study (SIBDCS) were evaluated. The SIBDCS collects data on a large sample of IBD patients from hospitals and private practice across Switzerland through physician and patient questionnaires. The primary outcome measure was the diagnostic delay.¦Results: Diagnostic delay in CD patients was significantly longer compared to UC patients (median 9 vs. 4 months, P < 0.001). Seventy-five percent of CD patients were diagnosed within 24 months compared to 12 months for UC and 6 months for IC patients. Multivariate logistic regression identified age <40 years at diagnosis (OR 2.15, P = 0.010) and ileal disease (OR 1.69, P = 0.025) as independent risk factors for long diagnostic delay in CD (>24 months). A trend for long diagnostic delay (>12 months) was associated with NSAID intake (OR 1.75, P = 0.093) and male gender (OR 0.59, P = 0.079) in UC patients.¦Conclusions: Whereas the median delay for diagnosing CD, UC, and IC seems to be acceptable, there exists a long delay in a considerable proportion of CD patients. More public awareness work needs to be done in order to reduce patient's and doctor's delay in this target population.
Resumo:
Predictive species distribution modelling (SDM) has become an essential tool in biodiversity conservation and management. The choice of grain size (resolution) of environmental layers used in modelling is one important factor that may affect predictions. We applied 10 distinct modelling techniques to presence-only data for 50 species in five different regions, to test whether: (1) a 10-fold coarsening of resolution affects predictive performance of SDMs, and (2) any observed effects are dependent on the type of region, modelling technique, or species considered. Results show that a 10 times change in grain size does not severely affect predictions from species distribution models. The overall trend is towards degradation of model performance, but improvement can also be observed. Changing grain size does not equally affect models across regions, techniques, and species types. The strongest effect is on regions and species types, with tree species in the data sets (regions) with highest locational accuracy being most affected. Changing grain size had little influence on the ranking of techniques: boosted regression trees remain best at both resolutions. The number of occurrences used for model training had an important effect, with larger sample sizes resulting in better models, which tended to be more sensitive to grain. Effect of grain change was only noticeable for models reaching sufficient performance and/or with initial data that have an intrinsic error smaller than the coarser grain size.
Resumo:
Lung transplantation has evolved from an experimental procedure to a viable therapeutic option in many countries. In Switzerland, the first lung transplant was performed in November 1992, more than ten years after the first successful procedure world-wide. Thenceforward, a prospective national lung transplant registry was established, principally to enable quality control. The data of all patients transplanted in the two Swiss Lung Transplant centres Zurich University Hospital and Centre de Romandie (Geneva-Lausanne) were analysed. In 10 years 242 lung transplants have been performed. Underlying lung diseases were cystic fibrosis including bronchiectasis (32%), emphysema (32%), parenchymal disorders (19%), pulmonary hypertension (11%) and lymphangioleiomyomatosis (3%). There were only 3% redo procedures. The 1, 5 and 9 year survival rates were 77% (95% CI 72-82), 64% (95% CI 57-71) and 56% (95% CI 45-67), respectively. The 5 year survival rate of patients transplanted since 1998 was 72% (95% CI 64-80). Multivariate Cox regression analysis revealed that survival was significantly better in this group compared to those transplanted before 1998 (HR 0.44, 0.26-0.75). Patients aged 60 years and older (HR 5.67, 95% CI 2.50-12.89) and those with pulmonary hypertension (HR 2.01, 95% CI 1.10-3.65) had a significantly worse prognosis The most frequent causes of death were infections (29%), bronchiolitis obliterans syndrome (25%) and multiple organ failure (14%). The 10-year Swiss experience of lung transplantation compares favourably with the international data. The best results are obtained in cystic fibrosis, pulmonary emphysema and parenchymal disorders.
Resumo:
BACKGROUND: Surgical recurrence rates among patients with Crohn's disease with ileocolic resection (ICR) remain high, and factors predicting surgical recurrence remain controversial. We aimed to identify risk and protective factors for repetitive ICRs among patients with Crohn's disease in a large cohort of patients. METHODS: Data on 305 patients after first ICR were retrieved from our cross-sectional and prospective database (median follow-up: 15 yr [0-52 yr]). Data were compared between patients with 1 (ICR = 1, n = 225) or more than 1 (ICR >1, n = 80) resection. Clinical phenotypes were classified according to the Montreal Classification. Gender, family history of inflammatory bowel disease, smoking status, type of surgery, immunomodulator, and biological therapy before, parallel to and after first ICR were analyzed. RESULTS: The mean duration from diagnosis until first ICR did not differ significantly between the groups, being 5.93 ± 7.65 years in the ICR = 1 group and 5.36 ± 6.35 years in the ICR >1 group (P = 0.05). Mean time to second ICR was 6.7 ± 5.74 years. In the multivariate logistic regression analysis, ileal disease location (odds ratio [OR], 2.42; 95% confidence interval [CI], 1.02-5.78; P = 0.05) was a significant risk factor. A therapy with immunomodulators at time of or within 1 year after first ICR (OR, 0.23; 95% CI, 0.09-0.63; P < 0.01) was a protective factor. Neither smoking (OR, 1.16; 95% CI, 0.66-2.06) nor gender (male OR, 0.85; 95% CI, 0.51-1.42) or family history (OR, 1.68; 95% CI, 0.84-3.36) had a significant impact on surgical recurrence. CONCLUSIONS: Immunomodulators have a protective impact regarding surgical recurrence after ICR. In contrast, ileal disease location constitutes a significant risk factor for a second ICR.
Resumo:
Introduction: There is little information regarding compliance with dietary recommendations in Switzerland. Objectives: To assess the trends in compliance with dietary recommendations in the Geneva population for period 1999 - 2009. Methods: Ten cross-sectional, population-based surveys (Bus Santé study). Dietary intake was assessed using a self-administered, validated semi quantitative Food Frequency Questionnaire. Compliance with the Swiss Society for Nutrition recommendations for nutrient intake was assessed. In all 9320 participants aged 35 to 75 years (50% women) were included. Trends were assessed by logistic regression adjusting for age, smoking stats, education and nationality, using survey year as the independent variable. Results: After excluding participants with extreme intakes, the percentage of participants with a cholesterol consumption< 300 mg/day increased from 40.8% in 1999 to 43.6% in 2009 for men (multivariate-adjusted p for trend = 0.04) and from 57.8% to 61.4% in women (multivariate-adjusted p for trend = 0.06). Calcium intake > 1 g/day decreased from 53.3% to 46.0% in men and from 47.6% to 40.7% in women (multivariate-adjusted p for trend< 0.001). Adequate iron intake decreased from 68.3%to 65.3% in men and from 13.3% to 8.4% in women (multivariate-adjusted p for trend< 0.001). Conversely, no significant changes were observed for carbohydrates, protein, total fat (including saturated, monounsaturated and polyunsaturated fatty acids), fibre, vitamins D and A. Conclusion: Fewimprovements were noted in adherence to dietary recommendations in the Geneva population between 1999 and 2009. The low and decreasing prevalence of adequate calcium and iron intake are of concern.
Resumo:
Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Q(st)-F(st)) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2F(st)/(1 - F(st))G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2F(st)/(1 - F(st))] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Q(st)-F(st) comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions.
Resumo:
Atrial fibrillation (AF) is a frequent arrhythmia after conventional coronary artery bypass grafting. With the advent of minimally invasive technique for left internal mammary artery-left anterior descending coronary artery (LIMA-LAD) grafting, we analyzed the incidence and the risk factors of postoperative AF in this patient population. This prospective study involves all patients undergoing isolated LIMA-LAD grafting with minimally invasive technique between January 1994 and June 2000. Twenty-four possible risk factors for postoperative AF were entered into univariate and multivariate logistic regression analyses. Postoperative AF occurred in 21 of the 90 patients (23.3%) analyzed. Double- or triple-vessel disease was present in 12/90 patients (13.3%). On univariate analysis, right coronary artery disease (p <0.01), age (p = 0.01), and diabetes (p = 0.04) were found to be risk factors for AF. On multivariate analysis, right coronary artery disease was identified as the sole significant risk factor (p = 0.02). In this patient population, the incidence of AF after minimally invasive coronary artery bypass is in the range of that reported for conventional coronary artery bypass grafting. Right coronary artery disease was found to be an independent predictor, and this may be related to the fact that in this patient population the diseased right coronary artery was not revascularized at the time of the surgical procedure. For the same reason, this risk factor may find a broader application to noncardiac thoracic surgery.
Resumo:
PURPOSE: We report the long-term results of a randomized clinical trial comparing induction therapy with once per week for 4 weeks single-agent rituximab alone versus induction followed by 4 cycles of maintenance therapy every 2 months in patients with follicular lymphoma. PATIENTS AND METHODS: Patients (prior chemotherapy 138; chemotherapy-naive 64) received single-agent rituximab and if nonprogressive, were randomly assigned to no further treatment (observation) or four additional doses of rituximab given at 2-month intervals (prolonged exposure). RESULTS: At a median follow-up of 9.5 years and with all living patients having been observed for at least 5 years, the median event-free survival (EFS) was 13 months for the observation and 24 months for the prolonged exposure arm (P < .001). In the observation arm, patients without events at 8 years were 5%, while in the prolonged exposure arm they were 27%. Of previously untreated patients receiving prolonged treatment after responding to rituximab induction, at 8 years 45% were still without event. The only favorable prognostic factor for EFS in a multivariate Cox regression was the prolonged rituximab schedule (hazard ratio, 0.59; 95% CI, 0.39 to 0.88; P = .009), whereas being chemotherapy naive, presenting with stage lower than IV, and showing a VV phenotype at position 158 of the Fc-gamma RIIIA receptor were not of independent prognostic value. No long-term toxicity potentially due to rituximab was observed. CONCLUSION: An important proportion of patients experienced long-term remission after prolonged exposure to rituximab, particularly if they had no prior treatment and responded to rituximab induction.
Resumo:
Purpose: to assess the trends of self-reported prevalence of cardiovascular risk factors (CV RFs: hypertension, dyslipidaemia, diabetes) and their management for period 1992 to 2007 in the Swiss population. Methods: four National health interview surveys conducted between 1992 and 2007 in representative samples of the Swiss population (63,782 subjects overall). Self-reported CV RFs prevalence, treatment and controllevels were computed after weighting. Weights were calculated by raking ratio such that the marginal distribution of the weighted totals conforms to the marginal distribution of the targeted population. Multivariate analysis adjusted on age, sex, education, nationality and SMI was conducted using logistic regression. Results: prevalence of ail CV RFs increased between 1992 and 2007, see table. Although the self-reported prevalence of treatment among subjects with CV RFs increased, and this was confirmed by multivariate analysis: OR for hypocholesterolaemic treatment relative to 1992: 0.64 [0.52-0.78]; 1.39 [1.18-1.65] and 2.00 [1.69-2.36] for 1997, 2002 and 2007, respectively. Still, in 2007, circa 40% of hypertensive, 60% of dyslipidaemic and 50% of diabetic subjects weren't treated. Conversely, an adequate control of CV RFs was reported by treated subjects, with an increase during the study period. This increase was confirmed by multivariate analysis (not shown). Conclusion: the self-reported prevalence of hypertension, dyslipidaemia and diabetes increased between 1992 and 2007 in the Swiss population. Despite a good control of treated subjects, still a significant percentage of subjects with CV RFs are not treated.