945 resultados para multivariate regression tree
Resumo:
Understanding niche evolution, dynamics, and the response of species to climate change requires knowledge of the determinants of the environmental niche and species range limits. Mean values of climatic variables are often used in such analyses. In contrast, the increasing frequency of climate extremes suggests the importance of understanding their additional influence on range limits. Here, we assess how measures representing climate extremes (i.e., interannual variability in climate parameters) explain and predict spatial patterns of 11 tree species in Switzerland. We find clear, although comparably small, improvement (+20% in adjusted D(2), +8% and +3% in cross-validated True Skill Statistic and area under the receiver operating characteristics curve values) in models that use measures of extremes in addition to means. The primary effect of including information on climate extremes is a correction of local overprediction and underprediction. Our results demonstrate that measures of climate extremes are important for understanding the climatic limits of tree species and assessing species niche characteristics. The inclusion of climate variability likely will improve models of species range limits under future conditions, where changes in mean climate and increased variability are expected.
Resumo:
Based on data available in the Information System for Notifiable Diseases, predictive factors of favorable results were identified in the treatment of pulmonary tuberculosis, diagnosed between 2001 and 2004 and living in Recife-PE, Brazil. Uni- and multivariate logistic regression methods were used. In multivariate analysis, the following factors remained: Age (years), 0 to 9 (OR=4.27; p=0.001) and 10 to 19 (OR=1.78; p=0.011), greater chance of cure than over 60; Education (years), 8 to 11 (OR=1.52; p=0.049), greater chance of cure than no education; Type of entry, new cases (OR=3.31; p<0.001) and relapse (OR=3.32; p<0.001), greater chances of cure than restart after abandonment; Time (months) 2, 5-|6 (OR=9.15; p<0.001); 6-|9 (OR=27.28; p<0.001) and More than 9 (OR=24.78; p<0.001), greater chances of cure than less than 5; Health Unit District, DS I (OR=1.60; p=0.018) and DS IV (OR=2.87; p<0.001), greater chances of cure than DS VI.
Resumo:
Quantifying the impacts of inbreeding and genetic drift on fitness traits in fragmented populations is becoming a major goal in conservation biology. Such impacts occur at different levels and involve different sets of loci. Genetic drift randomly fixes slightly deleterious alleles leading to different fixation load among populations. By contrast, inbreeding depression arises from highly deleterious alleles in segregation within a population and creates variation among individuals. A popular approach is to measure correlations between molecular variation and phenotypic performances. This approach has been mainly used at the individual level to detect inbreeding depression within populations and sometimes at the population level but without consideration about the genetic processes measured. For the first time, we used in this study a molecular approach considering both the interpopulation and intrapopulation level to discriminate the relative importance of inbreeding depression vs. fixation load in isolated and non-fragmented populations of European tree frog (Hyla arborea), complemented with interpopulational crosses. We demonstrated that the positive correlations observed between genetic heterozygosity and larval performances on merged data were mainly caused by co-variations in genetic diversity and fixation load among populations rather than by inbreeding depression and segregating deleterious alleles within populations. Such a method is highly relevant in a conservation perspective because, depending on how populations lose fitness (inbreeding vs. fixation load), specific management actions may be designed to improve the persistence of populations.
Resumo:
OBJECTIVES: The objectives were to identify the social and medical factors associated with emergency department (ED) frequent use and to determine if frequent users were more likely to have a combination of these factors in a universal health insurance system. METHODS: This was a retrospective chart review case-control study comparing randomized samples of frequent users and nonfrequent users at the Lausanne University Hospital, Switzerland. The authors defined frequent users as patients with four or more ED visits within the previous 12 months. Adult patients who visited the ED between April 2008 and March 2009 (study period) were included, and patients leaving the ED without medical discharge were excluded. For each patient, the first ED electronic record within the study period was considered for data extraction. Along with basic demographics, variables of interest included social (employment or housing status) and medical (ED primary diagnosis) characteristics. Significant social and medical factors were used to construct a logistic regression model, to determine factors associated with frequent ED use. In addition, comparison of the combination of social and medical factors was examined. RESULTS: A total of 359 of 1,591 frequent and 360 of 34,263 nonfrequent users were selected. Frequent users accounted for less than a 20th of all ED patients (4.4%), but for 12.1% of all visits (5,813 of 48,117), with a maximum of 73 ED visits. No difference in terms of age or sex occurred, but more frequent users had a nationality other than Swiss or European (n = 117 [32.6%] vs. n = 83 [23.1%], p = 0.003). Adjusted multivariate analysis showed that social and specific medical vulnerability factors most increased the risk of frequent ED use: being under guardianship (adjusted odds ratio [OR] = 15.8; 95% confidence interval [CI] = 1.7 to 147.3), living closer to the ED (adjusted OR = 4.6; 95% CI = 2.8 to 7.6), being uninsured (adjusted OR = 2.5; 95% CI = 1.1 to 5.8), being unemployed or dependent on government welfare (adjusted OR = 2.1; 95% CI = 1.3 to 3.4), the number of psychiatric hospitalizations (adjusted OR = 4.6; 95% CI = 1.5 to 14.1), and the use of five or more clinical departments over 12 months (adjusted OR = 4.5; 95% CI = 2.5 to 8.1). Having two of four social factors increased the odds of frequent ED use (adjusted = OR 5.4; 95% CI = 2.9 to 9.9), and similar results were found for medical factors (adjusted OR = 7.9; 95% CI = 4.6 to 13.4). A combination of social and medical factors was markedly associated with ED frequent use, as frequent users were 10 times more likely to have three of them (on a total of eight factors; 95% CI = 5.1 to 19.6). CONCLUSIONS: Frequent users accounted for a moderate proportion of visits at the Lausanne ED. Social and medical vulnerability factors were associated with frequent ED use. In addition, frequent users were more likely to have both social and medical vulnerabilities than were other patients. Case management strategies might address the vulnerability factors of frequent users to prevent inequities in health care and related costs.
Resumo:
BACKGROUND: Toll-like receptors (TLRs) are essential components of the immune response to fungal pathogens. We examined the role of TLR polymorphisms in conferring a risk of invasive aspergillosis among recipients of allogeneic hematopoietic-cell transplants. METHODS: We analyzed 20 single-nucleotide polymorphisms (SNPs) in the toll-like receptor 2 gene (TLR2), the toll-like receptor 3 gene (TLR3), the toll-like receptor 4 gene (TLR4), and the toll-like receptor 9 gene (TLR9) in a cohort of 336 recipients of hematopoietic-cell transplants and their unrelated donors. The risk of invasive aspergillosis was assessed with the use of multivariate Cox regression analysis. The analysis was replicated in a validation study involving 103 case patients and 263 matched controls who received hematopoietic-cell transplants from related and unrelated donors. RESULTS: In the discovery study, two donor TLR4 haplotypes (S3 and S4) increased the risk of invasive aspergillosis (adjusted hazard ratio for S3, 2.20; 95% confidence interval [CI], 1.14 to 4.25; P=0.02; adjusted hazard ratio for S4, 6.16; 95% CI, 1.97 to 19.26; P=0.002). The haplotype S4 was present in carriers of two SNPs in strong linkage disequilibrium (1063 A/G [D299G] and 1363 C/T [T399I]) that influence TLR4 function. In the validation study, donor haplotype S4 also increased the risk of invasive aspergillosis (adjusted odds ratio, 2.49; 95% CI, 1.15 to 5.41; P=0.02); the association was present in unrelated recipients of hematopoietic-cell transplants (odds ratio, 5.00; 95% CI, 1.04 to 24.01; P=0.04) but not in related recipients (odds ratio, 2.29; 95% CI, 0.93 to 5.68; P=0.07). In the discovery study, seropositivity for cytomegalovirus (CMV) in donors or recipients, donor positivity for S4, or both, as compared with negative results for CMV and S4, were associated with an increase in the 3-year probability of invasive aspergillosis (12% vs. 1%, P=0.02) and death that was not related to relapse (35% vs. 22%, P=0.02). CONCLUSIONS: This study suggests an association between the donor TLR4 haplotype S4 and the risk of invasive aspergillosis among recipients of hematopoietic-cell transplants from unrelated donors.
Resumo:
Machine learning and pattern recognition methods have been used to diagnose Alzheimer's disease (AD) and mild cognitive impairment (MCI) from individual MRI scans. Another application of such methods is to predict clinical scores from individual scans. Using relevance vector regression (RVR), we predicted individuals' performances on established tests from their MRI T1 weighted image in two independent data sets. From Mayo Clinic, 73 probable AD patients and 91 cognitively normal (CN) controls completed the Mini-Mental State Examination (MMSE), Dementia Rating Scale (DRS), and Auditory Verbal Learning Test (AVLT) within 3months of their scan. Baseline MRI's from the Alzheimer's disease Neuroimaging Initiative (ADNI) comprised the other data set; 113 AD, 351 MCI, and 122 CN subjects completed the MMSE and Alzheimer's Disease Assessment Scale-Cognitive subtest (ADAS-cog) and 39 AD, 92 MCI, and 32 CN ADNI subjects completed MMSE, ADAS-cog, and AVLT. Predicted and actual clinical scores were highly correlated for the MMSE, DRS, and ADAS-cog tests (P<0.0001). Training with one data set and testing with another demonstrated stability between data sets. DRS, MMSE, and ADAS-Cog correlated better than AVLT with whole brain grey matter changes associated with AD. This result underscores their utility for screening and tracking disease. RVR offers a novel way to measure interactions between structural changes and neuropsychological tests beyond that of univariate methods. In clinical practice, we envision using RVR to aid in diagnosis and predict clinical outcome.
Resumo:
In this study we determine whether blood pressure readings using a cuff of fixed size systematically differed from readings made with a triple-bladder cuff (Tricuff) that automatically adjusts bladder width to arm circumference and assessed subsequent clinical and epidemiological effects. Blood pressure was measured with a standard cuff or a Tricuff in 454 patients visiting an outpatient clinic in the Seychelles (Indian Ocean). Overall means of within-individual standard cuff-Tricuff differences in systolic and diastolic blood pressures were examined in relation to arm circumference and sex. The standard cuff-Tricuff difference in systolic and diastolic blood pressures increased monotonically with circumference (from 4.7 +/- 0.8/3.2 +/- 0.7 mm Hg for arm circumference of 30 to 31 cm to 10.0 +/- 1.1/8.0 +/- 0.9 mm Hg for arm circumference > or = 36 cm) and was larger in women than men. Multivariate linear regression indicated independent effects of arm circumference and sex. Forty percent of subjects with a diastolic blood pressure of > or = 95 mm Hg measured with a standard cuff had values less than 95 mm Hg measured with a Tricuff. Extrapolation to the entire population of the Seychelles decreased the prevalence of blood pressure greater than or equal to 160/95 mm Hg by 11.5% and 24.0% in men and women, respectively, aged 35 to 64 years. The age-adjusted effect of body mass index on systolic and diastolic blood pressures decreased twofold using blood pressure readings made with a Tricuff instead of a standard cuff.(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
Population viability analyses (PVA) are increasingly used in metapopulation conservation plans. Two major types of models are commonly used to assess vulnerability and to rank management options: population-based stochastic simulation models (PSM such as RAMAS or VORTEX) and stochastic patch occupancy models (SPOM). While the first set of models relies on explicit intrapatch dynamics and interpatch dispersal to predict population levels in space and time, the latter is based on spatially explicit metapopulation theory where the probability of patch occupation is predicted given the patch area and isolation (patch topology). We applied both approaches to a European tree frog (Hyla arborea) metapopulation in western Switzerland in order to evaluate the concordances of both models and their applications to conservation. Although some quantitative discrepancies appeared in terms of network occupancy and equilibrium population size, the two approaches were largely concordant regarding the ranking of patch values and sensitivities to parameters, which is encouraging given the differences in the underlying paradigms and input data.
Resumo:
BACKGROUND: Polymorphisms in IFNL3 and IFNL4, the genes encoding interferon λ3 and interferon λ4, respectively, have been associated with reduced hepatitis C virus clearance. We explored the role of such polymorphisms on the incidence of cytomegalovirus (CMV) infection in solid-organ transplant recipients. METHODS: White patients participating in the Swiss Transplant Cohort Study in 2008-2011 were included. A novel functional TT/-G polymorphism (rs368234815) in the CpG region upstream of IFNL3 was investigated. RESULTS: A total of 840 solid-organ transplant recipients at risk for CMV infection were included, among whom 373 (44%) received antiviral prophylaxis. The 12-month cumulative incidence of CMV replication and disease were 0.44 and 0.08 cases, respectively. Patient homozygous for the minor rs368234815 allele (-G/-G) tended to have a higher cumulative incidence of CMV replication (subdistribution hazard ratio [SHR], 1.30 [95% confidence interval {CI}, .97-1.74]; P = .07), compared with other patients (TT/TT or TT/-G). The association was significant among patients followed by a preemptive approach (SHR, 1.46 [95% CI, 1.01-2.12]; P = .047), especially in patients receiving an organ from a seropositive donor (SHR, 1.92 [95% CI, 1.30-2.85]; P = .001), but not among those who received antiviral prophylaxis (SHR, 1.13 [95% CI, .70-1.83]; P = .6). These associations remained significant in multivariate competing risk regression models. CONCLUSIONS: Polymorphisms in the IFNL3/4 region influence susceptibility to CMV replication in solid-organ transplant recipients, particularly in patients not receiving antiviral prophylaxis.
Resumo:
1. As trees in a given cohort progress through ontogeny, many individuals die. This risk of mortality is unevenly distributed across species because of many processes such as habitat filtering, interspecific competition and negative density dependence. Here, we predict and test the patterns that such ecological processes should inscribe on both species and phylogenetic diversity as plants recruit from saplings to the canopy. 2. We compared species and phylogenetic diversity of sapling and tree communities at two sites in French Guiana. We surveyed 2084 adult trees in four 1-ha tree plots and 943 saplings in sixteen 16-m2 subplots nested within the tree plots. Species diversity was measured using Fisher's alpha (species richness) and Simpson's index (species evenness). Phylogenetic diversity was measured using Faith's phylogenetic diversity (phylogenetic richness) and Rao's quadratic entropy index (phylogenetic evenness). The phylogenetic diversity indices were inferred using four phylogenetic hypotheses: two based on rbcLa plastid DNA sequences obtained from the inventoried individuals with different branch lengths, a global phylogeny available from the Angiosperm Phylogeny Group, and a combination of both. 3. Taxonomic identification of the saplings was performed by combining morphological and DNA barcoding techniques using three plant DNA barcodes (psbA-trnH, rpoC1 and rbcLa). DNA barcoding enabled us to increase species assignment and to assign unidentified saplings to molecular operational taxonomic units. 4. Species richness was similar between saplings and trees, but in about half of our comparisons, species evenness was higher in trees than in saplings. This suggests that negative density dependence plays an important role during the sapling-to-tree transition. 5. Phylogenetic richness increased between saplings and trees in about half of the comparisons. Phylogenetic evenness increased significantly between saplings and trees in a few cases (4 out of 16) and only with the most resolved phylogeny. These results suggest that negative density dependence operates largely independently of the phylogenetic structure of communities. 6. Synthesis. By contrasting species richness and evenness across size classes, we suggest that negative density dependence drives shifts in composition during the sapling-to-tree transition. In addition, we found little evidence for a change in phylogenetic diversity across age classes, suggesting that the observed patterns are not phylogenetically constrained.
Resumo:
We investigated sex-specific recombination rates in Hyla arborea, a species with nascent sex chromosomes and male heterogamety. Twenty microsatellites were clustered into six linkage groups, all showing suppressed or very low recombination in males. Seven markers were sex linked, none of them showing any sign of recombination in males (r=0.00 versus 0.43 on average in females). This opposes classical models of sex chromosome evolution, which envision an initially small differential segment that progressively expands as structural changes accumulate on the Y chromosome. For autosomes, maps were more than 14 times longer in females than in males, which seems the highest ratio documented so far in vertebrates. These results support the pleiotropic model of Haldane and Huxley, according to which recombination is reduced in the heterogametic sex by general modifiers that affect recombination on the whole genome.
Resumo:
In contrast with mammals and birds, most poikilothermic vertebrates feature structurally undifferentiated sex chromosomes, which may result either from frequent turnovers, or from occasional events of XY recombination. The latter mechanism was recently suggested to be responsible for sex-chromosome homomorphy in European tree frogs (Hyla arborea). However, no single case of male recombination has been identified in large-scale laboratory crosses, and populations from NW Europe consistently display sex-specific allelic frequencies with male-diagnostic alleles, suggesting the absence of recombination in their recent history. To address this apparent paradox, we extended the phylogeographic scope of investigations, by analyzing the sequences of three sex-linked markers throughout the whole species distribution. Refugial populations (southern Balkans and Adriatic coast) show a mix of X and Y alleles in haplotypic networks, and no more within-individual pairwise nucleotide differences in males than in females, testifying to recurrent XY recombination. In contrast, populations of NW Europe, which originated from a recent postglacial expansion, show a clear pattern of XY differentiation; the X and Y gametologs of the sex-linked gene Med15 present different alleles, likely fixed by drift on the front wave of expansions, and kept differentiated since. Our results support the view that sex-chromosome homomorphy in H. arborea is maintained by occasional or historical events of recombination; whether the frequency of these events indeed differs between populations remains to be clarified.
Resumo:
BACKGROUND: Early diagnosis of postoperative orthopaedic infections is important in order to rapidly initiate adequate antimicrobial therapy. There are currently no reliable diagnostic markers to differentiate infectious from noninfectious causes of postoperative fever. We investigated the value of the serum procalcitonin level in febrile patients after orthopaedic surgery. METHODS: We prospectively evaluated 103 consecutive patients with new onset of fever within ten days after orthopaedic surgery. Fever episodes were classified by two independent investigators who were blinded to procalcitonin results as infectious or noninfectious origin. White blood-cell count, C-reactive protein level, and procalcitonin level were assessed on days 0, 1, and 3 of the postoperative fever. RESULTS: Infection was diagnosed in forty-five (44%) of 103 patients and involved the respiratory tract (eighteen patients), urinary tract (eighteen), joints (four), surgical site (two), bloodstream (two), and soft tissues (one). Unlike C-reactive protein levels and white blood-cell counts, procalcitonin values were significantly higher in patients with infection compared with patients without infection on the day of fever onset (p = 0.04), day 1 (p = 0.07), and day 3 (p = 0.003). Receiver-operating characteristics demonstrated that procalcitonin had the highest diagnostic accuracy, with a value of 0.62, 0.62, and 0.71 on days 0, 1, and 3, respectively. In a multivariate logistic regression analysis, procalcitonin was a significant predictor for postoperative infection on days 0, 1, and 3 of fever with an odds ratio of 2.3 (95% confidence interval, 1.1 to 4.4), 2.3 (95% confidence interval, 1.1 to 5.2), and 3.3 (95% confidence interval, 1.2 to 9.0), respectively. CONCLUSIONS: Serum procalcitonin is a helpful diagnostic marker supporting clinical and microbiological findings for more reliable differentiation of infectious from noninfectious causes of fever after orthopaedic surgery.