957 resultados para Gases, Rare--Statistical methods.
Resumo:
This project examines similarities and differences between the automated condition data collected on and off county paved roads and the manual condition data collected by Iowa Department of Transportation (DOT) staff in 2000 and 2001. Also, the researchers will provide staff support to the advisory committee in exploring other options to the highway need process. The results show that the automated condition data can be used in a converted highway needs process with no major differences between the two methods. Even though the foundation rating difference was significant, the foundation rating weighting factor in HWYNEEDS is minimal and should not have a major impact. In terms of RUTF formula based distribution, the results clearly show the superiority of the condition-based analysis compared to the non-condition based. That correlation can be further enhanced by adding more distress variables to the analysis.
Resumo:
The contribution of muscle biopsies to the diagnosis of neuromuscular disorders and the indications of various methods of examination are investigated by analysis of 889 biopsies from patients suffering from myopathic and/or neurogenic disorders. Histo-enzymatic studies performed on frozen material as well as immunohistochemistry and electron microscopy allowed to provide specific diagnoses in all the neurogenic disorders (polyneuropathies and motor neuron diseases), whereas one third of myopathies remained uncertain. Confrontation of neuropathological data with the clinical indications for histological investigations shows that muscle biopsies reveal the diagnosis in 25% of the cases (mainly in congenital and metabolic myopathies) and confirm and/or complete the clinical diagnosis in 50%. In the remaining cases with non specific abnormalities neuropathological investigations may help the clinician by excluding well defined neuromuscular disorders. Analysis of performed studies and results of investigations show the contribution and specificity of each method for the diagnosis. Statistical evaluation of this series indicates that cryostat sectioning for histo- and immunochemical and electron microscopy increases the rate of diagnoses of neuromuscular diseases: full investigation was necessary for the diagnosis in 30% of the cases. The interpretation of the wide range of pathological reactions in muscles requires a close cooperation with the clinician.
Resumo:
This paper investigates the use of ensemble of predictors in order to improve the performance of spatial prediction methods. Support vector regression (SVR), a popular method from the field of statistical machine learning, is used. Several instances of SVR are combined using different data sampling schemes (bagging and boosting). Bagging shows good performance, and proves to be more computationally efficient than training a single SVR model while reducing error. Boosting, however, does not improve results on this specific problem.
Resumo:
Next-generation sequencing techniques such as exome sequencing can successfully detect all genetic variants in a human exome and it has been useful together with the implementation of variant filters to identify causing-disease mutations. Two filters aremainly used for the mutations identification: low allele frequency and the computational annotation of the genetic variant. Bioinformatic tools to predict the effect of a givenvariant may have errors due to the existing bias in databases and sometimes show a limited coincidence among them. Advances in functional and comparative genomics are needed in order to properly annotate these variants.The goal of this study is to: first, functionally annotate Common Variable Immunodeficiency disease (CVID) variants with the available bioinformatic methods in order to assess the reliability of these strategies. Sencondly, as the development of new methods to reduce the number of candidate genetic variants is an active and necessary field of research, we are exploring the utility of gene function information at organism level as a filter for rare disease genes identification. Recently, it has been proposed that only 10-15% of human genes are essential and therefore we would expect that severe rare diseases are mostly caused by mutations on them. Our goal is to determine whether or not these rare and severe diseases are caused by deleterious mutations in these essential genes. If this hypothesis were true, taking into account essential genes as a filter would be an interesting parameter to identify causingdisease mutations.
Resumo:
The objective of this study was to estimate the potential of method restriction as a public health strategy in suicide prevention. Data from the Swiss Federal Statistical Office and the Swiss Institutes of Forensic Medicine from 2004 were gathered and categorized into suicide submethods according to accessibility to restriction of means. Of suicides in Switzerland, 39.2% are accessible to method restriction. The highest proportions were found in private weapons (13.2%), army weapons (10.4%), and jumps from hot-spots (4.6%). The presented method permits the estimation of the suicide prevention potential of a country by method restriction and the comparison of restriction potentials between suicide methods. In Switzerland, reduction of firearm suicides has the highest potential to reduce the total number of suicides.
Resumo:
PURPOSE: Primary bone lymphoma (PBL) represents less than 1% of all malignant lymphomas. In this study, we assessed the disease profile, outcome, and prognostic factors in patients with Stages I and II PBL.¦PATIENTS AND METHODS: Thirteen Rare Cancer Network (RCN) institutions enrolled 116 consecutive patients with PBL treated between 1987 and 2008 in this study. Eighty-seven patients underwent chemoradiotherapy (CXRT) without (78) or with (9) surgery, 15 radiotherapy (RT) without (13) or with (2) surgery, and 14 chemotherapy (CXT) without (9) or with (5) surgery. Median RT dose was 40 Gy (range, 4-60). The median number of CXT cycles was six (range, 2-8). Median follow-up was 41 months (range, 6-242).¦RESULTS: The overall response rate at the end of treatment was 91% (complete response [CR] 74%, partial response [PR] 17%). Local recurrence or progression was observed in 12 (10%) patients and systemic recurrence in 17 (15%). The 5-year overall survival (OS), lymphoma-specific survival (LSS), and local control (LC) were 76%, 78%, and 92%, respectively. In univariate analyses (log-rank test), favorable prognostic factors for OS and LSS were International Prognostic Index (IPI) score ≤1 (p = 0.009), high-grade histology (p = 0.04), CXRT (p = 0.05), CXT (p = 0.0004), CR (p < 0.0001), and RT dose >40 Gy (p = 0.005). For LC, only CR and Stage I were favorable factors. In multivariate analysis, IPI score, RT dose, CR, and CXT were independently influencing the outcome (OS and LSS). CR was the only predicting factor for LC.¦CONCLUSION: This large multicenter retrospective study confirms the good prognosis of early-stage PBL treated with combined CXRT. An adequate dose of RT and complete CXT regime were associated with better outcome.
Resumo:
Because data on rare species usually are sparse, it is important to have efficient ways to sample additional data. Traditional sampling approaches are of limited value for rare species because a very large proportion of randomly chosen sampling sites are unlikely to shelter the species. For these species, spatial predictions from niche-based distribution models can be used to stratify the sampling and increase sampling efficiency. New data sampled are then used to improve the initial model. Applying this approach repeatedly is an adaptive process that may allow increasing the number of new occurrences found. We illustrate the approach with a case study of a rare and endangered plant species in Switzerland and a simulation experiment. Our field survey confirmed that the method helps in the discovery of new populations of the target species in remote areas where the predicted habitat suitability is high. In our simulations the model-based approach provided a significant improvement (by a factor of 1.8 to 4 times, depending on the measure) over simple random sampling. In terms of cost this approach may save up to 70% of the time spent in the field.
Resumo:
We report a case of abdominal eventration associated with cystic fibrosis, diagnosed by mid-trimester ultrasonography. The defect concerned the abdominal muscles and their aponevrotic sheath, but respected the skin. There was no associated malformation. The outcome was favorable after surgery, and the infant is well at the age of 6 months.
Resumo:
In this paper, some steganalytic techniques designed to detect the existence of hidden messages using histogram shifting methods are presented. Firstly, some techniques to identify specific methods of histogram shifting, based on visible marks on the histogram or abnormal statistical distributions are suggested. Then, we present a general technique capable of detecting all histogram shifting techniques analyzed. This technique is based on the effect of histogram shifting methods on the "volatility" of the histogram of differences and the study of its reduction whenever new data are hidden.
Resumo:
The objective of this work was to determine how taxonomy benefited from the ecological quantitative and site-based sampling methods in enchytraeids studies. Enchytraeids (small relatives of earthworms) were sampled in different phases of rain forest regeneration in the southern Mata Atlântica in Paraná, Brazil. The research combined ecological and taxonomic work, because enchytraeids are poorly studied and difficult to identify, and many new species were expected. The provision of large numbers of specimens enabled the test of species diagnoses by investigating the ranges of character variations in a larger series of specimens. Simplified species diagnoses adapted to the local conditions that allowed the identification of all specimens, juveniles included, were developed. Key characters and character states are presented for the three genera: Achaeta, Hemienchytraeus and Guaranidrilus. Among several new species, a rare species, possibly a remnant of the autochthonous forest fauna, was found and described.
Resumo:
BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.
Resumo:
BACKGROUND: To asses the clinical profile, treatment outcome and prognostic factors in primary breast lymphoma (PBL). METHODS: Between 1970 and 2000, 84 consecutive patients with PBL were treated in 20 institutions of the Rare Cancer Network. Forty-six patients had Ann Arbor stage IE, 33 stage IIE, 1 stage IIIE, 2 stage IVE and 2 an unknown stage. Twenty-one underwent a mastectomy, 39 conservative surgery and 23 biopsy; 51 received radiotherapy (RT) with (n = 37) or without (n = 14) chemotherapy. Median RT dose was 40 Gy (range 12-55 Gy). RESULTS: Ten (12%) patients progressed locally and 43 (55%) had a systemic relapse. Central nervous system (CNS) was the site of relapse in 12 (14%) cases. The 5-yr overall survival, lymphoma-specific survival, disease-free survival and local control rates were 53%, 59%, 41% and 87% respectively. In the univariate analyses, favorable prognostic factors were early stage, conservative surgery, RT administration and combined modality treatment. Multivariate analysis showed that early stage and the use of RT were favorable prognostic factors. CONCLUSION: The outcome of PBL is fair. Local control is excellent with RT or combined modality treatment but systemic relapses, including that in the CNS, occurs frequently.
Resumo:
Motivation: The comparative analysis of gene gain and loss rates is critical for understanding the role of natural selection and adaptation in shaping gene family sizes. Studying complete genome data from closely related species allows accurate estimation of gene family turnover rates. Current methods and software tools, however, are not well designed for dealing with certain kinds of functional elements, such as microRNAs or transcription factor binding sites. Results: Here, we describe BadiRate, a new software tool to estimate family turnover rates, as well as the number of elements in internal phylogenetic nodes, by likelihood-based methods and parsimony. It implements two stochastic population models, which provide the appropriate statistical framework for testing hypothesis, such as lineage-specific gene family expansions or contractions. We have assessed the accuracy of BadiRate by computer simulations, and have also illustrated its functionality by analyzing a representative empirical dataset.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
Motivation: The comparative analysis of gene gain and loss rates is critical for understanding the role of natural selection and adaptation in shaping gene family sizes. Studying complete genome data from closely related species allows accurate estimation of gene family turnover rates. Current methods and software tools, however, are not well designed for dealing with certain kinds of functional elements, such as microRNAs or transcription factor binding sites. Results: Here, we describe BadiRate, a new software tool to estimate family turnover rates, as well as the number of elements in internal phylogenetic nodes, by likelihood-based methods and parsimony. It implements two stochastic population models, which provide the appropriate statistical framework for testing hypothesis, such as lineage-specific gene family expansions or contractions. We have assessed the accuracy of BadiRate by computer simulations, and have also illustrated its functionality by analyzing a representative empirical dataset.