948 resultados para Cryptography Statistical methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neurodevelopment of preterm children has become an outcome of major interest since the improvement in survival due to advances in neonatal care. Many studies focused on the relationships among prenatal characteristics and neurodevelopmental outcome in order to identify the higher risk preterms’ subgroups. The aim of this study is to analyze and put in relation growth and development trajectories to investigate their association. 346 children born at the S.Orsola Hospital in Bologna from 01/01/2005 to 30/06/2011 with a birth weight of <1500 grams were followed up in a longitudinal study at different intervals from 3 to 24 months of corrected age. During follow-up visits, preterms’ main biometrical characteristics were measured and the Griffiths Mental Development Scale was administered to assess neurodevelopment. Latent Curve Models were developed to estimate the trajectories of length and of neurodevelopment, both separately and combined in a single model, and to assess the influence of clinical and socio-economic variables. Neurodevelopment trajectory was stepwise declining over time and length trajectory showed a steep increase until 12 months and was flat afterwards. Higher initial values of length were correlated with higher initial values of neurodevelopment and predicted a more declining neurodevelopment. SGA preterms and those from families with higher status had a less declining neurodevelopment slope, while being born from a migrant mother proved negative on neurodevelopment through the mediating effect of a being taller at 3 months. A longer stay in NICU used as a proxy of preterms’ morbidity) was predictive of lower initial neurodevelopment levels. At 24 months, neurodevelopment is more similar among preterms and is more accurately evaluated. The association among preterms’ neurodevelopment and physiological growth may provide further insights on the determinants of preterms’ outcomes. Sound statistical methods, exploiting all the information collected in a longitudinal study, may be more appropriate to the analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite the scientific achievement of the last decades in the astrophysical and cosmological fields, the majority of the Universe energy content is still unknown. A potential solution to the “missing mass problem” is the existence of dark matter in the form of WIMPs. Due to the very small cross section for WIMP-nuleon interactions, the number of expected events is very limited (about 1 ev/tonne/year), thus requiring detectors with large target mass and low background level. The aim of the XENON1T experiment, the first tonne-scale LXe based detector, is to be sensitive to WIMP-nucleon cross section as low as 10^-47 cm^2. To investigate the possibility of such a detector to reach its goal, Monte Carlo simulations are mandatory to estimate the background. To this aim, the GEANT4 toolkit has been used to implement the detector geometry and to simulate the decays from the various background sources: electromagnetic and nuclear. From the analysis of the simulations, the level of background has been found totally acceptable for the experiment purposes: about 1 background event in a 2 tonne-years exposure. Indeed, using the Maximum Gap method, the XENON1T sensitivity has been evaluated and the minimum for the WIMP-nucleon cross sections has been found at 1.87 x 10^-47 cm^2, at 90% CL, for a WIMP mass of 45 GeV/c^2. The results have been independently cross checked by using the Likelihood Ratio method that confirmed such results with an agreement within less than a factor two. Such a result is completely acceptable considering the intrinsic differences between the two statistical methods. Thus, in the PhD thesis it has been proven that the XENON1T detector will be able to reach the designed sensitivity, thus lowering the limits on the WIMP-nucleon cross section by about 2 orders of magnitude with respect to the current experiments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Das Glaukom ist, nach dem Katarakt, die zweithäufigste Ursache für Erblindungen weltweit mit Milionen von Betroffenen, die von dieser zunächst weitgehend symptomfreien neurodegenerativen Erkrankung heimgesucht werden. Die Möglichkeiten auf dem Feld der Diagnose beschränken sich bislang weitestgehend auf die Messung des Augeninnendrucks und der Beurteilung des Augenhintergrundes durch einen erfahrenen Augenarzt. Eine labordiagnostische Prophylaxe ist bis heute nicht verfügbar, die Zahl unerkannter Erkrankungen dementsprechend hoch. Hierdurch geht wertvolle Zeit verloren, die man für eine effektive Therapie nutzen könnte.rnBezüglich der Pathogenese des Glaukoms geht man heute von mehreren, miteinander wechselwirkenden Pathomechanismen aus, zu denen neben mechanischen Einflüssen durch einen erhöhten IOD auch Hypoxie, verminderte Neutrophinversorgung, Exzitotoxizität, oxidativer Stress und eine Beteiligung autoimmuner Prozesse gezählt werden. Unabhängig vom Pathomechanismus folgt stets die Etablierung umfangreicher degenerativer Prozesse im Sehnervenkopf, den retinalen Ganglienzellen und den Axonen des Sehnerven, die letztlich im irreversiblen Untergang dieser Neuronen münden. Diese pathologischen Prozesse im ZNS hinterlassen auf Proteomebene Spuren, die mithilfe moderner massenspektrometrischer Methoden in Kombination mit multivariaten statistischen Methoden detektierbar und als sogenannte Biomarker-Kandidaten mit definiertem Molekulargewicht darstellbar sind. In dieser Arbeit wurde ein „Workflow“ entwickelt, der es ermöglicht, diese Biomarker-Kandidaten im Blutserum und in der Tränenflüssigkeit in einfachen, reproduzierbaren Schritten zu identifizieren und zu charakterisieren. Abweichend von der etablierten Methotik der Bottom-Up-Proteomics musste hierfür eine Methode entsprechend einer Top-Down-Philosophie entwickelt werden, die es erlaubt, die Spuren des Glaukoms im Proteom zu detektieren und zu charakterisieren.rnDies erfolgte in dieser Arbeit durch sowohl massenspektroskopischen Methoden wie SELDI-TOF® und MALDI-Tof-Tof als auch durch Bead-, Gel- und Flüssigkeits-chromatographisch-basierte Separations und Fraktionierungstechniken.rnDie erfolgreiche Kombination dieser Methoden führte zu Identifikationen einer ganzen Reihe von Biomarker-Kandidaten. Unter den identifizierten Proteinen, die bezüglich ihres korrespondierenden SELDI-Peaks im Massenbereich von Biomarker-Kandidaten liegen, finden sich Zytokine und Effektormoleküle der angeborernen Immunität, stressinduzierbare Kinasen, Faktoren, die zum Schutz der Telomeren dienen, Proliferationsmarker, neuronale Antigene und Transportproteine. Darüber hinaus wurden Komponenten identifiziert, die an der neuronalen Neutrophinversorgung beteiligt sind, neuronale Rezeptoren und Antigene, Komponenten des Komplementsystems und des MHC-I-Komplexes. All diese identifizierten Proteine sind bezüglich ihrer Funktion und möglichen Rolle innerhalb der Pathogenese des Glaukoms detailliert beschrieben und charakterisiert. Dies erlaubt einen umfassenden Einblick in alle Pathomechanismen, denen nach heutigem Kenntnisstand, eine Rolle an der Pathogenese des Glaukoms unterstellt wird.rn

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This is the second part of a study investigating a model-based transient calibration process for diesel engines. The first part addressed the data requirements and data processing required for empirical transient emission and torque models. The current work focuses on modelling and optimization. The unexpected result of this investigation is that when trained on transient data, simple regression models perform better than more powerful methods such as neural networks or localized regression. This result has been attributed to extrapolation over data that have estimated rather than measured transient air-handling parameters. The challenges of detecting and preventing extrapolation using statistical methods that work well with steady-state data have been explained. The concept of constraining the distribution of statistical leverage relative to the distribution of the starting solution to prevent extrapolation during the optimization process has been proposed and demonstrated. Separate from the issue of extrapolation is preventing the search from being quasi-static. Second-order linear dynamic constraint models have been proposed to prevent the search from returning solutions that are feasible if each point were run at steady state, but which are unrealistic in a transient sense. Dynamic constraint models translate commanded parameters to actually achieved parameters that then feed into the transient emission and torque models. Combined model inaccuracies have been used to adjust the optimized solutions. To frame the optimization problem within reasonable dimensionality, the coefficients of commanded surfaces that approximate engine tables are adjusted during search iterations, each of which involves simulating the entire transient cycle. The resulting strategy, different from the corresponding manual calibration strategy and resulting in lower emissions and efficiency, is intended to improve rather than replace the manual calibration process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this study was to develop a criteria catalogue serving as a guideline for authors to improve quality of reporting experiments in basic research in homeopathy. A Delphi Process was initiated including three rounds of adjusting and phrasing plus two consensus conferences. European researchers who published experimental work within the last 5 years were involved. A checklist for authors provide a catalogue with 23 criteria. The “Introduction” should focus on underlying hypotheses, the homeopathic principle investigated and state if experiments are exploratory or confirmatory. “Materials and methods” should comprise information on object of investigation, experimental setup, parameters, intervention and statistical methods. A more detailed description on the homeopathic substances, for example, manufacture, dilution method, starting point of dilution is required. A further result of the Delphi process is to raise scientists' awareness of reporting blinding, allocation, replication, quality control and system performance controls. The part “Results” should provide the exact number of treated units per setting which were included in each analysis and state missing samples and drop outs. Results presented in tables and figures are as important as appropriate measures of effect size, uncertainty and probability. “Discussion” in a report should depict more than a general interpretation of results in the context of current evidence but also limitations and an appraisal of aptitude for the chosen experimental model. Authors of homeopathic basic research publications are encouraged to apply our checklist when preparing their manuscripts. Feedback is encouraged on applicability, strength and limitations of the list to enable future revisions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present study investigates the relation of perceived arousal (continuous self-rating), autonomic nervous system activity (heart rate, heart rate variability) and musical characteristics (sound intensity, musical rhythm) upon listening to a complex musical piece. Twenty amateur musicians listened to two performances of Chopin's "Tristesse" with different rhythmic shapes. Besides conventional statistical methods for analyzing psychophysiological reactions (heart rate, respiration rate) and musical variables, semblance analysis was used. Perceived arousal correlated strongly with sound intensity; heart rate showed only a partial response to changes in sound intensity. Larger changes in heart rate were caused by the version with more rhythmic tension. The low-/high-frequency ratio of heart rate variability increased-whereas the high frequency component decreased-during music listening. We conclude that autonomic nervous system activity can be modulated not only by sound intensity but also by the interpreter's use of rhythmic tension. Semblance analysis enables us to track the subtle correlations between musical and physiological variables.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[1] Instrumental temperature series are often affected by artificial breaks (“break points”) due to (e.g.,) changes in station location, land-use, or instrumentation. The Swiss climate observation network offers a high number and density of stations, many long and relatively complete daily to sub-daily temperature series, and well-documented station histories (i.e., metadata). However, for many climate observation networks outside of Switzerland, detailed station histories are missing, incomplete, or inaccessible. To correct these records, the use of reliable statistical break detection methods is necessary. Here, we apply three statistical break detection methods to high-quality Swiss temperature series and use the available metadata to assess the methods. Due to the complex terrain in Switzerland, we are able to assess these methods under specific local conditions such as the Foehn or crest situations. We find that the temperature series of all stations are affected by artificial breaks (average = 1 break point / 48 years) with discrepancies in the abilities of the methods to detect breaks. However, by combining the three statistical methods, almost all of the detected break points are confirmed by metadata. In most cases, these break points are ascribed to a combination of factors in the station history.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The AEGISS (Ascertainment and Enhancement of Gastrointestinal Infection Surveillance and Statistics) project aims to use spatio-temporal statistical methods to identify anomalies in the space-time distribution of non-specific, gastrointestinal infections in the UK, using the Southampton area in southern England as a test-case. In this paper, we use the AEGISS project to illustrate how spatio-temporal point process methodology can be used in the development of a rapid-response, spatial surveillance system. Current surveillance of gastroenteric disease in the UK relies on general practitioners reporting cases of suspected food-poisoning through a statutory notification scheme, voluntary laboratory reports of the isolation of gastrointestinal pathogens and standard reports of general outbreaks of infectious intestinal disease by public health and environmental health authorities. However, most statutory notifications are made only after a laboratory reports the isolation of a gastrointestinal pathogen. As a result, detection is delayed and the ability to react to an emerging outbreak is reduced. For more detailed discussion, see Diggle et al. (2003). A new and potentially valuable source of data on the incidence of non-specific gastro-enteric infections in the UK is NHS Direct, a 24-hour phone-in clinical advice service. NHS Direct data are less likely than reports by general practitioners to suffer from spatially and temporally localized inconsistencies in reporting rates. Also, reporting delays by patients are likely to be reduced, as no appointments are needed. Against this, NHS Direct data sacrifice specificity. Each call to NHS Direct is classified only according to the general pattern of reported symptoms (Cooper et al, 2003). The current paper focuses on the use of spatio-temporal statistical analysis for early detection of unexplained variation in the spatio-temporal incidence of non-specific gastroenteric symptoms, as reported to NHS Direct. Section 2 describes our statistical formulation of this problem, the nature of the available data and our approach to predictive inference. Section 3 describes the stochastic model. Section 4 gives the results of fitting the model to NHS Direct data. Section 5 shows how the model is used for spatio-temporal prediction. The paper concludes with a short discussion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There are numerous statistical methods for quantitative trait linkage analysis in human studies. An ideal such method would have high power to detect genetic loci contributing to the trait, would be robust to non-normality in the phenotype distribution, would be appropriate for general pedigrees, would allow the incorporation of environmental covariates, and would be appropriate in the presence of selective sampling. We recently described a general framework for quantitative trait linkage analysis, based on generalized estimating equations, for which many current methods are special cases. This procedure is appropriate for general pedigrees and easily accommodates environmental covariates. In this paper, we use computer simulations to investigate the power robustness of a variety of linkage test statistics built upon our general framework. We also propose two novel test statistics that take account of higher moments of the phenotype distribution, in order to accommodate non-normality. These new linkage tests are shown to have high power and to be robust to non-normality. While we have not yet examined the performance of our procedures in the context of selective sampling via computer simulations, the proposed tests satisfy all of the other qualities of an ideal quantitative trait linkage analysis method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A marker that is strongly associated with outcome (or disease) is often assumed to be effective for classifying individuals according to their current or future outcome. However, for this to be true, the associated odds ratio must be of a magnitude rarely seen in epidemiological studies. An illustration of the relationship between odds ratios and receiver operating characteristic (ROC) curves shows, for example, that a marker with an odds ratio as high as 3 is in fact a very poor classification tool. If a marker identifies 10 percent of controls as positive (false positives) and has an odds ratio of 3, then it will only correctly identify 25 percent of cases as positive (true positives). Moreover, the authors illustrate that a single measure of association such as an odds ratio does not meaningfully describe a marker’s ability to classify subjects. Appropriate statistical methods for assessing and reporting the classification power of a marker are described. The serious pitfalls of using more traditional methods based on parameters in logistic regression models are illustrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper considers statistical models in which two different types of events, such as the diagnosis of a disease and the remission of the disease, occur alternately over time and are observed subject to right censoring. We propose nonparametric estimators for the joint distribution of bivariate recurrence times and the marginal distribution of the first recurrence time. In general, the marginal distribution of the second recurrence time cannot be estimated due to an identifiability problem, but a conditional distribution of the second recurrence time can be estimated non-parametrically. In literature, statistical methods have been developed to estimate the joint distribution of bivariate recurrence times based on data of the first pair of censored bivariate recurrence times. These methods are efficient in the current model because recurrence times of higher orders are not used. Asymptotic properties of the estimators are established. Numerical studies demonstrate the estimator performs well with practical sample sizes. We apply the proposed method to a Denmark psychiatric case register data set for illustration of the methods and theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Utilizing remote sensing methods to assess landscape-scale ecological change are rapidly becoming a dominant force in the natural sciences. Powerful and robust non-parametric statistical methods are also actively being developed to compliment the unique characteristics of remotely sensed data. The focus of this research is to utilize these powerful, robust remote sensing and statistical approaches to shed light on woody plant encroachment into native grasslands--a troubling ecological phenomenon occurring throughout the world. Specifically, this research investigates western juniper encroachment within the sage-steppe ecosystem of the western USA. Western juniper trees are native to the intermountain west and are ecologically important by means of providing structural diversity and habitat for many species. However, after nearly 150 years of post-European settlement changes to this threatened ecosystem, natural ecological processes such as fire regimes no longer limit the range of western juniper to rocky refugia and other areas protected from short fire return intervals that are historically common to the region. Consequently, sage-steppe communities with high juniper densities exhibit negative impacts, such as reduced structural diversity, degraded wildlife habitat and ultimately the loss of biodiversity. Much of today's sage-steppe ecosystem is transitioning to juniper woodlands. Additionally, the majority of western juniper woodlands have not reached their full potential in both range and density. The first section of this research investigates the biophysical drivers responsible for juniper expansion patterns observed in the sage-steppe ecosystem. The second section is a comprehensive accuracy assessment of classification methods used to identify juniper tree cover from multispectral 1 m spatial resolution aerial imagery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION: Liver cirrhosis develops only in a minority of heavy drinkers. Genetic factors may account for some variation in the progression of fibrosis in alcoholic liver disease (ALD). Transforming growth factor beta 1 (TGFbeta1) is a key profibrogenic cytokine in fibrosis and its gene contains several polymorphic sites. A single nucleotide polymorphism at codon 25 has been suggested to affect fibrosis progression in patients with chronic hepatitis C virus infection, fatty liver disease, and hereditary hemochromatosis. Its contribution to the progression of ALD has not been investigated sufficiently so far. PATIENTS AND METHODS: One-hundred-and-fifty-one heavy drinkers without apparent ALD, 149 individuals with alcoholic cirrhosis, and 220 alcoholic cirrhotics who underwent liver transplantation (LTX) were genotyped for TGFbeta1 codon 25 variants. RESULTS: Univariate analysis suggested that genotypes Arg/Pro or Pro/Pro are associated with decompensated liver cirrhosis requiring LTX. However, after adjusting for patients' age these genotypes did not confer a significant risk for cirrhosis requiring LTX. CONCLUSION: TGFbeta1 codon 25 genotypes Arg/Pro or Pro/Pro are not associated with alcoholic liver cirrhosis. Our study emphasizes the need for adequate statistical methods and accurate study design when evaluating the contribution of genetic variants to the course of chronic liver diseases.