995 resultados para Gibbs excess models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper revisits the issue of conditional volatility in real GDP growth rates for Canada, Japan, the United Kingdom, and the United States. Previous studies find high persistence in the volatility. This paper shows that this finding largely reflects a nonstationary variance. Output growth in the four countries became noticeably less volatile over the past few decades. In this paper, we employ the modified ICSS algorithm to detect structural change in the unconditional variance of output growth. One structural break exists in each of the four countries. We then use generalized autoregressive conditional heteroskedasticity (GARCH) specifications modeling output growth and its volatility with and without the break in volatility. The evidence shows that the time-varying variance falls sharply in Canada, Japan, and the U.K. and disappears in the U.S., excess kurtosis vanishes in Canada, Japan, and the U.S. and drops substantially in the U.K., once we incorporate the break in the variance equation of output for the four countries. That is, the integrated GARCH (IGARCH) effect proves spurious and the GARCH model demonstrates misspecification, if researchers neglect a nonstationary unconditional variance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the recognition of the importance of evidence-based medicine, there is an emerging need for methods to systematically synthesize available data. Specifically, methods to provide accurate estimates of test characteristics for diagnostic tests are needed to help physicians make better clinical decisions. To provide more flexible approaches for meta-analysis of diagnostic tests, we developed three Bayesian generalized linear models. Two of these models, a bivariate normal and a binomial model, analyzed pairs of sensitivity and specificity values while incorporating the correlation between these two outcome variables. Noninformative independent uniform priors were used for the variance of sensitivity, specificity and correlation. We also applied an inverse Wishart prior to check the sensitivity of the results. The third model was a multinomial model where the test results were modeled as multinomial random variables. All three models can include specific imaging techniques as covariates in order to compare performance. Vague normal priors were assigned to the coefficients of the covariates. The computations were carried out using the 'Bayesian inference using Gibbs sampling' implementation of Markov chain Monte Carlo techniques. We investigated the properties of the three proposed models through extensive simulation studies. We also applied these models to a previously published meta-analysis dataset on cervical cancer as well as to an unpublished melanoma dataset. In general, our findings show that the point estimates of sensitivity and specificity were consistent among Bayesian and frequentist bivariate normal and binomial models. However, in the simulation studies, the estimates of the correlation coefficient from Bayesian bivariate models are not as good as those obtained from frequentist estimation regardless of which prior distribution was used for the covariance matrix. The Bayesian multinomial model consistently underestimated the sensitivity and specificity regardless of the sample size and correlation coefficient. In conclusion, the Bayesian bivariate binomial model provides the most flexible framework for future applications because of its following strengths: (1) it facilitates direct comparison between different tests; (2) it captures the variability in both sensitivity and specificity simultaneously as well as the intercorrelation between the two; and (3) it can be directly applied to sparse data without ad hoc correction. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ESAT 2014. 27th European Symposium on Applied Thermodynamics, Eindhoven University of Technology, July 6-9, 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We performed a quantitative comparison of brittle thrust wedge experiments to evaluate the variability among analogue models and to appraise the reproducibility and limits of model interpretation. Fifteen analogue modeling laboratories participated in this benchmark initiative. Each laboratory received a shipment of the same type of quartz and corundum sand and all laboratories adhered to a stringent model building protocol and used the same type of foil to cover base and sidewalls of the sandbox. Sieve structure, sifting height, filling rate, and details on off-scraping of excess sand followed prescribed procedures. Our analogue benchmark shows that even for simple plane-strain experiments with prescribed stringent model construction techniques, quantitative model results show variability, most notably for surface slope, thrust spacing and number of forward and backthrusts. One of the sources of the variability in model results is related to slight variations in how sand is deposited in the sandbox. Small changes in sifting height, sifting rate, and scraping will result in slightly heterogeneous material bulk densities, which will affect the mechanical properties of the sand, and will result in lateral and vertical differences in peak and boundary friction angles, as well as cohesion values once the model is constructed. Initial variations in basal friction are inferred to play the most important role in causing model variability. Our comparison shows that the human factor plays a decisive role, and even when one modeler repeats the same experiment, quantitative model results still show variability. Our observations highlight the limits of up-scaling quantitative analogue model results to nature or for making comparisons with numerical models. The frictional behavior of sand is highly sensitive to small variations in material state or experimental set-up, and hence, it will remain difficult to scale quantitative results such as number of thrusts, thrust spacing, and pop-up width from model to nature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate whether relative contributions of genetic and shared environmental factors are associated with an increased risk in melanoma. Data from the Queensland Familial Melanoma Project comprising 15,907 subjects arising from 1912 families were analyzed to estimate the additive genetic, common and unique environmental contributions to variation in the age at onset of melanoma. Two complementary approaches for analyzing correlated time-to-onset family data were considered: the generalized estimating equations (GEE) method in which one can estimate relationship-specific dependence simultaneously with regression coefficients that describe the average population response to changing covariates; and a subject-specific Bayesian mixed model in which heterogeneity in regression parameters is explicitly modeled and the different components of variation may be estimated directly. The proportional hazards and Weibull models were utilized, as both produce natural frameworks for estimating relative risks while adjusting for simultaneous effects of other covariates. A simple Markov Chain Monte Carlo method for covariate imputation of missing data was used and the actual implementation of the Bayesian model was based on Gibbs sampling using the free ware package BUGS. In addition, we also used a Bayesian model to investigate the relative contribution of genetic and environmental effects on the expression of naevi and freckles, which are known risk factors for melanoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the discovery in the 1970s that dendritic abnormalities in cortical pyramidal neurons are the most consistent pathologic correlate of mental retardation, research has focused on how dendritic alterations are related to reduced intellectual ability. Due in part to obvious ethical problems and in part to the lack of fruitful methods to study neuronal circuitry in the human cortex, there is little data about the microanatomical contribution to mental retardation. The recent identification of the genetic bases of some mental retardation associated alterations, coupled with the technology to create transgenic animal models and the introduction of powerful sophisticated tools in the field of microanatomy, has led to a growth in the studies of the alterations of pyramidal cell morphology in these disorders. Studies of individuals with Down syndrome, the most frequent genetic disorder leading to mental retardation, allow the analysis of the relationships between cognition, genotype and brain microanatomy. In Down syndrome the crucial question is to define the mechanisms by which an excess of normal gene products, in interaction with the environment, directs and constrains neural maturation, and how this abnormal development translates into cognition and behaviour. In the present article we discuss mainly Down syndrome-associated dendritic abnormalities and plasticity and the role of animal models in these studies. We believe that through the further development of such approaches, the study of the microanatomical substrates of mental retardation will contribute significantly to our understanding of the mechanisms underlying human brain disorders associated with mental retardation. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comparison of a constant (continuous delivery of 4% FiO(2)) and a variable (initial 5% FiO(2) with adjustments to induce low amplitude EEG (LAEEG) and hypotension) hypoxic/ischemic insult was performed to determine which insult was more effective in producing a consistent degree of survivable neuropathological damage in a newborn piglet model of perinatal asphyxia. We also examined which physiological responses contributed to this outcome. Thirty-nine 1-day-old piglets were subjected to either a constant hypoxic/ischemic insult of 30- to 37-min duration or a variable hypoxic/ischemic insult of 30-min low peak amplitude EEG (LAEEG < 5 mu V) including 10 min of low mean arterial blood pressure (MABP < 70% of baseline). Control animals (n = 6) received 21% FiO(2) for the duration of the experiment. At 72 h, the piglets were euthanased, their brains removed and fixed in 4% paraformaldehyde and assessed for hypoxic/ischemic injury by histological analysis. Based on neuropathology scores, piglets were grouped as undamaged or damaged; piglets that did not survive to 72 h were grouped separately as dead. The variable insult resulted in a greater number of piglets with neuropathological damage (undamaged = 12.5%, damaged = 68.75%, dead = 18.75%) while the constant insult resulted in a large proportion of undamaged piglets (undamaged = 50%, damaged = 22.2%, dead = 27.8%). A hypoxic insult varied to maintain peak amplitude EEG < 5 mu V results in a greater number of survivors with a consistent degree of neuropathological damage than a constant hypoxic insult. Physiological variables MABP, LAEEG, pH and arterial base excess were found to be significantly associated with neuropathological outcome. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Eukaryotic genomes display segmental patterns of variation in various properties, including GC content and degree of evolutionary conservation. DNA segmentation algorithms are aimed at identifying statistically significant boundaries between such segments. Such algorithms may provide a means of discovering new classes of functional elements in eukaryotic genomes. This paper presents a model and an algorithm for Bayesian DNA segmentation and considers the feasibility of using it to segment whole eukaryotic genomes. The algorithm is tested on a range of simulated and real DNA sequences, and the following conclusions are drawn. Firstly, the algorithm correctly identifies non-segmented sequence, and can thus be used to reject the null hypothesis of uniformity in the property of interest. Secondly, estimates of the number and locations of change-points produced by the algorithm are robust to variations in algorithm parameters and initial starting conditions and correspond to real features in the data. Thirdly, the algorithm is successfully used to segment human chromosome 1 according to GC content, thus demonstrating the feasibility of Bayesian segmentation of eukaryotic genomes. The software described in this paper is available from the author's website (www.uq.edu.au/similar to uqjkeith/) or upon request to the author.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study addresses the problem of predicting the properties of multicomponent systems from those of corresponding binary systems. Two types of multicomponent polynomial models have been analysed. A probabilistic interpretation of the parameters of the Polynomial model, which explicitly relates them with the Gibbs free energies of the generalised quasichemical reactions, is proposed. The presented treatment provides a theoretical justification for such parameters. A methodology of estimating the ternary interaction parameter from the binary ones is presented. The methodology provides a way in which the power series multicomponent models, where no projection is required, could be incorporated into the Calphad approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The terrigenous sediment proportion of the deep sea sediments from off Northwest Africa has been studied in order to distinguish between the aeolian and the fluvial sediment supply. The present and fossil Saharan dust trajectories were recognized from the distribution patterns of the aeolian sediment. The following timeslices have been investigated: Present, 6,000, 12,000 and 18,000 y. B. P. Furthermore, the quantity of dust deposited off the Saharan coast has been estimated. For this purpose, 80 surface sediment samples and 34 sediment cores have been analysed. The stratigraphy of the cores has been achieved from oxygen isotopic curves, 14C-dating, foraminiferal transfer temperatures, and carbonate contents. Silt sized biogenic opal generally accounts for less than 2 % of the total insoluble sediment proportion. Only under productive upwelling waters and off river mouths, the opal proportion exceeds 2 % significantly. The modern terrigenous sediment from off the Saharan coast is generally characterized by intensely stained quartz grains. They indicate an origin from southern Saharan and Sahelian laterites, and a zonal aeolian transport in midtropospheric levels, between 1.5 an 5.5 km, by 'Harmattan' Winds. The dust particles follow large outbreaks of Saharan air across the African coast between 15° and 21° N. Their trajectories are centered at about 18° N and continue further into a clockwise gyre situated south of the Canary Islands. This course is indicated by a sickle-shaped tongue of coarser grain sizes in the deep-sea sediment. Such loess-sized terrigenous particles only settle within a zone extending to 700 km offshore. Fine silt and clay sized particles, with grain sizes smaller than 10- 15 µm, drift still further west and can be traced up to more than 4,000 km distance from their source areas. Additional terrigenous silt which is poor in stained quartz occurs within a narrow zone off the western Sahara between 20° and 27° N only. It depicts the present dust supply by the trade winds close to the surface. The dust load originates from the northwestern Sahara, the Atlas Mountains and coastal areas, which contain a particularly low amount of stained quartz. The distribution pattern of these pale quartz sediments reveals a SSW-dispersal of dust being consistent with the present trade wind direction from the NNE. In comparison to the sediments from off the Sahara and the deeper subtropical Atlantic, the sediments off river mouths, in particular off the Senegal river, are characterized by an additional input of fine grained terrigenous particles (< 6 µm). This is due to fluvial suspension load. The fluvial discharge leads to a relative excess of fine grained particles and is observed in a correlation diagram of the modal grain sizes of terrigenous silt with the proportion of fine fraction (< 6 µm). The aeolian sediment contribution by the Harmattan Winds strongly decreased during the Climatic Optimum at 6,000 y. B. P. The dust discharge of the trade winds is hardly detectable in the deep-sea sediments. This probably indicates a weakened atmospheric circulation. In contrast, the fluvial sediment supply reached a maximum, and can be traced to beyond Cape Blanc. Thus, the Saharan climate was more humid at 6,000 y B. P. A latitudinal shift of the Harmattan driven dust outbreaks cannot be observed. Also during the Glacial, 18,000 y. B. P., Harmattan dust transport crossed the African coast at latitudes of 15°-20° N. Its sediment load increased intensively, and markedly coarser grains spread further into the Atlantic Ocean. An expanded zone of pale-quart sediments indicates an enhanced dust supply by the trade winds blowing from the NE. No synglacial fluvial sediment contribution can be recognized between 12° and 30° N. This indicates a dry glacial climate and a strengthened stmospheric circulation over the Sahelian and Saharan region. The climatic transition pahes, at 12, 000 y. B. P., between the last Glacial and the Intergalcial, which is compareable to the Alerod in Europe, is characterized by an intermediate supply of terrigenous particles. The Harmattan dust transport wa weaker than during the Glacial. The northeasterly trade winds were still intensive. River supply reached a first postglacial maximum seaward of the Senegal river mouth. This indicates increasing humidity over the southern Sahara and a weaker atmospheric circulation as compared to the glacial. The accumulation rates of the terrigenous silt proportion (> 6 µm) decrcase exponentially with increasing distance from the Saharan coast. Those of the terrigenous fine fraction (< 6 µm) follow the same trend and show almost similar gradients. Accordingly, also the terrigenous fine fraction is believed to result predominantly from aeolian transport. In the Atlantic deep-sea sediments, the annual terrigenous sediment accumulation has fluctuated, from about 60 million tons p. a. during the Late Glacial (13,500-18,000 y. B. P, aeolian supply only) to about 33 million tons p. a. during the Holocene Climatic Optimum (6,000-9,000 y. B. P, mainly fluvial supply), when the river supply has reached a maximum, and to about 45 million tons p. a. during the last 4,000 years B. P. (fluvial supply only south of 18° N).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excess nutrient loads carried by streams and rivers are a great concern for environmental resource managers. In agricultural regions, excess loads are transported downstream to receiving water bodies, potentially causing algal blooms, which could lead to numerous ecological problems. To better understand nutrient load transport, and to develop appropriate water management plans, it is important to have accurate estimates of annual nutrient loads. This study used a Monte Carlo sub-sampling method and error-corrected statistical models to estimate annual nitrate-N loads from two watersheds in central Illinois. The performance of three load estimation methods (the seven-parameter log-linear model, the ratio estimator, and the flow-weighted averaging estimator) applied at one-, two-, four-, six-, and eight-week sampling frequencies were compared. Five error correction techniques; the existing composite method, and four new error correction techniques developed in this study; were applied to each combination of sampling frequency and load estimation method. On average, the most accurate error reduction technique, (proportional rectangular) resulted in 15% and 30% more accurate load estimates when compared to the most accurate uncorrected load estimation method (ratio estimator) for the two watersheds. Using error correction methods, it is possible to design more cost-effective monitoring plans by achieving the same load estimation accuracy with fewer observations. Finally, the optimum combinations of monitoring threshold and sampling frequency that minimizes the number of samples required to achieve specified levels of accuracy in load estimation were determined. For one- to three-weeks sampling frequencies, combined threshold/fixed-interval monitoring approaches produced the best outcomes, while fixed-interval-only approaches produced the most accurate results for four- to eight-weeks sampling frequencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding how virus strains offer protection against closely related emerging strains is vital for creating effective vaccines. For many viruses, including Foot-and-Mouth Disease Virus (FMDV) and the Influenza virus where multiple serotypes often co-circulate, in vitro testing of large numbers of vaccines can be infeasible. Therefore the development of an in silico predictor of cross-protection between strains is important to help optimise vaccine choice. Vaccines will offer cross-protection against closely related strains, but not against those that are antigenically distinct. To be able to predict cross-protection we must understand the antigenic variability within a virus serotype, distinct lineages of a virus, and identify the antigenic residues and evolutionary changes that cause the variability. In this thesis we present a family of sparse hierarchical Bayesian models for detecting relevant antigenic sites in virus evolution (SABRE), as well as an extended version of the method, the extended SABRE (eSABRE) method, which better takes into account the data collection process. The SABRE methods are a family of sparse Bayesian hierarchical models that use spike and slab priors to identify sites in the viral protein which are important for the neutralisation of the virus. In this thesis we demonstrate how the SABRE methods can be used to identify antigenic residues within different serotypes and show how the SABRE method outperforms established methods, mixed-effects models based on forward variable selection or l1 regularisation, on both synthetic and viral datasets. In addition we also test a number of different versions of the SABRE method, compare conjugate and semi-conjugate prior specifications and an alternative to the spike and slab prior; the binary mask model. We also propose novel proposal mechanisms for the Markov chain Monte Carlo (MCMC) simulations, which improve mixing and convergence over that of the established component-wise Gibbs sampler. The SABRE method is then applied to datasets from FMDV and the Influenza virus in order to identify a number of known antigenic residue and to provide hypotheses of other potentially antigenic residues. We also demonstrate how the SABRE methods can be used to create accurate predictions of the important evolutionary changes of the FMDV serotypes. In this thesis we provide an extended version of the SABRE method, the eSABRE method, based on a latent variable model. The eSABRE method takes further into account the structure of the datasets for FMDV and the Influenza virus through the latent variable model and gives an improvement in the modelling of the error. We show how the eSABRE method outperforms the SABRE methods in simulation studies and propose a new information criterion for selecting the random effects factors that should be included in the eSABRE method; block integrated Widely Applicable Information Criterion (biWAIC). We demonstrate how biWAIC performs equally to two other methods for selecting the random effects factors and combine it with the eSABRE method to apply it to two large Influenza datasets. Inference in these large datasets is computationally infeasible with the SABRE methods, but as a result of the improved structure of the likelihood, we are able to show how the eSABRE method offers a computational improvement, leading it to be used on these datasets. The results of the eSABRE method show that we can use the method in a fully automatic manner to identify a large number of antigenic residues on a variety of the antigenic sites of two Influenza serotypes, as well as making predictions of a number of nearby sites that may also be antigenic and are worthy of further experiment investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present manuscript focuses on out of equilibrium physics in two dimensional models. It has the purpose of presenting some results obtained as part of out of equilibrium dynamics in its non perturbative aspects. This can be understood in two different ways: the former is related to integrability, which is non perturbative by nature; the latter is related to emergence of phenomena in the out of equilibirum dynamics of non integrable models that are not accessible by standard perturbative techniques. In the study of out of equilibirum dynamics, two different protocols are used througout this work: the bipartitioning protocol, within the Generalised Hydrodynamics (GHD) framework, and the quantum quench protocol. With GHD machinery we study the Staircase Model, highlighting how the hydrodynamic picture sheds new light into the physics of Integrable Quantum Field Theories; with quench protocols we analyse different setups where a non-perturbative description is needed and various dynamical phenomena emerge, such as the manifistation of a dynamical Gibbs effect, confinement and the emergence of Bloch oscillations preventing thermalisation.