933 resultados para Bayesian Mixture Model, Cavalieri Method, Trapezoidal Rule


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: Dynamic near infrared fluorescence imaging of the urinary tract provides a promising way to diagnose ureteropelvic junction obstruction. Initial studies demonstrated the ability to visualize urine flow and peristalsis in great detail. We analyzed the efficacy of near infrared imaging in evaluating ureteropelvic junction obstruction, renal involvement and the anatomical detail provided compared to conventional imaging modalities. Materials and Methods: Ten swine underwent partial or complete unilateral ureteral obstruction. Groups were survived for the short or the long term. Imaging was performed with mercaptoacetyltriglycine diuretic renogram, magnetic resonance urogram, excretory urogram, ultrasound and near infrared imaging. Scoring systems for ureteropelvic junction obstruction were developed for magnetic resonance urogram and near infrared imaging. Physicians and medical students graded ureteropelvic junction obstruction based on magnetic resonance urogram and near infrared imaging results. Results: Markers of vascular and urinary dynamics were quantitatively consistent among control renal units. The same markers were abnormal in obstructed renal units with significantly different times of renal phase peak, start of pelvic phase and start of renal uptake. Such parameters were consistent with those obtained with mercaptoacetyltriglycine diuretic renography. Near infrared imaging provided live imaging of urinary flow, which was helpful in identifying the area of obstruction for surgical planning. Physicians and medical students categorized the degree of obstruction appropriately for fluorescence imaging and magnetic resonance urogram. Conclusions: Near infrared imaging offers a feasible way to obtain live, dynamic images of urine flow and ureteral peristalsis. Qualitative and quantitative parameters were comparable to those of conventional imaging. Findings support fluorescence imaging as an accurate, easy to use method of diagnosing ureteropelvic junction obstruction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We analyze the global phase diagram of a Maier-Saupe lattice model with the inclusion of shape-disordered degrees of freedom to mimic a mixture of oblate and prolate molecules (discs and cylinders). In the neighborhood of a Landau multicritical point, solutions of the statistical problem can be written as a Landau-de Gennes expansion for the free energy. If the shape-disordered degrees of freedom are quenched, we confirm the existence of a biaxial nematic structure. If orientational and disorder degrees of freedom are allowed to thermalize, this biaxial solution becomes thermodynamically unstable. Also, we use a two-temperature formalism to mimic the presence of two distinct relaxation times, and show that a slight departure from complete thermalization is enough to stabilize a biaxial nematic phase.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of this paper is to model variations in test-day milk yields of first lactations of Holstein cows by RR using B-spline functions and Bayesian inference in order to fit adequate and parsimonious models for the estimation of genetic parameters. They used 152,145 test day milk yield records from 7317 first lactations of Holstein cows. The model established in this study was additive, permanent environmental and residual random effects. In addition, contemporary group and linear and quadratic effects of the age of cow at calving were included as fixed effects. Authors modeled the average lactation curve of the population with a fourth-order orthogonal Legendre polynomial. They concluded that a cubic B-spline with seven random regression coefficients for both the additive genetic and permanent environment effects was to be the best according to residual mean square and residual variance estimates. Moreover they urged a lower order model (quadratic B-spline with seven random regression coefficients for both random effects) could be adopted because it yielded practically the same genetic parameter estimates with parsimony. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis presents Bayesian solutions to inference problems for three types of social network data structures: a single observation of a social network, repeated observations on the same social network, and repeated observations on a social network developing through time. A social network is conceived as being a structure consisting of actors and their social interaction with each other. A common conceptualisation of social networks is to let the actors be represented by nodes in a graph with edges between pairs of nodes that are relationally tied to each other according to some definition. Statistical analysis of social networks is to a large extent concerned with modelling of these relational ties, which lends itself to empirical evaluation. The first paper deals with a family of statistical models for social networks called exponential random graphs that takes various structural features of the network into account. In general, the likelihood functions of exponential random graphs are only known up to a constant of proportionality. A procedure for performing Bayesian inference using Markov chain Monte Carlo (MCMC) methods is presented. The algorithm consists of two basic steps, one in which an ordinary Metropolis-Hastings up-dating step is used, and another in which an importance sampling scheme is used to calculate the acceptance probability of the Metropolis-Hastings step. In paper number two a method for modelling reports given by actors (or other informants) on their social interaction with others is investigated in a Bayesian framework. The model contains two basic ingredients: the unknown network structure and functions that link this unknown network structure to the reports given by the actors. These functions take the form of probit link functions. An intrinsic problem is that the model is not identified, meaning that there are combinations of values on the unknown structure and the parameters in the probit link functions that are observationally equivalent. Instead of using restrictions for achieving identification, it is proposed that the different observationally equivalent combinations of parameters and unknown structure be investigated a posteriori. Estimation of parameters is carried out using Gibbs sampling with a switching devise that enables transitions between posterior modal regions. The main goal of the procedures is to provide tools for comparisons of different model specifications. Papers 3 and 4, propose Bayesian methods for longitudinal social networks. The premise of the models investigated is that overall change in social networks occurs as a consequence of sequences of incremental changes. Models for the evolution of social networks using continuos-time Markov chains are meant to capture these dynamics. Paper 3 presents an MCMC algorithm for exploring the posteriors of parameters for such Markov chains. More specifically, the unobserved evolution of the network in-between observations is explicitly modelled thereby avoiding the need to deal with explicit formulas for the transition probabilities. This enables likelihood based parameter inference in a wider class of network evolution models than has been available before. Paper 4 builds on the proposed inference procedure of Paper 3 and demonstrates how to perform model selection for a class of network evolution models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of the thesi is to formulate a suitable Item Response Theory (IRT) based model to measure HRQoL (as latent variable) using a mixed responses questionnaire and relaxing the hypothesis of normal distributed latent variable. The new model is a combination of two models already presented in literature, that is, a latent trait model for mixed responses and an IRT model for Skew Normal latent variable. It is developed in a Bayesian framework, a Markov chain Monte Carlo procedure is used to generate samples of the posterior distribution of the parameters of interest. The proposed model is test on a questionnaire composed by 5 discrete items and one continuous to measure HRQoL in children, the EQ-5D-Y questionnaire. A large sample of children collected in the schools was used. In comparison with a model for only discrete responses and a model for mixed responses and normal latent variable, the new model has better performances, in term of deviance information criterion (DIC), chain convergences times and precision of the estimates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this PhD thesis the crashworthiness topic is studied with the perspective of the development of a small-scale experimental test able to characterize a material in terms of energy absorption. The material properties obtained are then used to validate a nu- merical model of the experimental test itself. Consequently, the numerical model, calibrated on the specific ma- terial, can be extended to more complex structures and used to simulate their energy absorption behavior. The experimental activity started at University of Washington in Seattle, WA (USA) and continued at Second Faculty of Engi- neering, University of Bologna, Forl`ı (Italy), where the numerical model for the simulation of the experimental test was implemented and optimized.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Forest models are tools for explaining and predicting the dynamics of forest ecosystems. They simulate forest behavior by integrating information on the underlying processes in trees, soil and atmosphere. Bayesian calibration is the application of probability theory to parameter estimation. It is a method, applicable to all models, that quantifies output uncertainty and identifies key parameters and variables. This study aims at testing the Bayesian procedure for calibration to different types of forest models, to evaluate their performances and the uncertainties associated with them. In particular,we aimed at 1) applying a Bayesian framework to calibrate forest models and test their performances in different biomes and different environmental conditions, 2) identifying and solve structure-related issues in simple models, and 3) identifying the advantages of additional information made available when calibrating forest models with a Bayesian approach. We applied the Bayesian framework to calibrate the Prelued model on eight Italian eddy-covariance sites in Chapter 2. The ability of Prelued to reproduce the estimated Gross Primary Productivity (GPP) was tested over contrasting natural vegetation types that represented a wide range of climatic and environmental conditions. The issues related to Prelued's multiplicative structure were the main topic of Chapter 3: several different MCMC-based procedures were applied within a Bayesian framework to calibrate the model, and their performances were compared. A more complex model was applied in Chapter 4, focusing on the application of the physiology-based model HYDRALL to the forest ecosystem of Lavarone (IT) to evaluate the importance of additional information in the calibration procedure and their impact on model performances, model uncertainties, and parameter estimation. Overall, the Bayesian technique proved to be an excellent and versatile tool to successfully calibrate forest models of different structure and complexity, on different kind and number of variables and with a different number of parameters involved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The PM3 semiempirical quantum-mechanical method was found to systematically describe intermolecular hydrogen bonding in small polar molecules. PM3 shows charge transfer from the donor to acceptor molecules on the order of 0.02-0.06 units of charge when strong hydrogen bonds are formed. The PM3 method is predictive; calculated hydrogen bond energies with an absolute magnitude greater than 2 kcal mol-' suggest that the global minimum is a hydrogen bonded complex; absolute energies less than 2 kcal mol-' imply that other van der Waals complexes are more stable. The geometries of the PM3 hydrogen bonded complexes agree with high-resolution spectroscopic observations, gas electron diffraction data, and high-level ab initio calculations. The main limitations in the PM3 method are the underestimation of hydrogen bond lengths by 0.1-0.2 for some systems and the underestimation of reliable experimental hydrogen bond energies by approximately 1-2 kcal mol-l. The PM3 method predicts that ammonia is a good hydrogen bond acceptor and a poor hydrogen donor when interacting with neutral molecules. Electronegativity differences between F, N, and 0 predict that donor strength follows the order F > 0 > N and acceptor strength follows the order N > 0 > F. In the calculations presented in this article, the PM3 method mirrors these electronegativity differences, predicting the F-H- - -N bond to be the strongest and the N-H- - -F bond the weakest. It appears that the PM3 Hamiltonian is able to model hydrogen bonding because of the reduction of two-center repulsive forces brought about by the parameterization of the Gaussian core-core interactions. The ability of the PM3 method to model intermolecular hydrogen bonding means reasonably accurate quantum-mechanical calculations can be applied to small biologic systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Generalized linear mixed models with semiparametric random effects are useful in a wide variety of Bayesian applications. When the random effects arise from a mixture of Dirichlet process (MDP) model, normal base measures and Gibbs sampling procedures based on the Pólya urn scheme are often used to simulate posterior draws. These algorithms are applicable in the conjugate case when (for a normal base measure) the likelihood is normal. In the non-conjugate case, the algorithms proposed by MacEachern and Müller (1998) and Neal (2000) are often applied to generate posterior samples. Some common problems associated with simulation algorithms for non-conjugate MDP models include convergence and mixing difficulties. This paper proposes an algorithm based on the Pólya urn scheme that extends the Gibbs sampling algorithms to non-conjugate models with normal base measures and exponential family likelihoods. The algorithm proceeds by making Laplace approximations to the likelihood function, thereby reducing the procedure to that of conjugate normal MDP models. To ensure the validity of the stationary distribution in the non-conjugate case, the proposals are accepted or rejected by a Metropolis-Hastings step. In the special case where the data are normally distributed, the algorithm is identical to the Gibbs sampler.