996 resultados para Mixture Distribution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

High biogenic sedimentation rates in the late Neogene at DSDP Site 590 (1293 m) provide an exceptional opportunity to evaluate late Neogene (late Miocene to latest Pliocene) paleoceanography in waters transitional between temperate and warm-subtropical water masses. Oxygen and carbon isotope analyses and quantitative planktonic foraminiferal data have been used to interpret the late Neogene paleoceanographic evolution of this site. Faunal and isotopic data from Site 590 show a progression of paleoceanographic events between 6.7 and 4.3 Ma, during the latest Miocene and early Pliocene. First, a permanent depletion in both planktonic and benthic foraminiferal d13C, between 6.7 and 6.2 Ma, can be correlated to the globally recognized late Miocene carbon isotope shift. Second, a 0.5 per mil enrichment in benthic foraminiferal d18O between 5.6 and 4.7 Ma in the latest Miocene to early Pliocene corresponds to the latest Miocene oxygen isotopic enrichment at Site 284, located in temperate waters south of Site 590. This enrichment in d18O coincides with a time of cool surface waters, as is suggested by high frequencies of Neogloboquadrina pachyderma and low frequencies of the warmer-water planktonic foraminifers, as well as by an enrichment in planktonic foraminiferal d18O relative to the earlier Miocene. By 4.6 Ma, benthic foraminiferal d18O values become depleted and remain fairly stable until about 3.8 Ma. The early Pliocene (~4.3 to 3.2 Ma) is marked by a significant increase in biogenic sedimentation rates (37.7 to 83.3 m/m.y.). During this time, heaviest values in planktonic foraminiferal d18O are associated with a decrease in the gradient between surface and intermediate-water d13C and d18O, a 1.0 per mil depletion in the d13C of two species of planktonic foraminifers, and a mixture of warm and cool planktonic foraminiferal elements. These data suggest that localized upwelling at the Subtropical Divergence produced an increase in surface-water productivity during the early Pliocene. A two-step enrichment in benthic foraminiferal d18O occurs in the late Pliocene sequence at Site 590. A 0.3 per mil average enrichment at about 3.6 Ma is followed by a 0.5 per mil enrichment at 2.7 Ma. These two events can be correlated with the two-step isotopic enrichment associated with late Pliocene climatic instability and the initiation of Northern Hemisphere glaciation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of assessing the number of clusters in a limited number of tissue samples containing gene expressions for possibly several thousands of genes. It is proposed to use a normal mixture model-based approach to the clustering of the tissue samples. One advantage of this approach is that the question on the number of clusters in the data can be formulated in terms of a test on the smallest number of components in the mixture model compatible with the data. This test can be carried out on the basis of the likelihood ratio test statistic, using resampling to assess its null distribution. The effectiveness of this approach is demonstrated on simulated data and on some microarray datasets, as considered previously in the bioinformatics literature. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: An important problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. We provide a straightforward and easily implemented method for estimating the posterior probability that an individual gene is null. The problem can be expressed in a two-component mixture framework, using an empirical Bayes approach. Current methods of implementing this approach either have some limitations due to the minimal assumptions made or with more specific assumptions are computationally intensive. Results: By converting to a z-score the value of the test statistic used to test the significance of each gene, we propose a simple two-component normal mixture that models adequately the distribution of this score. The usefulness of our approach is demonstrated on three real datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Minimization of a sum-of-squares or cross-entropy error function leads to network outputs which approximate the conditional averages of the target data, conditioned on the input vector. For classifications problems, with a suitably chosen target coding scheme, these averages represent the posterior probabilities of class membership, and so can be regarded as optimal. For problems involving the prediction of continuous variables, however, the conditional averages provide only a very limited description of the properties of the target variables. This is particularly true for problems in which the mapping to be learned is multi-valued, as often arises in the solution of inverse problems, since the average of several correct target values is not necessarily itself a correct value. In order to obtain a complete description of the data, for the purposes of predicting the outputs corresponding to new input vectors, we must model the conditional probability distribution of the target data, again conditioned on the input vector. In this paper we introduce a new class of network models obtained by combining a conventional neural network with a mixture density model. The complete system is called a Mixture Density Network, and can in principle represent arbitrary conditional probability distributions in the same way that a conventional neural network can represent arbitrary functions. We demonstrate the effectiveness of Mixture Density Networks using both a toy problem and a problem involving robot inverse kinematics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have proposed a novel robust inversion-based neurocontroller that searches for the optimal control law by sampling from the estimated Gaussian distribution of the inverse plant model. However, for problems involving the prediction of continuous variables, a Gaussian model approximation provides only a very limited description of the properties of the inverse model. This is usually the case for problems in which the mapping to be learned is multi-valued or involves hysteritic transfer characteristics. This often arises in the solution of inverse plant models. In order to obtain a complete description of the inverse model, a more general multicomponent distributions must be modeled. In this paper we test whether our proposed sampling approach can be used when considering an arbitrary conditional probability distributions. These arbitrary distributions will be modeled by a mixture density network. Importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The effectiveness of the importance sampling from an arbitrary conditional probability distribution will be demonstrated using a simple single input single output static nonlinear system with hysteretic characteristics in the inverse plant model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The morphology of asphalt mixture can be defined as a set of parameters describing the geometrical characteristics of its constituent materials, their relative proportions as well as spatial arrangement in the mixture. The present study is carried out to investigate the effect of the morphology on its meso- and macro-mechanical response. An analysis approach is used for the meso-structural characterisation based on the X-ray computed tomography (CT) data. Image processing techniques are used to systematically vary the internal structure to obtain different morphology structures. A morphology framework is used to characterise the average mastic coating thickness around the main load carrying structure in the structures. The uniaxial tension simulation shows that the mixtures with the lowest coating thickness exhibit better inter-particle interaction with more continuous load distribution chains between adjacent aggregate particles, less stress concentrations and less strain localisation in the mastic phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Located at a subtropical latitude, the expansive Florida Everglades contains a mixture of tropical and temperate diatom taxa, as well as a unique flora adapted to the calcareous, often excessively hot, seasonally flooded wetland conditions. This flora has been poorly documented taxonomically, although diatoms are recognized as important indicators of environmental change in this threatened ecosystem. Gomphonema is a dominant genus in the freshwater marsh, and is represented by highly variable species complexes, including Gomphonema gracile Ehrenberg, Gomphonema intricatum var. vibrio Ehrenberg sensu Fricke, Gomphonema vibrioides Reichardt & Lange-Bertalot and Gomphonema parvulum (Kützing) Grunow. These taxa have been shown to exhibit wide morphological variation in other regions, resulting in considerable nomenclatural confusion. We collected Gomphonema from 237 sites distributed throughout the freshwater Everglades and used qualitative and quantitative morphological data to identify 20 distinguishable populations. Taxonomie assignments were based on descriptions and/or observations of type material of relevant taxa when possible, but deviations from original morphological range descriptions were common. We then compared morphological variation in Everglades Gomphonema taxa to that reported for the same taxa in other regions and suggest revisions of taxonomie concepts when necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Dirichlet distribution is a multivariate generalization of the Beta distribution. It is an important multivariate continuous distribution in probability and statistics. In this report, we review the Dirichlet distribution and study its properties, including statistical and information-theoretic quantities involving this distribution. Also, relationships between the Dirichlet distribution and other distributions are discussed. There are some different ways to think about generating random variables with a Dirichlet distribution. The stick-breaking approach and the Pólya urn method are discussed. In Bayesian statistics, the Dirichlet distribution and the generalized Dirichlet distribution can both be a conjugate prior for the Multinomial distribution. The Dirichlet distribution has many applications in different fields. We focus on the unsupervised learning of a finite mixture model based on the Dirichlet distribution. The Initialization Algorithm and Dirichlet Mixture Estimation Algorithm are both reviewed for estimating the parameters of a Dirichlet mixture. Three experimental results are shown for the estimation of artificial histograms, summarization of image databases and human skin detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We measured the distribution in absolute magnitude - circular velocity space for a well-defined sample of 199 rotating galaxies of the Calar Alto Legacy Integral Field Area Survey (CALIFA) using their stellar kinematics. Our aim in this analysis is to avoid subjective selection criteria and to take volume and large-scale structure factors into account. Using stellar velocity fields instead of gas emission line kinematics allows including rapidly rotating early-type galaxies. Our initial sample contains 277 galaxies with available stellar velocity fields and growth curve r-band photometry. After rejecting 51 velocity fields that could not be modelled because of the low number of bins, foreground contamination, or significant interaction, we performed Markov chain Monte Carlo modelling of the velocity fields, from which we obtained the rotation curve and kinematic parameters and their realistic uncertainties. We performed an extinction correction and calculated the circular velocity v_circ accounting for the pressure support of a given galaxy. The resulting galaxy distribution on the M-r - v(circ) plane was then modelled as a mixture of two distinct populations, allowing robust and reproducible rejection of outliers, a significant fraction of which are slow rotators. The selection effects are understood well enough that we were able to correct for the incompleteness of the sample. The 199 galaxies were weighted by volume and large-scale structure factors, which enabled us to fit a volume-corrected Tully-Fisher relation (TFR). More importantly, we also provide the volume-corrected distribution of galaxies in the M_r - v_circ plane, which can be compared with cosmological simulations. The joint distribution of the luminosity and circular velocity space densities, representative over the range of -20 > M_r > -22 mag, can place more stringent constraints on the galaxy formation and evolution scenarios than linear TFR fit parameters or the luminosity function alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Survival models are being widely applied to the engineering field to model time-to-event data once censored data is here a common issue. Using parametric models or not, for the case of heterogeneous data, they may not always represent a good fit. The present study relays on critical pumps survival data where traditional parametric regression might be improved in order to obtain better approaches. Considering censored data and using an empiric method to split the data into two subgroups to give the possibility to fit separated models to our censored data, we’ve mixture two distinct distributions according a mixture-models approach. We have concluded that it is a good method to fit data that does not fit to a usual parametric distribution and achieve reliable parameters. A constant cumulative hazard rate policy was used as well to check optimum inspection times using the obtained model from the mixture-model, which could be a plus when comparing with the actual maintenance policies to check whether changes should be introduced or not.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although various abutment connections and materials have recently been introduced, insufficient data exist regarding the effect of stress distribution on their mechanical performance. The purpose of this study was to investigate the effect of different abutment materials and platform connections on stress distribution in single anterior implant-supported restorations with the finite element method. Nine experimental groups were modeled from the combination of 3 platform connections (external hexagon, internal hexagon, and Morse tapered) and 3 abutment materials (titanium, zirconia, and hybrid) as follows: external hexagon-titanium, external hexagon-zirconia, external hexagon-hybrid, internal hexagon-titanium, internal hexagon-zirconia, internal hexagon-hybrid, Morse tapered-titanium, Morse tapered-zirconia, and Morse tapered-hybrid. Finite element models consisted of a 4×13-mm implant, anatomic abutment, and lithium disilicate central incisor crown cemented over the abutment. The 49 N occlusal loading was applied in 6 steps to simulate the incisal guidance. Equivalent von Mises stress (σvM) was used for both the qualitative and quantitative evaluation of the implant and abutment in all the groups and the maximum (σmax) and minimum (σmin) principal stresses for the numerical comparison of the zirconia parts. The highest abutment σvM occurred in the Morse-tapered groups and the lowest in the external hexagon-hybrid, internal hexagon-titanium, and internal hexagon-hybrid groups. The σmax and σmin values were lower in the hybrid groups than in the zirconia groups. The stress distribution concentrated in the abutment-implant interface in all the groups, regardless of the platform connection or abutment material. The platform connection influenced the stress on abutments more than the abutment material. The stress values for implants were similar among different platform connections, but greater stress concentrations were observed in internal connections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In acquired immunodeficiency syndrome (AIDS) studies it is quite common to observe viral load measurements collected irregularly over time. Moreover, these measurements can be subjected to some upper and/or lower detection limits depending on the quantification assays. A complication arises when these continuous repeated measures have a heavy-tailed behavior. For such data structures, we propose a robust structure for a censored linear model based on the multivariate Student's t-distribution. To compensate for the autocorrelation existing among irregularly observed measures, a damped exponential correlation structure is employed. An efficient expectation maximization type algorithm is developed for computing the maximum likelihood estimates, obtaining as a by-product the standard errors of the fixed effects and the log-likelihood function. The proposed algorithm uses closed-form expressions at the E-step that rely on formulas for the mean and variance of a truncated multivariate Student's t-distribution. The methodology is illustrated through an application to an Human Immunodeficiency Virus-AIDS (HIV-AIDS) study and several simulation studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to evaluate by photoelastic analysis stress distribution on short and long implants of two dental implant systems with 2-unit implant-supported fixed partial prostheses of 8 mm and 13 mm heights. Sixteen photoelastic models were divided into 4 groups: I: long implant (5 × 11 mm) (Neodent), II: long implant (5 × 11 mm) (Bicon), III: short implant (5 × 6 mm) (Neodent), and IV: short implants (5 × 6 mm) (Bicon). The models were positioned in a circular polariscope associated with a cell load and static axial (0.5 Kgf) and nonaxial load (15°, 0.5 Kgf) were applied to each group for both prosthetic crown heights. Three-way ANOVA was used to compare the factors implant length, crown height, and implant system (α = 0.05). The results showed that implant length was a statistically significant factor for both axial and nonaxial loading. The 13 mm prosthetic crown did not result in statistically significant differences in stress distribution between the implant systems and implant lengths studied, regardless of load type (P > 0.05). It can be concluded that short implants showed higher stress levels than long implants. Implant system and length was not relevant factors when prosthetic crown height were increased.