894 resultados para gaussian mixture model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes three tests to determine whether a given nonlinear device noise model is in agreement with accepted thermodynamic principles. These tests are applied to several models. One conclusion is that every Gaussian noise model for any nonlinear device predicts thermodynamically impossible circuit behavior: these models should be abandoned. But the nonlinear shot-noise model predicts thermodynamically acceptable behavior under a constraint derived here. Further, this constraint specifies the current noise amplitude at each operating point from knowledge of the device v - i curve alone. For the Gaussian and shot-noise models, this paper shows how the thermodynamic requirements can be reduced to concise mathematical tests involving no approximatio

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In most studies on civil wars, determinants of conflict have been hitherto explored assuming that actors involved were either unitary or stable. However, if this intra-group homogeneity assumption does not hold, empirical econometric estimates may be biased. We use Fixed Effects Finite Mixture Model (FE-FMM) approach to address this issue that provides a representation of heterogeneity when data originate from different latent classes and the affiliation is unknown. It allows to identify sub-populations within a population as well as the determinants of their behaviors. By combining various data sources for the period 2000-2005, we apply this methodology to the Colombian conflict. Our results highlight a behavioral heterogeneity in guerrilla’s armed groups and their distinct economic correlates. By contrast paramilitaries behave as a rather homogenous group.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A wind-tunnel study was conducted to investigate ventilation of scalars from urban-like geometries at neighbourhood scale by exploring two different geometries a uniform height roughness and a non-uniform height roughness, both with an equal plan and frontal density of λ p = λ f = 25%. In both configurations a sub-unit of the idealized urban surface was coated with a thin layer of naphthalene to represent area sources. The naphthalene sublimation method was used to measure directly total area-averaged transport of scalars out of the complex geometries. At the same time, naphthalene vapour concentrations controlled by the turbulent fluxes were detected using a fast Flame Ionisation Detection (FID) technique. This paper describes the novel use of a naphthalene coated surface as an area source in dispersion studies. Particular emphasis was also given to testing whether the concentration measurements were independent of Reynolds number. For low wind speeds, transfer from the naphthalene surface is determined by a combination of forced and natural convection. Compared with a propane point source release, a 25% higher free stream velocity was needed for the naphthalene area source to yield Reynolds-number-independent concentration fields. Ventilation transfer coefficients w T /U derived from the naphthalene sublimation method showed that, whilst there was enhanced vertical momentum exchange due to obstacle height variability, advection was reduced and dispersion from the source area was not enhanced. Thus, the height variability of a canopy is an important parameter when generalising urban dispersion. Fine resolution concentration measurements in the canopy showed the effect of height variability on dispersion at street scale. Rapid vertical transport in the wake of individual high-rise obstacles was found to generate elevated point-like sources. A Gaussian plume model was used to analyse differences in the downstream plumes. Intensified lateral and vertical plume spread and plume dilution with height was found for the non-uniform height roughness

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The article considers screening human populations with two screening tests. If any of the two tests is positive, then full evaluation of the disease status is undertaken; however, if both diagnostic tests are negative, then disease status remains unknown. This procedure leads to a data constellation in which, for each disease status, the 2 × 2 table associated with the two diagnostic tests used in screening has exactly one empty, unknown cell. To estimate the unobserved cell counts, previous approaches assume independence of the two diagnostic tests and use specific models, including the special mixture model of Walter or unconstrained capture–recapture estimates. Often, as is also demonstrated in this article by means of a simple test, the independence of the two screening tests is not supported by the data. Two new estimators are suggested that allow associations of the screening test, although the form of association must be assumed to be homogeneous over disease status. These estimators are modifications of the simple capture–recapture estimator and easy to construct. The estimators are investigated for several screening studies with fully evaluated disease status in which the superior behavior of the new estimators compared to the previous conventional ones can be shown. Finally, the performance of the new estimators is compared with maximum likelihood estimators, which are more difficult to obtain in these models. The results indicate the loss of efficiency as minor.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Much of the atmospheric variability in the North Atlantic sector is associated with variations in the eddy-driven component of the zonal flow. Here we present a simple method to specifically diagnose this component of the flow using the low-level wind field (925–700 hpa ). We focus on the North Atlantic winter season in the ERA-40 reanalysis. Diagnostics of the latitude and speed of the eddy-driven jet stream are compared with conventional diagnostics of the North Atlantic Oscillation (NAO) and the East Atlantic (EA) pattern. This shows that the NAO and the EA both describe combined changes in the latitude and speed of the jet stream. It is therefore necessary, but not always sufficient, to consider both the NAO and the EA in identifying changes in the jet stream. The jet stream analysis suggests that there are three preferred latitudinal positions of the North Atlantic eddy-driven jet stream in winter. This result is in very good agreement with the application of a statistical mixture model to the two-dimensional state space defined by the NAO and the EA. These results are consistent with several other studies which identify four European/Atlantic regimes, comprising three jet stream patterns plus European blocking events.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper considers meta-analysis of diagnostic studies that use a continuous Score for classification of study participants into healthy, or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between Studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might he confounded by a potentially unknown variation of the cut-off Value. To cope with this phenomena it is suggested to use, instead an overall estimate of the misclassification error previously suggested and used as Youden's index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel-Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden's index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The article considers screening human populations with two screening tests. If any of the two tests is positive, then full evaluation of the disease status is undertaken; however, if both diagnostic tests are negative, then disease status remains unknown. This procedure leads to a data constellation in which, for each disease status, the 2 x 2 table associated with the two diagnostic tests used in screening has exactly one empty, unknown cell. To estimate the unobserved cell counts, previous approaches assume independence of the two diagnostic tests and use specific models, including the special mixture model of Walter or unconstrained capture-recapture estimates. Often, as is also demonstrated in this article by means of a simple test, the independence of the two screening tests is not supported by the data. Two new estimators are suggested that allow associations of the screening test, although the form of association must be assumed to be homogeneous over disease status. These estimators are modifications of the simple capture-recapture estimator and easy to construct. The estimators are investigated for several screening studies with fully evaluated disease status in which the superior behavior of the new estimators compared to the previous conventional ones can be shown. Finally, the performance of the new estimators is compared with maximum likelihood estimators, which are more difficult to obtain in these models. The results indicate the loss of efficiency as minor.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The rate at which a given site in a gene sequence alignment evolves over time may vary. This phenomenon-known as heterotachy-can bias or distort phylogenetic trees inferred from models of sequence evolution that assume rates of evolution are constant. Here, we describe a phylogenetic mixture model designed to accommodate heterotachy. The method sums the likelihood of the data at each site over more than one set of branch lengths on the same tree topology. A branch-length set that is best for one site may differ from the branch-length set that is best for some other site, thereby allowing different sites to have different rates of change throughout the tree. Because rate variation may not be present in all branches, we use a reversible-jump Markov chain Monte Carlo algorithm to identify those branches in which reliable amounts of heterotachy occur. We implement the method in combination with our 'pattern-heterogeneity' mixture model, applying it to simulated data and five published datasets. We find that complex evolutionary signals of heterotachy are routinely present over and above variation in the rate or pattern of evolution across sites, that the reversible-jump method requires far fewer parameters than conventional mixture models to describe it, and serves to identify the regions of the tree in which heterotachy is most pronounced. The reversible-jump procedure also removes the need for a posteriori tests of 'significance' such as the Akaike or Bayesian information criterion tests, or Bayes factors. Heterotachy has important consequences for the correct reconstruction of phylogenies as well as for tests of hypotheses that rely on accurate branch-length information. These include molecular clocks, analyses of tempo and mode of evolution, comparative studies and ancestral state reconstruction. The model is available from the authors' website, and can be used for the analysis of both nucleotide and morphological data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Accelerated failure time models with a shared random component are described, and are used to evaluate the effect of explanatory factors and different transplant centres on survival times following kidney transplantation. Different combinations of the distribution of the random effects and baseline hazard function are considered and the fit of such models to the transplant data is critically assessed. A mixture model that combines short- and long-term components of a hazard function is then developed, which provides a more flexible model for the hazard function. The model can incorporate different explanatory variables and random effects in each component. The model is straightforward to fit using standard statistical software, and is shown to be a good fit to the transplant data. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study of motor unit action potential (MUAP) activity from electrornyographic signals is an important stage on neurological investigations that aim to understand the state of the neuromuscular system. In this context, the identification and clustering of MUAPs that exhibit common characteristics, and the assessment of which data features are most relevant for the definition of such cluster structure are central issues. In this paper, we propose the application of an unsupervised Feature Relevance Determination (FRD) method to the analysis of experimental MUAPs obtained from healthy human subjects. In contrast to approaches that require the knowledge of a priori information from the data, this FRD method is embedded on a constrained mixture model, known as Generative Topographic Mapping, which simultaneously performs clustering and visualization of MUAPs. The experimental results of the analysis of a data set consisting of MUAPs measured from the surface of the First Dorsal Interosseous, a hand muscle, indicate that the MUAP features corresponding to the hyperpolarization period in the physisiological process of generation of muscle fibre action potentials are consistently estimated as the most relevant and, therefore, as those that should be paid preferential attention for the interpretation of the MUAP groupings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A generalized or tunable-kernel model is proposed for probability density function estimation based on an orthogonal forward regression procedure. Each stage of the density estimation process determines a tunable kernel, namely, its center vector and diagonal covariance matrix, by minimizing a leave-one-out test criterion. The kernel mixing weights of the constructed sparse density estimate are finally updated using the multiplicative nonnegative quadratic programming algorithm to ensure the nonnegative and unity constraints, and this weight-updating process additionally has the desired ability to further reduce the model size. The proposed tunable-kernel model has advantages, in terms of model generalization capability and model sparsity, over the standard fixed-kernel model that restricts kernel centers to the training data points and employs a single common kernel variance for every kernel. On the other hand, it does not optimize all the model parameters together and thus avoids the problems of high-dimensional ill-conditioned nonlinear optimization associated with the conventional finite mixture model. Several examples are included to demonstrate the ability of the proposed novel tunable-kernel model to effectively construct a very compact density estimate accurately.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mixture model techniques are applied to a daily index of monsoon convection from ERA‐40 reanalysis to show regime behavior. The result is the existence of two significant regimes showing preferred locations of convection within the Asia/Western‐North Pacific domain, with some resemblance to active‐break events over India. Simple trend analysis over 1958–2001 shows that the first regime has become less frequent while the second becomes much more dominant. Both undergo a change in structure contributing to the total OLR trend over the ERA‐40 period. Stratifying the data according to a large‐scale dynamical index of monsoon interannual variability, we show the regime occurrence to be strongly perturbed by the seasonal condition, in agreement with conceptual ideas. This technique could be used to further examine predictability issues relating the seasonal mean and intraseasonal monsoon variability or to explore changes in monsoon behavior in centennial‐scale model integrations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In public goods experiments, stochastic choice, censoring and motivational heterogeneity give scope for disagreement over the extent of unselfishness, and whether it is reciprocal or altruistic. We show that these problems can be addressed econometrically, by estimating a finite mixture model to isolate types, incorporating double censoring and a tremble term. Most subjects act selfishly, but a substantial proportion are reciprocal with altruism playing only a marginal role. Isolating reciprocators enables a test of Sugden’s model of voluntary contributions. We estimate that reciprocators display a self-serving bias relative to the model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The extensive shoreline deposits of Lake Chilwa, southern Malawi, a shallow water body today covering 600 km2 of a basin of 7500 km2, are investigated for their record of late Quaternary highstands. OSL dating, applied to 36 samples from five sediment cores from the northern and western marginal sand ridges, reveal a highstand record spanning 44 ka. Using two different grouping methods, highstand phases are identified at 43.7–33.3 ka, 26.2–21.0 ka and 17.9–12.0 ka (total error method) or 38.4–35.5 ka, 24.3–22.3 ka, 16.2–15.1 ka and 13.5–12.7 ka (Finite Mixture Model age components) with two further discrete events recorded at 11.01 ± 0.76 ka and 8.52 ± 0.56 ka. Highstands are comparable to the timing of wet phases from other basins in East and southern Africa, demonstrating wet conditions in the region before the LGM, which was dry, and a wet Lateglacial, which commenced earlier in the southern compared to northern hemisphere in East Africa. We find no evidence that wet phases are insolation driven, but analysis of the dataset and GCM modelling experiments suggest that Heinrich events may be associated with enhanced monsoon activity in East Africa in both timing and as a possible causal mechanism.