181 resultados para statistical speaker models

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Finite-size scaling analysis turns out to be a powerful tool to calculate the phase diagram as well as the critical properties of two-dimensional classical statistical mechanics models and quantum Hamiltonians in one dimension. The most used method to locate quantum critical points is the so-called crossing method, where the estimates are obtained by comparing the mass gaps of two distinct lattice sizes. The success of this method is due to its simplicity and the ability to provide accurate results even considering relatively small lattice sizes. In this paper, we introduce an estimator that locates quantum critical points by exploring the known distinct behavior of the entanglement entropy in critical and noncritical systems. As a benchmark test, we use this new estimator to locate the critical point of the quantum Ising chain and the critical line of the spin-1 Blume-Capel quantum chain. The tricritical point of this last model is also obtained. Comparison with the standard crossing method is also presented. The method we propose is simple to implement in practice, particularly in density matrix renormalization group calculations, and provides us, like the crossing method, amazingly accurate results for quite small lattice sizes. Our applications show that the proposed method has several advantages, as compared with the standard crossing method, and we believe it will become popular in future numerical studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider a simple Maier-Saupe statistical model with the inclusion of disorder degrees of freedom to mimic the phase diagram of a mixture of rodlike and disklike molecules. A quenched distribution of shapes leads to a phase diagram with two uniaxial and a biaxial nematic structure. A thermalized distribution, however, which is more adequate to liquid mixtures, precludes the stability of this biaxial phase. We then use a two-temperature formalism, and assume a separation of relaxation times, to show that a partial degree of annealing is already sufficient to stabilize a biaxial nematic structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dental impression is an important step in the preparation of prostheses since it provides the reproduction of anatomic and surface details of teeth and adjacent structures. The objective of this study was to evaluate the linear dimensional alterations in gypsum dies obtained with different elastomeric materials, using a resin coping impression technique with individual shells. A master cast made of stainless steel with fixed prosthesis characteristics with two prepared abutment teeth was used to obtain the impressions. References points (A, B, C, D, E and F) were recorded on the occlusal and buccal surfaces of abutments to register the distances. The impressions were obtained using the following materials: polyether, mercaptan-polysulfide, addition silicone, and condensation silicone. The transfer impressions were made with custom trays and an irreversible hydrocolloid material and were poured with type IV gypsum. The distances between identified points in gypsum dies were measured using an optical microscope and the results were statistically analyzed by ANOVA (p < 0.05) and Tukey's test. The mean of the distances were registered as follows: addition silicone (AB = 13.6 µm, CD=15.0 µm, EF = 14.6 µm, GH=15.2 µm), mercaptan-polysulfide (AB = 36.0 µm, CD = 36.0 µm, EF = 39.6 µm, GH = 40.6 µm), polyether (AB = 35.2 µm, CD = 35.6 µm, EF = 39.4 µm, GH = 41.4 µm) and condensation silicone (AB = 69.2 µm, CD = 71.0 µm, EF = 80.6 µm, GH = 81.2 µm). All of the measurements found in gypsum dies were compared to those of a master cast. The results demonstrated that the addition silicone provides the best stability of the compounds tested, followed by polyether, polysulfide and condensation silicone. No statistical differences were obtained between polyether and mercaptan-polysulfide materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The enzyme purine nucleoside phosphorylase from Schistosoma mansoni (SmPNP) is an attractive molecular target for the treatment of major parasitic infectious diseases, with special emphasis on its role in the discovery of new drugs against schistosomiasis, a tropical disease that affects millions of people worldwide. In the present work, we have determined the inhibitory potency and developed descriptor- and fragment-based quantitative structure-activity relationships (QSAR) for a series of 9-deazaguanine analogs as inhibitors of SmPNP. Significant statistical parameters (descriptor-based model: r² = 0.79, q² = 0.62, r²pred = 0.52; and fragment-based model: r² = 0.95, q² = 0.81, r²pred = 0.80) were obtained, indicating the potential of the models for untested compounds. The fragment-based model was then used to predict the inhibitory potency of a test set of compounds, and the predicted values are in good agreement with the experimental results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to determine the reproducibility, reliability and validity of measurements in digital models compared to plaster models. Fifteen pairs of plaster models were obtained from orthodontic patients with permanent dentition before treatment. These were digitized to be evaluated with the program Cécile3 v2.554.2 beta. Two examiners measured three times the mesiodistal width of all the teeth present, intercanine, interpremolar and intermolar distances, overjet and overbite. The plaster models were measured using a digital vernier. The t-Student test for paired samples and interclass correlation coefficient (ICC) were used for statistical analysis. The ICC of the digital models were 0.84 ± 0.15 (intra-examiner) and 0.80 ± 0.19 (inter-examiner). The average mean difference of the digital models was 0.23 ± 0.14 and 0.24 ± 0.11 for each examiner, respectively. When the two types of measurements were compared, the values obtained from the digital models were lower than those obtained from the plaster models (p < 0.05), although the differences were considered clinically insignificant (differences < 0.1 mm). The Cécile digital models are a clinically acceptable alternative for use in Orthodontics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, we have experienced increasing interest in the understanding of the physical properties of collisionless plasmas, mostly because of the large number of astrophysical environments (e. g. the intracluster medium (ICM)) containing magnetic fields that are strong enough to be coupled with the ionized gas and characterized by densities sufficiently low to prevent the pressure isotropization with respect to the magnetic line direction. Under these conditions, a new class of kinetic instabilities arises, such as firehose and mirror instabilities, which have been studied extensively in the literature. Their role in the turbulence evolution and cascade process in the presence of pressure anisotropy, however, is still unclear. In this work, we present the first statistical analysis of turbulence in collisionless plasmas using three-dimensional numerical simulations and solving double-isothermal magnetohydrodynamic equations with the Chew-Goldberger-Low laws closure (CGL-MHD). We study models with different initial conditions to account for the firehose and mirror instabilities and to obtain different turbulent regimes. We found that the CGL-MHD subsonic and supersonic turbulences show small differences compared to the MHD models in most cases. However, in the regimes of strong kinetic instabilities, the statistics, i.e. the probability distribution functions (PDFs) of density and velocity, are very different. In subsonic models, the instabilities cause an increase in the dispersion of density, while the dispersion of velocity is increased by a large factor in some cases. Moreover, the spectra of density and velocity show increased power at small scales explained by the high growth rate of the instabilities. Finally, we calculated the structure functions of velocity and density fluctuations in the local reference frame defined by the direction of magnetic lines. The results indicate that in some cases the instabilities significantly increase the anisotropy of fluctuations. These results, even though preliminary and restricted to very specific conditions, show that the physical properties of turbulence in collisionless plasmas, as those found in the ICM, may be very different from what has been largely believed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Creation of cold dark matter (CCDM) can macroscopically be described by a negative pressure, and, therefore, the mechanism is capable to accelerate the Universe, without the need of an additional dark energy component. In this framework, we discuss the evolution of perturbations by considering a Neo-Newtonian approach where, unlike in the standard Newtonian cosmology, the fluid pressure is taken into account even in the homogeneous and isotropic background equations (Lima, Zanchin, and Brandenberger, MNRAS 291, L1, 1997). The evolution of the density contrast is calculated in the linear approximation and compared to the one predicted by the Lambda CDM model. The difference between the CCDM and Lambda CDM predictions at the perturbative level is quantified by using three different statistical methods, namely: a simple chi(2)-analysis in the relevant space parameter, a Bayesian statistical inference, and, finally, a Kolmogorov-Smirnov test. We find that under certain circumstances, the CCDM scenario analyzed here predicts an overall dynamics (including Hubble flow and matter fluctuation field) which fully recovers that of the traditional cosmic concordance model. Our basic conclusion is that such a reduction of the dark sector provides a viable alternative description to the accelerating Lambda CDM cosmology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mass function of cluster-size halos and their redshift distribution are computed for 12 distinct accelerating cosmological scenarios and confronted to the predictions of the conventional flat Lambda CDM model. The comparison with Lambda CDM is performed by a two-step process. First, we determine the free parameters of all models through a joint analysis involving the latest cosmological data, using supernovae type Ia, the cosmic microwave background shift parameter, and baryon acoustic oscillations. Apart from a braneworld inspired cosmology, it is found that the derived Hubble relation of the remaining models reproduces the Lambda CDM results approximately with the same degree of statistical confidence. Second, in order to attempt to distinguish the different dark energy models from the expectations of Lambda CDM, we analyze the predicted cluster-size halo redshift distribution on the basis of two future cluster surveys: (i) an X-ray survey based on the eROSITA satellite, and (ii) a Sunayev-Zeldovich survey based on the South Pole Telescope. As a result, we find that the predictions of 8 out of 12 dark energy models can be clearly distinguished from the Lambda CDM cosmology, while the predictions of 4 models are statistically equivalent to those of the Lambda CDM model, as far as the expected cluster mass function and redshift distribution are concerned. The present analysis suggests that such a technique appears to be very competitive to independent tests probing the late time evolution of the Universe and the associated dark energy effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here, I investigate the use of Bayesian updating rules applied to modeling how social agents change their minds in the case of continuous opinion models. Given another agent statement about the continuous value of a variable, we will see that interesting dynamics emerge when an agent assigns a likelihood to that value that is a mixture of a Gaussian and a uniform distribution. This represents the idea that the other agent might have no idea about what is being talked about. The effect of updating only the first moments of the distribution will be studied, and we will see that this generates results similar to those of the bounded confidence models. On also updating the second moment, several different opinions always survive in the long run, as agents become more stubborn with time. However, depending on the probability of error and initial uncertainty, those opinions might be clustered around a central value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today several different unsupervised classification algorithms are commonly used to cluster similar patterns in a data set based only on its statistical properties. Specially in image data applications, self-organizing methods for unsupervised classification have been successfully applied for clustering pixels or group of pixels in order to perform segmentation tasks. The first important contribution of this paper refers to the development of a self-organizing method for data classification, named Enhanced Independent Component Analysis Mixture Model (EICAMM), which was built by proposing some modifications in the Independent Component Analysis Mixture Model (ICAMM). Such improvements were proposed by considering some of the model limitations as well as by analyzing how it should be improved in order to become more efficient. Moreover, a pre-processing methodology was also proposed, which is based on combining the Sparse Code Shrinkage (SCS) for image denoising and the Sobel edge detector. In the experiments of this work, the EICAMM and other self-organizing models were applied for segmenting images in their original and pre-processed versions. A comparative analysis showed satisfactory and competitive image segmentation results obtained by the proposals presented herein. (C) 2008 Published by Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixed models have become important in analyzing the results of experiments, particularly those that require more complicated models (e.g., those that involve longitudinal data). This article describes a method for deriving the terms in a mixed model. Our approach extends an earlier method by Brien and Bailey to explicitly identify terms for which autocorrelation and smooth trend arising from longitudinal observations need to be incorporated in the model. At the same time we retain the principle that the model used should include, at least, all the terms that are justified by the randomization. This is done by dividing the factors into sets, called tiers, based on the randomization and determining the crossing and nesting relationships between factors. The method is applied to formulate mixed models for a wide range of examples. We also describe the mixed model analysis of data from a three-phase experiment to investigate the effect of time of refinement on Eucalyptus pulp from four different sources. Cubic smoothing splines are used to describe differences in the trend over time and unstructured covariance matrices between times are found to be necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present various diagnostic methods for polyhazard models. Polyhazard models are a flexible family for fitting lifetime data. Their main advantage over the single hazard models, such as the Weibull and the log-logistic models, is to include a large amount of nonmonotone hazard shapes, as bathtub and multimodal curves. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. A discussion of the computation of the likelihood displacement as well as the normal curvature in the local influence method are presented. Finally, an example with real data is given for illustration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many models exist in the literature to explain the success of technological innovation. However, no studies have been made regarding graphic formats representing the technological innovation models and their impact, or on the understanding of these models by non-specialists in technology management. Thus, the main objective of this paper is to propose a new graphic configuration to represent the technological innovation management. Based on the literature, the innovation model is presented in the traditional format. Next, the same model is designed in the graphic format - named `the see-saw of competitiveness` - showing the interfaces among the identified factors. The two graphic formats were compared by a group of graduate students in terms of the ease in understanding the conceptual model of innovation. The statistical analysis shows that the seesaw of competitiveness is preferred.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The aim of this article is to propose an integrated framework for extracting and describing patterns of disorders from medical images using a combination of linear discriminant analysis and active contour models. Methods: A multivariate statistical methodology was first used to identify the most discriminating hyperplane separating two groups of images (from healthy controls and patients with schizophrenia) contained in the input data. After this, the present work makes explicit the differences found by the multivariate statistical method by subtracting the discriminant models of controls and patients, weighted by the pooled variance between the two groups. A variational level-set technique was used to segment clusters of these differences. We obtain a label of each anatomical change using the Talairach atlas. Results: In this work all the data was analysed simultaneously rather than assuming a priori regions of interest. As a consequence of this, by using active contour models, we were able to obtain regions of interest that were emergent from the data. The results were evaluated using, as gold standard, well-known facts about the neuroanatomical changes related to schizophrenia. Most of the items in the gold standard was covered in our result set. Conclusions: We argue that such investigation provides a suitable framework for characterising the high complexity of magnetic resonance images in schizophrenia as the results obtained indicate a high sensitivity rate with respect to the gold standard. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we compare the performance of two statistical approaches for the analysis of data obtained from the social research area. In the first approach, we use normal models with joint regression modelling for the mean and for the variance heterogeneity. In the second approach, we use hierarchical models. In the first case, individual and social variables are included in the regression modelling for the mean and for the variance, as explanatory variables, while in the second case, the variance at level 1 of the hierarchical model depends on the individuals (age of the individuals), and in the level 2 of the hierarchical model, the variance is assumed to change according to socioeconomic stratum. Applying these methodologies, we analyze a Colombian tallness data set to find differences that can be explained by socioeconomic conditions. We also present some theoretical and empirical results concerning the two models. From this comparative study, we conclude that it is better to jointly modelling the mean and variance heterogeneity in all cases. We also observe that the convergence of the Gibbs sampling chain used in the Markov Chain Monte Carlo method for the jointly modeling the mean and variance heterogeneity is quickly achieved.