942 resultados para generalized multiscale entropy
Resumo:
The extension of Boltzmann-Gibbs thermostatistics, proposed by Tsallis, introduces an additional parameter q to the inverse temperature beta. Here, we show that a previously introduced generalized Metropolis dynamics to evolve spin models is not local and does not obey the detailed energy balance. In this dynamics, locality is only retrieved for q = 1, which corresponds to the standard Metropolis algorithm. Nonlocality implies very time-consuming computer calculations, since the energy of the whole system must be reevaluated when a single spin is flipped. To circumvent this costly calculation, we propose a generalized master equation, which gives rise to a local generalized Metropolis dynamics that obeys the detailed energy balance. To compare the different critical values obtained with other generalized dynamics, we perform Monte Carlo simulations in equilibrium for the Ising model. By using short-time nonequilibrium numerical simulations, we also calculate for this model the critical temperature and the static and dynamical critical exponents as functions of q. Even for q not equal 1, we show that suitable time-evolving power laws can be found for each initial condition. Our numerical experiments corroborate the literature results when we use nonlocal dynamics, showing that short-time parameter determination works also in this case. However, the dynamics governed by the new master equation leads to different results for critical temperatures and also the critical exponents affecting universality classes. We further propose a simple algorithm to optimize modeling the time evolution with a power law, considering in a log-log plot two successive refinements.
Resumo:
We provide a detailed account of the spatial structure of the Brazilian sardine (Sardinella brasiliensis) spawning and nursery habitats, using ichthyoplankton data from nine surveys (1976-1993) covering the Southeastern Brazilian Bight (SBB). The spatial variability of sardine eggs and larvae was partitioned into predefined spatial-scale classes (broad scale, 200-500 km; medium scale, 50-100 km; and local scale, <50 km). The relationship between density distributions at both developmental stages and environmental descriptors (temperature and salinity) was also explored within these spatial scales. Spatial distributions of sardine eggs were mostly structured on medium and local scales, while larvae were characterized by broad-and medium-scale distributions. Broad-and medium-scale surface temperatures were positively correlated with sardine densities, for both developmental stages. Correlations with salinity were predominantly negative and concentrated on a medium scale. Broad-scale structuring might be explained by mesoscale processes, such as pulsing upwelling events and Brazil Current meandering at the northern portion of the SBB, while medium-scale relationships may be associated with local estuarine outflows. The results indicate that processes favouring vertical stability might regulate the spatial extensions of suitable spawning and nursery habitats for the Brazilian sardine.
Resumo:
Non-commutative geometry indicates a deformation of the energy-momentum dispersion relation f (E) = E/pc (not equal 1) for massless particles. This distorted energy-momentum relation can affect the radiation-dominated phase of the universe at sufficiently high temperature. This prompted the idea of non-commutative inflation by Alexander et al (2003 Phys. Rev. D 67 081301) and Koh and Brandenberger (2007 JCAP06(2007) 021 and JCAP11(2007) 013). These authors studied a one-parameter family of a non-relativistic dispersion relation that leads to inflation: the a family of curves f (E) = 1 + (lambda E)(alpha). We show here how the conceptually different structure of symmetries of non-commutative spaces can lead, in a mathematically consistent way, to the fundamental equations of non-commutative inflation driven by radiation. We describe how this structure can be considered independently of (but including) the idea of non-commutative spaces as a starting point of the general inflationary deformation of SL(2, C). We analyze the conditions on the dispersion relation that leads to inflation as a set of inequalities which plays the same role as the slow-roll conditions on the potential of a scalar field. We study conditions for a possible numerical approach to obtain a general one-parameter family of dispersion relations that lead to successful inflation.
Resumo:
A rigorous asymptotic theory for Wald residuals in generalized linear models is not yet available. The authors provide matrix formulae of order O(n(-1)), where n is the sample size, for the first two moments of these residuals. The formulae can be applied to many regression models widely used in practice. The authors suggest adjusted Wald residuals to these models with approximately zero mean and unit variance. The expressions were used to analyze a real dataset. Some simulation results indicate that the adjusted Wald residuals are better approximated by the standard normal distribution than the Wald residuals.
Resumo:
Renyi and von Neumann entropies quantifying the amount of entanglement in ground states of critical spin chains are known to satisfy a universal law which is given by the conformal field theory (CFT) describing their scaling regime. This law can be generalized to excitations described by primary fields in CFT, as was done by Alcaraz et al in 2011 (see reference [1], of which this work is a completion). An alternative derivation is presented, together with numerical verifications of our results in different models belonging to the c = 1, 1/2 universality classes. Oscillations of the Renyi entropy in excited states are also discussed.
Resumo:
Increasing age is associated with a reduction in overall heart rate variability as well as changes in complexity of physiologic dynamics. The aim of this study was to verify if the alterations in autonomic modulation of heart rate caused by the aging process could be detected by Shannon entropy (SE), conditional entropy (CE) and symbolic analysis (SA). Complexity analysis was carried out in 44 healthy subjects divided into two groups: old (n = 23, 63 +/- A 3 years) and young group (n = 21, 23 +/- A 2). It was analyzed SE, CE [complexity index (CI) and normalized CI (NCI)] and SA (0V, 1V, 2LV and 2ULV patterns) during short heart period series (200 cardiac beats) derived from ECG recordings during 15 min of rest in a supine position. The sequences characterized by three heart periods with no significant variations (0V), and that with two significant unlike variations (2ULV) reflect changes in sympathetic and vagal modulation, respectively. The unpaired t test (or Mann-Whitney rank sum test when appropriate) was used in the statistical analysis. In the aging process, the distributions of patterns (SE) remain similar to young subjects. However, the regularity is significantly different; the patterns are more repetitive in the old group (a decrease of CI and NCI). The amounts of pattern types are different: 0V is increased and 2LV and 2ULV are reduced in the old group. These differences indicate marked change of autonomic regulation. The CE and SA are feasible techniques to detect alteration in autonomic control of heart rate in the old group.
Resumo:
We used the statistical measurements of information entropy, disequilibrium and complexity to infer a hierarchy of equations of state for two types of compact stars from the broad class of neutron stars, namely, with hadronic composition and with strange quark composition. Our results show that, since order costs energy. Nature would favor the exotic strange stars even though the question of how to form the strange stars cannot be answered within this approach. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
A twisted generalized Weyl algebra A of degree n depends on a. base algebra R, n commuting automorphisms sigma(i) of R, n central elements t(i) of R and on some additional scalar parameters. In a paper by Mazorchuk and Turowska, it is claimed that certain consistency conditions for sigma(i) and t(i) are sufficient for the algebra to be nontrivial. However, in this paper we give all example which shows that this is false. We also correct the statement by finding a new set of consistency conditions and prove that the old and new conditions together are necessary and sufficient for the base algebra R to map injectively into A. In particular they are sufficient for the algebra A to be nontrivial. We speculate that these consistency relations may play a role in other areas of mathematics, analogous to the role played by the Yang-Baxter equation in the theory of integrable systems.
Resumo:
The generalized finite element method (GFEM) is applied to a nonconventional hybrid-mixed stress formulation (HMSF) for plane analysis. In the HMSF, three approximation fields are involved: stresses and displacements in the domain and displacement fields on the static boundary. The GFEM-HMSF shape functions are then generated by the product of a partition of unity associated to each field and the polynomials enrichment functions. In principle, the enrichment can be conducted independently over each of the HMSF approximation fields. However, stability and convergence features of the resulting numerical method can be affected mainly by spurious modes generated when enrichment is arbitrarily applied to the displacement fields. With the aim to efficiently explore the enrichment possibilities, an extension to GFEM-HMSF of the conventional Zienkiewicz-Patch-Test is proposed as a necessary condition to ensure numerical stability. Finally, once the extended Patch-Test is satisfied, some numerical analyses focusing on the selective enrichment over distorted meshes formed by bilinear quadrilateral finite elements are presented, thus showing the performance of the GFEM-HMSF combination.
Resumo:
Lattice calculations of the QCD trace anomaly at temperatures T < 160 MeV have been shown to match hadron resonance gas model calculations, which include an exponentially rising hadron mass spectrum. In this paper we perform a more detailed comparison of the model calculations to lattice data that confirms the need for an exponentially increasing density of hadronic states. Also, we find that the lattice data is compatible with a hadron density of states that goes as rho(m) similar to m(-a) exp(m/T-H) at large m with a > 5/2 (where T-H similar to 167 MeV). With this specific subleading contribution to the density of states, heavy resonances are most likely to undergo two-body decay (instead of multiparticle decay), which facilitates their inclusion into hadron transport codes. Moreover, estimates for the shear viscosity and the shear relaxation time coefficient of the hadron resonance model computed within the excluded volume approximation suggest that these transport coefficients are sensitive to the parameters that define the hadron mass spectrum.
Resumo:
Background: Thalamotomies and pallidotomies were commonly performed before the deep brain stimulation (DBS) era. Although ablative procedures can lead to significant dystonia improvement, longer periods of analysis reveal disease progression and functional deterioration. Today, the same patients seek additional treatment possibilities. Methods: Four patients with generalized dystonia who previously had undergone bilateral pallidotomy came to our service seeking additional treatment because of dystonic symptom progression. Bilateral subthalamic nucleus DBS (B-STN-DBS) was the treatment of choice. The patients were evaluated with the BurkeFahnMarsden Dystonia Rating Scale (BFMDRS) and the Unified Dystonia Rating Scale (UDRS) before and 2 years after surgery. Results: All patients showed significant functional improvement, averaging 65.3% in BFMDRS (P = .014) and 69.2% in UDRS (P = .025). Conclusions: These results suggest that B-STN-DBS may be an interesting treatment option for generalized dystonia, even for patients who have already undergone bilateral pallidotomy. (c) 2012 Movement Disorder Society
Resumo:
We study the Von Neumann and Renyi entanglement entropy of long-range harmonic oscillators (LRHO) by both theoretical and numerical means. We show that the entanglement entropy in massless harmonic oscillators increases logarithmically with the sub-system size as S - c(eff)/3 log l. Although the entanglement entropy of LRHO's shares some similarities with the entanglement entropy at conformal critical points we show that the Renyi entanglement entropy presents some deviations from the expected conformal behaviour. In the massive case we demonstrate that the behaviour of the entanglement entropy with respect to the correlation length is also logarithmic as the short-range case. Copyright (c) EPLA, 2012
Resumo:
Fractal theory presents a large number of applications to image and signal analysis. Although the fractal dimension can be used as an image object descriptor, a multiscale approach, such as multiscale fractal dimension (MFD), increases the amount of information extracted from an object. MFD provides a curve which describes object complexity along the scale. However, this curve presents much redundant information, which could be discarded without loss in performance. Thus, it is necessary the use of a descriptor technique to analyze this curve and also to reduce the dimensionality of these data by selecting its meaningful descriptors. This paper shows a comparative study among different techniques for MFD descriptors generation. It compares the use of well-known and state-of-the-art descriptors, such as Fourier, Wavelet, Polynomial Approximation (PA), Functional Data Analysis (FDA), Principal Component Analysis (PCA), Symbolic Aggregate Approximation (SAX), kernel PCA, Independent Component Analysis (ICA), geometrical and statistical features. The descriptors are evaluated in a classification experiment using Linear Discriminant Analysis over the descriptors computed from MFD curves from two data sets: generic shapes and rotated fish contours. Results indicate that PCA, FDA, PA and Wavelet Approximation provide the best MFD descriptors for recognition and classification tasks. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Background: Prostate cancer is a serious public health problem that affects quality of life and has a significant mortality rate. The aim of the present study was to quantify the fractal dimension and Shannon’s entropy in the histological diagnosis of prostate cancer. Methods: Thirty-four patients with prostate cancer aged 50 to 75 years having been submitted to radical prostatectomy participated in the study. Histological slides of normal (N), hyperplastic (H) and tumor (T) areas of the prostate were digitally photographed with three different magnifications (40x, 100x and 400x) and analyzed. The fractal dimension (FD), Shannon’s entropy (SE) and number of cell nuclei (NCN) in these areas were compared. Results: FD analysis demonstrated the following significant differences between groups: T vs. N and H vs. N groups (p < 0.05) at a magnification of 40x; T vs. N (p < 0.01) at 100x and H vs. N (p < 0.01) at 400x. SE analysis revealed the following significant differences groups: T vs. H and T vs. N (p < 0.05) at 100x; and T vs. H and T vs. N (p < 0.001) at 400x. NCN analysis demonstrated the following significant differences between groups: T vs. H and T vs. N (p < 0.05) at 40x; T vs. H and T vs. N (p < 0.0001) at 100x; and T vs. H and T vs. N (p < 0.01) at 400x. Conclusions: The quantification of the FD and SE, together with the number of cell nuclei, has potential clinical applications in the histological diagnosis of prostate cancer.
Resumo:
Abstract Background The generalized odds ratio (GOR) was recently suggested as a genetic model-free measure for association studies. However, its properties were not extensively investigated. We used Monte Carlo simulations to investigate type-I error rates, power and bias in both effect size and between-study variance estimates of meta-analyses using the GOR as a summary effect, and compared these results to those obtained by usual approaches of model specification. We further applied the GOR in a real meta-analysis of three genome-wide association studies in Alzheimer's disease. Findings For bi-allelic polymorphisms, the GOR performs virtually identical to a standard multiplicative model of analysis (e.g. per-allele odds ratio) for variants acting multiplicatively, but augments slightly the power to detect variants with a dominant mode of action, while reducing the probability to detect recessive variants. Although there were differences among the GOR and usual approaches in terms of bias and type-I error rates, both simulation- and real data-based results provided little indication that these differences will be substantial in practice for meta-analyses involving bi-allelic polymorphisms. However, the use of the GOR may be slightly more powerful for the synthesis of data from tri-allelic variants, particularly when susceptibility alleles are less common in the populations (≤10%). This gain in power may depend on knowledge of the direction of the effects. Conclusions For the synthesis of data from bi-allelic variants, the GOR may be regarded as a multiplicative-like model of analysis. The use of the GOR may be slightly more powerful in the tri-allelic case, particularly when susceptibility alleles are less common in the populations.