921 resultados para Entropy of Tsallis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using only linear interactions and a local parity measurement we show how entanglement can be detected between two harmonic oscillators. The scheme generalizes to measure both linear and nonlinear functionals of an arbitrary oscillator state. This leads to many applications including purity tests, eigenvalue estimation, entropy, and distance measures-all without the need for nonlinear interactions or complete state reconstruction. Remarkably, experimental realization of the proposed scheme is already within the reach of current technology with linear optics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use series expansion methods to calculate the dispersion relation of the one-magnon excitations for the spin-(1)/(2) triangular-lattice nearest-neighbor Heisenberg antiferromagnet above a three-sublattice ordered ground state. Several striking features are observed compared to the classical (large-S) spin-wave spectra. Whereas, at low energies the dispersion is only weakly renormalized by quantum fluctuations, significant anomalies are observed at high energies. In particular, we find rotonlike minima at special wave vectors and strong downward renormalization in large parts of the Brillouin zone, leading to very flat or dispersionless modes. We present detailed comparison of our calculated excitation energies in the Brillouin zone with the spin-wave dispersion to order 1/S calculated recently by Starykh, Chubukov, and Abanov [Phys. Rev. B74, 180403(R) (2006)]. We find many common features but also some quantitative and qualitative differences. We show that at temperatures as low as 0.1J the thermally excited rotons make a significant contribution to the entropy. Consequently, unlike for the square lattice model, a nonlinear sigma model description of the finite-temperature properties is only applicable at temperatures < 0.1J. Finally, we review recent NMR measurements on the organic compound kappa-(BEDT-TTF)(2)Cu-2(CN)(3). We argue that these are inconsistent with long-range order and a description of the low-energy excitations in terms of interacting magnons, and that therefore a Heisenberg model with only nearest-neighbor exchange does not offer an adequate description of this material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a problem of robust performance analysis of linear discrete time varying systems on a bounded time interval. The system is represented in the state-space form. It is driven by a random input disturbance with imprecisely known probability distribution; this distributional uncertainty is described in terms of entropy. The worst-case performance of the system is quantified by its a-anisotropic norm. Computing the anisotropic norm is reduced to solving a set of difference Riccati and Lyapunov equations and a special form equation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a fast adaptive Importance Sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First we estimate the minimum Cross-Entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level; finally, the tilting parameter just found is used to estimate the overflow probability of interest. We recognize three distinct properties of the method which together explain why the method works well; we conjecture that they hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two probabilistic interpretations of the n-tuple recognition method are put forward in order to allow this technique to be analysed with the same Bayesian methods used in connection with other neural network models. Elementary demonstrations are then given of the use of maximum likelihood and maximum entropy methods for tuning the model parameters and assisting their interpretation. One of the models can be used to illustrate the significance of overlapping n-tuple samples with respect to correlations in the patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A formalism for modelling the dynamics of Genetic Algorithms (GAs) using methods from statistical mechanics, originally due to Prugel-Bennett and Shapiro, is reviewed, generalized and improved upon. This formalism can be used to predict the averaged trajectory of macroscopic statistics describing the GA's population. These macroscopics are chosen to average well between runs, so that fluctuations from mean behaviour can often be neglected. Where necessary, non-trivial terms are determined by assuming maximum entropy with constraints on known macroscopics. Problems of realistic size are described in compact form and finite population effects are included, often proving to be of fundamental importance. The macroscopics used here are cumulants of an appropriate quantity within the population and the mean correlation (Hamming distance) within the population. Including the correlation as an explicit macroscopic provides a significant improvement over the original formulation. The formalism is applied to a number of simple optimization problems in order to determine its predictive power and to gain insight into GA dynamics. Problems which are most amenable to analysis come from the class where alleles within the genotype contribute additively to the phenotype. This class can be treated with some generality, including problems with inhomogeneous contributions from each site, non-linear or noisy fitness measures, simple diploid representations and temporally varying fitness. The results can also be applied to a simple learning problem, generalization in a binary perceptron, and a limit is identified for which the optimal training batch size can be determined for this problem. The theory is compared to averaged results from a real GA in each case, showing excellent agreement if the maximum entropy principle holds. Some situations where this approximation brakes down are identified. In order to fully test the formalism, an attempt is made on the strong sc np-hard problem of storing random patterns in a binary perceptron. Here, the relationship between the genotype and phenotype (training error) is strongly non-linear. Mutation is modelled under the assumption that perceptron configurations are typical of perceptrons with a given training error. Unfortunately, this assumption does not provide a good approximation in general. It is conjectured that perceptron configurations would have to be constrained by other statistics in order to accurately model mutation for this problem. Issues arising from this study are discussed in conclusion and some possible areas of further research are outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel method for prediction of the onset of a spontaneous (paroxysmal) atrial fibrilation episode by representing the electrocardiograph (ECG) output as two time series corresponding to the interbeat intervals and the lengths of the atrial component of the ECG. We will then show how different entropy measures can be calulated from both of these series and then combined in a neural network trained using the Bayesian evidence procedure to form and effective predictive classifier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using techniques from Statistical Physics, the annealed VC entropy for hyperplanes in high dimensional spaces is calculated as a function of the margin for a spherical Gaussian distribution of inputs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of entropy rate is well defined in dynamical systems theory but is impossible to apply it directly to finite real world data sets. With this in mind, Pincus developed Approximate Entropy (ApEn), which uses ideas from Eckmann and Ruelle to create a regularity measure based on entropy rate that can be used to determine the influence of chaotic behaviour in a real world signal. However, this measure was found not to be robust and so an improved formulation known as the Sample Entropy (SampEn) was created by Richman and Moorman to address these issues. We have developed a new, related, regularity measure which is not based on the theory provided by Eckmann and Ruelle and proves a more well-behaved measure of complexity than the previous measures whilst still retaining a low computational cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores a new method of analysing muscle fatigue within the muscles predominantly used during microsurgery. The captured electromyographic (EMG) data retrieved from these muscles are analysed for any defining patterns relating to muscle fatigue. The analysis consists of dynamically embedding the EMG signals from a single muscle channel into an embedded matrix. The muscle fatigue is determined by defining its entropy characterized by the singular values of the dynamically embedded (DE) matrix. The paper compares this new method with the traditional method of using mean frequency shifts in the EMG signal's power spectral density. Linear regressions are fitted to the results from both methods, and the coefficients of variation of both their slope and point of intercept are determined. It is shown that the complexity method is slightly more robust in that the coefficient of variation for the DE method has lower variability than the conventional method of mean frequency analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The preparation and characterization of two new neutral ferric complexes with desolvation-induced discontinuous spin-state transformation above room temperature are reported. The compounds, Fe(Hthpy)(thpy).CH3OH.3H2O (1) and Fe(Hmthpy)(mthpy).2H2O (2), are low-spin (LS) at room temperature and below, whereas their nonsolvated forms are high-spin (HS), exhibiting zero-field splitting. In these complexes, Hthpy, Hmthpy, and thpy, mthpy are the deprotonated forms of pyridoxal thiosemicarbazone and pyridoxal methylthiosemicarbazone, respectively; each is an O,N,S-tridentate ligand. The molecular structures have been determined at 100(1) K using single-crystal X-ray diffraction techniques and resulted in a triclinic system (space group P1) and monoclinic unit cell (space group P21/c) for 1 and 2, respectively. Structures were refined to the final error indices, where RF = 0.0560 for 1 and RF = 0.0522 for 2. The chemical inequivalence of the ligands was clearly established, for the "extra" hydrogen atom on the monodeprotonated ligands (Hthpy, Hmthpy) was found to be bound to the nitrogen of the pyridine ring. The ligands are all of the thiol form; the doubly deprotonated chelates (thpy, mthpy) have C-S bond lengths slightly longer than those of the singly deprotonated forms. There is a three-dimensional network of hydrogen bonds in both compounds. The discontinuous spin-state transformation is accompanied with liberation of solvate molecules. This is evidenced also from DSC analysis. Heat capacity data for the LS and HS phases are tabulated at selected temperatures, the values of the enthalpy and entropy changes connected with the change of spin state were reckoned at DeltaH = 12.5 0.3 kJ mol-1 and DeltaS = 33.3 0.8 J mol-1 K-1, respectively, for 1 and DeltaH = 6.5 0.3 kJ mol-1 and DeltaS = 17.6 0.8 J mol-1 K-1, respectively, for 2

Relevância:

30.00% 30.00%

Publicador:

Resumo:

WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web APIs have gained increasing popularity in recent Web service technology development owing to its simplicity of technology stack and the proliferation of mashups. However, efficiently discovering Web APIs and the relevant documentations on the Web is still a challenging task even with the best resources available on the Web. In this paper we cast the problem of detecting the Web API documentations as a text classification problem of classifying a given Web page as Web API associated or not. We propose a supervised generative topic model called feature latent Dirichlet allocation (feaLDA) which offers a generic probabilistic framework for automatic detection of Web APIs. feaLDA not only captures the correspondence between data and the associated class labels, but also provides a mechanism for incorporating side information such as labelled features automatically learned from data that can effectively help improving classification performance. Extensive experiments on our Web APIs documentation dataset shows that the feaLDA model outperforms three strong supervised baselines including naive Bayes, support vector machines, and the maximum entropy model, by over 3% in classification accuracy. In addition, feaLDA also gives superior performance when compared against other existing supervised topic models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an assessment of the practical value of existing traditional and non-standard measures for discriminating healthy people from people with Parkinson's disease (PD) by detecting dysphonia. We introduce a new measure of dysphonia, Pitch Period Entropy (PPE), which is robust to many uncontrollable confounding effects including noisy acoustic environments and normal, healthy variations in voice frequency. We collected sustained phonations from 31 people, 23 with PD. We then selected 10 highly uncorrelated measures, and an exhaustive search of all possible combinations of these measures finds four that in combination lead to overall correct classification performance of 91.4%, using a kernel support vector machine. In conclusion, we find that non-standard methods in combination with traditional harmonics-to-noise ratios are best able to separate healthy from PD subjects. The selected non-standard methods are robust to many uncontrollable variations in acoustic environment and individual subjects, and are thus well-suited to telemonitoring applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Implementation of Enterprise Resource Planning (ERP) systems require huge investments while ineffective implementations of such projects are commonly observed. A considerable number of these projects have been reported to fail or take longer than it was initially planned, while previous studies show that the aim of rapid implementation of such projects has not been successful and the failure of the fundamental goals in these projects have imposed huge amounts of costs on investors. Some of the major consequences are the reduction in demand for such products and the introduction of further skepticism to the managers and investors of ERP systems. In this regard, it is important to understand the factors determining success or failure of ERP implementation. The aim of this paper is to study the critical success factors (CSFs) in implementing ERP systems and to develop a conceptual model which can serve as a basis for ERP project managers. These critical success factors that are called “core critical success factors” are extracted from 62 published papers using the content analysis and the entropy method. The proposed conceptual model has been verified in the context of five multinational companies.