924 resultados para ENTROPY
Resumo:
Background and Purpose-Functional MRI is a powerful tool to investigate recovery of brain function in patients with stroke. An inherent assumption in functional MRI data analysis is that the blood oxygenation level-dependent (BOLD) signal is stable over the course of the examination. In this study, we evaluated the validity of such assumption in patients with chronic stroke. Methods-Fifteen patients performed a simple motor task with repeated epochs using the paretic and the unaffected hand in separate runs. The corresponding BOLD signal time courses were extracted from the primary and supplementary motor areas of both hemispheres. Statistical maps were obtained by the conventional General Linear Model and by a parametric General Linear Model. Results-Stable BOLD amplitude was observed when the task was executed with the unaffected hand. Conversely, the BOLD signal amplitude in both primary and supplementary motor areas was progressively attenuated in every patient when the task was executed with the paretic hand. The conventional General Linear Model analysis failed to detect brain activation during movement of the paretic hand. However, the proposed parametric General Linear Model corrected the misdetection problem and showed robust activation in both primary and supplementary motor areas. Conclusions-The use of data analysis tools that are built on the premise of a stable BOLD signal may lead to misdetection of functional regions and underestimation of brain activity in patients with stroke. The present data urge the use of caution when relying on the BOLD response as a marker of brain reorganization in patients with stroke. (Stroke. 2010; 41:1921-1926.)
Resumo:
The anisotropic norm of a linear discrete-time-invariant system measures system output sensitivity to stationary Gaussian input disturbances of bounded mean anisotropy. Mean anisotropy characterizes the degree of predictability (or colouredness) and spatial non-roundness of the noise. The anisotropic norm falls between the H-2 and H-infinity norms and accommodates their loss of performance when the probability structure of input disturbances is not exactly known. This paper develops a method for numerical computation of the anisotropic norm which involves linked Riccati and Lyapunov equations and an associated special type equation.
Resumo:
Classical dynamics is formulated as a Hamiltonian flow in phase space, while quantum mechanics is formulated as unitary dynamics in Hilbert space. These different formulations have made it difficult to directly compare quantum and classical nonlinear dynamics. Previous solutions have focused on computing quantities associated with a statistical ensemble such as variance or entropy. However a more diner comparison would compare classical predictions to the quantum predictions for continuous simultaneous measurement of position and momentum of a single system, in this paper we give a theory of such measurement and show that chaotic behavior in classical systems fan be reproduced by continuously measured quantum systems.
Resumo:
Motivated by application of twisted current algebra in description of the entropy of Ads(3) black hole, we investigate the simplest twisted current algebra sl(3, c)(k)((2)). Free field representation of the twisted algebra, and the corresponding twisted Sugawara energy-momentum tensor are obtained by using three (beta, gamma) pairs and two scalar fields. Primary fields and two screening currents of the first kind are presented. (C) 2001 Published by Elsevier Science B.V.
Resumo:
Seven hundred and nineteen samples from throughout the Cainozoic section in CRP-3 were analysed by a Malvern Mastersizer laser particle analyser, in order to derive a stratigraphic distribution of grain-size parameters downhole. Entropy analysis of these data (using the method of Woolfe and Michibayashi, 1995) allowed recognition of four groups of samples, each group characterised by a distinctive grain-size distribution. Group 1, which shows a multi-modal distribution, corresponds to mudrocks, interbedded mudrock/sandstone facies, muddy sandstones and diamictites. Group 2, with a sand-grade mode but showing wide dispersion of particle size, corresponds to muddy sandstones, a few cleaner sandstones and some conglomerates. Group 3 and Group 4 are also sand-dominated, with better grain-size sorting, and correspond to clean, well-washed sandstones of varying mean grain-size (medium and fine modes, respectively). The downhole disappearance of Group 1, and dominance of Groups 3 and 4 reflect a concomitant change from mudrock- and diamictite-rich lithology to a section dominated by clean, well-washed sandstones with minor conglomerates. Progressive downhole increases in percentage sand and principal mode also reflect these changes. Significant shifts in grain-size parameters and entropy group membership were noted across sequence boundaries and seismic reflectors, as recognised in others studies.
Resumo:
A remarkable feature of quantum entanglement is that an entangled state of two parties, Alice (A) and Bob (B), may be more disordered locally than globally. That is, S(A) > S(A, B), where S() is the von Neumann entropy. It is known that satisfaction of this inequality implies that a state is nonseparable. In this paper we prove the stronger result that for separable states the vector of eigenvalues of the density matrix of system AB is majorized by the vector of eigenvalues of the density matrix of system A alone. This gives a strong sense in which a separable state is more disordered globally than locally and a new necessary condition for separability of bipartite states in arbitrary dimensions.
Resumo:
We describe in detail the theory underpinning the measurement of density matrices of a pair of quantum two-level systems (qubits). Our particular emphasis is on qubits realized by the two polarization degrees of freedom of a pair of entangled photons generated in a down-conversion experiment; however, the discussion applies in general, regardless of the actual physical realization. Two techniques are discussed, namely, a tomographic reconstruction (in which the density matrix is linearly related to a set of measured quantities) and a maximum likelihood technique which requires numerical optimization (but has the advantage of producing density matrices that are always non-negative definite). In addition, a detailed error analysis is presented, allowing errors in quantities derived from the density matrix, such as the entropy or entanglement of formation, to be estimated. Examples based on down-conversion experiments are used to illustrate our results.
Resumo:
The continuous parametric pumping of a superconducting lossy QED cavity supporting a field prepared initially as a superposition of coherent states is discussed. In contrast to classical pumping, we verify that the phase sensitivity of the parametric pumping makes the asymptotic behaviour of the cavity field state strongly dependent on the phase theta of the coherent state \ alpha > = \ alpha \e(i theta)>. Here we consider theta = pi /4, -pi /4 and we analyse the evolution of the purity of the superposition states with the help of the linear entropy and fidelity functions. We also analyse the decoherence process quantitatively through the Wigner function, for both states, verifying that the decay is slightly modified when compared to the free decoherence case: for theta = -pi /4 the process is accelerated while for theta = pi /4 it is delayed.
Resumo:
In this paper we study some purely mathematical considerations that arise in a paper of Cooper on the foundations of thermodynamics that was published in this journal. Connections with mathematical utility theory are studied and some errors in Cooper's paper are rectified. (C) 2001 Academic Press.
Resumo:
The flow field and the energy transport near thermoacoustic couples are simulated using a 2D full Navier-Stokes solver. The thermoacoustic couple plate is maintained at a constant temperature; plate lengths, which are short and long compared with the particle displacement lengths of the acoustic standing waves, are tested. Also investigated are the effects of plate spacing and the amplitude of the standing wave. Results are examined in the form of energy vectors, particle paths, and overall entropy generation rates. These show that a net heat-pumping effect appears only near the edges of thermoacoustic couple plates, within about a particle displacement distance from the ends. A heat-pumping effect can be seen even on the shortest plates tested when the plate spacing exceeds the thermal penetration depth. It is observed that energy dissipation near the plate increases quadratically as the plate spacing is reduced. The results also indicate that there may be a larger scale vortical motion outside the plates which disappears as the plate spacing is reduced. (C) 2002 Acoustical Society of America.
Resumo:
Recently quantum tomography has been proposed as a fundamental tool for prototyping a few qubit quantum device. It allows the complete reconstruction of the state produced from a given input into the device. From this reconstructed density matrix, relevant quantum information quantities such as the degree of entanglement and entropy can be calculated. Generally, orthogonal measurements have been discussed for this tomographic reconstruction. In this paper, we extend the tomographic reconstruction technique to two new regimes. First, we show how nonorthogonal measurements allow the reconstruction of the state of the system provided the measurements span the Hilbert space. We then detail how quantum-state tomography can be performed for multiqudits with a specific example illustrating how to achieve this in one- and two-qutrit systems.
Resumo:
Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.
Resumo:
This paper presents a new model based on thermodynamic and molecular interaction between molecules to describe the vapour-liquid phase equilibria and surface tension of pure component. The model assumes that the bulk fluid can be characterised as set of parallel layers. Because of this molecular structure, we coin the model as the molecular layer structure theory (MLST). Each layer has two energetic components. One is the interaction energy of one molecule of that layer with all surrounding layers. The other component is the intra-layer Helmholtz free energy, which accounts for the internal energy and the entropy of that layer. The equilibrium between two separating phases is derived from the minimum of the grand potential, and the surface tension is calculated as the excess of the Helmholtz energy of the system. We test this model with a number of components, argon, krypton, ethane, n-butane, iso-butane, ethylene and sulphur hexafluoride, and the results are very satisfactory. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The objective of this paper is to definite Historicity in Economic Sciences applying the principles of Entropy and methodological indeterminism. This implies the definition of two kinds of economic universes: one characterized by ergodicity and reversibility of Time and processes and the other by the opposite properties. The first part will deal with the construction of the subject of study and the nature of the proper analysis to these two universes. Taking such dichotomy into account, the second part will examine its implications as regards to the nature of equilibrium, the properties of stability and instability and the closure of the systems.
Resumo:
Background: Regulating mechanisms of branching morphogenesis of fetal lung rat explants have been an essential tool for molecular research. This work presents a new methodology to accurately quantify the epithelial, outer contour and peripheral airway buds of lung explants during cellular development from microscopic images. Methods: The outer contour was defined using an adaptive and multi-scale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelial was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds were counted as the skeleton branched ends from a skeletonized image of the lung inner epithelial. Results: The time for lung branching morphometric analysis was reduced in 98% in contrast to the manual method. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Non-significant differences were found between the automatic and manual results in all culture days. Conclusions: The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lightning characteristics and allowing a reliable comparison between different researchers.