917 resultados para COHERENT
Resumo:
The observation of coherent tunnelling in Cu2+ - and Ag2+ -doped MgO and CaO:Cu2+ was a crucial discovery in the realm of the Jahn-Teller (JT) effect. The main reasons favoring this dynamic behavior are now clarified through ab initio calculations on Cu2+ - and Ag2+ -doped cubic oxides. Small JT distortions and an unexpected low anharmonicity of the eg JT mode are behind energy barriers smaller than 25 cm-1 derived through CASPT2 calculations for Cu2+ - and Ag2+ -doped MgO and CaO:Cu2+ . The low anharmonicity is shown to come from a strong vibrational coupling of MO610- units (M=Cu,Ag) to the host lattice. The average distance between the d9 impurity and ligands is found to vary significantly on passing from MgO to SrO following to a good extent the lattice parameter.
Squeezed Coherent State Representation of Scalar Field and Particle Production in the Early Universe
Resumo:
The present work is an attempt to explain particle production in the early univese. We argue that nonzero values of the stress-energy tensor evaluated in squeezed vacuum state can be due to particle production and this supports the concept of particle production from zero-point quantum fluctuations. In the present calculation we use the squeezed coherent state introduced by Fan and Xiao [7]. The vacuum expectation values of stressenergy tensor defined prior to any dynamics in the background gravitational field give all information about particle production. Squeezing of the vacuum is achieved by means of the background gravitational field, which plays the role of a parametric amplifier [8]. The present calculation shows that the vacuum expectation value of the energy density and pressure contain terms in addition to the classical zero-point energy terms. The calculation of the particle production probability shows that the probability increases as the squeezing parameter increases, reaches a maximum value, and then decreases.
Squeezed Coherent State Representation of Scalar Field and Particle Production in the Early Universe
Resumo:
The present work is an attempt to explain particle production in the early univese. We argue that nonzero values of the stress-energy tensor evaluated in squeezed vacuum state can be due to particle production and this supports the concept of particle production from zero-point quantum fluctuations. In the present calculation we use the squeezed coherent state introduced by Fan and Xiao [7]. The vacuum expectation values of stressenergy tensor defined prior to any dynamics in the background gravitational field give all information about particle production. Squeezing of the vacuum is achieved by means of the background gravitational field, which plays the role of a parametric amplifier [8]. The present calculation shows that the vacuum expectation value of the energy density and pressure contain terms in addition to the classical zero-point energy terms. The calculation of the particle production probability shows that the probability increases as the squeezing parameter increases, reaches a maximum value, and then decreases.
Resumo:
As the number of processors in distributed-memory multiprocessors grows, efficiently supporting a shared-memory programming model becomes difficult. We have designed the Protocol for Hierarchical Directories (PHD) to allow shared-memory support for systems containing massive numbers of processors. PHD eliminates bandwidth problems by using a scalable network, decreases hot-spots by not relying on a single point to distribute blocks, and uses a scalable amount of space for its directories. PHD provides a shared-memory model by synthesizing a global shared memory from the local memories of processors. PHD supports sequentially consistent read, write, and test- and-set operations. This thesis also introduces a method of describing locality for hierarchical protocols and employs this method in the derivation of an abstract model of the protocol behavior. An embedded model, based on the work of Johnson[ISCA19], describes the protocol behavior when mapped to a k-ary n-cube. The thesis uses these two models to study the average height in the hierarchy that operations reach, the longest path messages travel, the number of messages that operations generate, the inter-transaction issue time, and the protocol overhead for different locality parameters, degrees of multithreading, and machine sizes. We determine that multithreading is only useful for approximately two to four threads; any additional interleaving does not decrease the overall latency. For small machines and high locality applications, this limitation is due mainly to the length of the running threads. For large machines with medium to low locality, this limitation is due mainly to the protocol overhead being too large. Our study using the embedded model shows that in situations where the run length between references to shared memory is at least an order of magnitude longer than the time to process a single state transition in the protocol, applications exhibit good performance. If separate controllers for processing protocol requests are included, the protocol scales to 32k processor machines as long as the application exhibits hierarchical locality: at least 22% of the global references must be able to be satisfied locally; at most 35% of the global references are allowed to reach the top level of the hierarchy.
Resumo:
Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult to achieve because the relative values of the forecast components often fail to behave in a way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It has been shown that cause-specic mortality forecasts are pessimistic when compared with all-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approach of using log mortality rates and forecasts the density of deaths in the life table. Since these values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbing state), they are intrinsically relative rather than absolute values across decrements as well as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison (1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that the unit sum constraint is honoured. The structure of the best-known, single-decrement mortality-rate forecasting model, devised by Lee and Carter (1992), is expressed in compositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortality by cause of death for Japan
Resumo:
Resumen tomado parcialmente del autor
Resumo:
Resumen tomado de la publicación
Resumo:
Resumen basado en el de la publicación. Monográfico con el título: 'Política educativa per a una societat en crisi'
Resumo:
Resumen basado en el de la autora en catalán
Resumo:
A quasi-optical technique for characterizing micromachined waveguides is demonstrated with wideband time-resolved terahertz spectroscopy. A transfer-function representation is adopted for the description of the relation between the signals in the input and output port of the waveguides. The time-domain responses were discretized, and the waveguide transfer function was obtained through a parametric approach in the z domain after describing the system with an autoregressive with exogenous input model. The a priori assumption of the number of modes propagating in the structure was inferred from comparisons of the theoretical with the measured characteristic impedance as well as with parsimony arguments. Measurements for a precision WR-8 waveguide-adjustable short as well as for G-band reduced-height micromachined waveguides are presented. (C) 2003 Optical Society of America.
Resumo:
Methods have recently been developed that make use of electromagnetic radiation at terahertz (THz) frequencies, the region of the spectrum between millimetre wavelengths and the infrared, for imaging purposes. Radiation at these wavelengths is non-ionizing and subject to far less Rayleigh scatter than visible or infrared wavelengths, making it suitable for medical applications. This paper introduces THz pulsed imaging and discusses its potential for in vivo medical applications in comparison with existing modalities.