984 resultados para Kähler-Einstein Metrics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that two evanescently coupled chi((2)) parametric oscillators provide a tunable bright source of quadrature squeezed light, Einstein-Podolsky-Rosen correlations and quantum entanglement. Analysing the system in the above threshold regime, we demonstrate that these properties can be controlled by adjusting the coupling strengths and the cavity detunings. As this can be implemented with integrated optics, it provides a possible route to rugged and stable EPR sources. (C) 2005 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study a model for a two-mode atomic-molecular Bose-Einstein condensate. Starting with a classical analysis we determine the phase space fixed points of the system. It is found that bifurcations of the fixed points naturally separate the coupling parameter space into four regions. The different regions give rise to qualitatively different dynamics. We then show that this classification holds true for the quantum dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare theoretically the tripartite entanglement available from the use of three concurrent x(2) nonlinearities and three independent squeezed states mixed on beamsplitters, using an appropriate version of the van Loock-Furusawa inequalities. We also define three-mode generalizations of the Einstein-Podolsky-Rosen paradox which are an alternative for demonstrating the inseparability of the density matrix.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The basis of this work was to investigate the relative environmental impacts of various power generators knowing that all plants are located in totally different environments and that different receptors will experience different impacts. Based on IChemE sustainability metrics paradigm, we calculated potential environmental indicators (P-EI) that represent the environmental burden of masses of potential pollutants discharged into different receiving media. However, a P-EI may not be of significance, as it may not be expressed at all in different conditions, so to try and include some receiver significance we developed a methodology to take into account some specific environmental indicators (S-EI) that refer to the environmental attributes of a specific site. In this context, we acquired site specific environmental data related to the airsheds and water catchment areas in different locations for a limited number of environmental indicators such as human health (carcinogenic) effects, atmospheric acidification, photochemical (ozone) smog and eutrophication. The S-EI results from this particular analysis show that atmospheric acidification has highest impact value while health risks due to fly ash emissions are considered not to be as significant. This is due to the fact that many coal power plants in Australia are located in low population density air sheds. The contribution of coal power plants to photochemical (ozone) smog and eutrophication were not significant. In this study, we have considered emission related data trends to reflect technology performance (e.g., P-EI indicators) while a real sustainability metric can be associated only with the specific environmental conditions of the relevant sites (e.g., S-EI indicators).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe the production of BECs on a new type of atom chip based on silver foil. Our atom chip is fabricated with thick wires capable of carrying currents of several amperes without overheating. The silver surface is highly reflective to light resonant with optical transitions used for Rb. The pattern on the chip consists of two parallel Z-trap wires, capable of producing two-wire guide, and two additional endcap wires for varying the axial confinement. Condensates are produced in magnetic microtraps formed within 1 mm of surface of the chip. We have observed the fragmentation of cold atom clouds when brought close to the chip surface. This results from a perturbed trapping potential caused by nanometer deviations of the current path through the wires on the chip. We present results of fragmentation of cold clouds at distances below 100 µm from the wires and investigate the origin of the deviating current. The fragmentation has different characteristics to those seen with copper conductors. The dynamics of atoms in these microtraps is also investigated. ©2005 COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Representing knowledge using domain ontologies has shown to be a useful mechanism and format for managing and exchanging information. Due to the difficulty and cost of building ontologies, a number of ontology libraries and search engines are coming to existence to facilitate reusing such knowledge structures. The need for ontology ranking techniques is becoming crucial as the number of ontologies available for reuse is continuing to grow. In this paper we present AKTiveRank, a prototype system for ranking ontologies based on the analysis of their structures. We describe the metrics used in the ranking system and present an experiment on ranking ontologies returned by a popular search engine for an example query.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To provide a consistent standard for the evaluation of different types of presbyopic correction. SETTING: Eye Clinic, School of Life and Health Sciences, Aston University, Birmingham, United Kingdom. METHODS: Presbyopic corrections examined were accommodating intraocular lenses (IOLs), simultaneous multifocal and monovision contact lenses, and varifocal spectacles. Binocular near visual acuity measured with different optotypes (uppercase letters, lowercase letters, and words) and reading metrics assessed with the Minnesota Near Reading chart (reading acuity, critical print size [CPS], CPS reading speed) were intercorrelated (Pearson product moment correlations) and assessed for concordance (intraclass correlation coefficients [ICC]) and agreement (Bland-Altman analysis) for indication of clinical usefulness. RESULTS: Nineteen accommodating IOL cases, 40 simultaneous contact lens cases, and 38 varifocal spectacle cases were evaluated. Other than CPS reading speed, all near visual acuity and reading metrics correlated well with each other (r>0.70, P<.001). Near visual acuity measured with uppercase letters was highly concordant (ICC, 0.78) and in close agreement with lowercase letters (+/- 0.17 logMAR). Near word acuity agreed well with reading acuity (+/- 0.16 logMAR), which in turn agreed well with near visual acuity measured with uppercase letters 0.16 logMAR). Concordance (ICC, 0.18 to 0.46) and agreement (+/- 0.24 to 0.30 logMAR) of CPS with the other near metrics was moderate. CONCLUSION: Measurement of near visual ability in presbyopia should be standardized to include assessment of near visual acuity with logMAR uppercase-letter optotypes, smallest logMAR print size that maintains maximum reading speed (CPS), and reading speed. J Cataract Refract Surg 2009; 35:1401-1409 (C) 2009 ASCRS and ESCRS

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is concerned with exact solutions of Einstein's field equations of general relativity, in particular, when the source of the gravitational field is a perfect fluid with a purely electric Weyl tensor. General relativity, cosmology and computer algebra are discussed briefly. A mathematical introduction to Riemannian geometry and the tetrad formalism is then given. This is followed by a review of some previous results and known solutions concerning purely electric perfect fluids. In addition, some orthonormal and null tetrad equations of the Ricci and Bianchi identities are displayed in a form suitable for investigating these space-times. Conformally flat perfect fluids are characterised by the vanishing of the Weyl tensor and form a sub-class of the purely electric fields in which all solutions are known (Stephani 1967). The number of Killing vectors in these space-times is investigated and results presented for the non-expanding space-times. The existence of stationary fields that may also admit 0, 1 or 3 spacelike Killing vectors is demonstrated. Shear-free fluids in the class under consideration are shown to be either non-expanding or irrotational (Collins 1984) using both orthonormal and null tetrads. A discrepancy between Collins (1984) and Wolf (1986) is resolved by explicitly solving the field equations to prove that the only purely electric, shear-free, geodesic but rotating perfect fluid is the Godel (1949) solution. The irrotational fluids with shear are then studied and solutions due to Szafron (1977) and Allnutt (1982) are characterised. The metric is simplified in several cases where new solutions may be found. The geodesic space-times in this class and all Bianchi type 1 perfect fluid metrics are shown to have a metric expressible in a diagonal form. The position of spherically symmetric and Bianchi type 1 space-times in relation to the general case is also illustrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The energy consumption and the energy efficiency have become very important issue in optimizing the current as well as in designing the future telecommunications networks. The energy and power metrics are being introduced in order to enable assessment and comparison of the energy consumption and power efficiency of the telecommunications networks and other transmission equipment. The standardization of the energy and power metrics is a significant ongoing activity aiming to define the baseline energy and power metrics for the telecommunications systems. This article provides an up-to-date overview of the energy and power metrics being proposed by the various standardization bodies and subsequently adopted worldwide by the equipment manufacturers and the network operators. © Institut Télécom and Springer-Verlag 2012.and Springer-Verlag 2012.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a theoretical study of a Bose-Einstein condensate of interacting bosons in a quartic trap in one, two, and three dimensions. Using Thomas-Fermi approximation, suitably complemented by numerical solutions of the Gross-Pitaevskii equation, we study the ground sate condensate density profiles, the chemical potential, the effects of cross-terms in the quartic potential, temporal evolution of various energy components of the condensate, and width oscillations of the condensate. Results obtained are compared with corresponding results for a bose condensate in a harmonic confinement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To what extent does competitive entry create a structural change in key marketing metrics? New players may just be a temporal nuisance to incumbents, but could also fundamentally change the latter's performance evolution, or induce them to permanently alter their spending levels and/or pricing decisions. Similarly, the addition of a new marketing channel could permanently shift shopping preferences, or could just create a short-lived migration from existing channels. The steady-state impact of a given entry or channel addition on various marketing metrics is intrinsically an empirical issue for which we need an appropriate testing procedure. In this study, we introduce a testing sequence that allows for the endogenous determination of potential change (break) locations, thereby accounting for lead and/or lagged effects of the introduction of interest. By not restricting the number of potential breaks to one (as is commonly done in the marketing literature), we quantify the impact of the new entrant(s) while controlling for other events that may have taken place in the market. We illustrate the methodology in the context of the Dutch television advertising market, which was characterized by the entry of several late movers. We find that the steady-state growth of private incumbents' revenues was slowed by the quasi-simultaneous entry of three new players. Contrary to industry observers' expectations, such a slowdown was not experienced in the related markets of print and radio advertising.