9 resultados para Quantitative reconstruction
Resumo:
[EN] Data contained in this record come from the following accademic activity (from which it is possible to locate additional records related with the Monastery):
Resumo:
Contributed to: 4th International Conference, EuroMed 2012, Limassol, Cyprus, October 29 – November 3, 2012.
Resumo:
Feasible tomography schemes for large particle numbers must possess, besides an appropriate data acquisition protocol, an efficient way to reconstruct the density operator from the observed finite data set. Since state reconstruction typically requires the solution of a nonlinear large-scale optimization problem, this is a major challenge in the design of scalable tomography schemes. Here we present an efficient state reconstruction scheme for permutationally invariant quantum state tomography. It works for all common state-of-the-art reconstruction principles, including, in particular, maximum likelihood and least squares methods, which are the preferred choices in today's experiments. This high efficiency is achieved by greatly reducing the dimensionality of the problem employing a particular representation of permutationally invariant states known from spin coupling combined with convex optimization, which has clear advantages regarding speed, control and accuracy in comparison to commonly employed numerical routines. First prototype implementations easily allow reconstruction of a state of 20 qubits in a few minutes on a standard computer
Resumo:
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.
Resumo:
Climate change is an important environmental problem and one whose economic implications are many and varied. This paper starts with the presumption that mitigation of greenhouse gases is a necessary policy that has to be designed in a cost effective way. It is well known that market instruments are the best option for cost effectiveness. But the discussion regarding which of the various market instruments should be used, how they may interact and what combinations of policies should be implemented is still open and very lively. In this paper we propose a combination of instruments: the marketable emission permits already in place in Europe for major economic sectors and a CO(2) tax for economic sectors not included in the emissions permit scheme. The study uses an applied general equilibrium model for the Spanish economy to compute the results obtained with the new mix of instruments proposed. As the combination of the market for emission permits and the CO(2) tax admits different possibilities that depend on how the mitigation is distributed among the economic sectors, we concentrate on four possibilities: cost-effective, equalitarian, proportional to emissions, and proportional to output distributions. Other alternatives to the CO(2) tax are also analysed (tax on energy, on oil and on electricity). Our findings suggest that careful, well designed policies are needed as any deviation imposes significant additional costs that increase more than proportionally to the level of emissions reduction targeted by the EU.
Resumo:
4 p.
Resumo:
44 p.
Resumo:
El objetivo principal de esta tesis doctoral es, en primer lugar, ofrecer una reconstrucción alternativa del protoainu para, en segundo lugar, aplicar conceptos de tipología diacrónicaholística con el fin de discernir algún patrón evolutivo que ayude a responder a la pregunta:¿por qué la lengua ainu es como es en su contexto geolingüístico (lengua AOV con prefijos),cuando en la región euroasiática lo normal es encontrar el perfil 'lengua AOV con sufijos'? En suma, se trata de explorar las posibilidades que ofrece la tipología diacrónica holística,combinada con métodos más tradicionales, en la investigación de las etapas prehistóricas delenguas aisladas, es decir, sin parientes conocidos, como el ainu, el vasco, el zuñi o elburushaski. Este trabajo se divide en tres grandes bloques con un total de ocho capítulos, unapéndice con las nuevas reconstrucciones protoainúes y la bibliografía.El primer bloque se abre con el capítulo 1, donde se hace una breve presentación delas lenguas ainus y su filología. El capítulo 2 está dedicado a la reconstrucción de la fonologíaprotoainu. La reconstrucción pionera pertenece a A. Vovin (1992), que de hecho sirve comobase sobre la que ampliar, corregir o modificar nuevos elementos. En el capítulo 3 se describela morfología histórica de las lenguas ainus. En el capítulo 4 se investiga esta opción dentrode un marco más amplio que tiene como objetivo analizar los patrones elementales deformación de palabras. El capítulo 5, con el que se inicia el segundo bloque, da cabida a lapresentación de una hipótesis tipológica diacrónica, a cargo de P. Donegan y D. Stampe, conla que especialistas en lenguas munda y mon-khmer han sido capaces de alcanzar unreconstrucción del protoaustroasiático según la cual el tipo aglutinante de las lenguas mundasería secundario, frente al original monosilábico de las lenguas mon-khmer. En el capítulo 6se retoma la perspectiva tradicional de la lingüística geográfica, pero no se olvidan algunas delas consideraciones tipológicas apuntadas en el capítulo anterior (el hecho de que la hipótesisde Donegan y Stampe no funcione con el ainu no significa que la tipología diacrónica nopueda ser todavía de utilidad). En el capítulo 7 se presentan algunas incongruencias queresultan tras combinar las supuestas evidencias arqueológicas con el escenario lingüísticodescrito en capítulos anteriores. Las conclusiones generales se presentan en el capítulo 8. Elapéndice es una tabla comparativa con las dos reconstrucciones disponibles a fecha de hoypara la lengua protoainu, es decir, las propuestas por A. Vovin en su estudio seminal de 1992y en el capítulo 3 de la presente tesis. Dicha tabla incluye 686 reconstrucciones (puedehacerse una sencilla referencia cruzada con Vovin, puesto que ambas están ordenadasalfabéticamente).
Resumo:
In multisource industrial scenarios (MSIS) coexist NOAA generating activities with other productive sources of airborne particles, such as parallel processes of manufacturing or electrical and diesel machinery. A distinctive characteristic of MSIS is the spatially complex distribution of aerosol sources, as well as their potential differences in dynamics, due to the feasibility of multi-task configuration at a given time. Thus, the background signal is expected to challenge the aerosol analyzers at a probably wide range of concentrations and size distributions, depending of the multisource configuration at a given time. Monitoring and prediction by using statistical analysis of time series captured by on-line particle analyzers in industrial scenarios, have been proven to be feasible in predicting PNC evolution provided a given quality of net signals (difference between signal at source and background). However the analysis and modelling of non-consistent time series, influenced by low levels of SNR (Signal-Noise Ratio) could build a misleading basis for decision making. In this context, this work explores the use of stochastic models based on ARIMA methodology to monitor and predict exposure values (PNC). The study was carried out in a MSIS where an case study focused on the manufacture of perforated tablets of nano-TiO2 by cold pressing was performed