933 resultados para multi-quantum-well
Resumo:
Quantum feedback can stabilize a two-level atom against decoherence (spontaneous emission), putting it into an arbitrary (specified) pure state. This requires perfect homodyne detection of the atomic emission, and instantaneous feedback. Inefficient detection was considered previously by two of us. Here we allow for a non-zero delay time tau in the feedback circuit. Because a two-level atom is a non-linear optical system, an analytical solution is not possible. However, quantum trajectories allow a simple numerical simulation of the resulting non-Markovian process. We find the effect of the time delay to be qualitatively similar to chat of inefficient detection. The solution of the non-Markovian quantum trajectory will not remain fixed, so that the time-averaged state will be mixed, not pure. In the case where one tries to stabilize the atom in the excited state, an approximate analytical solution to the quantum trajectory is possible. The result, that the purity (P = 2Tr[rho (2)] - 1) of the average state is given by P = 1 - 4y tau (where gamma is the spontaneous emission rate) is found to agree very well with the numerical results. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This paper deals with atomic systems coupled to a structured reservoir of quantum EM field modes, with particular relevance to atoms interacting with the field in photonic band gap materials. The case of high Q cavities has been treated elsewhere using Fano diagonalization based on a quasimode approach, showing that the cavity quasimodes are responsible for pseudomodes introduced to treat non-Markovian behaviour. The paper considers a simple model of a photonic band gap case, where the spatially dependent permittivity consists of a constant term plus a small spatially periodic term that leads to a narrow band gap in the spectrum of mode frequencies. Most treatments of photonic band gap materials are based on the true modes, obtained numerically by solving the Helmholtz equation for the actual spatially periodic permittivity. Here the field modes are first treated in terms of a simpler quasimode approach, in which the quasimodes are plane waves associated with the constant permittivity term. Couplings between the quasimodes occur owing to the small periodic term in the permittivity, with selection rules for the coupled modes being related to the reciprocal lattice vectors. This produces a field Hamiltonian in quasimode form. A matrix diagonalization method may be applied to relate true mode annihilation operators to those for quasimodes. The atomic transitions are coupled to all the quasimodes, and the true mode atom-EM field coupling constants (one-photon Rabi frequencies) are related to those for the quasimodes and also expressions are obtained for the true mode density. The results for the one-photon Rabi frequencies differ from those assumed in other work. Expressions for atomic decay rates are obtained using the Fermi Golden rule, although these are valid only well away from the band gaps.
Resumo:
We develop a systematic theory of critical quantum fluctuations in the driven parametric oscillator. Our analytic results agree well with stochastic numerical simulations. We also compare the results obtained in the positive-P representation, as a fully quantum-mechanical calculation, with the truncated Wigner phase-space equation, also known as the semiclassical theory. We show when these results agree and differ in calculations taken beyond the linearized approximation. We find that the optimal broadband noise reduction occurs just above threshold. In this region where there are large quantum fluctuations in the conjugate variance and macroscopic quantum superposition states might be expected, we find that the quantum predictions correspond very closely to the semiclassical theory.
Resumo:
Which gates are universal for quantum computation? Although it is well known that certain gates on two-level quantum systems (qubits), such as the controlled-NOT, are universal when assisted by arbitrary one-qubit gates, it has only recently become clear precisely what class of two-qubit gates is universal in this sense. We present an elementary proof that any entangling two-qubit gate is universal for quantum computation, when assisted by one-qubit gates. A proof of this result for systems of arbitrary finite dimension has been provided by Brylinski and Brylinski; however, their proof relies on a long argument using advanced mathematics. In contrast, our proof provides a simple constructive procedure which is close to optimal and experimentally practical.
Resumo:
The isotope composition of Ph is difficult to determine accurately due to the lack of a stable normalisation ratio. Double and triple-spike addition techniques provide one solution and presently yield the most accurate measurements. A number of recent studies have claimed that improved accuracy and precision could also be achieved by multi-collector ICP-MS (MC-ICP-MS) Pb-isotope analysis using the addition of Tl of known isotope composition to Pb samples. In this paper, we verify whether the known isotope composition of Tl can be used for correction of mass discrimination of Pb with an extensive dataset for the NIST standard SRM 981, comparison of MC-ICP-MS with TIMS data, and comparison with three isochrons from different geological environments. When all our NIST SRM 981 data are normalised with one constant Tl-205/Tl-203 of 2.38869, the following averages and reproducibilities were obtained: Pb-207/Pb-206=0.91461+/-18; Pb-208/Ph-206 = 2.1674+/-7; and (PbPh)-Pb-206-Ph-204 = 16.941+/-6. These two sigma standard deviations of the mean correspond to 149, 330, and 374 ppm, respectively. Accuracies relative to triple-spike values are 149, 157, and 52 ppm, respectively, and thus well within uncertainties. The largest component of the uncertainties stems from the Ph data alone and is not caused by differential mass discrimination behaviour of Ph and Tl. In routine operation, variation of sample introduction memory and production of isobaric molecular interferences in the spectrometer's collision cell currently appear to be the ultimate limitation to better reproducibility. Comparative study of five different datasets from actual samples (bullets, international rock standards, carbonates, metamorphic minerals, and sulphide minerals) demonstrates that in most cases geological scatter of the sample exceeds the achieved analytical reproducibility. We observe good agreement between TIMS and MC-ICP-MS data for international rock standards but find that such comparison does not constitute the ultimate. test for the validity of the MC-ICP-MS technique. Two attempted isochrons resulted in geological scatter (in one case small) in excess of analytical reproducibility. However, in one case (leached Great Dyke sulphides) we obtained a true isochron (MSWD = 0.63) age of 2578.3 +/- 0.9 Ma, which is identical to and more precise than a recently published U-Pb zircon age (2579 3 Ma) for a Great Dyke websterite [Earth Planet. Sci. Lett. 180 (2000) 1-12]. Reproducibility of this age by means of an isochron we regard as a robust test of accuracy over a wide dynamic range. We show that reliable and accurate Pb-isotope data can be obtained by careful operation of second-generation MC-ICP magnetic sector mass spectrometers. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We have recently developed a scaleable Artificial Boundary Inhomogeneity (ABI) method [Chem. Phys. Lett.366, 390–397 (2002)] based on the utilization of the Lanczos algorithm, and in this work explore an alternative iterative implementation based on the Chebyshev algorithm. Detailed comparisons between the two iterative methods have been made in terms of efficiency as well as convergence behavior. The Lanczos subspace ABI method was also further improved by the use of a simpler three-term backward recursion algorithm to solve the subspace linear system. The two different iterative methods are tested on the model collinear H+H2 reactive state-to-state scattering.
Resumo:
Implementing multi-level governance has been a key priority in EU cohesion policy. This study assesses the perceived achievements and shortcomings in implementing European Social Fund by analyzing the deficits and weaknesses as well as the poor participation of local agents who are in direct contact with the beneficiaries in order to design and implement this fund, which is the main financial instrument of EU social policy.
Resumo:
In the last decades considerations about equipments' availability became an important issue, as well as its dependence on components characteristics such as reliability and maintainability. This is particularly of outstanding importance if one is dealing with high risk industrial equipments, where these factors play an important and fundamental role in risk management when safety or huge economic values are in discussion. As availability is a function of reliability, maintainability, and maintenance support activities, the main goal is to improve one or more of these factors. This paper intends to show how maintainability can influence availability and present a methodology to select the most important attributes for maintainability using a partial Multi Criteria Decision Making (pMCDM). Improvements in maintainability can be analyzed assuming it as a probability related with a restore probability density function [g(t)].
Resumo:
We write down the renormalization-group equations for the Yukawa-coupling matrices in a general multi-Higgs-doublet model. We then assume that the matrices of the Yukawa couplings of the various Higgs doublets to right-handed fermions of fixed quantum numbers are all proportional to each other. We demonstrate that, in the case of the two-Higgs-doublet model, this proportionality is preserved by the renormalization-group running only in the cases of the standard type-I, II, X, and Y models. We furthermore show that a similar result holds even when there are more than two Higgs doublets: the Yukawa-coupling matrices to fermions of a given electric charge remain proportional under the renormalization-group running if and only if there is a basis for the Higgs doublets in which all the fermions of a given electric charge couple to only one Higgs doublet.
Resumo:
This work was focused on a multi-purpose estuarine environment (river Sado estuary, SW Portugal) around which a number of activities (e.g., fishing, farming, heavy industry, tourism and recreational activities) coexist with urban centres with a total of about 200 000 inhabitants. Based on previous knowledge of the hazardous chemicals within the ecosystem and their potential toxicity to benthic species, this project intended to evaluate the impact of estuarine contaminants on the human and ecosystem health. An integrative methodology based on epidemiological, analytical and biological data and comprising several lines of evidence, namely, human contamination pathways, human health effects, consumption of local produce, estuarine sediments, wells and soils contamination, effects on commercial benthic organisms, and genotoxic potential of sediments, was used. The epidemiological survey confirmed the occurrence of direct and indirect (through food chain) exposure of the local population to estuarine contaminants. Furthermore, the complex mixture of contaminants (e.g., metals, pesticides, polycyclic aromatic hydrocarbons) trapped in the estuary sediments was toxic to human liver cells exposed in vitro, causing cell death, oxidative stress and genotoxic effects that might constitute a risk factor for the development of chronic-degenerative diseases, on the long term. Finally, the integration of data from several endpoints indicated that the estuary is moderately impacted by toxicants that affect also the aquatic biota. Nevertheless, the human health risk can only be correctly assessed through a biomonitoring study including the quantification of contaminants (or metabolites) in biological fluids as well as biomarkers of early biological effects (e.g., biochemical, genetic and omics-based endpoints) and genetic susceptibility in the target population. Data should be supported by a detailed survey to assess the impact of the contaminated seafood and local farm products consumption on human health and, particularly, on metabolic diseases or cancer development.
Resumo:
Emotion although being an important factor in our every day life it is many times forgotten in the development of systems to be used by persons. In this work we present an architecture for a ubiquitous group decision support system able to support persons in group decision processes. The system considers the emotional factors of the intervenient participants, as well as the argumentation between them. Particular attention will be taken to one of components of this system: the multi-agent simulator, modeling the human participants, considering emotional characteristics, and allowing the exchanges of hypothetic arguments among the participants.
Resumo:
In the last years it has become increasingly clear that the mammalian transcriptome is highly complex and includes a large number of small non-coding RNAs (sncRNAs) and long noncoding RNAs (lncRNAs). Here we review the biogenesis pathways of the three classes of sncRNAs, namely short interfering RNAs (siRNAs), microRNAs (miRNAs) and PIWI-interacting RNAs (piRNAs). These ncRNAs have been extensively studied and are involved in pathways leading to specific gene silencing and the protection of genomes against virus and transposons, for example. Also, lncRNAs have emerged as pivotal molecules for the transcriptional and post-transcriptional regulation of gene expression which is supported by their tissue-specific expression patterns, subcellular distribution, and developmental regulation. Therefore, we also focus our attention on their role in differentiation and development. SncRNAs and lncRNAs play critical roles in defining DNA methylation patterns, as well as chromatin remodeling thus having a substantial effect in epigenetics. The identification of some overlaps in their biogenesis pathways and functional roles raises the hypothesis that these molecules play concerted functions in vivo, creating complex regulatory networks where cooperation with regulatory proteins is necessary. We also highlighted the implications of biogenesis and gene expression deregulation of sncRNAs and lncRNAs in human diseases like cancer.
Resumo:
Tese de Doutoramento, Ciências do Mar (Biologia Marinha)
Resumo:
Solvent extraction is considered as a multi-criteria optimization problem, since several chemical species with similar extraction kinetic properties are frequently present in the aqueous phase and the selective extraction is not practicable. This optimization, applied to mixer–settler units, considers the best parameters and operating conditions, as well as the best structure or process flow-sheet. Global process optimization is performed for a specific flow-sheet and a comparison of Pareto curves for different flow-sheets is made. The positive weight sum approach linked to the sequential quadratic programming method is used to obtain the Pareto set. In all investigated structures, recovery increases with hold-up, residence time and agitation speed, while the purity has an opposite behaviour. For the same treatment capacity, counter-current arrangements are shown to promote recovery without significant impairment in purity. Recycling the aqueous phase is shown to be irrelevant, but organic recycling with as many stages as economically feasible clearly improves the design criteria and reduces the most efficient organic flow-rate.
Resumo:
Due to usage conditions, hazardous environments or intentional causes, physical and virtual systems are subject to faults in their components, which may affect their overall behaviour. In a ‘black-box’ agent modelled by a set of propositional logic rules, in which just a subset of components is externally visible, such faults may only be recognised by examining some output function of the agent. A (fault-free) model of the agent’s system provides the expected output given some input. If the real output differs from that predicted output, then the system is faulty. However, some faults may only become apparent in the system output when appropriate inputs are given. A number of problems regarding both testing and diagnosis thus arise, such as testing a fault, testing the whole system, finding possible faults and differentiating them to locate the correct one. The corresponding optimisation problems of finding solutions that require minimum resources are also very relevant in industry, as is minimal diagnosis. In this dissertation we use a well established set of benchmark circuits to address such diagnostic related problems and propose and develop models with different logics that we formalise and generalise as much as possible. We also prove that all techniques generalise to agents and to multiple faults. The developed multi-valued logics extend the usual Boolean logic (suitable for faultfree models) by encoding values with some dependency (usually on faults). Such logics thus allow modelling an arbitrary number of diagnostic theories. Each problem is subsequently solved with CLP solvers that we implement and discuss, together with a new efficient search technique that we present. We compare our results with other approaches such as SAT (that require substantial duplication of circuits), showing the effectiveness of constraints over multi-valued logics, and also the adequacy of a general set constraint solver (with special inferences over set functions such as cardinality) on other problems. In addition, for an optimisation problem, we integrate local search with a constructive approach (branch-and-bound) using a variety of logics to improve an existing efficient tool based on SAT and ILP.