964 resultados para Realization
Resumo:
Nitrogen doped silicon (NIDOS) films have been deposited by low-pressure chemical vapor deposition from silane SiH4 and ammonia NH3 at high temperature (750°C) and the influences of the NH3/SiH4 gas ratio on the films deposition rate, refractive index, stoichiometry, microstructure, electrical conductivity, and thermomechanical stress are studied. The chemical species derived from silylene SiH2 into the gaseous phase are shown to be responsible for the deposition of NIDOS and/or (silicon rich) silicon nitride. The competition between these two deposition phenomena leads finally to very high deposition rates (100 nm/min) for low NH3/SiH4 gas ratio (R¿0.1). Moreover, complex variations of NIDOS film properties are evidenced and related to the dual behavior of the nitrogen atom into silicon, either n-type substitutional impurity or insulative intersticial impurity, according to the Si¿N atomic bound. Finally, the use of NIDOS deposition for the realization of microelectromechanical systems is investigated.
Resumo:
In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations
Resumo:
We have investigated, in the L-S coupling scheme, the appearance of triplet pairing in fermionic droplets in which a single nl shell is active. The method is applied to a constant-strength model, for which we discuss the different phase transitions that take place as the number of particles in the shell is varied. Drops of 3He atoms can be plausible physical scenarios for the realization of the model.
Resumo:
The decay of orthopositronium into three photons produces a physical realization of a pure state with three-party entanglement. Its quantum correlations are analyzed using recent results on quantum information theory, looking for the final state that has the maximal amount of Greenberger, Horne, and Zeilinger like correlations. This state allows for a statistical dismissal of local realism stronger than the one obtained using any entangled state of two spin one-half particles.
Resumo:
A pseudoclassical model for a relativistic spinning particle is studied. The only physically meaningful world line is the one without Zitterbewegung. The Poincar realization for this situation is constructed.
Resumo:
Optimal and finite positive operator valued measurements on a finite number N of identically prepared systems have recently been presented. With physical realization in mind, we propose here optimal and minimal generalized quantum measurements for two-level systems. We explicitly construct them up to N = 7 and verify that they are minimal up to N = 5.
Resumo:
In this paper we consider diffusion of a passive substance C in a temporarily and spatially inhomogeneous two-dimensional medium. As a realization for the latter we choose a phase-separating medium consisting of two substances A and B, whose dynamics is determined by the Cahn-Hilliard equation. Assuming different diffusion coefficients of C in A and B, we find that the variance of the distribution function of the said substance grows less than linearly in time. We derive a simple identity for the variance using a probabilistic ansatz and are then able to identify the interface between A and B as the main cause for this nonlinear dependence. We argue that, finally, for very large times the here temporarily dependent diffusion "constant" goes like t-1/3 to a constant asymptotic value D¿. The latter is calculated approximately by employing the effective-medium approximation and by fitting the simulation data to the said time dependence.
Resumo:
We explore the phase diagram of a two-component ultracold atomic Fermi gas interacting with zero-range forces in the limit of weak coupling. We focus on the dependence of the pairing gap and the free energy on the variations in the number densities of the two species while the total density of the system is held fixed. As the density asymmetry is increased, the system exhibits a transition from a homogenous Bardeen-Cooper-Schrieffer (BCS) phase to phases with spontaneously broken global space symmetries. One such realization is the deformed Fermi surface superfluidity (DFS) which exploits the possibility of deforming the Fermi surfaces of the species into ellipsoidal form at zero total momentum of Cooper pairs. The critical asymmetries at which the transition from DFS to the unpaired state occurs are larger than those for the BCS phase. In this precritical region the DFS phase lowers the pairing energy of the asymmetric BCS state. We compare quantitatively the DFS phase to another realization of superconducting phases with broken translational symmetry: the single-plane-wave Larkin-Ovchinnikov-Fulde-Ferrell phase, which is characterized by a nonvanishing center-of-mass momentum of the Cooper pairs. The possibility of the detection of the DFS phase in the time-of-flight experiments is discussed and quantified for the case of 6Li atoms trapped in two different hyperfine states.
Resumo:
A haplotype is an m-long binary vector. The XOR-genotype of two haplotypes is the m-vector of their coordinate-wise XOR. We study the following problem: Given a set of XOR-genotypes, reconstruct their haplotypes so that the set of resulting haplotypes can be mapped onto a perfect phylogeny (PP) tree. The question is motivated by studying population evolution in human genetics, and is a variant of the perfect phylogeny haplotyping problem that has received intensive attention recently. Unlike the latter problem, in which the input is "full" genotypes, here we assume less informative input, and so may be more economical to obtain experimentally. Building on ideas of Gusfield, we show how to solve the problem in polynomial time, by a reduction to the graph realization problem. The actual haplotypes are not uniquely determined by that tree they map onto, and the tree itself may or may not be unique. We show that tree uniqueness implies uniquely determined haplotypes, up to inherent degrees of freedom, and give a sufficient condition for the uniqueness. To actually determine the haplotypes given the tree, additional information is necessary. We show that two or three full genotypes suffice to reconstruct all the haplotypes, and present a linear algorithm for identifying those genotypes.
Resumo:
The most important features of the proposed spherical gravitational wave detectors are closely linked with their symmetry. Hollow spheres share this property with solid ones, considered in the literature so far, and constitute an interesting alternative for the realization of an omnidirectional gravitational wave detector. In this paper we address the problem of how a hollow elastic sphere interacts with an incoming gravitational wave and find an analytical solution for its normal mode spectrum and response, as well as for its energy absorption cross sections. It appears that this shape can be designed having relatively low resonance frequencies (~ 200 Hz) yet keeping a large cross section, so its frequency range overlaps with the projected large interferometers. We also apply the obtained results to discuss the performance of a hollow sphere as a detector for a variety of gravitational wave signals.
Resumo:
It is shown that the world volume field theory of a single D3-brane in a supergravity D3-brane background admits finite energy, and non-singular, Abelian monopoles and dyons preserving 1/2 or 1/4 of the N=4 supersymmetry and saturating a Bogomolnyi-type bound. The 1/4 supersymmetric solitons provide a world volume realization of string-junction dyons. We also discuss the dual M-theory realization of the 1/2 supersymmetric dyons as finite tension self-dual strings on the M5-brane, and of the 1/4 supersymmetric dyons as their intersections.
Resumo:
It is shown that a IIA superstring carrying D0-brane charge can be "blown up", in a Minkowski vacuum background, to a (1/4)-supersymmetric tubular D2-brane, supported against collapse by the angular momentum generated by crossed electric and magnetic Born-Infeld fields. This supertube can be viewed as a world-volume realization of the sigma-model Q lump.
Resumo:
In this paper we consider diffusion of a passive substance C in a temporarily and spatially inhomogeneous two-dimensional medium. As a realization for the latter we choose a phase-separating medium consisting of two substances A and B, whose dynamics is determined by the Cahn-Hilliard equation. Assuming different diffusion coefficients of C in A and B, we find that the variance of the distribution function of the said substance grows less than linearly in time. We derive a simple identity for the variance using a probabilistic ansatz and are then able to identify the interface between A and B as the main cause for this nonlinear dependence. We argue that, finally, for very large times the here temporarily dependent diffusion "constant" goes like t-1/3 to a constant asymptotic value D¿. The latter is calculated approximately by employing the effective-medium approximation and by fitting the simulation data to the said time dependence.
Resumo:
Today, perhaps without their realization, Iowans are factoring climate change into their lives and activities. Current farming practices and flood mitigation efforts, for example, are reflecting warmer winters, longer growing seasons, warmer nights, higher dew-point temperatures, increased humidity, greater annual stream flows, and more frequent severe precipitation events (Fig. 1) than were prevalent during the past 50 years. Some of the effects of these changes (such as longer growing season) may be positive, while others (particularly the tendency for greater precipitation events that lead to flooding) are negative. Climate change embodies all of these results and many more in a complex manner. The Iowa legislature has been proactive in seeking advice about climate change and its impacts on our state. In 2007, Governor Culver and the Iowa General Assembly enacted Senate File 485 and House File 2571 to create the Iowa Climate Change Advisory Council (ICCAC). ICCAC members reported an emissions inventory and a forecast for Iowa’s greenhouse gases (GHGs), policy options for reducing Iowa’s GHG, and two scenarios charting GHG reductions of 50% and 90% by 2050 from a baseline of 2005. Following issuance of the final report in December 2008, the General Assembly enacted a new bill in 2009 (Sec. 27, Section 473.7, Code 2009 amended) that set in motion a review of climate change impacts and policies in Iowa. This report is the result of that 2009 bill. It continues the dialogue between Iowa’s stakeholders, scientific community, and the state legislature that was begun with these earlier reports.
Resumo:
Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.