974 resultados para augmented-wave method
Resumo:
We present the quantum theory of the far-off-resonance continuous-wave Raman laser using the Heisenberg-Langevin approach. We show that the simplified quantum Langevin equations for this system are mathematically identical to those of the nondegenerate optical parametric oscillator in the time domain with the following associations: pump pump, Stokes signal, and Raman coherence idler. We derive analytical results for both the steady-state behavior and the time-dependent noise spectra, using standard linearization procedures. In the semiclassical limit, these results match with previous purely semiclassical treatments, which yield excellent agreement with experimental observations. The analytical time-dependent results predict perfect photon statistics conversion from the pump to the Stokes and nonclassical behavior under certain operational conditions.
Resumo:
This paper describes the modification of a two-dimensional finite element long wave hydrodynamic model in order to predict the net current and water levels attributable to the influences of waves. Tests examine the effects of the application of wave induced forces, including comparisons to a physical experiment. An example of a real river system is presented with comparisons to measured data, which demonstrate the importance of simulating the combined effects of tides and waves upon hydrodynamic behavior. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
A new wavelet-based adaptive framework for solving population balance equations (PBEs) is proposed in this work. The technique is general, powerful and efficient without the need for prior assumptions about the characteristics of the processes. Because there are steeply varying number densities across a size range, a new strategy is developed to select the optimal order of resolution and the collocation points based on an interpolating wavelet transform (IWT). The proposed technique has been tested for size-independent agglomeration, agglomeration with a linear summation kernel and agglomeration with a nonlinear kernel. In all cases, the predicted and analytical particle size distributions (PSDs) are in excellent agreement. Further work on the solution of the general population balance equations with nucleation, growth and agglomeration and the solution of steady-state population balance equations will be presented in this framework. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We develop a method for determining the elements of the pressure tensor at a radius r in a cylindrically symmetric system, analogous to the so-called method of planes used in planar systems [B. D. Todd, Denis J. Evans, and Peter J. Daivis, Phys. Rev. E 52, 1627 (1995)]. We demonstrate its application in determining the radial shear stress dependence during molecular dynamics simulations of the forced flow of methane in cylindrical silica mesopores. Such expressions are useful for the examination of constitutive relations in the context of transport in confined systems.
Resumo:
The influence of the dispersion of vapor-grown carbon nanofibers (VGCNF) on the electrical properties of VGCNF/ Epoxy composites has been studied. A homogenous dispersion of the VGCNF does not imply better electrical properties. In fact, it is demonstrated that the most simple of the tested dispersion methods results in higher conductivity, since the presence of well-distributed nanofiber clusters appears to be a key factor for increasing composite conductivity.
Resumo:
This study aimed to evaluate the efficiency of multiple centroids to study the adaptability of alfalfa genotypes (Medicago sativa L.). In this method, the genotypes are compared with ideotypes defined by the bissegmented regression model, according to the researcher's interest. Thus, genotype classification is carried out as determined by the objective of the researcher and the proposed recommendation strategy. Despite the great potential of the method, it needs to be evaluated under the biological context (with real data). In this context, we used data on the evaluation of dry matter production of 92 alfalfa cultivars, with 20 cuttings, from an experiment in randomized blocks with two repetitions carried out from November 2004 to June 2006. The multiple centroid method proved efficient for classifying alfalfa genotypes. Moreover, it showed no unambiguous indications and provided that ideotypes were defined according to the researcher's interest, facilitating data interpretation.
Resumo:
Given the dynamic nature of cardiac function, correct temporal alignment of pre-operative models and intraoperative images is crucial for augmented reality in cardiac image-guided interventions. As such, the current study focuses on the development of an image-based strategy for temporal alignment of multimodal cardiac imaging sequences, such as cine Magnetic Resonance Imaging (MRI) or 3D Ultrasound (US). First, we derive a robust, modality-independent signal from the image sequences, estimated by computing the normalized crosscorrelation between each frame in the temporal sequence and the end-diastolic frame. This signal is a resembler for the left-ventricle (LV) volume curve over time, whose variation indicates di erent temporal landmarks of the cardiac cycle. We then perform the temporal alignment of these surrogate signals derived from MRI and US sequences of the same patient through Dynamic Time Warping (DTW), allowing to synchronize both sequences. The proposed framework was evaluated in 98 patients, which have undergone both 3D+t MRI and US scans. The end-systolic frame could be accurately estimated as the minimum of the image-derived surrogate signal, presenting a relative error of 1:6 1:9% and 4:0 4:2% for the MRI and US sequences, respectively, thus supporting its association with key temporal instants of the cardiac cycle. The use of DTW reduces the desynchronization of the cardiac events in MRI and US sequences, allowing to temporally align multimodal cardiac imaging sequences. Overall, a generic, fast and accurate method for temporal synchronization of MRI and US sequences of the same patient was introduced. This approach could be straightforwardly used for the correct temporal alignment of pre-operative MRI information and intra-operative US images.
Resumo:
Foi desenvolvido um método destinado a fazer a triagem rápida e o escalonamento da toxicidade geral exercida por xenobióticos tendo como modelo o Saccharomyces cerevisiae. Para padronizar as condições de experimentação foi estabelecida a relação entre a absorvência a 525 nm e o número de células em suspensão por mililitro de meio de cultura e calculadas uma curva padrão e respectiva equação definidora (Y=6,8219E-08X + 0,0327) Culturas de Saccharomyces cerevisiae em meio completo para leveduras (YPD - 1% de glucose 2%, de peptona 0,5% e extracto de levedura 1%) foram expostas a diferentes concentrações de nicotina e a inibição do crescimento avaliada.
Resumo:
In this text, we intend to explore augmented reality as a means to visualise interactive communication projects. With ARToolkit, Virtools and 3ds Max applications, we aim to show how to create a portable interactive platform that resorts to the environment and markers for constructing the game’s scenario. We plan to show that the realism of simulation, together with the merger of artificial objects with the real world, can generate interactive empathy between players and their avatars.
Resumo:
A brief description of the main features of the health planning technique developed by the "Centro de Estudios del Desarrollo" (CENDES) in Venezuela, and proposed by the Pan-American Health Organization for use in Latin America, is presented. This presentation is followed by an appraisal of the planning method which includes comments both upon its positive aspects and upon its negative points. Comments are also made referring to other recent publications of the WHO/PAHO on health planning. In conclusion, the CENDES technique is considered a health planning method of great potential for use especially in underdeveloped areas, the success of its application depending upon the hability of the health planners to introduce the necessary modifications to adapt to the local circunstamces.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
This paper is an elaboration of the DECA algorithm [1] to blindly unmix hyperspectral data. The underlying mixing model is linear, meaning that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. The proposed method, as DECA, is tailored to highly mixed mixtures in which the geometric based approaches fail to identify the simplex of minimum volume enclosing the observed spectral vectors. We resort then to a statitistical framework, where the abundance fractions are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. With respect to DECA, we introduce two improvements: 1) the number of Dirichlet modes are inferred based on the minimum description length (MDL) principle; 2) The generalized expectation maximization (GEM) algorithm we adopt to infer the model parameters is improved by using alternating minimization and augmented Lagrangian methods to compute the mixing matrix. The effectiveness of the proposed algorithm is illustrated with simulated and read data.