20 resultados para quantization artifacts
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
The aim of this study was to determine whether image artifacts caused by orthodontic metal accessories interfere with the accuracy of 3D CBCT model superimposition. A human dry skull was subjected three times to a CBCT scan: at first without orthodontic brackets (T1), then with stainless steel brackets bonded without (T2) and with orthodontic arch wires (T3) inserted into the brackets' slots. The registration of image surfaces and the superimposition of 3D models were performed. Within-subject surface distances between T1-T2, T1-T3 and T2-T3 were computed and calculated for comparison among the three data sets. The minimum and maximum Hausdorff Distance units (HDu) computed between the corresponding data points of the T1 and T2 CBCT 3D surface images were 0.000000 and 0.049280 HDu, respectively, and the mean distance was 0.002497 HDu. The minimum and maximum Hausdorff Distances between T1 and T3 were 0.000000 and 0.047440 HDu, respectively, with a mean distance of 0.002585 HDu. In the comparison between T2 and T3, the minimum, maximum and mean Hausdorff Distances were 0.000000, 0.025616 and 0.000347 HDu, respectively. In the current study, the image artifacts caused by metal orthodontic accessories did not compromise the accuracy of the 3D model superimposition. Color-coded maps of overlaid structures complemented the computed Hausdorff Distances and demonstrated a precise fusion between the data sets.
Resumo:
We construct a consistent theory of a quantum massive Weyl field. We start with the formulation of the classical field theory approach for the description of massive Weyl fields. It is demonstrated that the standard Lagrange formalism cannot be applied for the studies of massive first-quantized Weyl spinors. Nevertheless we show that the classical field theory description of massive Weyl fields can be implemented in frames of the Hamilton formalism or using the extended Lagrange formalism. Then we carry out a canonical quantization of the system. The independent ways for the quantization of a massive Weyl field are discussed. We also compare our results with the previous approaches for the treatment of massive Weyl spinors. Finally the new interpretation of the Majorana condition is proposed.
Resumo:
In this work, we reported some results about the stochastic quantization of the spherical model. We started by reviewing some basic aspects of this method with emphasis in the connection between the Langevin equation and the supersymmetric quantum mechanics, aiming at the application of the corresponding connection to the spherical model. An intuitive idea is that when applied to the spherical model this gives rise to a supersymmetric version that is identified with one studied in Phys. Rev. E 85, 061109, (2012). Before investigating in detail this aspect, we studied the stochastic quantization of the mean spherical model that is simpler to implement than the one with the strict constraint. We also highlight some points concerning more traditional methods discussed in the literature like canonical and path integral quantization. To produce a supersymmetric version, grounded in the Nicolai map, we investigated the stochastic quantization of the strict spherical model. We showed in fact that the result of this process is an off-shell supersymmetric extension of the quantum spherical model (with the precise supersymmetric constraint structure). That analysis establishes a connection between the classical model and its supersymmetric quantum counterpart. The supersymmetric version in this way constructed is a more natural one and gives further support and motivations to investigate similar connections in other models of the literature.
Resumo:
The transport properties of the two-dimensional system in HgTe-based quantum wells containing simultaneously electrons and holes of low densities are examined. The Hall resistance, as a function of perpendicular magnetic field, reveals an unconventional behavior, different from the classical N-shaped dependence typical for bipolar systems with electron-hole asymmetry. The quantum features of magnetotransport are explained by means of numerical calculation of the Landau level spectrum based on the Kane Hamiltonian. The origin of the quantum Hall plateau sigma(xy) = 0 near the charge neutrality point is attributed to special features of Landau quantization in our system.
Resumo:
Purpose - The purpose of this paper is to develop an efficient numerical algorithm for the self-consistent solution of Schrodinger and Poisson equations in one-dimensional systems. The goal is to compute the charge-control and capacitance-voltage characteristics of quantum wire transistors. Design/methodology/approach - The paper presents a numerical formulation employing a non-uniform finite difference discretization scheme, in which the wavefunctions and electronic energy levels are obtained by solving the Schrodinger equation through the split-operator method while a relaxation method in the FTCS scheme ("Forward Time Centered Space") is used to solve the two-dimensional Poisson equation. Findings - The numerical model is validated by taking previously published results as a benchmark and then applying them to yield the charge-control characteristics and the capacitance-voltage relationship for a split-gate quantum wire device. Originality/value - The paper helps to fulfill the need for C-V models of quantum wire device. To do so, the authors implemented a straightforward calculation method for the two-dimensional electronic carrier density n(x,y). The formulation reduces the computational procedure to a much simpler problem, similar to the one-dimensional quantization case, significantly diminishing running time.
Resumo:
This work assessed homogeneity of the Institute of Astronomy, Geophysics and Atmospheric Sciences (IAG) weather station climate series, using various statistical techniques. The record from this target station is one of the longest in Brazil, having commenced in 1933 with observations of precipitation, and temperatures and other variables later in 1936. Thus, it is one of the few stations in Brazil with enough data for long-term climate variability and climate change studies. There is, however, a possibility that its data may have been contaminated by some artifacts over time. Admittedly, there was an intervention on the observations in 1958, with the replacement of instruments, for which the size of impact has not been yet evaluated. The station transformed in the course of time from rural to urban, and this may also have influenced homogeneity of the observations and makes the station less representative for climate studies over larger spatial scales. Homogeneity of the target station was assessed applying both absolute, or single station tests, and tests relatively to regional climate, in annual scale, regarding daily precipitation, relative humidity, maximum (TMax), minimum (TMin), and wet bulb temperatures. Among these quantities, only precipitation does not exhibit any inhomogeneity. A clear signal of change of instruments in 1958 was detected in the TMax and relative humidity data, the latter certainly because of its strong dependence on temperature. This signal is not very clear in TMin, but it presents non-climatic discontinuities around 1953 and around 1970. A significant homogeneity break is found around 1990 for TMax and wet bulb temperature. The discontinuities detected after 1958 may have been caused by urbanization, as the observed warming trend in the station is considerably greater than that corresponding to regional climate.
Resumo:
The Distributed Software Development (DSD) is a development strategy that meets the globalization needs concerned with the increase productivity and cost reduction. However, the temporal distance, geographical dispersion and the socio-cultural differences, increased some challenges and, especially, added new requirements related with the communication, coordination and control of projects. Among these new demands there is the necessity of a software process that provides adequate support to the distributed software development. This paper presents an integrated approach of software development and test that considers distributed teams peculiarities. The approach purpose is to offer support to DSD, providing a better project visibility, improving the communication between the development and test teams, minimizing the ambiguity and difficulty to understand the artifacts and activities. This integrated approach was conceived based on four pillars: (i) to identify the DSD peculiarities concerned with development and test processes, (ii) to define the necessary elements to compose the integrated approach of development and test to support the distributed teams, (iii) to describe and specify the workflows, artifacts, and roles of the approach, and (iv) to represent appropriately the approach to enable the effective communication and understanding of it.
Resumo:
Although the occurrence of glandular trichomes is frequently reported for aerial vegetative organs, many questions still remain opened about the presence of such trichomes in underground systems. Here, we present, for the first time, a comparative study concerning the structure, ultrastructure and chemical aspects of both, the aerial and underground glandular trichomes of two different Chrysolaena species, C obovata and C platensis. Glandular trichomes (GTs) were examined using LM, SEM, and TEM and also analyzed by GC-MS and HPLC coupled to UV/DAD and HR-ESI-MS (HPLC-UV-MS). In both aerial (leaf and bud) and underground (rhizophore) organs, the GTs are multicellular, biseriate and formed by five pairs of cells: a pair of support cells, a pair of basal cells, and three pairs of secreting cells. These secreting cells have, at the beginning of secretory process, abundance of smooth ER. The same classes of secondary metabolites are biosynthesized and stored in both aerial and underground GTs of C platensis and C obovata. These GTs from aerial and underground organs have similar cellular and sub-cellular anatomy, however the belowground trichomes show a higher diversity of compounds when compared to those from the leaves. We also demonstrate by means of HPLC-UV-DAD that the sesquiterpene lactones are located inside the trichomes and that hirsutinolides are not artifacts. (C) 2012 Elsevier GmbH. All rights reserved.
Resumo:
Introduction: The objective of the study was to evaluate the ability of large-volume cone-beam computed tomography (CBCT) to detect horizontal root fracture and to test the influence of a metallic post. Methods: Through the examination of 40 teeth by large-volume CBCT (20-cm height and 15-cm diameter cylinder) at 0.2-mm voxel resolution, 2 observers analyzed the samples for the presence and localization of horizontal root fracture. Results: The values of accuracy in the groups that had no metallic post ranged from 33%-68%, whereas for the samples with the metallic post, values showed a wide variation (38%-83%). Intraobserver agreement showed no statistically significant difference between the groups with/without metallic post; both ranged from very weak to weak (kappa, 0.09-0.369). Conclusions: The low accuracy and low intraobserver and interobserver agreement reflect the difficulty in performing an adequate diagnosis of horizontal root fractures through a large-volume CBCT by using a small voxel reconstruction. (J Endod 2012;38:856-859)
Resumo:
We report a morphology-based approach for the automatic identification of outlier neurons, as well as its application to the NeuroMorpho.org database, with more than 5,000 neurons. Each neuron in a given analysis is represented by a feature vector composed of 20 measurements, which are then projected into a two-dimensional space by applying principal component analysis. Bivariate kernel density estimation is then used to obtain the probability distribution for the group of cells, so that the cells with highest probabilities are understood as archetypes while those with the smallest probabilities are classified as outliers. The potential of the methodology is illustrated in several cases involving uniform cell types as well as cell types for specific animal species. The results provide insights regarding the distribution of cells, yielding single and multi-variate clusters, and they suggest that outlier cells tend to be more planar and tortuous. The proposed methodology can be used in several situations involving one or more categories of cells, as well as for detection of new categories and possible artifacts.
Resumo:
We propose an integral formulation of the equations of motion of a large class of field theories which leads in a quite natural and direct way to the construction of conservation laws. The approach is based on generalized non-abelian Stokes theorems for p-form connections, and its appropriate mathematical language is that of loop spaces. The equations of motion are written as the equality of a hyper-volume ordered integral to a hyper-surface ordered integral on the border of that hyper-volume. The approach applies to integrable field theories in (1 + 1) dimensions, Chern-Simons theories in (2 + 1) dimensions, and non-abelian gauge theories in (2 + 1) and (3 + 1) dimensions. The results presented in this paper are relevant for the understanding of global properties of those theories. As a special byproduct we solve a long standing problem in (3 + 1)-dimensional Yang-Mills theory, namely the construction of conserved charges, valid for any solution, which are invariant under arbitrary gauge transformations. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Cone beam computed tomography (CBCT) can be considered as a valuable imaging modality for improving diagnosis and treatment planning to achieve true guidance for several craniofacial surgical interventions. A new concept and perspective in medical informatics is the highlight discussion about the new imaging interactive workflow. The aim of this article was to present, in a short literature review, the usefulness of CBCT technology as an important alternative imaging modality, highlighting current practices and near-term future applications in cutting-edge thought-provoking perspectives for craniofacial surgical assessment. This article explains the state of the art of CBCT improvements, medical workstation, and perspectives of the dedicated unique hardware and software, which can be used from the CBCT source. In conclusion, CBCT technology is developing rapidly, and many advances are on the horizon. Further progress in medical workstations, engineering capabilities, and improvement in independent software-some open source-should be attempted with this new imaging method. The perspectives, challenges, and pitfalls in CBCT will be delineated and evaluated along with the technological developments.
Resumo:
We discuss the construction of coherent states (CS) for systems with continuous spectra. First, we propose to adopt the Malkin-Manko approach, developed for systems with discrete spectra, to the case under consideration. Following this approach, we consider two examples, a free particle and a particle in a linear potential. Second, we generalize the approach of action-angle CS to systems with continuous spectra. In the first approach we start with a well-defined quantum formulation (canonical quantization) of a physical system and the construction of CS follows from such a quantization. In the second approach, the quantization procedure is inherent to the CS construction itself.
Resumo:
A specific separated-local-field NMR experiment, dubbed Dipolar-Chemical-Shift Correlation (DIPSHIFT) is frequently used to study molecular motions by probing reorientations through the changes in XH dipolar coupling and T-2. In systems where the coupling is weak or the reorientation angle is small, a recoupled variant of the DIPSHIFT experiment is applied, where the effective dipolar coupling is amplified by a REDOR-like pi-pulse train. However, a previously described constant-time variant of this experiment is not sensitive to the motion-induced T-2 effect, which precludes the observation of motions over a large range of rates ranging from hundreds of Hz to around a MHz. We present a DIPSHIFT implementation which amplifies the dipolar couplings and is still sensitive to T-2 effects. Spin dynamics simulations, analytical calculations and experiments demonstrate the sensitivity of the technique to molecular motions, and suggest the best experimental conditions to avoid imperfections. Furthermore, an in-depth theoretical analysis of the interplay of REDOR-like recoupling and proton decoupling based on Average-Hamiltonian Theory was performed, which allowed explaining the origin of many artifacts found in literature data. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
This study explores educational technology and management education by analyzing fidelity in game-based management education interventions. A sample of 31 MBA students was selected to help answer the research question: To what extent do MBA students tend to recognize specific game-based academic experiences, in terms of fidelity, as relevant to their managerial performance? Two distinct game-based interventions (BG1 and BG2) with key differences in fidelity levels were explored: BG1 presented higher physical and functional fidelity levels and lower psychological fidelity levels. Hypotheses were tested with data from the participants, collected shortly after their experiences, related to the overall perceived quality of game-based interventions. The findings reveal a higher overall perception of quality towards BG1: (a) better for testing strategies, (b) offering better business and market models, (c) based on a pace that better stimulates learning, and (d) presenting a fidelity level that better supports real world performance. This study fosters the conclusion that MBA students tend to recognize, to a large extent, that specific game-based academic experiences are relevant and meaningful to their managerial development, mostly with heightened fidelity levels of adopted artifacts. Agents must be ready and motivated to explore the new, to try and err, and to learn collaboratively in order to perform.