950 resultados para tensor tomography


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of the tensor component of the Skyrme effective nucleon-nucleon interaction on the single-particle structure in superheavy elements is studied. A selection of the available Skyrme forces has been chosen and their predictions for the proton and neutron shell closures investigated. The inclusion of the tensor term with realistic coupling strength parameters leads to a small increase in the spin-orbit splitting between the proton 2f7/2 and 2f5/2 partners, opening the Z=114 shell gap over a wide range of nuclei. The Z=126 shell gap, predicted by these models in the absence of the tensor term, is found to be stongly dependent on neutron number with a Z=138 gap opening for large neutron numbers, having a consequent implication for the synthesis of neutron-rich superheavy elements. The predicted neutron shell structures remain largely unchanged by inclusion of the tensor component.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with tensor clustering with the assistance of dimensionality reduction approaches. A class of formulation for tensor clustering is introduced based on tensor Tucker decomposition models. In this formulation, an extra tensor mode is formed by a collection of tensors of the same dimensions and then used to assist a Tucker decomposition in order to achieve data dimensionality reduction. We design two types of clustering models for the tensors: PCA Tensor Clustering model and Non-negative Tensor Clustering model, by utilizing different regularizations. The tensor clustering can thus be solved by the optimization method based on the alternative coordinate scheme. Interestingly, our experiments show that the proposed models yield comparable or even better performance compared to most recent clustering algorithms based on matrix factorization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional dictionary learning algorithms are used for finding a sparse representation on high dimensional data by transforming samples into a one-dimensional (1D) vector. This 1D model loses the inherent spatial structure property of data. An alternative solution is to employ Tensor Decomposition for dictionary learning on their original structural form —a tensor— by learning multiple dictionaries along each mode and the corresponding sparse representation in respect to the Kronecker product of these dictionaries. To learn tensor dictionaries along each mode, all the existing methods update each dictionary iteratively in an alternating manner. Because atoms from each mode dictionary jointly make contributions to the sparsity of tensor, existing works ignore atoms correlations between different mode dictionaries by treating each mode dictionary independently. In this paper, we propose a joint multiple dictionary learning method for tensor sparse coding, which explores atom correlations for sparse representation and updates multiple atoms from each mode dictionary simultaneously. In this algorithm, the Frequent-Pattern Tree (FP-tree) mining algorithm is employed to exploit frequent atom patterns in the sparse representation. Inspired by the idea of K-SVD, we develop a new dictionary update method that jointly updates elements in each pattern. Experimental results demonstrate our method outperforms other tensor based dictionary learning algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Micro-computed tomography (μCT) has been successfully used to study the cardiovascular system of mouse embryos in situ. With the use of barium as a suitable contrast agent, blood vessels have been imaged and analysed quantitatively such as blood volume and vessel sizes on embryos of ages 14.5 to 16.5 days old. The advantage of using this imaging modality is that it has provided three dimensional information whilst leaving samples intact for further study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Subspace clustering groups a set of samples from a union of several linear subspaces into clusters, so that the samples in the same cluster are drawn from the same linear subspace. In the majority of the existing work on subspace clustering, clusters are built based on feature information, while sample correlations in their original spatial structure are simply ignored. Besides, original high-dimensional feature vector contains noisy/redundant information, and the time complexity grows exponentially with the number of dimensions. To address these issues, we propose a tensor low-rank representation (TLRR) and sparse coding-based (TLRRSC) subspace clustering method by simultaneously considering feature information and spatial structures. TLRR seeks the lowest rank representation over original spatial structures along all spatial directions. Sparse coding learns a dictionary along feature spaces, so that each sample can be represented by a few atoms of the learned dictionary. The affinity matrix used for spectral clustering is built from the joint similarities in both spatial and feature spaces. TLRRSC can well capture the global structure and inherent feature information of data, and provide a robust subspace segmentation from corrupted data. Experimental results on both synthetic and real-world data sets show that TLRRSC outperforms several established state-of-the-art methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tensor clustering is an important tool that exploits intrinsically rich structures in real-world multiarray or Tensor datasets. Often in dealing with those datasets, standard practice is to use subspace clustering that is based on vectorizing multiarray data. However, vectorization of tensorial data does not exploit complete structure information. In this paper, we propose a subspace clustering algorithm without adopting any vectorization process. Our approach is based on a novel heterogeneous Tucker decomposition model taking into account cluster membership information. We propose a new clustering algorithm that alternates between different modes of the proposed heterogeneous tensor model. All but the last mode have closed-form updates. Updating the last mode reduces to optimizing over the multinomial manifold for which we investigate second order Riemannian geometry and propose a trust-region algorithm. Numerical experiments show that our proposed algorithm compete effectively with state-of-the-art clustering algorithms that are based on tensor factorization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The aim of this study was to evaluate the accuracy of two imaging methods in diagnosing apical periodontitis (AP) using histopathological findings as a gold standard. Methods: The periapex of 83 treated or untreated roots of dogs` teeth was examined using periapical radiography (PR), cone-beam computed tomography (CBCT) scans, and histology. Sensitivity, specificity, predictive values, and accuracy of PR and CBCT diagnosis were calculated. Results: PR detected AP in 71% of roots, a CBCT scan detected AP in 84%, and AP was histologically diagnosed in 93% (p = 0.001). Overall, sensitivity was 0.77 and 0.91 for PR and CBCT, respectively. Specificity was 1 for both. Negative predictive value was 0.25 and 0.46 for PR and CBCT, respectively. Positive predictive value was 1 for both. Diagnostic accuracy (true positives + true negatives) was 0.78 and 0.92 for PR and CBCT (p = 0.028), respectively. Conclusion: A CBCT scan was more sensitive in detecting AP compared with PR, which was more likely to miss AP when it was still present. (J Endod 2009;35:1009-1012)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to compare the favorable outcome of root canal treatment determined by periapical radiographs (PRs) and cone beam computed tomography (CBCT) scans. Ninety-six roots of dogs` teeth were used to form four groups (n = 24). In group 1, root canal treatments were performed in healthy teeth. Root canals in groups 2 through 4 were infected until apical periodontitis (AP) was radiographically confirmed. Roots with AP were treated by one-visit therapy in group 2, by two-visit therapy in group 3, and left untreated in group 4. The radiolucent area in the PRs and the volume of CBCT-scanned periapical lesions were measured before and 6 months after the treatment. In groups 1, 2, and 3, a favorable outcome (lesions absent or reduced) was shown in 57 (79%) roots using PRs but only in 25 (35%) roots using CBCT scans (p = 0.0001). Unfavorable outcomes occurred more frequently after one-visit therapy than two-visit therapy when determined by CBCT scans (p = 0.023). (J Endod 2009; 35:723-726)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present models for the upper-mantle velocity structure beneath SE and Central Brazil using independent tomographic inversions of P- and S-wave relative arrival-time residuals (including core phases) from teleseismic earthquakes. The events were recorded by a total of 92 stations deployed through different projects, institutions and time periods during the years 1992-2004. Our results show correlations with the main tectonic structures and reveal new anomalies not yet observed in previous works. All interpretations are based on robust anomalies, which appear in the different inversions for P-and S-waves. The resolution is variable through our study volume and has been analyzed through different theoretical test inversions. High-velocity anomalies are observed in the western portion of the Sao Francisco Craton, supporting the hypothesis that this Craton was part of a major Neoproterozoic plate (San Franciscan Plate). Low-velocity anomalies beneath the Tocantins Province (mainly fold belts between the Amazon and Sao Francisco Cratons) are interpreted as due to lithospheric thinning, which is consistent with the good correlation between intraplate seismicity and low-velocity anomalies in this region. Our results show that the basement of the Parana Basin is formed by several blocks, separated by suture zones, according to model of Milani & Ramos. The slab of the Nazca Plate can be observed as a high-velocity anomaly beneath the Parana Basin, between the depths of 700 and 1200 km. Further, we confirm the low-velocity anomaly in the NE area of the Parana Basin which has been interpreted by VanDecar et al. as a fossil conduct of the Tristan da Cunha Plume related to the Parana flood basalt eruptions during the opening of the South Atlantic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the use of optical coherence tomography (OCT) to detect and quantify demineralization process induced by S. mutans biofilm in third molars human teeth. Artificial lesions were induced by a S. mutans microbiological culture and the samples (N = 50) were divided into groups according to the demineralization time: 3, 5, 7, 9, and 11days. The OCT system was implemented using a light source delivering an average power of 96 mu W in the sample arm, and spectral characteristics allowing 23 mu m of axial resolution. The images were produced with lateral scans step of 10 pan and analyzed individually. As a result of the evaluation of theses images, lesion depth was calculated as function of demineralization time. The depth of the lesion in the root dentine increased from 70 pm to 230,urn (corrected by the enamel refraction index, 1.62 @ 856 nm), depending of exposure time. The lesion depth in root dentine was correlated to demineralization time, showing that it follows a geometrical progression like a bacteria growth law. [GRAPHICS] Progression of lesion depth in root dentine as function of exposure time, showing that it follows a geometrical progression like a bacteria growth law(C) 2009 by Astro Ltd. Published exclusively by WILEY-VCH Verlag GmbH & Co. KGaA

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we evaluate the effectiveness of computed tomography images as a tool to determine magnetic nanoparticle biodistribution over biological tissues. For this purpose, tomography images for magnetic nanoparticles, composed of Fe(3)O(4), coated with 2,3-dimercaptosuccinic acid (DMSA), were generated at several material concentrations. The comparison of CT numbers, calculated from these images generated at clinical conditions, with typical CT numbers for biological tissues, shows that the detection of nanoparticle in most tissues is only possible for high material concentrations. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Themean value of the one-loop energy-momentum tensor in thermal QED with an electric-like background that creates particles from vacuum is calculated. The problem is essentially different from calculations of effective actions ( similar to the action of Heisenberg-Euler) in backgrounds that respect the stability of vacuum. The role of a constant electric background in the violation of both the stability of vacuum and the thermal character of particle distribution is investigated. Restrictions on the electric field and the duration over which one can neglect the back-reaction of created particles are established.