991 resultados para TENSOR


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study properties of subspace lattices related to the continuity of the map Lat and the notion of reflexivity. We characterize various “closedness” properties in different ways and give the hierarchy between them. We investigate several properties related to tensor products of subspace lattices and show that the tensor product of the projection lattices of two von Neumann algebras, one of which is injective, is reflexive.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that if $\cl A$ is the tensor product of finitely many continuous nest algebras, $\cl B$ is a CDCSL algebra and $\cl A$ and $\cl B$ have the same normaliser semi-group then either $\cl A = \cl B$ or $\cl A^* = \cl B$.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of the present paper is to lay the foundations for a systematic study of tensor products of operator systems. After giving an axiomatic definition of tensor products in this category, we examine in detail several particular examples of tensor products, including a minimal, maximal, maximal commuting, maximal injective and some asymmetric tensor products. We characterize these tensor products in terms of their universal properties and give descriptions of their positive cones. We also characterize the corresponding tensor products of operator spaces induced by a certain canonical inclusion of an operator space into an operator system. We examine notions of nuclearity for our tensor products which, on the category of C*-algebras, reduce to the classical notion. We exhibit an operator system S which is not completely order isomorphic to a C*-algebra yet has the property that for every C*-algebra A, the minimal and maximal tensor product of S and A are equal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a method for measuring the full stress tensor in a crystal utilising the properties of individual point defects. By measuring the perturbation to the electronic states of three point defects with C 3 v symmetry in a cubic crystal, sufficient information is obtained to construct all six independent components of the symmetric stress tensor. We demonstrate the method using photoluminescence from nitrogen-vacancy colour centers in diamond. The method breaks the inverse relationship between spatial resolution and sensitivity that is inherent to existing bulk strain measurement techniques, and thus, offers a route to nanoscale strain mapping in diamond and other materials in which individual point defects can be interrogated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that Kraus' property $ S_{\sigma }$ is preserved under taking weak* closed sums with masa-bimodules of finite width and establish an intersection formula for weak* closed spans of tensor products, one of whose terms is a masa-bimodule of finite width. We initiate the study of the question of when operator synthesis is preserved under the formation of products and prove that the union of finitely many sets of the form $ \kappa \times \lambda $, where $ \kappa $ is a set of finite width while $ \lambda $ is operator synthetic, is, under a necessary restriction on the sets $ \lambda $, again operator synthetic. We show that property $ S_{\sigma }$ is preserved under spatial Morita subordinance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that, if M is a subspace lattice with the property that the rank one subspace of its operator algebra is weak* dense, L is a commutative subspace lattice and P is the lattice of all projections on a separable Hilbert space, then L⊗M⊗P is reflexive. If M is moreover an atomic Boolean subspace lattice while L is any subspace lattice, we provide a concrete lattice theoretic description of L⊗M in terms of projection valued functions defined on the set of atoms of M . As a consequence, we show that the Lattice Tensor Product Formula holds for AlgM and any other reflexive operator algebra and give several further corollaries of these results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How can we correlate the neural activity in the human brain as it responds to typed words, with properties of these terms (like ‘edible’, ‘fits in hand’)? In short, we want to find latent variables, that jointly explain both the brain activity, as well as the behavioral responses. This is one of many settings of the Coupled Matrix-Tensor Factorization (CMTF) problem.

Can we accelerate any CMTF solver, so that it runs within a few minutes instead of tens of hours to a day, while maintaining good accuracy? We introduce Turbo-SMT, a meta-method capable of doing exactly that: it boosts the performance of any CMTF algorithm, by up to 200x, along with an up to 65 fold increase in sparsity, with comparable accuracy to the baseline.

We apply Turbo-SMT to BrainQ, a dataset consisting of a (nouns, brain voxels, human subjects) tensor and a (nouns, properties) matrix, with coupling along the nouns dimension. Turbo-SMT is able to find meaningful latent variables, as well as to predict brain activity with competitive accuracy.




Relevância:

20.00% 20.00%

Publicador:

Resumo:

How can we correlate neural activity in the human brain as it responds to words, with behavioral data expressed as answers to questions about these same words? In short, we want to find latent variables, that explain both the brain activity, as well as the behavioral responses. We show that this is an instance of the Coupled Matrix-Tensor Factorization (CMTF) problem. We propose Scoup-SMT, a novel, fast, and parallel algorithm that solves the CMTF problem and produces a sparse latent low-rank subspace of the data. In our experiments, we find that Scoup-SMT is 50-100 times faster than a state-of-the-art algorithm for CMTF, along with a 5 fold increase in sparsity. Moreover, we extend Scoup-SMT to handle missing data without degradation of performance. We apply Scoup-SMT to BrainQ, a dataset consisting of a (nouns, brain voxels, human subjects) tensor and a (nouns, properties) matrix, with coupling along the nouns dimension. Scoup-SMT is able to find meaningful latent variables, as well as to predict brain activity with competitive accuracy. Finally, we demonstrate the generality of Scoup-SMT, by applying it on a Facebook dataset (users, friends, wall-postings); there, Scoup-SMT spots spammer-like anomalies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We had previously shown that regularization principles lead to approximation schemes, as Radial Basis Functions, which are equivalent to networks with one layer of hidden units, called Regularization Networks. In this paper we show that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models, Breiman's hinge functions and some forms of Projection Pursuit Regression. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to different types of smoothness assumptions. In the final part of the paper, we also show a relation between activation functions of the Gaussian and sigmoidal type.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach