987 resultados para TENSOR LEED


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that Kraus' property $ S_{\sigma }$ is preserved under taking weak* closed sums with masa-bimodules of finite width and establish an intersection formula for weak* closed spans of tensor products, one of whose terms is a masa-bimodule of finite width. We initiate the study of the question of when operator synthesis is preserved under the formation of products and prove that the union of finitely many sets of the form $ \kappa \times \lambda $, where $ \kappa $ is a set of finite width while $ \lambda $ is operator synthetic, is, under a necessary restriction on the sets $ \lambda $, again operator synthetic. We show that property $ S_{\sigma }$ is preserved under spatial Morita subordinance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that, if M is a subspace lattice with the property that the rank one subspace of its operator algebra is weak* dense, L is a commutative subspace lattice and P is the lattice of all projections on a separable Hilbert space, then L⊗M⊗P is reflexive. If M is moreover an atomic Boolean subspace lattice while L is any subspace lattice, we provide a concrete lattice theoretic description of L⊗M in terms of projection valued functions defined on the set of atoms of M . As a consequence, we show that the Lattice Tensor Product Formula holds for AlgM and any other reflexive operator algebra and give several further corollaries of these results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How can we correlate the neural activity in the human brain as it responds to typed words, with properties of these terms (like ‘edible’, ‘fits in hand’)? In short, we want to find latent variables, that jointly explain both the brain activity, as well as the behavioral responses. This is one of many settings of the Coupled Matrix-Tensor Factorization (CMTF) problem.

Can we accelerate any CMTF solver, so that it runs within a few minutes instead of tens of hours to a day, while maintaining good accuracy? We introduce Turbo-SMT, a meta-method capable of doing exactly that: it boosts the performance of any CMTF algorithm, by up to 200x, along with an up to 65 fold increase in sparsity, with comparable accuracy to the baseline.

We apply Turbo-SMT to BrainQ, a dataset consisting of a (nouns, brain voxels, human subjects) tensor and a (nouns, properties) matrix, with coupling along the nouns dimension. Turbo-SMT is able to find meaningful latent variables, as well as to predict brain activity with competitive accuracy.




Relevância:

20.00% 20.00%

Publicador:

Resumo:

LOW-ENERGY electron diffraction (LEED) has become the most successful technique in surface crystallography1, but because of the complexity of the surface-electron scattering interactions, analyses of LEED data are still conducted on a trial-and-error basis: a direct-inversion method for treating LEED intensity data remains an attractive goal2. Building on recent theoretical and experimental developments in electron holography from surface structures3-16, we show here that three-dimensional images with atomic resolution can be obtained by a direct transform of conventional LEED intensity spectra.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Molecularly adsorbed CO on Pd{110} has been shown (R. Raval et al., Chem. Phys. Lett. 167 (1990) 391, ref. [1]) to induce a substantial reconstruction of the surface in the coverage range 0.3 <theta less-than-or-equal-to 0.75. Throughout this coverage range, the adsorbate-covered reconstructed surface exhibits a (4 x 2) LEED pattern. However, the exact nature of the reconstruction remains uncertain. We have conducted a LEED I(E) "fingerprinting" analysis of the CO/Pd{110}-(4 x 2) structure in order to establish the type of reconstruction induced in the metal surface. This study shows that the LEED I(E) profiles of the integral order and appropriate half-order beams of the CO/Pd{110}-(4 x 2) pattern closely resemble the I(E) profiles theoretically calculated for a Pd{110}-(1 x 2) missing-row structure. Additionally, there is a strong resemblance to the experimental LEED I(E) profiles for the Cs/Pd{110}-(1 x 2) structure which has also been shown to exhibit the missing-row structure. On the basis of this evidence we conclude that the CO/Pd{110}-(4 x 2) LEED pattern arises from a missing-row reconstruction of the Pd{110} surface which gives rise to a strong underlying (1 x 2) pattern plus a poorly ordered CO overlayer which produces weak, diffuse fourth-order spots in the LEED pattern.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How can we correlate neural activity in the human brain as it responds to words, with behavioral data expressed as answers to questions about these same words? In short, we want to find latent variables, that explain both the brain activity, as well as the behavioral responses. We show that this is an instance of the Coupled Matrix-Tensor Factorization (CMTF) problem. We propose Scoup-SMT, a novel, fast, and parallel algorithm that solves the CMTF problem and produces a sparse latent low-rank subspace of the data. In our experiments, we find that Scoup-SMT is 50-100 times faster than a state-of-the-art algorithm for CMTF, along with a 5 fold increase in sparsity. Moreover, we extend Scoup-SMT to handle missing data without degradation of performance. We apply Scoup-SMT to BrainQ, a dataset consisting of a (nouns, brain voxels, human subjects) tensor and a (nouns, properties) matrix, with coupling along the nouns dimension. Scoup-SMT is able to find meaningful latent variables, as well as to predict brain activity with competitive accuracy. Finally, we demonstrate the generality of Scoup-SMT, by applying it on a Facebook dataset (users, friends, wall-postings); there, Scoup-SMT spots spammer-like anomalies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Affiche de projet terminal, baccalauréat en Urbanisme. Institut d'urbanisme, Université de Montréal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We had previously shown that regularization principles lead to approximation schemes, as Radial Basis Functions, which are equivalent to networks with one layer of hidden units, called Regularization Networks. In this paper we show that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models, Breiman's hinge functions and some forms of Projection Pursuit Regression. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to different types of smoothness assumptions. In the final part of the paper, we also show a relation between activation functions of the Gaussian and sigmoidal type.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion Tensor Imaging (DTI) is a new magnetic resonance imaging modality capable of producing quantitative maps of microscopic natural displacements of water molecules that occur in brain tissues as part of the physical diffusion process. This technique has become a powerful tool in the investigation of brain structure and function because it allows for in vivo measurements of white matter fiber orientation. The application of DTI in clinical practice requires specialized processing and visualization techniques to extract and represent acquired information in a comprehensible manner. Tracking techniques are used to infer patterns of continuity in the brain by following in a step-wise mode the path of a set of particles dropped into a vector field. In this way, white matter fiber maps can be obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study complete continuity properties of operators onto ℓ2 and prove several results in the Dunford–Pettis theory of JB∗-triples and their projective tensor products, culminating in characterisations of the alternative Dunford–Pettis property for where E and F are JB∗-triples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A quantitative low energy electron diffraction (LEED) analysis has been performed for the p(2 x 2)-S and c(2 x 2)-S surface structures formed by exposing the (1 x 1) phase of Ir{100} to H2S at 750 K. S is found to adsorb on the fourfold hollow sites in both structures leading to Pendry R-factor values of 0.17 for the p(2 x 2)-S and 0.16 for the c(2 x 2)-S structures. The distances between S and the nearest and next-nearest Ir atoms were found to be similar in both structures: 2.36 +/- 0.01 angstrom and 3.33 +/- 0.01 angstrom, respectively. The buckling in the second substrate layer is consistent with other structural studies for S adsorption on fcc{100} transition metal surfaces: 0.09 angstrom for p(2 x 2)-S and 0.02 angstrom for c(2 x 2)-S structures. The (1 x 5) reconstruction, which is the most stable phase for clean Ir{100}, is completely lifted and a c(2 x 2)-S overlayer is formed after exposure to H,S at 300 K followed by annealing to 520 K. CO temperature-programmed desorption (TPD) experiments indicate that the major factor in the poisoning of Ir by S is site blocking. (c) 2005 Elsevier B.V. All rights reserved.