992 resultados para Basis-set


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chapter 20 Clustering User Data for User Modelling in the GUIDE Multi-modal Set- top Box PM Langdon and P. Biswas 20.1 ... It utilises advanced user modelling and simulation in conjunction with a single layer interface that permits a ...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reusing steel and aluminum components would reduce the need for new production, possibly creating significant savings in carbon emissions. Currently, there is no clearly defined set of strategies or barriers to enable assessment of appropriate component reuse; neither is it possible to predict future levels of reuse. This work presents a global assessment of the potential for reusing steel and aluminum components. A combination of top-down and bottom-up analyses is used to allocate the final destinations of current global steel and aluminum production to product types. A substantial catalogue has been compiled for these products characterizing key features of steel and aluminum components including design specifications, requirements in use, and current reuse patterns. To estimate the fraction of end-of-life metal components that could be reused for each product, the catalogue formed the basis of a set of semistructured interviews with industrial experts. The results suggest that approximately 30% of steel and aluminum used in current products could be reused. Barriers against reuse are examined, prompting recommendations for redesign that would facilitate future reuse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We review some recently published methods to represent atomic neighbourhood environments, and analyse their relative merits in terms of their faithfulness and suitability for fitting potential energy surfaces. The crucial properties that such representations (sometimes called descriptors) must have are differentiability with respect to moving the atoms, and invariance to the basic symmetries of physics: rotation, reflection, translation, and permutation of atoms of the same species. We demonstrate that certain widely used descriptors that initially look quite different are specific cases of a general approach, in which a finite set of basis functions with increasing angular wave numbers are used to expand the atomic neighbourhood density function. Using the example system of small clusters, we quantitatively show that this expansion needs to be carried to higher and higher wave numbers as the number of neighbours increases in order to obtain a faithful representation, and that variants of the descriptors converge at very different rates. We also propose an altogether new approach, called Smooth Overlap of Atomic Positions (SOAP), that sidesteps these difficulties by directly defining the similarity between any two neighbourhood environments, and show that it is still closely connected to the invariant descriptors. We test the performance of the various representations by fitting models to the potential energy surface of small silicon clusters and the bulk crystal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When searching for characteristic subpatterns in potentially noisy graph data, it appears self-evident that having multiple observations would be better than having just one. However, it turns out that the inconsistencies introduced when different graph instances have different edge sets pose a serious challenge. In this work we address this challenge for the problem of finding maximum weighted cliques. We introduce the concept of most persistent soft-clique. This is subset of vertices, that 1) is almost fully or at least densely connected, 2) occurs in all or almost all graph instances, and 3) has the maximum weight. We present a measure of clique-ness, that essentially counts the number of edge missing to make a subset of vertices into a clique. With this measure, we show that the problem of finding the most persistent soft-clique problem can be cast either as: a) a max-min two person game optimization problem, or b) a min-min soft margin optimization problem. Both formulations lead to the same solution when using a partial Lagrangian method to solve the optimization problems. By experiments on synthetic data and on real social network data we show that the proposed method is able to reliably find soft cliques in graph data, even if that is distorted by random noise or unreliable observations. Copyright 2012 by the author(s)/owner(s).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a wind-turbine gearbox, planet bearings exhibit a high failure rate and are considered as one of the most critical components. Development of efficient vibration based fault detection methods for these bearings requires a thorough understanding of their vibration signature. Much work has been done to study the vibration properties of healthy planetary gear sets and to identify fault frequencies in fixed-axis bearings. However, vibration characteristics of planetary gear sets containing localized planet bearing defects (spalls or pits) have not been studied so far. In this paper, we propose a novel analytical model of a planetary gear set with ring gear flexibility and localized bearing defects as two key features. The model is used to simulate the vibration response of a planetary system in the presence of a defective planet bearing with faults on inner or outer raceway. The characteristic fault signature of a planetary bearing defect is determined and sources of modulation sidebands are identified. The findings from this work will be useful to improve existing sensor placement strategies and to develop more sophisticated fault detection algorithms. Copyright © 2011 by ASME.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The brain encodes visual information with limited precision. Contradictory evidence exists as to whether the precision with which an item is encoded depends on the number of stimuli in a display (set size). Some studies have found evidence that precision decreases with set size, but others have reported constant precision. These groups of studies differed in two ways. The studies that reported a decrease used displays with heterogeneous stimuli and tasks with a short-term memory component, while the ones that reported constancy used homogeneous stimuli and tasks that did not require short-term memory. To disentangle the effects of heterogeneity and short-memory involvement, we conducted two main experiments. In Experiment 1, stimuli were heterogeneous, and we compared a condition in which target identity was revealed before the stimulus display with one in which it was revealed afterward. In Experiment 2, target identity was fixed, and we compared heterogeneous and homogeneous distractor conditions. In both experiments, we compared an optimal-observer model in which precision is constant with set size with one in which it depends on set size. We found that precision decreases with set size when the distractors are heterogeneous, regardless of whether short-term memory is involved, but not when it is homogeneous. This suggests that heterogeneity, not short-term memory, is the critical factor. In addition, we found that precision exhibits variability across items and trials, which may partly be caused by attentional fluctuations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate and efficient computation of the distance function d for a given domain is important for many areas of numerical modeling. Partial differential (e.g. HamiltonJacobi type) equation based distance function algorithms have desirable computational efficiency and accuracy. In this study, as an alternative, a Poisson equation based level set (distance function) is considered and solved using the meshless boundary element method (BEM). The application of this for shape topology analysis, including the medial axis for domain decomposition, geometric de-featuring and other aspects of numerical modeling is assessed. © 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Change detection is a classic paradigm that has been used for decades to argue that working memory can hold no more than a fixed number of items ("item-limit models"). Recent findings force us to consider the alternative view that working memory is limited by the precision in stimulus encoding, with mean precision decreasing with increasing set size ("continuous-resource models"). Most previous studies that used the change detection paradigm have ignored effects of limited encoding precision by using highly discriminable stimuli and only large changes. We conducted two change detection experiments (orientation and color) in which change magnitudes were drawn from a wide range, including small changes. In a rigorous comparison of five models, we found no evidence of an item limit. Instead, human change detection performance was best explained by a continuous-resource model in which encoding precision is variable across items and trials even at a given set size. This model accounts for comparison errors in a principled, probabilistic manner. Our findings sharply challenge the theoretical basis for most neural studies of working memory capacity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a companion paper (McRobie(2013) arxiv:1304.3918), a simple set of `elemental' estimators was presented for the Generalized Pareto tail parameter. Each elemental estimator: involves only three log-spacings; is absolutely unbiased for all values of the tail parameter; is location- and scale-invariant; and is valid for all sample sizes $N$, even as small as $N= 3$. It was suggested that linear combinations of such elementals could then be used to construct efficient unbiased estimators. In this paper, the analogous mathematical approach is taken to the Generalised Extreme Value (GEV) distribution. The resulting elemental estimators, although not absolutely unbiased, are found to have very small bias, and may thus provide a useful basis for the construction of efficient estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Looking for a target in a visual scene becomes more difficult as the number of stimuli increases. In a signal detection theory view, this is due to the cumulative effect of noise in the encoding of the distractors, and potentially on top of that, to an increase of the noise (i.e., a decrease of precision) per stimulus with set size, reflecting divided attention. It has long been argued that human visual search behavior can be accounted for by the first factor alone. While such an account seems to be adequate for search tasks in which all distractors have the same, known feature value (i.e., are maximally predictable), we recently found a clear effect of set size on encoding precision when distractors are drawn from a uniform distribution (i.e., when they are maximally unpredictable). Here we interpolate between these two extreme cases to examine which of both conclusions holds more generally as distractor statistics are varied. In one experiment, we vary the level of distractor heterogeneity; in another we dissociate distractor homogeneity from predictability. In all conditions in both experiments, we found a strong decrease of precision with increasing set size, suggesting that precision being independent of set size is the exception rather than the rule.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we tackle the problem of learning a linear regression model whose parameter is a fixed-rank matrix. We study the Riemannian manifold geometry of the set of fixed-rank matrices and develop efficient line-search algorithms. The proposed algorithms have many applications, scale to high-dimensional problems, enjoy local convergence properties and confer a geometric basis to recent contributions on learning fixed-rank matrices. Numerical experiments on benchmarks suggest that the proposed algorithms compete with the state-of-the-art, and that manifold optimization offers a versatile framework for the design of rank-constrained machine learning algorithms. Copyright 2011 by the author(s)/owner(s).