959 resultados para Set-Valued Mapping


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective management is a key to ensuring the current and future sustainability of land, water and energy resources. Identifying the complexities of such management is not an easy task, especially since past studies have focussed on studying these resources in isolation from one another. However, with rapid population growth and an increase in the awareness of a potential change in climatic conditions that may affect the demand for and supply of food, water and energy, there has been a growing need to integrate the planning decisions relating to these three resources. The paper shows the visualisation of linked resources by drawing a set of interconnected Sankey diagrams for energy, water and land. These track the changes from basic resource (e.g. coal, surface water, groundwater and cropland) through transformations (e.g. fuel refining and desalination) to final services (e.g. sustenance, hygiene and transportation). The focus here is on the water analysis aspects of the tool, which uses California as a detailed case study. The movement of water in California is traced from its source to its services by mapping the different transformations of water from when it becomes available, through its use, to further treatment, to final sinks (including recycling and reuse of that resource). The connections that water has with energy and land resources for the state of California are highlighted. This includes the amount of energy used to pump and treat water, and the amount of water used for energy production and the land resources which create a water demand to produce crops for food. By mapping water in this way, policy-makers and resource managers can more easily understand the competing uses of water (environment, agriculture and urban use) through the identification of the services it delivers (e.g. sanitation, agriculture, landscaping), the potential opportunities for improving the management of the resource (e.g. building new desalination plants, reducing the demand for services), and the connections with other resources which are often overlooked in a traditional sector-based management strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The commercial far-range (>10 m) spatial data collection methods for acquiring infrastructure’s geometric data are not completely automated because of the necessary manual pre- and/or post-processing work. The required amount of human intervention and, in some cases, the high equipment costs associated with these methods impede their adoption by the majority of infrastructure mapping activities. This paper presents an automated stereo vision-based method, as an alternative and inexpensive solution, to producing a sparse Euclidean 3D point cloud of an infrastructure scene utilizing two video streams captured by a set of two calibrated cameras. In this process SURF features are automatically detected and matched between each pair of stereo video frames. 3D coordinates of the matched feature points are then calculated via triangulation. The detected SURF features in two successive video frames are automatically matched and the RANSAC algorithm is used to discard mismatches. The quaternion motion estimation method is then used along with bundle adjustment optimization to register successive point clouds. The method was tested on a database of infrastructure stereo video streams. The validity and statistical significance of the results were evaluated by comparing the spatial distance of randomly selected feature points with their corresponding tape measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The commercial far-range (>10m) infrastructure spatial data collection methods are not completely automated. They need significant amount of manual post-processing work and in some cases, the equipment costs are significant. This paper presents a method that is the first step of a stereo videogrammetric framework and holds the promise to address these issues. Under this method, video streams are initially collected from a calibrated set of two video cameras. For each pair of simultaneous video frames, visual feature points are detected and their spatial coordinates are then computed. The result, in the form of a sparse 3D point cloud, is the basis for the next steps in the framework (i.e., camera motion estimation and dense 3D reconstruction). A set of data, collected from an ongoing infrastructure project, is used to show the merits of the method. Comparison with existing tools is also shown, to indicate the performance differences of the proposed method in the level of automation and the accuracy of results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When searching for characteristic subpatterns in potentially noisy graph data, it appears self-evident that having multiple observations would be better than having just one. However, it turns out that the inconsistencies introduced when different graph instances have different edge sets pose a serious challenge. In this work we address this challenge for the problem of finding maximum weighted cliques. We introduce the concept of most persistent soft-clique. This is subset of vertices, that 1) is almost fully or at least densely connected, 2) occurs in all or almost all graph instances, and 3) has the maximum weight. We present a measure of clique-ness, that essentially counts the number of edge missing to make a subset of vertices into a clique. With this measure, we show that the problem of finding the most persistent soft-clique problem can be cast either as: a) a max-min two person game optimization problem, or b) a min-min soft margin optimization problem. Both formulations lead to the same solution when using a partial Lagrangian method to solve the optimization problems. By experiments on synthetic data and on real social network data we show that the proposed method is able to reliably find soft cliques in graph data, even if that is distorted by random noise or unreliable observations. Copyright 2012 by the author(s)/owner(s).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a wind-turbine gearbox, planet bearings exhibit a high failure rate and are considered as one of the most critical components. Development of efficient vibration based fault detection methods for these bearings requires a thorough understanding of their vibration signature. Much work has been done to study the vibration properties of healthy planetary gear sets and to identify fault frequencies in fixed-axis bearings. However, vibration characteristics of planetary gear sets containing localized planet bearing defects (spalls or pits) have not been studied so far. In this paper, we propose a novel analytical model of a planetary gear set with ring gear flexibility and localized bearing defects as two key features. The model is used to simulate the vibration response of a planetary system in the presence of a defective planet bearing with faults on inner or outer raceway. The characteristic fault signature of a planetary bearing defect is determined and sources of modulation sidebands are identified. The findings from this work will be useful to improve existing sensor placement strategies and to develop more sophisticated fault detection algorithms. Copyright © 2011 by ASME.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The brain encodes visual information with limited precision. Contradictory evidence exists as to whether the precision with which an item is encoded depends on the number of stimuli in a display (set size). Some studies have found evidence that precision decreases with set size, but others have reported constant precision. These groups of studies differed in two ways. The studies that reported a decrease used displays with heterogeneous stimuli and tasks with a short-term memory component, while the ones that reported constancy used homogeneous stimuli and tasks that did not require short-term memory. To disentangle the effects of heterogeneity and short-memory involvement, we conducted two main experiments. In Experiment 1, stimuli were heterogeneous, and we compared a condition in which target identity was revealed before the stimulus display with one in which it was revealed afterward. In Experiment 2, target identity was fixed, and we compared heterogeneous and homogeneous distractor conditions. In both experiments, we compared an optimal-observer model in which precision is constant with set size with one in which it depends on set size. We found that precision decreases with set size when the distractors are heterogeneous, regardless of whether short-term memory is involved, but not when it is homogeneous. This suggests that heterogeneity, not short-term memory, is the critical factor. In addition, we found that precision exhibits variability across items and trials, which may partly be caused by attentional fluctuations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate and efficient computation of the distance function d for a given domain is important for many areas of numerical modeling. Partial differential (e.g. HamiltonJacobi type) equation based distance function algorithms have desirable computational efficiency and accuracy. In this study, as an alternative, a Poisson equation based level set (distance function) is considered and solved using the meshless boundary element method (BEM). The application of this for shape topology analysis, including the medial axis for domain decomposition, geometric de-featuring and other aspects of numerical modeling is assessed. © 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demand for aluminum in final products has increased 30-fold since 1950 to 45 million tonnes per year, with forecasts predicting this exceptional growth to continue so that demand will reach 2-3 times today's levels by 2050. Aluminum production uses 3.5% of global electricity and causes 1% of global CO2 emissions, while meeting a 50% cut in emissions by 2050 against growing demand would require at least a 75% reduction in CO2 emissions per tonne of aluminum produced--a challenging prospect. In this paper we trace the global flows of aluminum from liquid metal to final products, revealing for the first time a complete map of the aluminum system and providing a basis for future study of the emissions abatement potential of material efficiency. The resulting Sankey diagram also draws attention to two key issues. First, around half of all liquid aluminum (~39 Mt) produced each year never reaches a final product, and a detailed discussion of these high yield losses shows significant opportunities for improvement. Second, aluminum recycling, which avoids the high energy costs and emissions of electrolysis, requires signification "dilution" (~ 8 Mt) and "cascade" (~ 6 Mt) flows of higher aluminum grades to make up for the shortfall in scrap supply and to obtain the desired alloy mix, increasing the energy required for recycling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The fundamental aim of clustering algorithms is to partition data points. We consider tasks where the discovered partition is allowed to vary with some covariate such as space or time. One approach would be to use fragmentation-coagulation processes, but these, being Markov processes, are restricted to linear or tree structured covariate spaces. We define a partition-valued process on an arbitrary covariate space using Gaussian processes. We use the process to construct a multitask clustering model which partitions datapoints in a similar way across multiple data sources, and a time series model of network data which allows cluster assignments to vary over time. We describe sampling algorithms for inference and apply our method to defining cancer subtypes based on different types of cellular characteristics, finding regulatory modules from gene expression data from multiple human populations, and discovering time varying community structure in a social network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Looking for a target in a visual scene becomes more difficult as the number of stimuli increases. In a signal detection theory view, this is due to the cumulative effect of noise in the encoding of the distractors, and potentially on top of that, to an increase of the noise (i.e., a decrease of precision) per stimulus with set size, reflecting divided attention. It has long been argued that human visual search behavior can be accounted for by the first factor alone. While such an account seems to be adequate for search tasks in which all distractors have the same, known feature value (i.e., are maximally predictable), we recently found a clear effect of set size on encoding precision when distractors are drawn from a uniform distribution (i.e., when they are maximally unpredictable). Here we interpolate between these two extreme cases to examine which of both conclusions holds more generally as distractor statistics are varied. In one experiment, we vary the level of distractor heterogeneity; in another we dissociate distractor homogeneity from predictability. In all conditions in both experiments, we found a strong decrease of precision with increasing set size, suggesting that precision being independent of set size is the exception rather than the rule.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this report is to compare the trapped field distribution under a local heating created at the sample edge for different sample morphologies. Hall probe mappings of the magnetic induction trapped in YBCO bulk samples maintained out of thermal equilibrium were performed on YBCO bulk single domains, YBCO single domains with regularly spaced hole arrays, and YBCO superconducting foams. The capability of heat draining was quantified by two criteria: the average induction decay and the size of the thermally affected zone caused by a local heating of the sample. Among the three investigated sample shapes, the drilled single domain displays a trapped induction which is weakly affected by the local heating while displaying a high trapped field. Finally, a simple numerical modelling of the heat flux spreading into a drilled sample is used to suggest some design rules about the hole configuration and their size. © 2005 IOP Publishing Ltd.