68 resultados para Rough Set
Influencing factors of successful transitions towards product-service systems: A simulation approach
Resumo:
Product-Service Systems (PSS) are new business strategies moving and extending the product value towards its functional usage and related required services. From a theoretical point of view the PSS concept is known since a decade and many Authors reported reasonable possible success factors: higher profits over the entire life-cycle, diminished environmental burden, and localization of required services. Nevertheless the PSS promises remain quantitatively unproven relaying on a simple theory that involves a few constructs with some empirical grounding, but that is limited by weak conceptualization, few propositions, and/or rough underlying theoretical logic. A plausible interpretation to analyze the possible evolution of a PSS strategy could be considering it as a new business proposition competing on a traditional Product-Oriented (PO) market, assumed at its own equilibrium state at a given time. The analysis of the dynamics associated to a possible transition from a traditional PO to a PSS strategy allows investigating the main parameters and variables influencing an eventual successful adoption. This research is worthwhile because organizations undergoing fundamental PSS strategy are concerned about change and inertia key processes which, despite equilibrium theory and because of negative feedback loops, could undermine, economically, the return of their PSS proposition. In this paper Authors propose a qualitative System Dynamics (SD) approach by considering the PSS as a perturbation of an existing PO market featured by a set of known parameters. The proposed model incorporates several PSS factors able to influence the success of a PSS proposition under a set of given and justified assumptions, attempting to place this business strategy in a dynamic framework.
Resumo:
In this article, we detail the methodology developed to construct arbitrarily high order schemes - linear and WENO - on 3D mixed-element unstructured meshes made up of general convex polyhedral elements. The approach is tailored specifically for the solution of scalar level set equations for application to incompressible two-phase flow problems. The construction of WENO schemes on 3D unstructured meshes is notoriously difficult, as it involves a much higher level of complexity than 2D approaches. This due to the multiplicity of geometrical considerations introduced by the extra dimension, especially on mixed-element meshes. Therefore, we have specifically developed a number of algorithms to handle mixed-element meshes composed of convex polyhedra with convex polygonal faces. The contribution of this work concerns several areas of interest: the formulation of an improved methodology in 3D, the minimisation of computational runtime in the implementation through the maximum use of pre-processing operations, the generation of novel methods to handle complex 3D mixed-element meshes and finally the application of the method to the transport of a scalar level set. © 2012 Global-Science Press.
Resumo:
Chapter 20 Clustering User Data for User Modelling in the GUIDE Multi-modal Set- top Box PM Langdon and P. Biswas 20.1 ... It utilises advanced user modelling and simulation in conjunction with a single layer interface that permits a ...
Resumo:
This paper describes a new approach to model the forces on a tread block for a free-rolling tyre in contact with a rough road. A theoretical analysis based on realistic tread mechanical properties and road roughness is presented, indicating partial contact between a tread block and a rough road. Hence an asperity-scale indentation model is developed using a semi-empirical formulation, taking into account both the rubber viscoelasticity and the tread block geometry. The model aims to capture the essential details of the contact at the simplest level, to make it suitable as part of a time-domain dynamic analysis of the coupled tyre-road system. The indentation model is found to have a good correlation with the finite element (FE) predictions and is validated against experimental results using a rolling contact rig. When coupled to a deformed tyre belt profile, the indentation model predicts normal and tangential force histories inside the tyre contact patch that show good agreement with FE predictions. © 2012 Elsevier B.V..
Resumo:
The commercial far-range (>10m) infrastructure spatial data collection methods are not completely automated. They need significant amount of manual post-processing work and in some cases, the equipment costs are significant. This paper presents a method that is the first step of a stereo videogrammetric framework and holds the promise to address these issues. Under this method, video streams are initially collected from a calibrated set of two video cameras. For each pair of simultaneous video frames, visual feature points are detected and their spatial coordinates are then computed. The result, in the form of a sparse 3D point cloud, is the basis for the next steps in the framework (i.e., camera motion estimation and dense 3D reconstruction). A set of data, collected from an ongoing infrastructure project, is used to show the merits of the method. Comparison with existing tools is also shown, to indicate the performance differences of the proposed method in the level of automation and the accuracy of results.
Resumo:
When searching for characteristic subpatterns in potentially noisy graph data, it appears self-evident that having multiple observations would be better than having just one. However, it turns out that the inconsistencies introduced when different graph instances have different edge sets pose a serious challenge. In this work we address this challenge for the problem of finding maximum weighted cliques. We introduce the concept of most persistent soft-clique. This is subset of vertices, that 1) is almost fully or at least densely connected, 2) occurs in all or almost all graph instances, and 3) has the maximum weight. We present a measure of clique-ness, that essentially counts the number of edge missing to make a subset of vertices into a clique. With this measure, we show that the problem of finding the most persistent soft-clique problem can be cast either as: a) a max-min two person game optimization problem, or b) a min-min soft margin optimization problem. Both formulations lead to the same solution when using a partial Lagrangian method to solve the optimization problems. By experiments on synthetic data and on real social network data we show that the proposed method is able to reliably find soft cliques in graph data, even if that is distorted by random noise or unreliable observations. Copyright 2012 by the author(s)/owner(s).
Resumo:
In a wind-turbine gearbox, planet bearings exhibit a high failure rate and are considered as one of the most critical components. Development of efficient vibration based fault detection methods for these bearings requires a thorough understanding of their vibration signature. Much work has been done to study the vibration properties of healthy planetary gear sets and to identify fault frequencies in fixed-axis bearings. However, vibration characteristics of planetary gear sets containing localized planet bearing defects (spalls or pits) have not been studied so far. In this paper, we propose a novel analytical model of a planetary gear set with ring gear flexibility and localized bearing defects as two key features. The model is used to simulate the vibration response of a planetary system in the presence of a defective planet bearing with faults on inner or outer raceway. The characteristic fault signature of a planetary bearing defect is determined and sources of modulation sidebands are identified. The findings from this work will be useful to improve existing sensor placement strategies and to develop more sophisticated fault detection algorithms. Copyright © 2011 by ASME.
Resumo:
The brain encodes visual information with limited precision. Contradictory evidence exists as to whether the precision with which an item is encoded depends on the number of stimuli in a display (set size). Some studies have found evidence that precision decreases with set size, but others have reported constant precision. These groups of studies differed in two ways. The studies that reported a decrease used displays with heterogeneous stimuli and tasks with a short-term memory component, while the ones that reported constancy used homogeneous stimuli and tasks that did not require short-term memory. To disentangle the effects of heterogeneity and short-memory involvement, we conducted two main experiments. In Experiment 1, stimuli were heterogeneous, and we compared a condition in which target identity was revealed before the stimulus display with one in which it was revealed afterward. In Experiment 2, target identity was fixed, and we compared heterogeneous and homogeneous distractor conditions. In both experiments, we compared an optimal-observer model in which precision is constant with set size with one in which it depends on set size. We found that precision decreases with set size when the distractors are heterogeneous, regardless of whether short-term memory is involved, but not when it is homogeneous. This suggests that heterogeneity, not short-term memory, is the critical factor. In addition, we found that precision exhibits variability across items and trials, which may partly be caused by attentional fluctuations.