58 resultados para uncertainty-based coordination
em Aston University Research Archive
Resumo:
In the Light Controlled Factory part-to-part assembly and reduced weight will be enabled through the use of predictive fitting processes; low cost high accuracy reconfigurable tooling will be made possible by active compensation; improved control will allow accurate robotic machining; and quality will be improved through the use of traceable uncertainty based quality control throughout the production system. A number of challenges must be overcome before this vision will be realized; 1) controlling industrial robots for accurate machining; 2) compensation of measurements for thermal expansion; 3) Compensation of measurements for refractive index changes; 4) development of Embedded Metrology Tooling for in-tooling measurement and active tooling compensation; and 5) development of Software for the Planning and Control of Integrated Metrology Networks based on Quality Control with Uncertainty Evaluation and control systems for predictive processes. This paper describes how these challenges are being addressed, in particular the central challenge of developing large volume measurement process models within an integrated dimensional variation management (IDVM) system.
Resumo:
Visual detection performance (d') is usually an accelerating function of stimulus contrast, which could imply a smooth, threshold-like nonlinearity in the sensory response. Alternatively, Pelli (1985 Journal of the Optical Society of America A 2 1508 - 1532) developed the 'uncertainty model' in which responses were linear with contrast, but the observer was uncertain about which of many noisy channels contained the signal. Such internal uncertainty effectively adds noise to weak signals, and predicts the nonlinear psychometric function. We re-examined these ideas by plotting psychometric functions (as z-scores) for two observers (SAW, PRM) with high precision. The task was to detect a single, vertical, blurred line at the fixation point, or identify its polarity (light vs dark). Detection of a known polarity was nearly linear for SAW but very nonlinear for PRM. Randomly interleaving light and dark trials reduced performance and rendered it non-linear for SAW, but had little effect for PRM. This occurred for both single-interval and 2AFC procedures. The whole pattern of results was well predicted by our Monte Carlo simulation of Pelli's model, with only two free parameters. SAW (highly practised) had very low uncertainty. PRM (with little prior practice) had much greater uncertainty, resulting in lower contrast sensitivity, nonlinear performance, and no effect of external (polarity) uncertainty. For SAW, identification was about v2 better than detection, implying statistically independent channels for stimuli of opposite polarity, rather than an opponent (light - dark) channel. These findings strongly suggest that noise and uncertainty, rather than sensory nonlinearity, limit visual detection.
Resumo:
Hierarchical knowledge structures are frequently used within clinical decision support systems as part of the model for generating intelligent advice. The nodes in the hierarchy inevitably have varying influence on the decisionmaking processes, which needs to be reflected by parameters. If the model has been elicited from human experts, it is not feasible to ask them to estimate the parameters because there will be so many in even moderately-sized structures. This paper describes how the parameters could be obtained from data instead, using only a small number of cases. The original method [1] is applied to a particular web-based clinical decision support system called GRiST, which uses its hierarchical knowledge to quantify the risks associated with mental-health problems. The knowledge was elicited from multidisciplinary mental-health practitioners but the tree has several thousand nodes, all requiring an estimation of their relative influence on the assessment process. The method described in the paper shows how they can be obtained from about 200 cases instead. It greatly reduces the experts’ elicitation tasks and has the potential for being generalised to similar knowledge-engineering domains where relative weightings of node siblings are part of the parameter space.
Resumo:
Dynamically adaptive systems (DASs) are intended to monitor the execution environment and then dynamically adapt their behavior in response to changing environmental conditions. The uncertainty of the execution environment is a major motivation for dynamic adaptation; it is impossible to know at development time all of the possible combinations of environmental conditions that will be encountered. To date, the work performed in requirements engineering for a DAS includes requirements monitoring and reasoning about the correctness of adaptations, where the DAS requirements are assumed to exist. This paper introduces a goal-based modeling approach to develop the requirements for a DAS, while explicitly factoring uncertainty into the process and resulting requirements. We introduce a variation of threat modeling to identify sources of uncertainty and demonstrate how the RELAX specification language can be used to specify more flexible requirements within a goal model to handle the uncertainty. © 2009 Springer Berlin Heidelberg.
Resumo:
We have attempted to bring together two areas which are challenging for both IS research and practice: forms of coordination and management of knowledge in the context of global, virtual software development projects. We developed a more comprehensive, knowledge-based model of how coordination can be achieved, and\illustrated the heuristic and explanatory power of the model when applied to global software projects experiencing different degrees of success. We first reviewed the literature on coordination and determined what is known about coordination of knowledge in global software projects. From this we developed a new, distinctive knowledge-based model of coordination, which was then employed to analyze two case studies of global software projects, at SAP and Baan, to illustrate the utility of the model.
Resumo:
Inventory control in complex manufacturing environments encounters various sources of uncertainity and imprecision. This paper presents one fuzzy knowledge-based approach to solving the problem of order quantity determination, in the presence of uncertain demand, lead time and actual inventory level. Uncertain data are represented by fuzzy numbers, and vaguely defined relations between them are modeled by fuzzy if-then rules. The proposed representation and inference mechanism are verified using a large numbers of examples. The results of three representative cases are summarized. Finally a comparison between the developed fuzzy knowledge-based and traditional, probabilistic approaches is discussed.
Resumo:
We consider an inversion-based neurocontroller for solving control problems of uncertain nonlinear systems. Classical approaches do not use uncertainty information in the neural network models. In this paper we show how we can exploit knowledge of this uncertainty to our advantage by developing a novel robust inverse control method. Simulations on a nonlinear uncertain second order system illustrate the approach.
Resumo:
This paper presents a problem structuring methodology to assess real option decisions in the face of unpredictability. Based on principles of robustness analysis and scenario planning, we demonstrate how decision-aiding can facilitate participation in projects setting and achieve effective decision making through the use of real options reasoning. We argue that robustness heuristics developed in earlier studies can be practical proxies for real options performance, hence indicators of efficient flexible planning. The developed framework also highlights how to integrate real options solutions in firms’ strategic plans and operating actions. The use of the methodology in a location decision application is provided for illustration.
Resumo:
Authors from Burrough (1992) to Heuvelink et al. (2007) have highlighted the importance of GIS frameworks which can handle incomplete knowledge in data inputs, in decision rules and in the geometries and attributes modelled. It is particularly important for this uncertainty to be characterised and quantified when GI data is used for spatial decision making. Despite a substantial and valuable literature on means of representing and encoding uncertainty and its propagation in GI (e.g.,Hunter and Goodchild 1993; Duckham et al. 2001; Couclelis 2003), no framework yet exists to describe and communicate uncertainty in an interoperable way. This limits the usability of Internet resources of geospatial data, which are ever-increasing, based on specifications that provide frameworks for the ‘GeoWeb’ (Botts and Robin 2007; Cox 2006). In this paper we present UncertML, an XML schema which provides a framework for describing uncertainty as it propagates through many applications, including online risk management chains. This uncertainty description ranges from simple summary statistics (e.g., mean and variance) to complex representations such as parametric, multivariate distributions at each point of a regular grid. The philosophy adopted in UncertML is that all data values are inherently uncertain, (i.e., they are random variables, rather than values with defined quality metadata).
Resumo:
Scenario Planning is a strategy tool with growing popularity in both academia and practical situations. Current practices in the teaching of scenario planning are largely based on existing literature which utilises scenario planning to develop strategies for the future, primarily considering the assessment of perceived macro-external environmental uncertainties. However there is a body of literature hitherto ignored by scenario planning researchers, which suggests that Perceived Environmental Uncertainty (PEU) influences micro-external or industrial environmental as well as the internal environment of the organisation. This paper provides a review of the most dominant theories on scenario planning process, demonstrates the need to consider PEU theory within scenario planning and presents how this can be done. The scope of this paper is to enhance the scenario planning process as a tool taught for Strategy Development. A case vignette is developed based on published scenarios to demonstrate the potential utilisation of the proposed process.
Resumo:
The rapid global loss of biodiversity has led to a proliferation of systematic conservation planning methods. In spite of their utility and mathematical sophistication, these methods only provide approximate solutions to real-world problems where there is uncertainty and temporal change. The consequences of errors in these solutions are seldom characterized or addressed. We propose a conceptual structure for exploring the consequences of input uncertainty and oversimpli?ed approximations to real-world processes for any conservation planning tool or strategy. We then present a computational framework based on this structure to quantitatively model species representation and persistence outcomes across a range of uncertainties. These include factors such as land costs, landscape structure, species composition and distribution, and temporal changes in habitat. We demonstrate the utility of the framework using several reserve selection methods including simple rules of thumb and more sophisticated tools such as Marxan and Zonation. We present new results showing how outcomes can be strongly affected by variation in problem characteristics that are seldom compared across multiple studies. These characteristics include number of species prioritized, distribution of species richness and rarity, and uncertainties in the amount and quality of habitat patches. We also demonstrate how the framework allows comparisons between conservation planning strategies and their response to error under a range of conditions. Using the approach presented here will improve conservation outcomes and resource allocation by making it easier to predict and quantify the consequences of many different uncertainties and assumptions simultaneously. Our results show that without more rigorously generalizable results, it is very dif?cult to predict the amount of error in any conservation plan. These results imply the need for standard practice to include evaluating the effects of multiple real-world complications on the behavior of any conservation planning method.
Resumo:
When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.
Resumo:
The research described within this thesis is concerned with the investigation of transition metal ion complexation within hydrophilic copolymer membranes. The membranes are copolymers of 4-methyl-4'-vinyl-2,2'-bipyridine, the 2-hydroxyethyl ester of 4,4'- dicarboxy-2,2'-bipyridine & bis-(5-vinylsalicylidene)ethylenediamine with 2-hydroxyethyl methacrylate. The effect of the polymer matrix on the formation and properties of transition metal iron complexes has been studied, specifically Cr(III) & Fe(II) salts for the bipyridyl- based copolymer membranes and Co(II), Ni(II) & Cu(II) salts for the salenH2- based copolymer membranes. The concomitant effect of complex formation on the properties of the polymer matrix have also been studied, e.g. on mechanical strength. A detailed body of work into the kinetics and thermodynamics for the formation of Cu(II) complexes in the salenH2- based copolymer membranes has been performed. The rate of complex formation is found to be very slow while the value of K for the equilibrium of complex formation is found to be unexpectedly small and shows a slight anion dependence. These phenomena are explained in terms of the effects of the heterogeneous phase provided by the polymer matrix. The transport of Cr(III) ions across uncomplexed and Cr(III)-pre-complexed bipyridyl-based membranes has been studied. In both cases, no Cr(III) coordination occurs within the time-scale of an experiment. Pre-complexation of the membrane does not lead to a change in the rate of permeation of Cr(III) ions. The transport of Co(II), Ni(II) & Cu(II) ions across salenH2- based membranes shows that there is no detectable lag-time in transport of the ions, despite independent evidence that complex formation within the membranes does occur. Finally, the synthesis of a number of functionalised ligands is described. Although they were found to be non-polymerisable by the methods employed in this research, they remain interesting ligands which provide a startmg pomt for further functionalisation.
Resumo:
In this paper we present a novel method for emulating a stochastic, or random output, computer model and show its application to a complex rabies model. The method is evaluated both in terms of accuracy and computational efficiency on synthetic data and the rabies model. We address the issue of experimental design and provide empirical evidence on the effectiveness of utilizing replicate model evaluations compared to a space-filling design. We employ the Mahalanobis error measure to validate the heteroscedastic Gaussian process based emulator predictions for both the mean and (co)variance. The emulator allows efficient screening to identify important model inputs and better understanding of the complex behaviour of the rabies model.
Resumo:
The thesis begins with a conceptual model of the way that language diversity affects the strategies, organisation and subsidiary control policies of multinational companies. The model is based solely on the researcher'’ personal experience of working in a variety of international management roles, but in Chapter 2 a wide-ranging review of related academic literature finds evidence to support the key ideas. The model is developed as a series of propositions which are tested in a comparative case study, refined and then re-tested in a global survey of multinational subsidiaries. The principal findings of the empirical phases of the thesis endorse the main tenets of the model: - That language difference between parent and subsidiary will impair communication, create mistrust and impede relationship development. - That subsequently the feelings of uncertainty, suspicion and mistrust will influence the decisions taken by the parent company. - They will have heightened sensitivity to language issues and will implement policies to manage language differences. - They will adopt low-risk strategies in host countries where they are concerned about language difference. - They will use organisational and manpower strategies to minimise the consequences and risks of the communications problems with the subsidiary. - As a consequence the level of integration and knowledge flow between parent and subsidiary will be curtailed. - They will adopt styles of control that depend least on their ability to communicate with their subsidiary. Although there is adequate support for all of the above conclusions, on some key points the evidence of the Case Studies and Survey is contradictory. The thesis, therefore, closes with an agenda for further research that would address these inconsistencies.