58 resultados para uncertainty-based coordination


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates neural network-based probabilistic decision support system to assess drivers' knowledge for the objective of developing a renewal policy of driving licences. The probabilistic model correlates drivers' demographic data to their results in a simulated written driving exam (SWDE). The probabilistic decision support system classifies drivers' into two groups of passing and failing a SWDE. Knowledge assessment of drivers within a probabilistic framework allows quantifying and incorporating uncertainty information into the decision-making system. The results obtained in a Jordanian case study indicate that the performance of the probabilistic decision support systems is more reliable than conventional deterministic decision support systems. Implications of the proposed probabilistic decision support systems on the renewing of the driving licences decision and the possibility of including extra assessment methods are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In nonlinear and stochastic control problems, learning an efficient feed-forward controller is not amenable to conventional neurocontrol methods. For these approaches, estimating and then incorporating uncertainty in the controller and feed-forward models can produce more robust control results. Here, we introduce a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. A nonlinear multi-variable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non-Gaussian distributions of control signal as well as processes with hysteresis. © 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern advances in technology have led to more complex manufacturing processes whose success centres on the ability to control these processes with a very high level of accuracy. Plant complexity inevitably leads to poor models that exhibit a high degree of parametric or functional uncertainty. The situation becomes even more complex if the plant to be controlled is characterised by a multivalued function or even if it exhibits a number of modes of behaviour during its operation. Since an intelligent controller is expected to operate and guarantee the best performance where complexity and uncertainty coexist and interact, control engineers and theorists have recently developed new control techniques under the framework of intelligent control to enhance the performance of the controller for more complex and uncertain plants. These techniques are based on incorporating model uncertainty. The newly developed control algorithms for incorporating model uncertainty are proven to give more accurate control results under uncertain conditions. In this paper, we survey some approaches that appear to be promising for enhancing the performance of intelligent control systems in the face of higher levels of complexity and uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Last mile relief distribution is the final stage of humanitarian logistics. It refers to the supply of relief items from local distribution centers to the disaster affected people (Balcik et al., 2008). In the last mile relief distribution literature, researchers have focused on the use of optimisation techniques for determining the exact optimal solution (Liberatore et al., 2014), but there is a need to include behavioural factors with those optimisation techniques in order to obtain better predictive results. This paper will explain how improving the coordination factor increases the effectiveness of the last mile relief distribution process. There are two stages of methodology used to achieve the goal: Interviews: The authors conducted interviews with the Indian Government and with South Asian NGOs to identify the critical factors for final relief distribution. After thematic and content analysis of the interviews and the reports, the authors found some behavioural factors which affect the final relief distribution. Model building: Last mile relief distribution in India follows a specific framework described in the Indian Government disaster management handbook. We modelled this framework using agent based simulation and investigated the impact of coordination on effectiveness. We define effectiveness as the speed and accuracy with which aid is delivered to affected people. We tested through simulation modelling whether coordination improves effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In dimensional metrology, often the largest source of uncertainty of measurement is thermal variation. Dimensional measurements are currently scaled linearly, using ambient temperature measurements and coefficients of thermal expansion, to ideal metrology conditions at 20˚C. This scaling is particularly difficult to implement with confidence in large volumes as the temperature is unlikely to be uniform, resulting in thermal gradients. A number of well-established computational methods are used in the design phase of product development for the prediction of thermal and gravitational effects, which could be used to a greater extent in metrology. This paper outlines the theory of how physical measurements of dimension and temperature can be combined more comprehensively throughout the product lifecycle, from design through to the manufacturing phase. The Hybrid Metrology concept is also introduced: an approach to metrology, which promises to improve product and equipment integrity in future manufacturing environments. The Hybrid Metrology System combines various state of the art physical dimensional and temperature measurement techniques with established computational methods to better predict thermal and gravitational effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The uncertainty of measurements must be quantified and considered in order to prove conformance with specifications and make other meaningful comparisons based on measurements. While there is a consistent methodology for the evaluation and expression of uncertainty within the metrology community industry frequently uses the alternative Measurement Systems Analysis methodology. This paper sets out to clarify the differences between uncertainty evaluation and MSA and presents a novel hybrid methodology for industrial measurement which enables a correct evaluation of measurement uncertainty while utilising the practical tools of MSA. In particular the use of Gage R&R ANOVA and Attribute Gage studies within a wider uncertainty evaluation framework is described. This enables in-line measurement data to be used to establish repeatability and reproducibility, without time consuming repeatability studies being carried out, while maintaining a complete consideration of all sources of uncertainty and therefore enabling conformance to be proven with a stated level of confidence. Such a rigorous approach to product verification will become increasingly important in the era of the Light Controlled Factory with metrology acting as the driving force to achieve the right first time and highly automated manufacture of high value large scale products such as aircraft, spacecraft and renewable power generation structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty text detection is important to many social-media-based applications since more and more users utilize social media platforms (e.g., Twitter, Facebook, etc.) as information source to produce or derive interpretations based on them. However, existing uncertainty cues are ineffective in social media context because of its specific characteristics. In this paper, we propose a variant of annotation scheme for uncertainty identification and construct the first uncertainty corpus based on tweets. We then conduct experiments on the generated tweets corpus to study the effectiveness of different types of features for uncertainty text identification. © 2013 Association for Computational Linguistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of this study is on the governance decisions in a concurrent channels context, in the case of uncertainty. The study examines how a firm chooses to deploy its sales force in times of uncertainty, and the subsequent performance outcome of those deployment choices. The theoretical framework is based on multiple theories of governance, including transaction cost analysis (TCA), agency theory, and institutional economics. Three uncertainty variables are investigated in this study. The first two are demand and competitive uncertainty which are considered to be industry-level market uncertainty forms. The third uncertainty, political uncertainty, is chosen as it is an important dimension of institutional environments, capturing non-economic circumstances such as regulations and political systemic issues. The study employs longitudinal secondary data from a Thai hotel chain, comprising monthly observations from January 2007 – December 2012. This hotel chain has its operations in 4 countries, Thailand, the Philippines, United Arab Emirates – Dubai, and Egypt, all of which experienced substantial demand, competitive, and political uncertainty during the study period. This makes them ideal contexts for this study. Two econometric models, both deploying Newey-West estimations, are employed to test 13 hypotheses. The first model considers the relationship between uncertainty and governance. The second model is a version of Newey-West, using an Instrumental Variables (IV) estimator and a Two-Stage Least Squares model (2SLS), to test the direct effect of uncertainty on performance and the moderating effect of governance on the relationship between uncertainty and performance. The observed relationship between uncertainty and governance observed follows a core prediction of TCA; that vertical integration is the preferred choice of governance when uncertainty rises. As for the subsequent performance outcomes, the results corroborate that uncertainty has a negative effect on performance. Importantly, the findings show that becoming more vertically integrated cannot help moderate the effect of demand and competitive uncertainty, but can significantly moderate the effect of political uncertainty. These findings have significant theoretical and practical implications, and extend our knowledge of the impact on uncertainty significantly, as well as bringing an institutional perspective to TCA. Further, they offer managers novel insight into the nature of different types of uncertainty, their impact on performance, and how channel decisions can mitigate these impacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A mild template removal of microcrystalline beta zeolite, based on Fenton chemistry, was optimized. Fenton detemplation was studied in terms of applicability conditions window, reaction rate and scale up. TGA and CHN elemental analysis were used to evaluate the detemplation effectiveness, while ICP, XRD, LPHR-Ar physisorption, and 27Al MAS NMR were applied to characterize the structure and texture of the resulting materials. The material properties were compared to calcination. By understanding the interplay of relevant parameters of the Fenton chemistry, the process can be optimized in order to make it industrially attractive for scale-up. The H2O2 utilization can be minimized down to 15 mL H2O2/g (88 °C, 30 ppm Fe), implying a high solid concentration and low consumption of H2O2. When Fe concentration must be minimized, values as low as 5 ppm Fe can be applied (88 °C, 30 mL H2O2/g), to achieve full detemplation. The reaction time to completeness can be reduced to 5 h when combining a Fe-oxalate catalyst with UV radiation. The protocol was scaled up to 100 times larger its original recipe. In terms of the material's properties, the scaled material is structurally comparable to the calcined counterpart (comparable Si/Al and XRD patterns), while it displays benefits in terms of texture and Al-coordination, the latter with full preservation of the tetrahedral Al

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most pavement design procedures incorporate reliability to account for design inputs-associated uncertainty and variability effect on predicted performance. The load and resistance factor design (LRFD) procedure, which delivers economical section while considering design inputs variability separately, has been recognised as an effective tool to incorporate reliability into design procedures. This paper presents a new reliability-based calibration in LRFD format for a mechanics-based fatigue cracking analysis framework. This paper employs a two-component reliability analysis methodology that utilises a central composite design-based response surface approach and a first-order reliability method. The reliability calibration was achieved based on a number of field pavement sections that have well-documented performance history and high-quality field and laboratory data. The effectiveness of the developed LRFD procedure was evaluated by performing pavement designs of various target reliabilities and design conditions. The result shows an excellent agreement between the target and actual reliabilities. Furthermore, it is clear from the results that more design features need to be included in the reliability calibration to minimise the deviation of the actual reliability from the target reliability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between uncertainty and firms’ risk-taking behaviour has been a focus of investigation since early discussion of the nature of enterprise activity. Here, we focus on how firms’ perceptions of environmental uncertainty and their perceptions of the risks involved impact on their willingness to undertake green innovation. Analysis is based on a cross-sectional survey of UK food companies undertaken in 2008. The results reinforce the relationship between perceived environmental uncertainty and perceived innovation risk and emphasise the importance of macro-uncertainty in shaping firms’ willingness to undertake green innovation. The perceived (market-related) riskiness of innovation also positively influences the probability of innovating, suggesting either a proactive approach to stimulating market disruption or an opportunistic approach to innovation leadership.