971 resultados para Process uncertainty


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaussian processes provide natural non-parametric prior distributions over regression functions. In this paper we consider regression problems where there is noise on the output, and the variance of the noise depends on the inputs. If we assume that the noise is a smooth function of the inputs, then it is natural to model the noise variance using a second Gaussian process, in addition to the Gaussian process governing the noise-free output value. We show that prior uncertainty about the parameters controlling both processes can be handled and that the posterior distribution of the noise rate can be sampled from using Markov chain Monte Carlo methods. Our results on a synthetic data set give a posterior noise variance that well-approximates the true variance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise or corruption. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which allows for input noise given that some model of the noise process exists. In the limit where this noise process is small and symmetric it is shown, using the Laplace approximation, that there is an additional term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable and sampling this jointly with the network's weights, using Markov Chain Monte Carlo methods, it is demonstrated that it is possible to infer the unbiassed regression over the noiseless input.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally, geostatistical algorithms are contained within specialist GIS and spatial statistics software. Such packages are often expensive, with relatively complex user interfaces and steep learning curves, and cannot be easily integrated into more complex process chains. In contrast, Service Oriented Architectures (SOAs) promote interoperability and loose coupling within distributed systems, typically using XML (eXtensible Markup Language) and Web services. Web services provide a mechanism for a user to discover and consume a particular process, often as part of a larger process chain, with minimal knowledge of how it works. Wrapping current geostatistical algorithms with a Web service layer would thus increase their accessibility, but raises several complex issues. This paper discusses a solution to providing interoperable, automatic geostatistical processing through the use of Web services, developed in the INTAMAP project (INTeroperability and Automated MAPping). The project builds upon Open Geospatial Consortium standards for describing observations, typically used within sensor webs, and employs Geography Markup Language (GML) to describe the spatial aspect of the problem domain. Thus the interpolation service is extremely flexible, being able to support a range of observation types, and can cope with issues such as change of support and differing error characteristics of sensors (by utilising descriptions of the observation process provided by SensorML). XML is accepted as the de facto standard for describing Web services, due to its expressive capabilities which allow automatic discovery and consumption by ‘naive’ users. Any XML schema employed must therefore be capable of describing every aspect of a service and its processes. However, no schema currently exists that can define the complex uncertainties and modelling choices that are often present within geostatistical analysis. We show a solution to this problem, developing a family of XML schemata to enable the description of a full range of uncertainty types. These types will range from simple statistics, such as the kriging mean and variances, through to a range of probability distributions and non-parametric models, such as realisations from a conditional simulation. By employing these schemata within a Web Processing Service (WPS) we show a prototype moving towards a truly interoperable geostatistical software architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a greedy Bayesian experimental design criterion for heteroscedastic Gaussian process models. The criterion is based on the Fisher information and is optimal in the sense of minimizing parameter uncertainty for likelihood based estimators. We demonstrate the validity of the criterion under different noise regimes and present experimental results from a rabies simulator to demonstrate the effectiveness of the resulting approximately optimal designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scenario Planning is a strategy tool with growing popularity in both academia and practical situations. Current practices in the teaching of scenario planning are largely based on existing literature which utilises scenario planning to develop strategies for the future, primarily considering the assessment of perceived macro-external environmental uncertainties. However there is a body of literature hitherto ignored by scenario planning researchers, which suggests that Perceived Environmental Uncertainty (PEU) influences micro-external or industrial environmental as well as the internal environment of the organisation. This paper provides a review of the most dominant theories on scenario planning process, demonstrates the need to consider PEU theory within scenario planning and presents how this can be done. The scope of this paper is to enhance the scenario planning process as a tool taught for Strategy Development. A case vignette is developed based on published scenarios to demonstrate the potential utilisation of the proposed process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which accounts for input noise provided that a model of the noise process exists. In the limit where the noise process is small and symmetric it is shown, using the Laplace approximation, that this method adds an extra term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable, and sampling this jointly with the network’s weights, using a Markov chain Monte Carlo method, it is demonstrated that it is possible to infer the regression over the noiseless input. This leads to the possibility of training an accurate model of a system using less accurate, or more uncertain, data. This is demonstrated on both the, synthetic, noisy sine wave problem and a real problem of inferring the forward model for a satellite radar backscatter system used to predict sea surface wind vectors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a novel method for emulating a stochastic, or random output, computer model and show its application to a complex rabies model. The method is evaluated both in terms of accuracy and computational efficiency on synthetic data and the rabies model. We address the issue of experimental design and provide empirical evidence on the effectiveness of utilizing replicate model evaluations compared to a space-filling design. We employ the Mahalanobis error measure to validate the heteroscedastic Gaussian process based emulator predictions for both the mean and (co)variance. The emulator allows efficient screening to identify important model inputs and better understanding of the complex behaviour of the rabies model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines the phenomenon of strategy. Making as practised by small professional football clubs. The study was undertaken because football clubs were perceived to have problems with strategy-making and because it was believed that the specific circumstances of football clubs could be outside the range of views covered by conventional views of strategy-making. The characteristics of the club environment are its uncertainty and unpredictability, simultaneous competition and co--operation, strong regulations, and a not-for-profit orientation. Small clubs in particular face a constant struggle for financial viability and survival, due in part to split business and playing objectives. The study was designed to establish the extent and nature of the difficulties clubs experience with a view to preparing the way for creating practical guidance on ways to overcome them. Clearly, in order to survive in the long term, small professional football clubs require very effective strategic decisions. This study has addressed this issue by inquiring into the nature of strategy making for these organisations with the objective to establish the general direction in which the football clubs in question should be moving. As a result, the main research question to guide this investigation was determined as: Why do small professional football clubs have difficulties making strategies. The investigation was based on an analysis the concept of strategy and its elements, the strategic vision and objectives, the process by which strategic action comes about, the strategic action itself, and the context within which this action occurs. Data has been collected, analysed and interpreted in relation to each of these elements. Together with a wide variety of published material, 20 small football clubs have been sampled and personal interviews were conducted with board members of those clubs. The findings indicate that small football clubs do indeed experience considerable difficulties in making strategies, the reasons for which lie both in the characteristics of their competitive environment and their approaches to strategy-making. The competitive environment is characterised by a cartel-like structure with a high degree of regulation, high levels of uncertainty, little control over the core product or the production process, short-term business cycles and a close geographical link between a club with its local market. The management of clubs is characterised by the need to balance conflicting sporting and business objectives. Formal planning techniques are of little use in the small football club context as decision-making processes have a strong political character and the development of novel strategies is hindered by a strong conservative, industry paradigm and a lack of financial and managerial resources. It is concluded that there is no simple advice to be given to clubs, as they must re-examine the relationship between their playing and business objectives to create a unified and workable approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years there has been a great effort to combine the technologies and techniques of GIS and process models. This project examines the issues of linking a standard current generation 2½d GIS with several existing model codes. The focus for the project has been the Shropshire Groundwater Scheme, which is being developed to augment flow in the River Severn during drought periods by pumping water from the Shropshire Aquifer. Previous authors have demonstrated that under certain circumstances pumping could reduce the soil moisture available for crops. This project follows earlier work at Aston in which the effects of drawdown were delineated and quantified through the development of a software package that implemented a technique which brought together the significant spatially varying parameters. This technique is repeated here, but using a standard GIS called GRASS. The GIS proved adequate for the task and the added functionality provided by the general purpose GIS - the data capture, manipulation and visualisation facilities - were of great benefit. The bulk of the project is concerned with examining the issues of the linkage of GIS and environmental process models. To this end a groundwater model (Modflow) and a soil moisture model (SWMS2D) were linked to the GIS and a crop model was implemented within the GIS. A loose-linked approach was adopted and secondary and surrogate data were used wherever possible. The implications of which relate to; justification of a loose-linked versus a closely integrated approach; how, technically, to achieve the linkage; how to reconcile the different data models used by the GIS and the process models; control of the movement of data between models of environmental subsystems, to model the total system; the advantages and disadvantages of using a current generation GIS as a medium for linking environmental process models; generation of input data, including the use of geostatistic, stochastic simulation, remote sensing, regression equations and mapped data; issues of accuracy, uncertainty and simply providing adequate data for the complex models; how such a modelling system fits into an organisational framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diagnosing faults in wastewater treatment, like diagnosis of most problems, requires bi-directional plausible reasoning. This means that both predictive (from causes to symptoms) and diagnostic (from symptoms to causes) inferences have to be made, depending on the evidence available, in reasoning for the final diagnosis. The use of computer technology for the purpose of diagnosing faults in the wastewater process has been explored, and a rule-based expert system was initiated. It was found that such an approach has serious limitations in its ability to reason bi-directionally, which makes it unsuitable for diagnosing tasks under the conditions of uncertainty. The probabilistic approach known as Bayesian Belief Networks (BBNS) was then critically reviewed, and was found to be well-suited for diagnosis under uncertainty. The theory and application of BBNs are outlined. A full-scale BBN for the diagnosis of faults in a wastewater treatment plant based on the activated sludge system has been developed in this research. Results from the BBN show good agreement with the predictions of wastewater experts. It can be concluded that the BBNs are far superior to rule-based systems based on certainty factors in their ability to diagnose faults and predict systems in complex operating systems having inherently uncertain behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis reviews the main methodological developments in public sector investment appraisal and finds growing evidence that appraisal techniques are not fulfilling their earlier promise. It is suggested that an important reason for this failure lies in the inability of these techniques to handle uncertainty except in a highly circumscribed fashion. It is argued that a more fruitful approach is to strive for flexibility. Investment projects should be formulated with a view to making them responsive to a wide range of possible future events, rather than embodying a solution which is optimal for one configuration of circumstances only. The distinction drawn in economics between the short and the long run is used to examine the nature of flexibility. The concept of long run flexibility is applied to the pre-investment range of choice open to the decisionmaker. It is demonstrated that flexibility is reduced at a very early stage of decisionmaking by the conventional system of appraisal which evaluates only a small number of options. The pre-appraisal filtering process is considered further in relation to decisionmaking models. It is argued that for public sector projects the narrowing down of options is best understood in relation to an amended mixed scanning model which places importance on the process by which the 'national interest ' is determined. Short run flexibility deals with operational characteristics, the degree to which particular projects may respond to changing demands when the basic investment is already in place. The tension between flexibility and cost is noted. A short case study on the choice of electricity generating plant is presented. The thesis concludes with a brief examination of the approaches used by successive British governments to public sector investment, particularly in relation to the nationalised industries

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hierarchical knowledge structures are frequently used within clinical decision support systems as part of the model for generating intelligent advice. The nodes in the hierarchy inevitably have varying influence on the decisionmaking processes, which needs to be reflected by parameters. If the model has been elicited from human experts, it is not feasible to ask them to estimate the parameters because there will be so many in even moderately-sized structures. This paper describes how the parameters could be obtained from data instead, using only a small number of cases. The original method [1] is applied to a particular web-based clinical decision support system called GRiST, which uses its hierarchical knowledge to quantify the risks associated with mental-health problems. The knowledge was elicited from multidisciplinary mental-health practitioners but the tree has several thousand nodes, all requiring an estimation of their relative influence on the assessment process. The method described in the paper shows how they can be obtained from about 200 cases instead. It greatly reduces the experts’ elicitation tasks and has the potential for being generalised to similar knowledge-engineering domains where relative weightings of node siblings are part of the parameter space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scenarioplanning is a strategy tool with growing popularity in both academia and practical situations. Current practices of scenarioplanning are largely based on existing literature which utilises scenarioplanning to develop strategies for the future, primarily considering the assessment of perceived macro-external environmentaluncertainties. However there is a body of literature hitherto ignored by scenarioplanning researchers, which suggests that PerceivedEnvironmentalUncertainty (PEU) influences the micro-external as well as the internal environment of the organisation. This paper reviews the most dominant theories on scenarioplanning process and PEU, developing three propositions for the practice of scenarioplanning process. Furthermore, it shows how these propositions can be integrated in the scenarioplanning process in order to improve the development of strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to investigate the relations between perceived business uncertainty (PBU), use of external risk management (RM) consultants, formalisation of RM, magnitude of RM methods and perceived organisational outcomes. Design/methodology/approach: This paper is based on a questionnaire survey of members of the Chartered Institute of Management Accountants in the UK. Using AMOS 17.0, the paper tests the strength of the direct and indirect effects among the variables and explores the fit of the overall path model. Findings: The results indicate significant and positive associations exist between the extent of PBU and the level ofRMformalisation, as well as between the level ofRMformalisation and the magnitude of RMmethods adopted. The use of externalRMconsultants is also found to have a significant and positive impact on the magnitude of RM methods adopted. Finally, both the extent of RM formalisation and the magnitude of RM methods adopted are seen to be significantly associated with overall improvement in organisational outcomes. Research limitations/implications: The study uses perceptual measures of the level of business uncertainty, usage of RM and organisational outcomes. Further, the respondents are members of a management accounting professional body and the views of other managers, such as risk managers, who are also important to the governance process are not incorporated. Originality/value: This study provides empirical evidence on the impact ofRMdesign and usage on improvements in organisational outcomes. It contributes to the RM literature where empirical research is needed in order to be comparable with the traditional management control system literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.