915 resultados para Uncertainty in Illness Theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inventory control in complex manufacturing environments encounters various sources of uncertainity and imprecision. This paper presents one fuzzy knowledge-based approach to solving the problem of order quantity determination, in the presence of uncertain demand, lead time and actual inventory level. Uncertain data are represented by fuzzy numbers, and vaguely defined relations between them are modeled by fuzzy if-then rules. The proposed representation and inference mechanism are verified using a large numbers of examples. The results of three representative cases are summarized. Finally a comparison between the developed fuzzy knowledge-based and traditional, probabilistic approaches is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OpenMI is a widely used standard allowing exchange of data between integrated models, which has mostly been applied to dynamic, deterministic models. Within the FP7 UncertWeb project we are developing mechanisms and tools to support the management of uncertainty in environmental models. In this paper we explore the integration of the UncertWeb framework with OpenMI, to assess the issues that arise when propagating uncertainty in OpenMI model compositions, and the degree of integration possible with UncertWeb tools. In particular we develop an uncertainty-enabled model for a simple Lotka-Volterra system with an interface conforming to the OpenMI standard, exploring uncertainty in the initial predator and prey levels, and the parameters of the model equations. We use the Elicitator tool developed within UncertWeb to identify the initial condition uncertainties, and show how these can be integrated, using UncertML, with simple Monte Carlo propagation mechanisms. The mediators we develop for OpenMI models are generic and produce standard Web services that expose the OpenMI models to a Web based framework. We discuss what further work is needed to allow a more complete system to be developed and show how this might be used practically.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Error and uncertainty in remotely sensed data come from several sources, and can be increased or mitigated by the processing to which that data is subjected (e.g. resampling, atmospheric correction). Historically the effects of such uncertainty have only been considered overall and evaluated in a confusion matrix which becomes high-level meta-data, and so is commonly ignored. However, some of the sources of uncertainty can be explicity identified and modelled, and their effects (which often vary across space and time) visualized. Others can be considered overall, but their spatial effects can still be visualized. This process of visualization is of particular value for users who need to assess the importance of data uncertainty for their own practical applications. This paper describes a Java-based toolkit, which uses interactive and linked views to enable visualization of data uncertainty by a variety of means. This allows users to consider error and uncertainty as integral elements of image data, to be viewed and explored, rather than as labels or indices attached to the data. © 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is (1) to highlight some recent and heretofore unpublished results in the theory of multiplier sequences and (2) to survey some open problems in this area of research. For the sake of clarity of exposition, we have grouped the problems in three subsections, although several of the problems are interrelated. For the reader’s convenience, we have included the pertinent definitions, cited references and related results, and in several instances, elucidated the problems by examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The question of forming aim-oriented description of an object domain of decision support process is outlined. Two main problems of an estimation and evaluation of data and knowledge uncertainty in decision support systems – straight and reverse, are formulated. Three conditions being the formalized criteria of aimoriented constructing of input, internal and output spaces of some decision support system are proposed. Definitions of appeared and hidden data uncertainties on some measuring scale are given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In nonlinear and stochastic control problems, learning an efficient feed-forward controller is not amenable to conventional neurocontrol methods. For these approaches, estimating and then incorporating uncertainty in the controller and feed-forward models can produce more robust control results. Here, we introduce a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. A nonlinear multi-variable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non-Gaussian distributions of control signal as well as processes with hysteresis. © 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sheer volume of citizen weather data collected and uploaded to online data hubs is immense. However as with any citizen data it is difficult to assess the accuracy of the measurements. Within this project we quantify just how much data is available, where it comes from, the frequency at which it is collected, and the types of automatic weather stations being used. We also list the numerous possible sources of error and uncertainty within citizen weather observations before showing evidence of such effects in real data. A thorough intercomparison field study was conducted, testing popular models of citizen weather stations. From this study we were able to parameterise key sources of bias. Most significantly the project develops a complete quality control system through which citizen air temperature observations can be passed. The structure of this system was heavily informed by the results of the field study. Using a Bayesian framework the system learns and updates its estimates of the calibration and radiation-induced biases inherent to each station. We then show the benefit of correcting for these learnt biases over using the original uncorrected data. The system also attaches an uncertainty estimate to each observation, which would provide real world applications that choose to incorporate such observations with a measure on which they may base their confidence in the data. The system relies on interpolated temperature and radiation observations from neighbouring professional weather stations for which a Bayesian regression model is used. We recognise some of the assumptions and flaws of the developed system and suggest further work that needs to be done to bring it to an operational setting. Such a system will hopefully allow applications to leverage the additional value citizen weather data brings to longstanding professional observing networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper details a method of estimating the uncertainty of dimensional measurement for a three-dimensional coordinate measurement machine. An experimental procedure was developed to compare three-dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with a multilateration-like technique employed to establish three-dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. Specifically a distributed coordinate measurement device was tested which consisted of a network of Rotary-Laser Automatic Theodolites (R-LATs), this system is known commercially as indoor GPS (iGPS). The method was found to be practical and was used to estimate that the uncertainty of measurement for the basic iGPS system is approximately 1 mm at a 95% confidence level throughout a measurement volume of approximately 10 m × 10 m × 1.5 m. © 2010 IOP Publishing Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Koopmans gyakorlati problémák megoldása során szerzett tapasztalatait általánosítva fogott hozzá a lineáris tevékenységelemzési modell kidolgozásához. Meglepődve tapasztalta, hogy a korabeli közgazdaságtan nem rendelkezett egységes, kellően egzakt termeléselmélettel és fogalomrendszerrel. Úttörő dolgozatában ezért - mintegy a lineáris tevékenységelemzési modell elméleti kereteként - lerakta a technológiai halmazok fogalmán nyugvó axiomatikus termeléselmélet alapjait is. Nevéhez fűződik a termelési hatékonyság és a hatékonysági árak fogalmának egzakt definíciója, s az egymást kölcsönösen feltételező viszonyuk igazolása a lineáris tevékenységelemzési modell keretében. A hatékonyság manapság használatos, pusztán műszaki szempontból értelmezett definícióját Koopmans csak sajátos esetként tárgyalta, célja a gazdasági hatékonyság fogalmának a bevezetése és elemzése volt. Dolgozatunkban a lineáris programozás dualitási tételei segítségével rekonstruáljuk ez utóbbira vonatkozó eredményeit. Megmutatjuk, hogy egyrészt bizonyításai egyenértékűek a lineáris programozás dualitási tételeinek igazolásával, másrészt a gazdasági hatékonysági árak voltaképpen a mai értelemben vett árnyékárak. Rámutatunk arra is, hogy a gazdasági hatékonyság értelmezéséhez megfogalmazott modellje az Arrow-Debreu-McKenzie-féle általános egyensúlyelméleti modellek közvetlen előzményének tekinthető, tartalmazta azok szinte minden lényeges elemét és fogalmát - az egyensúlyi árak nem mások, mint a Koopmans-féle hatékonysági árak. Végezetül újraértelmezzük Koopmans modelljét a vállalati technológiai mikroökonómiai leírásának lehetséges eszközeként. Journal of Economic Literature (JEL) kód: B23, B41, C61, D20, D50. /===/ Generalizing from his experience in solving practical problems, Koopmans set about devising a linear model for analysing activity. Surprisingly, he found that economics at that time possessed no uniform, sufficiently exact theory of production or system of concepts for it. He set out in a pioneering study to provide a theoretical framework for a linear model for analysing activity by expressing first the axiomatic bases of production theory, which rest on the concept of technological sets. He is associated with exact definition of the concept of production efficiency and efficiency prices, and confirmation of their relation as mutual postulates within the linear model of activity analysis. Koopmans saw the present, purely technical definition of efficiency as a special case; he aimed to introduce and analyse the concept of economic efficiency. The study uses the duality precepts of linear programming to reconstruct the results for the latter. It is shown first that evidence confirming the duality precepts of linear programming is equal in value, and secondly that efficiency prices are really shadow prices in today's sense. Furthermore, the model for the interpretation of economic efficiency can be seen as a direct predecessor of the Arrow–Debreu–McKenzie models of general equilibrium theory, as it contained almost every essential element and concept of them—equilibrium prices are nothing other than Koopmans' efficiency prices. Finally Koopmans' model is reinterpreted as a necessary tool for microeconomic description of enterprise technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to survey the game theory modelling of the behaviour of global players in mitigation and adaptation related to climate change. Three main fields are applied for the specific aspects of temperature rise: behaviour games, CPR problem and negotiation games. The game theory instruments are useful in analyzing strategies in uncertain circumstances, such as the occurrence and impacts of climate change. To analyze the international players’ relations, actions, attitude toward carbon emission, negotiation power and motives, several games are applied for the climate change in this paper. The solution is surveyed, too, for externality problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A cikk Oliver Hart és szerzőtársai modelljeinek következtetéseit hasonlítja össze Williamson tranzakciós költségekre vonatkozó nézeteivel. Megmutatja, hogy a két irányzat a vállalat vagy piac kérdéskörében más eszközöket használ, de hasonlóan érvel. Megismerkedhetünk Williamson Harttal szemben megfogalmazott azon kritikájával, hogy Hart modelljeiben az alkunak nincsenek tranzakciós költségei, illetve a kritika kritikájával is. Hart elképzeléseit támasztja alá a tulajdonjogi irányzaton belül nemrégiben kialakult referenciapont-elmélet, amely kísérleti lehetőségeket is nyújt a különböző feltételezések igazolására. ____ The article compares the conclusions from the models of Oliver Hart et al. with the views of Williamson on transaction costs. It shows that the two schools use different means on the question of the firm or the market, but similar reasoning. The author covers Williamson's criticism of Hart that there are no transaction costs in his models, and also the criticism of that criticism. Hart's notions are supported by the recently developed theory of reference point within the property-right trend, which offers chances of experimental proof of the various assumptions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variation and uncertainty in estimated evaporation was determined over time and between two locations in Florida Bay, a subtropical estuary. Meteorological data were collected from September 2001 to August 2002 at Rabbit Key and Butternut Key within the Bay. Evaporation was estimated using both vapor flux and energy budget methods. The results were placed into a long-term context using 33 years of temperature and rainfall data collected in south Florida. Evaporation also was estimated from this long-term data using an empirical formula relating evaporation to clear sky solar radiation and air temperature. Evaporation estimates for the 12-mo period ranged from 144 to 175 cm yr21, depending on location and method, with an average of 163 cm yr21 (6 9%). Monthly values ranged from 9.2 to 18.5 cm, with the highest value observed in May, corresponding with the maximum in measured net radiation. Uncertainty estimates derived from measurement errors in the data were as much as 10%, and were large enough to obscure differences in evaporation between the two sites. Differences among all estimates for any month indicate the overall uncertainty in monthly evaporation, and ranged from 9% to 26%. Over a 33-yr period (1970–2002), estimated annual evaporation from Florida Bay ranged from 148 to 181 cm yr21, with an average of 166 cm yr21. Rainfall was consistently lower in Florida Bay than evaporation, with a long-term average of 106 cm yr21. Rainfall considered alone was uncorrelated with evaporation at both monthly and annual time scales; when the seasonal variation in clear sky radiation was also taken into account both net radiation and evaporation were significantly suppressed in months with high rainfall.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Publisher PDF

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Laser trackers have been widely used in many industries to meet increasingly high accuracy requirements. In laser tracker measurement, it is complex and difficult to perform an accurate error analysis and uncertainty evaluation. This paper firstly reviews the working principle of single beam laser trackers and state-of- The- Art of key technologies from both industrial and academic efforts, followed by a comprehensive analysis of uncertainty sources. A generic laser tracker modelling method is formulated and the framework of the virtual tracker is proposed. The VLS can be used for measurement planning, measurement accuracy optimization and uncertainty evaluation. The completed virtual laser tracking system should take all the uncertainty sources affecting coordinate measurement into consideration and establish an uncertainty model which will behave in an identical way to the real system. © Springer-Verlag Berlin Heidelberg 2010.