15 resultados para probabilistic rules
em University of Queensland eSpace - Australia
Resumo:
Action systems are a construct for reasoning about concurrent, reactive systems, in which concurrent behaviour is described by interleaving atomic actions. Sere and Troubitsyna have proposed an extension to action systems in which actions may be expressed and composed using discrete probabilistic choice as well as demonic nondeterministic choice. In this paper we develop a trace-based semantics for probabilistic action systems. This semantics provides a simple theoretical base on which practical refinement rules for probabilistic action systems may be justified.
Resumo:
Back and von Wright have developed algebraic laws for reasoning about loops in the refinement calculus. We extend their work to reasoning about probabilistic loops in the probabilistic refinement calculus. We apply our algebraic reasoning to derive transformation rules for probabilistic action systems. In particular we focus on developing data refinement rules for probabilistic action systems. Our extension is interesting since some well known transformation rules that are applicable to standard programs are not applicable to probabilistic ones: we identify some of these important differences and we develop alternative rules where possible. In particular, our probabilistic action system data refinement rules are new.
Resumo:
Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.
Resumo:
The graded-fermion algebra and quasispin formalism are introduced and applied to obtain the gl(m\n)down arrow osp(m\n) branching rules for the two- column tensor irreducible representations of gl(m\n), for the case m less than or equal to n(n > 2). In the case m < n, all such irreducible representations of gl(m\n) are shown to be completely reducible as representations of osp(m\n). This is also shown to be true for the case m=n, except for the spin-singlet representations, which contain an indecomposable representation of osp(m\n) with composition length 3. These branching rules are given in fully explicit form. (C) 1999 American Institute of Physics. [S0022-2488(99)04410-2].
Resumo:
The study investigated the social rules applicable to selection interviews, and the attributions ions made by interviewers in response to rule-breaking behaviours by candidates. Sixty personnel specialists (31 males and 29 females) participated in the main study which examined their perceptions of social rules and attributions about rule breaking in their work experience. They listened to audiotapes of actual selection interviews, and made judgments about hireability communication competence, and specific social rules. Results indicated that interview rules could be categorized into two groups: specific interview presentation skills and general interpersonal competence. While situational attributions were more salient in explaining the breaking of general interpersonal competence rules, internal attributions (ability, effort) were more salient explanations for the breaking of more specific interview rules (with the exception of the preparation rule where lack of effort was the most likely explanation for rule breaking). Candidates previously judged as competent communicators were rated more favourably on both global and specific measures of rule-following competence, as well as on hireability. The theoretical and practical implications of combining social rules and attribution theory in the study of selection interviews are discussed.
Resumo:
The present paper addresses two major concerns that were identified when developing neural network based prediction models and which can limit their wider applicability in the industry. The first problem is that it appears neural network models are not readily available to a corrosion engineer. Therefore the first part of this paper describes a neural network model of CO2 corrosion which was created using a standard commercial software package and simple modelling strategies. It was found that such a model was able to capture practically all of the trends noticed in the experimental data with acceptable accuracy. This exercise has proven that a corrosion engineer could readily develop a neural network model such as the one described below for any problem at hand, given that sufficient experimental data exist. This applies even in the cases when the understanding of the underlying processes is poor. The second problem arises from cases when all the required inputs for a model are not known or can be estimated with a limited degree of accuracy. It seems advantageous to have models that can take as input a range rather than a single value. One such model, based on the so-called Monte Carlo approach, is presented. A number of comparisons are shown which have illustrated how a corrosion engineer might use this approach to rapidly test the sensitivity of a model to the uncertainities associated with the input parameters. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
In the treatment of atherosclerotic disease, stenting in the presence of a glycoprotein (GP) IIb/IIIa antagonist is becoming an increasingly common procedure. The ‘Do Tirofiban and ReoPro Give Similar Efficacy Trial’ (TARGET) was designed to determine whether the cheaper tirofiban was as effective and safe as abciximab in the prevention of ischaemic events with stenting. Unexpectedly, abciximab was shown to be superior to tirofiban. Tirofiban is a selective GP IIb/IIIa antagonist whereas abciximab has additional anti-inflammatory actions, which may contribute to its superiority.
Resumo:
The purpose of this study was threefold: first, the study was designed to illustrate the use of data and information collected in food safety surveys in a quantitative risk assessment. In this case, the focus was on the food service industry; however, similar data from other parts of the food chain could be similarly incorporated. The second objective was to quantitatively describe and better understand the role that the food service industry plays in the safety of food. The third objective was to illustrate the additional decision-making information that is available when uncertainty and variability are incorporated into the modelling of systems. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The present study utilized a social rules approach to investigate the relative influence of gender and status on managers' self-evaluations of their effectiveness in handling a dominating subordinate. In the first study 84 White middle-class participants identified the prescriptive and proscriptive rules for socially appropriate responding to a stimulus situation involving a pushy subordinate. Four rule sets were identified for female and male managers and subordinates, respectively. Rule-sets shared a number of common rules and showed some variation according to gender roles. In the second study, 91 White middle-class participants rated the individual rules for importance and also rated their personal and managerial effectiveness when responding to the stimulus situation using gender- and status-consistent and gender-and status-inconsistent response strategies. Both men and women rated the female gender and status- consistent strategy as most effective, and rated the status-inconsistent strategy as less effective than a gender-inconsistent response. Results were interpreted as providing more support for a situational gender-related theory of workplace behavior, rather than a traditional gender role perspective.
Resumo:
Regional commodity forecasts are being used increasingly in agricultural industries to enhance their risk management and decision-making processes. These commodity forecasts are probabilistic in nature and are often integrated with a seasonal climate forecast system. The climate forecast system is based on a subset of analogue years drawn from the full climatological distribution. In this study we sought to measure forecast quality for such an integrated system. We investigated the quality of a commodity (i.e. wheat and sugar) forecast based on a subset of analogue years in relation to a standard reference forecast based on the full climatological set. We derived three key dimensions of forecast quality for such probabilistic forecasts: reliability, distribution shift, and change in dispersion. A measure of reliability was required to ensure no bias in the forecast distribution. This was assessed via the slope of the reliability plot, which was derived from examination of probability levels of forecasts and associated frequencies of realizations. The other two dimensions related to changes in features of the forecast distribution relative to the reference distribution. The relationship of 13 published accuracy/skill measures to these dimensions of forecast quality was assessed using principal component analysis in case studies of commodity forecasting using seasonal climate forecasting for the wheat and sugar industries in Australia. There were two orthogonal dimensions of forecast quality: one associated with distribution shift relative to the reference distribution and the other associated with relative distribution dispersion. Although the conventional quality measures aligned with these dimensions, none measured both adequately. We conclude that a multi-dimensional approach to assessment of forecast quality is required and that simple measures of reliability, distribution shift, and change in dispersion provide a means for such assessment. The analysis presented was also relevant to measuring quality of probabilistic seasonal climate forecasting systems. The importance of retaining a focus on the probabilistic nature of the forecast and avoiding simplifying, but erroneous, distortions was discussed in relation to applying this new forecast quality assessment paradigm to seasonal climate forecasts. Copyright (K) 2003 Royal Meteorological Society.