938 resultados para acceptance probability
Resumo:
Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.
Resumo:
We analyze the quantum dynamics of radiation propagating in a single-mode optical fiber with dispersion, nonlinearity, and Raman coupling to thermal phonons. We start from a fundamental Hamiltonian that includes the principal known nonlinear effects and quantum-noise sources, including linear gain and loss. Both Markovian and frequency-dependent, non-Markovian reservoirs are treated. This treatment allows quantum Langevin equations, which have a classical form except for additional quantum-noise terms, to be calculated. In practical calculations, it is more useful to transform to Wigner or 1P quasi-probability operator representations. These transformations result in stochastic equations that can be analyzed by use of perturbation theory or exact numerical techniques. The results have applications to fiber-optics communications, networking, and sensor technology.
Resumo:
Argumentation is modelled as a game where the payoffs are measured in terms of the probability that the claimed conclusion is, or is not, defeasibly provable, given a history of arguments that have actually been exchanged, and given the probability of the factual premises. The probability of a conclusion is calculated using a standard variant of Defeasible Logic, in combination with standard probability calculus. It is a new element of the present approach that the exchange of arguments is analysed with game theoretical tools, yielding a prescriptive and to some extent even predictive account of the actual course of play. A brief comparison with existing argument-based dialogue approaches confirms that such a prescriptive account of the actual argumentation has been almost lacking in the approaches proposed so far.
Resumo:
The received view of an ad hoc hypothesis is that it accounts for only the observation(s) it was designed to account for, and so non-adhocness is generally held to be necessary or important for an introduced hypothesis or modification to a theory. Attempts by Popper and several others to convincingly explicate this view, however, prove to be unsuccessful or of doubtful value, and familiar and firmer criteria for evaluating the hypotheses or modified theories so classified are characteristically available. These points are obscured largely because the received view fails to adequately separate psychology from methodology or to recognise ambiguities in the use of 'ad hoc'.
Resumo:
Quasi-birth-and-death (QBD) processes with infinite “phase spaces” can exhibit unusual and interesting behavior. One of the simplest examples of such a process is the two-node tandem Jackson network, with the “phase” giving the state of the first queue and the “level” giving the state of the second queue. In this paper, we undertake an extensive analysis of the properties of this QBD. In particular, we investigate the spectral properties of Neuts’s R-matrix and show that the decay rate of the stationary distribution of the “level” process is not always equal to the convergence norm of R. In fact, we show that we can obtain any decay rate from a certain range by controlling only the transition structure at level zero, which is independent of R. We also consider the sequence of tandem queues that is constructed by restricting the waiting room of the first queue to some finite capacity, and then allowing this capacity to increase to infinity. We show that the decay rates for the finite truncations converge to a value, which is not necessarily the decay rate in the infinite waiting room case. Finally, we show that the probability that the process hits level n before level 0 given that it starts in level 1 decays at a rate which is not necessarily the same as the decay rate for the stationary distribution.
Resumo:
Extracting human postural information from video sequences has proved a difficult research question. The most successful approaches to date have been based on particle filtering, whereby the underlying probability distribution is approximated by a set of particles. The shape of the underlying observational probability distribution plays a significant role in determining the success, both accuracy and efficiency, of any visual tracker. In this paper we compare approaches used by other authors and present a cost path approach which is commonly used in image segmentation problems, however is currently not widely used in tracking applications.
Resumo:
Nearest–neighbour balance is considered a desirable property for an experiment to possess in situations where experimental units are influenced by their neighbours. This paper introduces a measure of the degree of nearest–neighbour balance of a design. The measure is used in an algorithm which generates nearest–neighbour balanced designs and is readily modified to obtain designs with various types of nearest–neighbour balance. Nearest–neighbour balanced designs are produced for a wide class of parameter settings, and in particular for those settings for which such designs cannot be found by existing direct combinatorial methods. In addition, designs with unequal row and column sizes, and designs with border plots are constructed using the approach presented here.
Resumo:
In this paper, we propose a fast adaptive importance sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First, we estimate the minimum cross-entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level. Finally, the tilting parameter just found is used to estimate the overflow probability of interest. We study various properties of the method in more detail for the M/M/1 queue and conjecture that similar properties also hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.
Resumo:
We consider a branching model, which we call the collision branching process (CBP), that accounts for the effect of collisions, or interactions, between particles or individuals. We establish that there is a unique CBP, and derive necessary and sufficient conditions for it to be nonexplosive. We review results on extinction probabilities, and obtain explicit expressions for the probability of explosion and the expected hitting times. The upwardly skip-free case is studied in some detail.
Resumo:
Bellerophon is a program for detecting chimeric sequences in multiple sequence datasets by an adaption of partial treeing analysis. Bellerophon was specifically developed to detect 16S rRNA gene chimeras in PCR-clone libraries of environmental samples but can be applied to other nucleotide sequence alignments.
Resumo:
Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).
Resumo:
There are many techniques for electricity market price forecasting. However, most of them are designed for expected price analysis rather than price spike forecasting. An effective method of predicting the occurrence of spikes has not yet been observed in the literature so far. In this paper, a data mining based approach is presented to give a reliable forecast of the occurrence of price spikes. Combined with the spike value prediction techniques developed by the same authors, the proposed approach aims at providing a comprehensive tool for price spike forecasting. In this paper, feature selection techniques are firstly described to identify the attributes relevant to the occurrence of spikes. A simple introduction to the classification techniques is given for completeness. Two algorithms: support vector machine and probability classifier are chosen to be the spike occurrence predictors and are discussed in details. Realistic market data are used to test the proposed model with promising results.
Resumo:
Australia is an increasingly important ally for the United States. It is willing to be part of challenging global missions, and its strong economy and growing self-confi dence suggest a more prominent role in both global and regional affairs. Moreover, its government has worked hard to strengthen the link between Canberra and Washington. Political and strategic affi nities between the two countries have been refl ected in--and complemented by--practiced military interoperability, as the two allies have sustained a pattern of security cooperation in relation to East Timor, Afghanistan and Iraq in the last 4 years. This growing collaboration between the two countries suggests that a reinvention of the traditional bilateral security relationship is taking place. At the core of this process lies an agreement about the need for engaging in more proactive strategic behavior in the changing global security environment, and a mutual acceptance of looming military and technological interdependence. But this new alliance relationship is already testing the boundaries of bipartisan support for security policy within Australia. Issues of strategic doctrine, defense planning, and procurement are becoming topics of fi erce policy debate. Such discussion is likely to be sharpened in the years ahead as Australia’s security relationship with the United States settles into a new framework.
Resumo:
Reviews the ecological status of the mahogany glider and describes its distribution, habitat and abundance, life history and threats to it. Three serial surveys of Brisbane residents provide data on the knowledge of respondents about the mahogany glider. The results provide information about the attitudes of respondents to the mahogany glider, to its conservation and relevant public policies and about variations in these factors as the knowledge of participants of the mahogany glider alters. Similarly data is provided and analysed about the willingness to pay of respondents to conserve the mahogany glider. Population viability analysis is applied to estimate the required habitat area for a minimum viable population of the mahogany glider to ensure at least a 95% probability of its survival for 100 years. Places are identified in Queensland where the requisite minimum area of critical habitat can be conserved. Using the survey results as a basis, the likely willingness of groups of Australians to pay for the conservation of the mahogany glider is estimated and consequently their willingness to pay for the minimum required area of its habitat. Methods for estimating the cost of protecting this habitat are outlined. Australia-wide benefits seem to exceed the costs. Establishing a national park containing the minimum viable population of the mahogany glider is an appealing management option. This would also be beneficial in conserving other endangered wildlife species. Therefore, additional economic benefits to those estimated on account of the mahogany glider itself can be obtained.
Resumo:
Polytomous Item Response Theory Models provides a unified, comprehensive introduction to the range of polytomous models available within item response theory (IRT). It begins by outlining the primary structural distinction between the two major types of polytomous IRT models. This focuses on the two types of response probability that are unique to polytomous models and their associated response functions, which are modeled differently by the different types of IRT model. It describes, both conceptually and mathematically, the major specific polytomous models, including the Nominal Response Model, the Partial Credit Model, the Rating Scale model, and the Graded Response Model. Important variations, such as the Generalized Partial Credit Model are also described as are less common variations, such as the Rating Scale version of the Graded Response Model. Relationships among the models are also investigated and the operation of measurement information is described for each major model. Practical examples of major models using real data are provided, as is a chapter on choosing an appropriate model. Figures are used throughout to illustrate important elements as they are described.