49 resultados para Probabilistic metrics

em University of Queensland eSpace - Australia


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Action systems are a construct for reasoning about concurrent, reactive systems, in which concurrent behaviour is described by interleaving atomic actions. Sere and Troubitsyna have proposed an extension to action systems in which actions may be expressed and composed using discrete probabilistic choice as well as demonic nondeterministic choice. In this paper we develop a trace-based semantics for probabilistic action systems. This semantics provides a simple theoretical base on which practical refinement rules for probabilistic action systems may be justified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a review of perceptual image quality metrics and their application to still image compression. The review describes how image quality metrics can be used to guide an image compression scheme and outlines the advantages, disadvantages and limitations of a number of quality metrics. We examine a broad range of metrics ranging from simple mathematical measures to those which incorporate full perceptual models. We highlight some variation in the models for luminance adaptation and the contrast sensitivity function and discuss what appears to be a lack of a general consensus regarding the models which best describe contrast masking and error summation. We identify how the various perceptual components have been incorporated in quality metrics, and identify a number of psychophysical testing techniques that can be used to validate the metrics. We conclude by illustrating some of the issues discussed throughout the paper with a simple demonstration. (C) 1998 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An assessment of the changes in the distribution and extent of mangroves within Moreton Bay, southeast Queensland, Australia, was carried out. Two assessment methods were evaluated: spatial and temporal pattern metrics analysis, and change detection analysis. Currently, about 15,000 ha of mangroves are present in Moreton Bay. These mangroves are important ecosystems, but are subject to disturbance from a number of sources. Over the past 25 years, there has been a loss of more than 3800 ha, as a result of natural losses and mangrove clearing (e.g. for urban and industrial development, agriculture and aquaculture). However, areas of new mangroves have become established over the same time period, offsetting these losses to create a net loss of about 200 ha. These new mangroves have mainly appeared in the southern bay region and the bay islands, particularly on the landward edge of existing mangroves. In addition, spatial patterns and species composition of mangrove patches have changed. The pattern metrics analysis provided an overview of mangrove distribution and change in the form of single metric values, while the change detection analysis gave a more detailed and spatially explicit description of change. An analysis of the effects of spatial scales on the pattern metrics indicated that they were relatively insensitive to scale at spatial resolutions less than 50 m, but that most metrics became sensitive at coarser resolutions, a finding which has implications for mapping of mangroves based on remotely sensed data. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper addresses two major concerns that were identified when developing neural network based prediction models and which can limit their wider applicability in the industry. The first problem is that it appears neural network models are not readily available to a corrosion engineer. Therefore the first part of this paper describes a neural network model of CO2 corrosion which was created using a standard commercial software package and simple modelling strategies. It was found that such a model was able to capture practically all of the trends noticed in the experimental data with acceptable accuracy. This exercise has proven that a corrosion engineer could readily develop a neural network model such as the one described below for any problem at hand, given that sufficient experimental data exist. This applies even in the cases when the understanding of the underlying processes is poor. The second problem arises from cases when all the required inputs for a model are not known or can be estimated with a limited degree of accuracy. It seems advantageous to have models that can take as input a range rather than a single value. One such model, based on the so-called Monte Carlo approach, is presented. A number of comparisons are shown which have illustrated how a corrosion engineer might use this approach to rapidly test the sensitivity of a model to the uncertainities associated with the input parameters. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was threefold: first, the study was designed to illustrate the use of data and information collected in food safety surveys in a quantitative risk assessment. In this case, the focus was on the food service industry; however, similar data from other parts of the food chain could be similarly incorporated. The second objective was to quantitatively describe and better understand the role that the food service industry plays in the safety of food. The third objective was to illustrate the additional decision-making information that is available when uncertainty and variability are incorporated into the modelling of systems. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Landscape metrics are widely applied in landscape ecology to quantify landscape structure. However, many are poorly tested and require rigorous validation if they are to serve as reliable indicators of habitat loss and fragmentation, such as Montreal Process Indicator 1.1e. We apply a landscape ecology theory, supported by exploratory and confirmatory statistical techniques, to empirically test landscape metrics for reporting Montreal Process Indicator 1.1e in continuous dry eucalypt forests of sub-tropical Queensland, Australia. Target biota examined included: the Yellow-bellied Glider (Petaurus australis); the diversity of nectar and sap feeding glider species including P. australis, the Sugar Glider P. breviceps, the Squirrel Glider P. norfolcensis, and the Feathertail Glider Acrobates pygmaeus; six diurnal forest birds species; total diurnal bird species diversity; and the density of nectar-feeding diurnal bird species. Two scales of influence were considered: the stand-scale (2 ha), and a series of radial landscape extents (500 m - 2 km; 78 - 1250 ha) surrounding each fauna transect. For all biota, stand-scale structural and compositional attributes were found to be more influential than landscape metrics. For the Yellow-bellied Glider, the proportion of trace habitats with a residual element of old spotted-gum/ironbark eucalypt trees was a significant landscape metric at the 2 km landscape extent. This is a measure of habitat loss rather than habitat fragmentation. For the diversity of nectar and sap feeding glider species, the proportion of trace habitats with a high coefficient of variation in patch size at the 750 m extent was a significant landscape metric. None of the landscape metrics tested was important for diurnal forest birds. We conclude that no single landscape metric adequately captures the response of the region's forest biota per se. This poses a major challenge to regional reporting of Montreal Process Indicator 1.1e, fragmentation of forest types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regional commodity forecasts are being used increasingly in agricultural industries to enhance their risk management and decision-making processes. These commodity forecasts are probabilistic in nature and are often integrated with a seasonal climate forecast system. The climate forecast system is based on a subset of analogue years drawn from the full climatological distribution. In this study we sought to measure forecast quality for such an integrated system. We investigated the quality of a commodity (i.e. wheat and sugar) forecast based on a subset of analogue years in relation to a standard reference forecast based on the full climatological set. We derived three key dimensions of forecast quality for such probabilistic forecasts: reliability, distribution shift, and change in dispersion. A measure of reliability was required to ensure no bias in the forecast distribution. This was assessed via the slope of the reliability plot, which was derived from examination of probability levels of forecasts and associated frequencies of realizations. The other two dimensions related to changes in features of the forecast distribution relative to the reference distribution. The relationship of 13 published accuracy/skill measures to these dimensions of forecast quality was assessed using principal component analysis in case studies of commodity forecasting using seasonal climate forecasting for the wheat and sugar industries in Australia. There were two orthogonal dimensions of forecast quality: one associated with distribution shift relative to the reference distribution and the other associated with relative distribution dispersion. Although the conventional quality measures aligned with these dimensions, none measured both adequately. We conclude that a multi-dimensional approach to assessment of forecast quality is required and that simple measures of reliability, distribution shift, and change in dispersion provide a means for such assessment. The analysis presented was also relevant to measuring quality of probabilistic seasonal climate forecasting systems. The importance of retaining a focus on the probabilistic nature of the forecast and avoiding simplifying, but erroneous, distortions was discussed in relation to applying this new forecast quality assessment paradigm to seasonal climate forecasts. Copyright (K) 2003 Royal Meteorological Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rocks used as construction aggregate in temperate climates deteriorate to differing degrees because of repeated freezing and thawing. The magnitude of the deterioration depends on the rock's properties. Aggregate, including crushed carbonate rock, is required to have minimum geotechnical qualities before it can be used in asphalt and concrete. In order to reduce chances of premature and expensive repairs, extensive freeze-thaw tests are conducted on potential construction rocks. These tests typically involve 300 freeze-thaw cycles and can take four to five months to complete. Less time consuming tests that (1) predict durability as well as the extended freeze-thaw test or that (2) reduce the number of rocks subject to the extended test, could save considerable amounts of money. Here we use a probabilistic neural network to try and predict durability as determined by the freeze-thaw test using four rock properties measured on 843 limestone samples from the Kansas Department of Transportation. Modified freeze-thaw tests and less time consuming specific gravity (dry), specific gravity (saturated), and modified absorption tests were conducted on each sample. Durability factors of 95 or more as determined from the extensive freeze-thaw tests are viewed as acceptable—rocks with values below 95 are rejected. If only the modified freeze-thaw test is used to predict which rocks are acceptable, about 45% are misclassified. When 421 randomly selected samples and all four standardized and scaled variables were used to train aprobabilistic neural network, the rate of misclassification of 422 independent validation samples dropped to 28%. The network was trained so that each class (group) and each variable had its own coefficient (sigma). In an attempt to reduce errors further, an additional class was added to the training data to predict durability values greater than 84 and less than 98, resulting in only 11% of the samples misclassified. About 43% of the test data was classed by the neural net into the middle group—these rocks should be subject to full freeze-thaw tests. Thus, use of the probabilistic neural network would meanthat the extended test would only need be applied to 43% of the samples, and 11% of the rocks classed as acceptable would fail early.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compute the Dirac indexes for. the two spin structures kappa(0) and kappa(1) for Eguchi-Hanson metrics with nonzero total mass. It shows that the Dirac indexes do not vanish in general, and axial anomaly exists. When the metric has zero total mass, the Dirac index vanishes for the spin structure no, and no axial anomaly exists in this case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network building and exchange of information by people within networks is crucial to the innovation process. Contrary to older models, in social networks the flow of information is noncontinuous and nonlinear. There are critical barriers to information flow that operate in a problematic manner. New models and new analytic tools are needed for these systems. This paper introduces the concept of virtual circuits and draws on recent concepts of network modelling and design to introduce a probabilistic switch theory that can be described using matrices. It can be used to model multistep information flow between people within organisational networks, to provide formal definitions of efficient and balanced networks and to describe distortion of information as it passes along human communication channels. The concept of multi-dimensional information space arises naturally from the use of matrices. The theory and the use of serial diagonal matrices have applications to organisational design and to the modelling of other systems. It is hypothesised that opinion leaders or creative individuals are more likely to emerge at information-rich nodes in networks. A mathematical definition of such nodes is developed and it does not invariably correspond with centrality as defined by early work on networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view population-based optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for continuous, population-based optimization in terms of a stochastic gradient descent on the Kullback-Leibler divergence between the model probability density and the objective function, represented as an unknown density of assumed form. This leads to an update rule that is related and compared with previous theoretical work, a continuous version of the population-based incremental learning algorithm, and the generalized mean shift clustering framework. Experimental results are presented that demonstrate the dynamics of the new algorithm on a set of simple test problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deregulations and market practices in power industry have brought great challenges to the system planning area. In particular, they introduce a variety of uncertainties to system planning. New techniques are required to cope with such uncertainties. As a promising approach, probabilistic methods are attracting more and more attentions by system planners. In small signal stability analysis, generation control parameters play an important role in determining the stability margin. The objective of this paper is to investigate power system state matrix sensitivity characteristics with respect to system parameter uncertainties with analytical and numerical approaches and to identify those parameters have great impact on system eigenvalues, therefore, the system stability properties. Those identified parameter variations need to be investigated with priority. The results can be used to help Regional Transmission Organizations (RTOs) and Independent System Operators (ISOs) perform planning studies under the open access environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the statistical problem of catalogue matching from a machine learning perspective with the goal of producing probabilistic outputs, and using all available information. A framework is provided that unifies two existing approaches to producing probabilistic outputs in the literature, one based on combining distribution estimates and the other based on combining probabilistic classifiers. We apply both of these to the problem of matching the HI Parkes All Sky Survey radio catalogue with large positional uncertainties to the much denser SuperCOSMOS catalogue with much smaller positional uncertainties. We demonstrate the utility of probabilistic outputs by a controllable completeness and efficiency trade-off and by identifying objects that have high probability of being rare. Finally, possible biasing effects in the output of these classifiers are also highlighted and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The structure of proteins may change as a result of the inherent flexibility of some protein regions. We develop and explore probabilistic machine learning methods for predicting a continuum secondary structure, i.e. assigning probabilities to the conformational states of a residue. We train our methods using data derived from high-quality NMR models. Results: Several probabilistic models not only successfully estimate the continuum secondary structure, but also provide a categorical output on par with models directly trained on categorical data. Importantly, models trained on the continuum secondary structure are also better than their categorical counterparts at identifying the conformational state for structurally ambivalent residues. Conclusion: Cascaded probabilistic neural networks trained on the continuum secondary structure exhibit better accuracy in structurally ambivalent regions of proteins, while sustaining an overall classification accuracy on par with standard, categorical prediction methods.