906 resultados para Timed and Probabilistic Automata


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new methodology is proposed for the analysis of generation capacity investment in a deregulated market environment. This methodology proposes to make the investment appraisal using a probabilistic framework. The probabilistic production simulation (PPC) algorithm is used to compute the expected energy generated, taking into account system load variations and plant forced outage rates, while the Monte Carlo approach has been applied to model the electricity price variability seen in a realistic network. The model is able to capture the price and hence the profitability uncertainties for generator companies. Seasonal variation in the electricity prices and the system demand are independently modeled. The method is validated on IEEE RTS system, augmented with realistic market and plant data, by using it to compare the financial viability of several generator investments applying either conventional or directly connected generator (powerformer) technologies. The significance of the results is assessed using several financial risk measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a method to analyze the first order eigenvalue sensitivity with respect to the operating parameters of a power system. The method is based on explicitly expressing the system state matrix into sub-matrices. The eigenvalue sensitivity is calculated based on the explicitly formed system state matrix. The 4th order generator model and 4th order exciter system model are used to form the system state matrix. A case study using New England 10-machine 39-bus system is provided to demonstrate the effectiveness of the proposed method. This method can be applied into large scale power system eigenvalue sensitivity with respect to operating parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grid computing is an advanced technique for collaboratively solving complicated scientific problems using geographically and organisational dispersed computational, data storage and other recourses. Application of grid computing could provide significant benefits to all aspects of power system that involves using computers. Based on our previous research, this paper presents a novel grid computing approach for probabilistic small signal stability (PSSS) analysis in electric power systems with uncertainties. A prototype computing grid is successfully implemented in our research lab to carry out PSSS analysis on two benchmark systems. Comparing to traditional computing techniques, the gird computing has given better performances for PSSS analysis in terms of computing capacity, speed, accuracy and stability. In addition, a computing grid framework for power system analysis has been proposed based on the recent study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grid computing is an emerging technology for providing the high performance computing capability and collaboration mechanism for solving the collaborated and complex problems while using the existing resources. In this paper, a grid computing based framework is proposed for the probabilistic based power system reliability and security analysis. The suggested name of this computing grid is Reliability and Security Grid (RSA-Grid). Then the architecture of this grid is presented. A prototype system has been built for further development of grid-based services for power systems reliability and security assessment based on probabilistic techniques, which require high performance computing and large amount of memory. Preliminary results based on prototype of this grid show that RSA-Grid can provide the comprehensive assessment results for real power systems efficiently and economically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an alternative to traditional evolutionary algorithms (EAs), population-based incremental learning (PBIL) maintains a probabilistic model of the best individual(s). Originally, PBIL was applied in binary search spaces. Recently, some work has been done to extend it to continuous spaces. In this paper, we review two such extensions of PBIL. An improved version of the PBIL based on Gaussian model is proposed that combines two main features: a new updating rule that takes into account all the individuals and their fitness values and a self-adaptive learning rate parameter. Furthermore, a new continuous PBIL employing a histogram probabilistic model is proposed. Some experiments results are presented that highlight the features of the new algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To navigate successfully in a novel environment a robot needs to be able to Simultaneously Localize And Map (SLAM) its surroundings. The most successful solutions to this problem so far have involved probabilistic algorithms, but there has been much promising work involving systems based on the workings of part of the rodent brain known as the hippocampus. In this paper we present a biologically plausible system called RatSLAM that uses competitive attractor networks to carry out SLAM in a probabilistic manner. The system can effectively perform parameter self-calibration and SLAM in onedimension. Tests in two dimensional environments revealed the inability of the RatSLAM system to maintain multiple pose hypotheses in the face of ambiguous visual input. These results support recent rat experimentation that suggest current competitive attractor models are not a complete solution to the hippocampal modelling problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A reliable perception of the real world is a key-feature for an autonomous vehicle and the Advanced Driver Assistance Systems (ADAS). Obstacles detection (OD) is one of the main components for the correct reconstruction of the dynamic world. Historical approaches based on stereo vision and other 3D perception technologies (e.g. LIDAR) have been adapted to the ADAS first and autonomous ground vehicles, after, providing excellent results. The obstacles detection is a very broad field and this domain counts a lot of works in the last years. In academic research, it has been clearly established the essential role of these systems to realize active safety systems for accident prevention, reflecting also the innovative systems introduced by industry. These systems need to accurately assess situational criticalities and simultaneously assess awareness of these criticalities by the driver; it requires that the obstacles detection algorithms must be reliable and accurate, providing: a real-time output, a stable and robust representation of the environment and an estimation independent from lighting and weather conditions. Initial systems relied on only one exteroceptive sensor (e.g. radar or laser for ACC and camera for LDW) in addition to proprioceptive sensors such as wheel speed and yaw rate sensors. But, current systems, such as ACC operating at the entire speed range or autonomous braking for collision avoidance, require the use of multiple sensors since individually they can not meet these requirements. It has led the community to move towards the use of a combination of them in order to exploit the benefits of each one. Pedestrians and vehicles detection are ones of the major thrusts in situational criticalities assessment, still remaining an active area of research. ADASs are the most prominent use case of pedestrians and vehicles detection. Vehicles should be equipped with sensing capabilities able to detect and act on objects in dangerous situations, where the driver would not be able to avoid a collision. A full ADAS or autonomous vehicle, with regard to pedestrians and vehicles, would not only include detection but also tracking, orientation, intent analysis, and collision prediction. The system detects obstacles using a probabilistic occupancy grid built from a multi-resolution disparity map. Obstacles classification is based on an AdaBoost SoftCascade trained on Aggregate Channel Features. A final stage of tracking and fusion guarantees stability and robustness to the result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet of Things (IoT) can be defined as a “network of networks” composed by billions of uniquely identified physical Smart Objects (SO), organized in an Internet-like structure. Smart Objects can be items equipped with sensors, consumer devices (e.g., smartphones, tablets, or wearable devices), and enterprise assets that are connected both to the Internet and to each others. The birth of the IoT, with its communications paradigms, can be considered as an enabling factor for the creation of the so-called Smart Cities. A Smart City uses Information and Communication Technologies (ICT) to enhance quality, performance and interactivity of urban services, ranging from traffic management and pollution monitoring to government services and energy management. This thesis is focused on multi-hop data dissemination within IoT and Smart Cities scenarios. The proposed multi-hop techniques, mostly based on probabilistic forwarding, have been used for different purposes: from the improvement of the performance of unicast protocols for Wireless Sensor Networks (WSNs) to the efficient data dissemination within Vehicular Ad-hoc NETworks (VANETs).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new approach to optimisation is introduced based on a precise probabilistic statement of what is ideally required of an optimisation method. It is convenient to express the formalism in terms of the control of a stationary environment. This leads to an objective function for the controller which unifies the objectives of exploration and exploitation, thereby providing a quantitative principle for managing this trade-off. This is demonstrated using a variant of the multi-armed bandit problem. This approach opens new possibilities for optimisation algorithms, particularly by using neural network or other adaptive methods for the adaptive controller. It also opens possibilities for deepening understanding of existing methods. The realisation of these possibilities requires research into practical approximations of the exact formalism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Principal component analysis (PCA) is one of the most popular techniques for processing, compressing and visualising data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However, conventional PCA does not correspond to a probability density, and so there is no unique way to combine PCA models. Previous attempts to formulate mixture models for PCA have therefore to some extent been ad hoc. In this paper, PCA is formulated within a maximum-likelihood framework, based on a specific form of Gaussian latent variable model. This leads to a well-defined mixture model for probabilistic principal component analysers, whose parameters can be determined using an EM algorithm. We discuss the advantages of this model in the context of clustering, density modelling and local dimensionality reduction, and we demonstrate its application to image compression and handwritten digit recognition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnification factors specify the extent to which the area of a small patch of the latent (or `feature') space of a topographic mapping is magnified on projection to the data space, and are of considerable interest in both neuro-biological and data analysis contexts. Previous attempts to consider magnification factors for the self-organizing map (SOM) algorithm have been hindered because the mapping is only defined at discrete points (given by the reference vectors). In this paper we consider the batch version of SOM, for which a continuous mapping can be defined, as well as the Generative Topographic Mapping (GTM) algorithm of Bishop et al. (1997) which has been introduced as a probabilistic formulation of the SOM. We show how the techniques of differential geometry can be used to determine magnification factors as continuous functions of the latent space coordinates. The results are illustrated here using a problem involving the identification of crab species from morphological data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.