906 resultados para Timed and Probabilistic Automata
Resumo:
Researchers at the University of Reading have developed over many years some simple mobile robots that explore an environment they perceive through simple ultrasonic sensors. Information from these sensors has allowed the robots to learn the simple task of moving around while avoiding dynamic obstacles using a static set of fuzzy automata, the choice of which has been criticised, due to its arbitrary nature. This paper considers how a dynamic set of automata can overcome this criticism. In addition, a new reinforcement learning function is outlined which is both scalable to different numbers and types of sensors. The innovations compare successfully with earlier work.
Resumo:
A large volume of visual content is inaccessible until effective and efficient indexing and retrieval of such data is achieved. In this paper, we introduce the DREAM system, which is a knowledge-assisted semantic-driven context-aware visual information retrieval system applied in the film post production domain. We mainly focus on the automatic labelling and topic map related aspects of the framework. The use of the context- related collateral knowledge, represented by a novel probabilistic based visual keyword co-occurrence matrix, had been proven effective via the experiments conducted during system evaluation. The automatically generated semantic labels were fed into the Topic Map Engine which can automatically construct ontological networks using Topic Maps technology, which dramatically enhances the indexing and retrieval performance of the system towards an even higher semantic level.
Resumo:
In this work a hybrid technique that includes probabilistic and optimization based methods is presented. The method is applied, both in simulation and by means of real-time experiments, to the heating unit of a Heating, Ventilation Air Conditioning (HVAC) system. It is shown that the addition of the probabilistic approach improves the fault diagnosis accuracy.
Resumo:
In this paper, we evaluate the Probabilistic Occupancy Map (POM) pedestrian detection algorithm on the PETS 2009 benchmark dataset. POM is a multi-camera generative detection method, which estimates ground plane occupancy from multiple background subtraction views. Occupancy probabilities are iteratively estimated by fitting a synthetic model of the background subtraction to the binary foreground motion. Furthermore, we test the integration of this algorithm into a larger framework designed for understanding human activities in real environments. We demonstrate accurate detection and localization on the PETS dataset, despite suboptimal calibration and foreground motion segmentation input.
Resumo:
A new probabilistic neural network (PNN) learning algorithm based on forward constrained selection (PNN-FCS) is proposed. An incremental learning scheme is adopted such that at each step, new neurons, one for each class, are selected from the training samples arid the weights of the neurons are estimated so as to minimize the overall misclassification error rate. In this manner, only the most significant training samples are used as the neurons. It is shown by simulation that the resultant networks of PNN-FCS have good classification performance compared to other types of classifiers, but much smaller model sizes than conventional PNN.
Resumo:
Based on the idea of an important cluster, a new multi-level probabilistic neural network (MLPNN) is introduced. The MLPNN uses an incremental constructive approach, i.e. it grows level by level. The construction algorithm of the MLPNN is proposed such that the classification accuracy monotonically increases to ensure that the classification accuracy of the MLPNN is higher than or equal to that of the traditional PNN. Numerical examples are included to demonstrate the effectiveness of proposed new approach.
Resumo:
Public water supplies in England and Wales are provided by around 25 private-sector companies, regulated by an economic regulator (Ofwat) and and environmental regulator (Environment Agency). As part of the regulatory process, companies are required periodically to review their investment needs to maintain safe and secure supplies, and this involves an assessment of the future balance between water supply and demand. The water industry and regulators have developed an agreed set of procedures for this assessment. Climate change has been incorporated into these procedures since the late 1990s, although has been included increasingly seriously over time and it has been an effective legal requirement to consider climate change since the 2003 Water Act. In the most recent assessment in 2009, companies were required explicitly to plan for a defined amount of climate change, taking into account climate change uncertainty. A “medium” climate change scenario was defined, together with “wet” and “dry” extremes, based on scenarios developed from a number of climate models. The water industry and its regulators are now gearing up to exploit the new UKCP09 probabilistic climate change projections – but these pose significant practical and conceptual challenges. This paper outlines how the procedures for incorporating climate change information into water resources planning have evolved, and explores the issues currently facing the industry in adapting to climate change.
Resumo:
The problem of a manipulator operating in a noisy workspace and required to move from an initial fixed position P0 to a final position Pf is considered. However, Pf is corrupted by noise, giving rise to Pˆf, which may be obtained by sensors. The use of learning automata is proposed to tackle this problem. An automaton is placed at each joint of the manipulator which moves according to the action chosen by the automaton (forward, backward, stationary) at each instant. The simultaneous reward or penalty of the automata enables avoiding any inverse kinematics computations that would be necessary if the distance of each joint from the final position had to be calculated. Three variable-structure learning algorithms are used, i.e., the discretized linear reward-penalty (DLR-P, the linear reward-penalty (LR-P ) and a nonlinear scheme. Each algorithm is separately tested with two (forward, backward) and three forward, backward, stationary) actions.
Resumo:
Scoring rules are an important tool for evaluating the performance of probabilistic forecasting schemes. A scoring rule is called strictly proper if its expectation is optimal if and only if the forecast probability represents the true distribution of the target. In the binary case, strictly proper scoring rules allow for a decomposition into terms related to the resolution and the reliability of a forecast. This fact is particularly well known for the Brier Score. In this article, this result is extended to forecasts for finite-valued targets. Both resolution and reliability are shown to have a positive effect on the score. It is demonstrated that resolution and reliability are directly related to forecast attributes that are desirable on grounds independent of the notion of scores. This finding can be considered an epistemological justification of measuring forecast quality by proper scoring rules. A link is provided to the original work of DeGroot and Fienberg, extending their concepts of sufficiency and refinement. The relation to the conjectured sharpness principle of Gneiting, et al., is elucidated.
Resumo:
Several methods are examined which allow to produce forecasts for time series in the form of probability assignments. The necessary concepts are presented, addressing questions such as how to assess the performance of a probabilistic forecast. A particular class of models, cluster weighted models (CWMs), is given particular attention. CWMs, originally proposed for deterministic forecasts, can be employed for probabilistic forecasting with little modification. Two examples are presented. The first involves estimating the state of (numerically simulated) dynamical systems from noise corrupted measurements, a problem also known as filtering. There is an optimal solution to this problem, called the optimal filter, to which the considered time series models are compared. (The optimal filter requires the dynamical equations to be known.) In the second example, we aim at forecasting the chaotic oscillations of an experimental bronze spring system. Both examples demonstrate that the considered time series models, and especially the CWMs, provide useful probabilistic information about the underlying dynamical relations. In particular, they provide more than just an approximation to the conditional mean.
Resumo:
References (20)Cited By (1)Export CitationAboutAbstract Proper scoring rules provide a useful means to evaluate probabilistic forecasts. Independent from scoring rules, it has been argued that reliability and resolution are desirable forecast attributes. The mathematical expectation value of the score allows for a decomposition into reliability and resolution related terms, demonstrating a relationship between scoring rules and reliability/resolution. A similar decomposition holds for the empirical (i.e. sample average) score over an archive of forecast–observation pairs. This empirical decomposition though provides a too optimistic estimate of the potential score (i.e. the optimum score which could be obtained through recalibration), showing that a forecast assessment based solely on the empirical resolution and reliability terms will be misleading. The differences between the theoretical and empirical decomposition are investigated, and specific recommendations are given how to obtain better estimators of reliability and resolution in the case of the Brier and Ignorance scoring rule.
Resumo:
Interest in attributing the risk of damaging weather-related events to anthropogenic climate change is increasing1. Yet climate models used to study the attribution problem typically do not resolve the weather systems associated with damaging events2 such as the UK floods of October and November 2000. Occurring during the wettest autumn in England and Wales since records began in 17663, 4, these floods damaged nearly 10,000 properties across that region, disrupted services severely, and caused insured losses estimated at £1.3 billion (refs 5, 6). Although the flooding was deemed a ‘wake-up call’ to the impacts of climate change at the time7, such claims are typically supported only by general thermodynamic arguments that suggest increased extreme precipitation under global warming, but fail8, 9 to account fully for the complex hydrometeorology4, 10 associated with flooding. Here we present a multi-step, physically based ‘probabilistic event attribution’ framework showing that it is very likely that global anthropogenic greenhouse gas emissions substantially increased the risk of flood occurrence in England and Wales in autumn 2000. Using publicly volunteered distributed computing11, 12, we generate several thousand seasonal-forecast-resolution climate model simulations of autumn 2000 weather, both under realistic conditions, and under conditions as they might have been had these greenhouse gas emissions and the resulting large-scale warming never occurred. Results are fed into a precipitation-runoff model that is used to simulate severe daily river runoff events in England and Wales (proxy indicators of flood events). The precise magnitude of the anthropogenic contribution remains uncertain, but in nine out of ten cases our model results indicate that twentieth-century anthropogenic greenhouse gas emissions increased the risk of floods occurring in England and Wales in autumn 2000 by more than 20%, and in two out of three cases by more than 90%.
Resumo:
This is one of the first papers in which arguments are given to treat code-switching and borrowing as similar phenomena. It is argued that it is theoretically undesirable to distinguish both phenomena, and empirically very problematic. A probabilistic account of code-switching and a hierarchy of switched constituents (similar to hierarchies of borrowability) are proposed which account for the fact that some constituents are more likely to be borrowed/switched than others. It is argued that the same kinds of constraints apply to both code-switching and borrowing.
Resumo:
This volume is a serious attempt to open up the subject of European philosophy of science to real thought, and provide the structural basis for the interdisciplinary development of its specialist fields, but also to provoke reflection on the idea of ‘European philosophy of science’. This efforts should foster a contemporaneous reflection on what might be meant by philosophy of science in Europe and European philosophy of science, and how in fact awareness of it could assist philosophers interpret and motivate their research through a stronger collective identity. The overarching aim is to set the background for a collaborative project organising, systematising, and ultimately forging an identity for, European philosophy of science by creating research structures and developing research networks across Europe to promote its development.
Resumo:
The probabilistic projections of climate change for the United Kingdom (UK Climate Impacts Programme) show a trend towards hotter and drier summers. This suggests an expected increase in cooling demand for buildings – a conflicting requirement to reducing building energy needs and related CO2 emissions. Though passive design is used to reduce thermal loads of a building, a supplementary cooling system is often necessary. For such mixed-mode strategies, indirect evaporative cooling is investigated as a low energy option in the context of a warmer and drier UK climate. Analysis of the climate projections shows an increase in wet-bulb depression; providing a good indication of the cooling potential of an evaporative cooler. Modelling a mixed-mode building at two different locations, showed such a building was capable of maintaining adequate thermal comfort in future probable climates. Comparing the control climate to the scenario climate, an increase in the median of evaporative cooling load is evident. The shift is greater for London than for Glasgow with a respective 71.6% and 3.3% increase in the median annual cooling load. The study shows evaporative cooling should continue to function as an effective low-energy cooling technique in future, warming climates.