61 resultados para probabilistic Hough transform
Resumo:
Several methods are examined which allow to produce forecasts for time series in the form of probability assignments. The necessary concepts are presented, addressing questions such as how to assess the performance of a probabilistic forecast. A particular class of models, cluster weighted models (CWMs), is given particular attention. CWMs, originally proposed for deterministic forecasts, can be employed for probabilistic forecasting with little modification. Two examples are presented. The first involves estimating the state of (numerically simulated) dynamical systems from noise corrupted measurements, a problem also known as filtering. There is an optimal solution to this problem, called the optimal filter, to which the considered time series models are compared. (The optimal filter requires the dynamical equations to be known.) In the second example, we aim at forecasting the chaotic oscillations of an experimental bronze spring system. Both examples demonstrate that the considered time series models, and especially the CWMs, provide useful probabilistic information about the underlying dynamical relations. In particular, they provide more than just an approximation to the conditional mean.
Resumo:
This volume is a serious attempt to open up the subject of European philosophy of science to real thought, and provide the structural basis for the interdisciplinary development of its specialist fields, but also to provoke reflection on the idea of ‘European philosophy of science’. This efforts should foster a contemporaneous reflection on what might be meant by philosophy of science in Europe and European philosophy of science, and how in fact awareness of it could assist philosophers interpret and motivate their research through a stronger collective identity. The overarching aim is to set the background for a collaborative project organising, systematising, and ultimately forging an identity for, European philosophy of science by creating research structures and developing research networks across Europe to promote its development.
Resumo:
The probabilistic projections of climate change for the United Kingdom (UK Climate Impacts Programme) show a trend towards hotter and drier summers. This suggests an expected increase in cooling demand for buildings – a conflicting requirement to reducing building energy needs and related CO2 emissions. Though passive design is used to reduce thermal loads of a building, a supplementary cooling system is often necessary. For such mixed-mode strategies, indirect evaporative cooling is investigated as a low energy option in the context of a warmer and drier UK climate. Analysis of the climate projections shows an increase in wet-bulb depression; providing a good indication of the cooling potential of an evaporative cooler. Modelling a mixed-mode building at two different locations, showed such a building was capable of maintaining adequate thermal comfort in future probable climates. Comparing the control climate to the scenario climate, an increase in the median of evaporative cooling load is evident. The shift is greater for London than for Glasgow with a respective 71.6% and 3.3% increase in the median annual cooling load. The study shows evaporative cooling should continue to function as an effective low-energy cooling technique in future, warming climates.
Resumo:
The Chartered Institute of Building Service Engineers (CIBSE) produced a technical memorandum (TM36) presenting research on future climate impacting building energy use and thermal comfort. One climate projection for each of four CO2 emissions scenario were used in TM36, so providing a deterministic outlook. As part of the UK Climate Impacts Programme (UKCIP) probabilistic climate projections are being studied in relation to building energy simulation techniques. Including uncertainty in climate projections is considered an important advance to climate impacts modelling and is included in the latest UKCIP data (UKCP09). Incorporating the stochastic nature of these new climate projections in building energy modelling requires a significant increase in data handling and careful statistical interpretation of the results to provide meaningful conclusions. This paper compares the results from building energy simulations when applying deterministic and probabilistic climate data. This is based on two case study buildings: (i) a mixed-mode office building with exposed thermal mass and (ii) a mechanically ventilated, light-weight office building. Building (i) represents an energy efficient building design that provides passive and active measures to maintain thermal comfort. Building (ii) relies entirely on mechanical means for heating and cooling, with its light-weight construction raising concern over increased cooling loads in a warmer climate. Devising an effective probabilistic approach highlighted greater uncertainty in predicting building performance, depending on the type of building modelled and the performance factors under consideration. Results indicate that the range of calculated quantities depends not only on the building type but is strongly dependent on the performance parameters that are of interest. Uncertainty is likely to be particularly marked with regard to thermal comfort in naturally ventilated buildings.
Resumo:
Abstract: Long-term exposure of skylarks to a fictitious insecticide and of wood mice to a fictitious fungicide were modelled probabilistically in a Monte Carlo simulation. Within the same simulation the consequences of exposure to pesticides on reproductive success were modelled using the toxicity-exposure-linking rules developed by R.S. Bennet et al. (2005) and the interspecies extrapolation factors suggested by R. Luttik et al.(2005). We built models to reflect a range of scenarios and as a result were able to show how exposure to pesticide might alter the number of individuals engaged in any given phase of the breeding cycle at any given time and predict the numbers of new adults at the season’s end.
Resumo:
A detailed spectrally-resolved extraterrestrial solar spectrum (ESS) is important for line-by-line radiative transfer modeling in the near-infrared (near-IR). Very few observationally-based high-resolution ESS are available in this spectral region. Consequently the theoretically-calculated ESS by Kurucz has been widely adopted. We present the CAVIAR (Continuum Absorption at Visible and Infrared Wavelengths and its Atmospheric Relevance) ESS which is derived using the Langley technique applied to calibrated observations using a ground-based high-resolution Fourier transform spectrometer (FTS) in atmospheric windows from 2000–10000 cm-1 (1–5 μm). There is good agreement between the strengths and positions of solar lines between the CAVIAR and the satellite-based ACE-FTS (Atmospheric Chemistry Experiment-FTS) ESS, in the spectral region where they overlap, and good agreement with other ground-based FTS measurements in two near-IR windows. However there are significant differences in the structure between the CAVIAR ESS and spectra from semi-empirical models. In addition, we found a difference of up to 8 % in the absolute (and hence the wavelength-integrated) irradiance between the CAVIAR ESS and that of Thuillier et al., which was based on measurements from the Atmospheric Laboratory for Applications and Science satellite and other sources. In many spectral regions, this difference is significant, as the coverage factor k = 2 (or 95 % confidence limit) uncertainties in the two sets of observations do not overlap. Since the total solar irradiance is relatively well constrained, if the CAVIAR ESS is correct, then this would indicate an integrated “loss” of solar irradiance of about 30 W m-2 in the near-IR that would have to be compensated by an increase at other wavelengths.
Resumo:
There are several scoring rules that one can choose from in order to score probabilistic forecasting models or estimate model parameters. Whilst it is generally agreed that proper scoring rules are preferable, there is no clear criterion for preferring one proper scoring rule above another. This manuscript compares and contrasts some commonly used proper scoring rules and provides guidance on scoring rule selection. In particular, it is shown that the logarithmic scoring rule prefers erring with more uncertainty, the spherical scoring rule prefers erring with lower uncertainty, whereas the other scoring rules are indifferent to either option.
Resumo:
Three wind gust estimation (WGE) methods implemented in the numerical weather prediction (NWP) model COSMO-CLM are evaluated with respect to their forecast quality using skill scores. Two methods estimate gusts locally from mean wind speed and the turbulence state of the atmosphere, while the third one considers the mixing-down of high momentum within the planetary boundary layer (WGE Brasseur). One hundred and fifty-eight windstorms from the last four decades are simulated and results are compared with gust observations at 37 stations in Germany. Skill scores reveal that the local WGE methods show an overall better behaviour, whilst WGE Brasseur performs less well except for mountain regions. The here introduced WGE turbulent kinetic energy (TKE) permits a probabilistic interpretation using statistical characteristics of gusts at observational sites for an assessment of uncertainty. The WGE TKE formulation has the advantage of a ‘native’ interpretation of wind gusts as result of local appearance of TKE. The inclusion of a probabilistic WGE TKE approach in NWP models has, thus, several advantages over other methods, as it has the potential for an estimation of uncertainties of gusts at observational sites.
Resumo:
Infrared polarization and intensity imagery provide complementary and discriminative information in image understanding and interpretation. In this paper, a novel fusion method is proposed by effectively merging the information with various combination rules. It makes use of both low-frequency and highfrequency images components from support value transform (SVT), and applies fuzzy logic in the combination process. Images (both infrared polarization and intensity images) to be fused are firstly decomposed into low-frequency component images and support value image sequences by the SVT. Then the low-frequency component images are combined using a fuzzy combination rule blending three sub-combination methods of (1) region feature maximum, (2) region feature weighting average, and (3) pixel value maximum; and the support value image sequences are merged using a fuzzy combination rule fusing two sub-combination methods of (1) pixel energy maximum and (2) region feature weighting. With the variables of two newly defined features, i.e. the low-frequency difference feature for low-frequency component images and the support-value difference feature for support value image sequences, trapezoidal membership functions are proposed and developed in tuning the fuzzy fusion process. Finally the fused image is obtained by inverse SVT operations. Experimental results of visual inspection and quantitative evaluation both indicate the superiority of the proposed method to its counterparts in image fusion of infrared polarization and intensity images.
Resumo:
We generalize the popular ensemble Kalman filter to an ensemble transform filter, in which the prior distribution can take the form of a Gaussian mixture or a Gaussian kernel density estimator. The design of the filter is based on a continuous formulation of the Bayesian filter analysis step. We call the new filter algorithm the ensemble Gaussian-mixture filter (EGMF). The EGMF is implemented for three simple test problems (Brownian dynamics in one dimension, Langevin dynamics in two dimensions and the three-dimensional Lorenz-63 model). It is demonstrated that the EGMF is capable of tracking systems with non-Gaussian uni- and multimodal ensemble distributions. Copyright © 2011 Royal Meteorological Society
Resumo:
Many applications, such as intermittent data assimilation, lead to a recursive application of Bayesian inference within a Monte Carlo context. Popular data assimilation algorithms include sequential Monte Carlo methods and ensemble Kalman filters (EnKFs). These methods differ in the way Bayesian inference is implemented. Sequential Monte Carlo methods rely on importance sampling combined with a resampling step, while EnKFs utilize a linear transformation of Monte Carlo samples based on the classic Kalman filter. While EnKFs have proven to be quite robust even for small ensemble sizes, they are not consistent since their derivation relies on a linear regression ansatz. In this paper, we propose another transform method, which does not rely on any a priori assumptions on the underlying prior and posterior distributions. The new method is based on solving an optimal transportation problem for discrete random variables. © 2013, Society for Industrial and Applied Mathematics
Resumo:
Two recent works have adapted the Kalman–Bucy filter into an ensemble setting. In the first formulation, the ensemble of perturbations is updated by the solution of an ordinary differential equation (ODE) in pseudo-time, while the mean is updated as in the standard Kalman filter. In the second formulation, the full ensemble is updated in the analysis step as the solution of single set of ODEs in pseudo-time. Neither requires matrix inversions except for the frequently diagonal observation error covariance. We analyse the behaviour of the ODEs involved in these formulations. We demonstrate that they stiffen for large magnitudes of the ratio of background error to observational error variance, and that using the integration scheme proposed in both formulations can lead to failure. A numerical integration scheme that is both stable and is not computationally expensive is proposed. We develop transform-based alternatives for these Bucy-type approaches so that the integrations are computed in ensemble space where the variables are weights (of dimension equal to the ensemble size) rather than model variables. Finally, the performance of our ensemble transform Kalman–Bucy implementations is evaluated using three models: the 3-variable Lorenz 1963 model, the 40-variable Lorenz 1996 model, and a medium complexity atmospheric general circulation model known as SPEEDY. The results from all three models are encouraging and warrant further exploration of these assimilation techniques.
Resumo:
We propose and demonstrate a fully probabilistic (Bayesian) approach to the detection of cloudy pixels in thermal infrared (TIR) imagery observed from satellite over oceans. Using this approach, we show how to exploit the prior information and the fast forward modelling capability that are typically available in the operational context to obtain improved cloud detection. The probability of clear sky for each pixel is estimated by applying Bayes' theorem, and we describe how to apply Bayes' theorem to this problem in general terms. Joint probability density functions (PDFs) of the observations in the TIR channels are needed; the PDFs for clear conditions are calculable from forward modelling and those for cloudy conditions have been obtained empirically. Using analysis fields from numerical weather prediction as prior information, we apply the approach to imagery representative of imagers on polar-orbiting platforms. In comparison with the established cloud-screening scheme, the new technique decreases both the rate of failure to detect cloud contamination and the false-alarm rate by one quarter. The rate of occurrence of cloud-screening-related errors of >1 K in area-averaged SSTs is reduced by 83%. Copyright © 2005 Royal Meteorological Society.
Resumo:
The aim of this article is to improve the communication of the probabilistic flood forecasts generated by hydrological ensemble prediction systems (HEPS) by understanding perceptions of different methods of visualizing probabilistic forecast information. This study focuses on interexpert communication and accounts for differences in visualization requirements based on the information content necessary for individual users. The perceptions of the expert group addressed in this study are important because they are the designers and primary users of existing HEPS. Nevertheless, they have sometimes resisted the release of uncertainty information to the general public because of doubts about whether it can be successfully communicated in ways that would be readily understood to nonexperts. In this article, we explore the strengths and weaknesses of existing HEPS visualization methods and thereby formulate some wider recommendations about the best practice for HEPS visualization and communication. We suggest that specific training on probabilistic forecasting would foster use of probabilistic forecasts with a wider range of applications. The result of a case study exercise showed that there is no overarching agreement between experts on how to display probabilistic forecasts and what they consider the essential information that should accompany plots and diagrams. In this article, we propose a list of minimum properties that, if consistently displayed with probabilistic forecasts, would make the products more easily understandable. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Hydrological ensemble prediction systems (HEPS) have in recent years been increasingly used for the operational forecasting of floods by European hydrometeorological agencies. The most obvious advantage of HEPS is that more of the uncertainty in the modelling system can be assessed. In addition, ensemble prediction systems generally have better skill than deterministic systems both in the terms of the mean forecast performance and the potential forecasting of extreme events. Research efforts have so far mostly been devoted to the improvement of the physical and technical aspects of the model systems, such as increased resolution in time and space and better description of physical processes. Developments like these are certainly needed; however, in this paper we argue that there are other areas of HEPS that need urgent attention. This was also the result from a group exercise and a survey conducted to operational forecasters within the European Flood Awareness System (EFAS) to identify the top priorities of improvement regarding their own system. They turned out to span a range of areas, the most popular being to include verification of an assessment of past forecast performance, a multi-model approach for hydrological modelling, to increase the forecast skill on the medium range (>3 days) and more focus on education and training on the interpretation of forecasts. In light of limited resources, we suggest a simple model to classify the identified priorities in terms of their cost and complexity to decide in which order to tackle them. This model is then used to create an action plan of short-, medium- and long-term research priorities with the ultimate goal of an optimal improvement of EFAS in particular and to spur the development of operational HEPS in general.