871 resultados para probabilistic refinement calculus


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a growth analysis model that combines large amounts of environmental data with limited amounts of biological data and apply it to Corbicula japonica. The model uses the maximum-likelihood method with the Akaike information criterion, which provides an objective criterion for model selection. An adequate distribution for describing a single cohort is selected from available probability density functions, which are expressed by location and scale parameters. Daily relative increase rates of the location parameter are expressed by a multivariate logistic function with environmental factors for each day and categorical variables indicating animal ages as independent variables. Daily relative increase rates of the scale parameter are expressed by an equation describing the relationship with the daily relative increase rate of the location parameter. Corbicula japonica grows to a modal shell length of 0.7 mm during the first year in Lake Abashiri. Compared with the attain-able maximum size of about 30 mm, the growth of juveniles is extremely slow because their growth is less susceptible to environmental factors until the second winter. The extremely slow growth in Lake Abashiri could be a geographical genetic variation within C. japonica.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bayesian formulated neural networks are implemented using hybrid Monte Carlo method for probabilistic fault identification in cylindrical shells. Each of the 20 nominally identical cylindrical shells is divided into three substructures. Holes of (12±2) mm in diameter are introduced in each of the substructures and vibration data are measured. Modal properties and the Coordinate Modal Assurance Criterion (COMAC) are utilized to train the two modal-property-neural-networks. These COMAC are calculated by taking the natural-frequency-vector to be an additional mode. Modal energies are calculated by determining the integrals of the real and imaginary components of the frequency response functions over bandwidths of 12% of the natural frequencies. The modal energies and the Coordinate Modal Energy Assurance Criterion (COMEAC) are used to train the two frequency-response-function-neural-networks. The averages of the two sets of trained-networks (COMAC and COMEAC as well as modal properties and modal energies) form two committees of networks. The COMEAC and the COMAC are found to be better identification data than using modal properties and modal energies directly. The committee approach is observed to give lower standard deviations than the individual methods. The main advantage of the Bayesian formulation is that it gives identities of damage and their respective confidence intervals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents some developments in query expansion and document representation of our spoken document retrieval system and shows how various retrieval techniques affect performance for different sets of transcriptions derived from a common speech source. Modifications of the document representation are used, which combine several techniques for query expansion, knowledge-based on one hand and statistics-based on the other. Taken together, these techniques can improve Average Precision by over 19% relative to a system similar to that which we presented at TREC-7. These new experiments have also confirmed that the degradation of Average Precision due to a word error rate (WER) of 25% is quite small (3.7% relative) and can be reduced to almost zero (0.2% relative). The overall improvement of the retrieval system can also be observed for seven different sets of transcriptions from different recognition engines with a WER ranging from 24.8% to 61.5%. We hope to repeat these experiments when larger document collections become available, in order to evaluate the scalability of these techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demodulation is an ill-posed problem whenever both carrier and envelope signals are broadband and unknown. Here, we approach this problem using the methods of probabilistic inference. The new approach, called Probabilistic Amplitude Demodulation (PAD), is computationally challenging but improves on existing methods in a number of ways. By contrast to previous approaches to demodulation, it satisfies five key desiderata: PAD has soft constraints because it is probabilistic; PAD is able to automatically adjust to the signal because it learns parameters; PAD is user-steerable because the solution can be shaped by user-specific prior information; PAD is robust to broad-band noise because this is modeled explicitly; and PAD's solution is self-consistent, empirically satisfying a Carrier Identity property. Furthermore, the probabilistic view naturally encompasses noise and uncertainty, allowing PAD to cope with missing data and return error bars on carrier and envelope estimates. Finally, we show that when PAD is applied to a bandpass-filtered signal, the stop-band energy of the inferred carrier is minimal, making PAD well-suited to sub-band demodulation. © 2006 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the current state-of-the-art in performance indicators and use of probabilistic approaches used in climate change impact studies. It presents a critical review of recent publications in this field, focussing on (1) metrics for energy use for heating and cooling, emissions, overheating and high-level performance aspects, and (2) uptake of uncertainty and risk analysis. This is followed by a case study, which is used to explore some of the contextual issues around the broader uptake of climate change impact studies in practice. The work concludes that probabilistic predictions of the impact of climate change are feasible, but only based on strict and explicitly stated assumptions. © 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article investigates how to use UK probabilistic climate-change projections (UKCP09) in rigorous building energy analysis. Two office buildings (deep plan and shallow plan) are used as case studies to demonstrate the application of UKCP09. Three different methods for reducing the computational demands are explored: statistical reduction (Finkelstein-Schafer [F-S] statistics), simplification using degree-day theory and the use of metamodels. The first method, which is based on an established technique, can be used as reference because it provides the most accurate information. However, it is necessary to automatically choose weather files based on F-S statistic by using computer programming language because thousands of weather files created from UKCP09 weather generator need to be processed. A combination of the second (degree-day theory) and third method (metamodels) requires only a relatively small number of simulation runs, but still provides valuable information to further implement the uncertainty and sensitivity analyses. The article also demonstrates how grid computing can be used to speed up the calculation for many independent EnergyPlus models by harnessing the processing power of idle desktop computers. © 2011 International Building Performance Simulation Association (IBPSA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Active control has been shown as a feasible technology for suppressing thermoacoustic instability in continuous combustion systems, and the control strategy design is substantially dependent on the reliability of the flame model. In this paper, refinement of G-equation flame model for the dynamics of lean premixed combustion is investigated. Precisely, the dynamics between the flame speed S_u and equivalence ratio phi are proposed based on numerical calculations and physical explanations. Finally, the developed model is tested on one set of experimental data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a novel model for the spatio-temporal clustering of trajectories based on motion, which applies to challenging street-view video sequences of pedestrians captured by a mobile camera. A key contribution of our work is the introduction of novel probabilistic region trajectories, motivated by the non-repeatability of segmentation of frames in a video sequence. Hierarchical image segments are obtained by using a state-of-the-art hierarchical segmentation algorithm, and connected from adjacent frames in a directed acyclic graph. The region trajectories and measures of confidence are extracted from this graph using a dynamic programming-based optimisation. Our second main contribution is a Bayesian framework with a twofold goal: to learn the optimal, in a maximum likelihood sense, Random Forests classifier of motion patterns based on video features, and construct a unique graph from region trajectories of different frames, lengths and hierarchical levels. Finally, we demonstrate the use of Isomap for effective spatio-temporal clustering of the region trajectories of pedestrians. We support our claims with experimental results on new and existing challenging video sequences. © 2011 IEEE.