957 resultados para Probability distributions


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Scene flow methods estimate the three-dimensional motion field for points in the world, using multi-camera video data. Such methods combine multi-view reconstruction with motion estimation. This paper describes an alternative formulation for dense scene flow estimation that provides reliable results using only two cameras by fusing stereo and optical flow estimation into a single coherent framework. Internally, the proposed algorithm generates probability distributions for optical flow and disparity. Taking into account the uncertainty in the intermediate stages allows for more reliable estimation of the 3D scene flow than previous methods allow. To handle the aperture problems inherent in the estimation of optical flow and disparity, a multi-scale method along with a novel region-based technique is used within a regularized solution. This combined approach both preserves discontinuities and prevents over-regularization – two problems commonly associated with the basic multi-scale approaches. Experiments with synthetic and real test data demonstrate the strength of the proposed approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The eliciting dose (ED) for a peanut allergic reaction in 5% of the peanut allergic population, the ED05, is 1.5 mg of peanut protein. This ED05 was derived from oral food challenges (OFC) that use graded, incremental doses administered at fixed time intervals. Individual patients’ threshold doses were used to generate population dose-distribution curves using probability distributions from which the ED05 was then determined. It is important to clinically validate that this dose is predictive of the allergenic response in a further unselected group of peanut-allergic individuals. Methods/Aims: This is a multi-centre study involving three national level referral and teaching centres. (Cork University Hospital, Ireland, Royal Children’s Hospital Melbourne, Australia and Massachusetts General Hospital, Boston, U.S.A.) The study is now in process and will continue to run until all centres have recruited 125 participates in each respective centre. A total of 375 participants, aged 1–18 years will be recruited during routine Allergy appointments in the centres. The aim is to assess the precision of the predicted ED05 using a single dose (6 mg peanut = 1.5 mg of peanut protein) in the form of a cookie. Validated Food Allergy related Quality of Life Questionnaires-(FAQLQ) will be self-administered prior to OFC and 1 month after challenge to assess the impact of a single dose OFC on FAQL. Serological and cell based in vitro studies will be performed. Conclusion: The validation of the ED05 threshold for allergic reactions in peanut allergic subjects has potential value for public health measures. The single dose OFC, based upon the statistical dose-distribution analysis of past challenge trials, promises an efficient approach to identify the most highly sensitive patients within any given food-allergic population.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We have analysed the electronic wave functions from an ab initio simulation of the ionic liquid (room temperature molten salt) dimethyl imidazolium chloride ([dmim][Cl] or [C1mim][Cl]) using localized Wannier orbitals. This allows us to assign electron density to individual ions. The probability distributions of the ionic dipole moments for an isolated ion and for ions in solution are compared. The liquid environment is found to polarize the cation by about 0.7 D and to increase the amplitude of the fluctuations in the dipole moments of both cation and anion. The relative changes in nuclear and electronic contributions are shown. The implications for classical force fields are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper discusses the relations between extended incidence calculus and assumption-based truth maintenance systems (ATMSs). We first prove that managing labels for statements (nodes) in an ATMS is equivalent to producing incidence sets of these statements in extended incidence calculus. We then demonstrate that the justification set for a node is functionally equivalent to the implication relation set for the same node in extended incidence calculus. As a consequence, extended incidence calculus can provide justifications for an ATMS, because implication relation sets are discovered by the system automatically. We also show that extended incidence calculus provides a theoretical basis for constructing a probabilistic ATMS by associating proper probability distributions on assumptions. In this way, we can not only produce labels for all nodes in the system, but also calculate the probability of any of such nodes in it. The nogood environments can also be obtained automatically. Therefore, extended incidence calculus and the ATMS are equivalent in carrying out inferences at both the symbolic level and the numerical level. This extends a result due to Laskey and Lehner.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A scale invariant feature transform (SIFT) based mean shift algorithm is presented for object tracking in real scenarios. SIFT features are used to correspond the region of interests across frames. Meanwhile, mean shift is applied to conduct similarity search via color histograms. The probability distributions from these two measurements are evaluated in an expectation–maximization scheme so as to achieve maximum likelihood estimation of similar regions. This mutual support mechanism can lead to consistent tracking performance if one of the two measurements becomes unstable. Experimental work demonstrates that the proposed mean shift/SIFT strategy improves the tracking performance of the classical mean shift and SIFT tracking algorithms in complicated real scenarios.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, an analysis of radio channel characteristics for single- and multiple-antenna bodyworn systems for use in body-to-body communications is presented. The work was based on an extensive measurement campaign conducted at 2.45 GHz representative of an indoor sweep and search scenario for fire and rescue personnel. Using maximum-likelihood estimation in conjunction with the Akaike information criterion (AIC), five candidate probability distributions were investigated and from these the kappa - mu distribution was found to best describe small-scale fading observed in the body-to-body channels. Additional channel parameters such as autocorrelation and the cross-correlation coefficient between fading signal envelopes were also analyzed. Low cross correlation and small differences in mean signal levels between potential dual-branch diversity receivers suggested that the prospect of successfully implementing diversity in this type application is extremely good. Moreover, using selection combination, maximal ratio, and equal gain combining, up to 8.69-dB diversity gain can be made available when four spatially separated antennas are used at the receiver. Additional improvements in the combined envelopes through lower level crossing rates and fade durations at low signal levels were also observed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We introduce a novel graph class we call universal hierarchical graphs (UHG) whose topology can be found numerously in problems representing, e.g., temporal, spacial or general process structures of systems. For this graph class we show, that we can naturally assign two probability distributions, for nodes and for edges, which lead us directly to the definition of the entropy and joint entropy and, hence, mutual information establishing an information theory for this graph class. Furthermore, we provide some results under which conditions these constraint probability distributions maximize the corresponding entropy. Also, we demonstrate that these entropic measures can be computed efficiently which is a prerequisite for every large scale practical application and show some numerical examples. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The recent adiabatic saddle-point method of Shearer et al. [ Phys. Rev. A 84 033409 (2011)] is applied to study strong-field photodetachment of H- by few-cycle linearly polarized laser pulses of frequencies near the two-photon detachment threshold. The behavior of the saddle points in the complex-time plane for a range of laser parameters is explored. A detailed analysis of the influence of laser intensities [(2×1011)–(6.5 × 1011) W/cm2], midinfrared laser wavelengths (1800–2700 nm), and various values of the carrier envelope phase (CEP) on (i) three-dimensional probability detachment distributions, (ii) photoangular distributions (PADs), (iii) energy spectra, and (iv) momentum distributions are presented. Examination of the probability distributions and PADs reveal main lobes and jetlike structures. Bifurcation phenomena in the probability distributions and PADs are also observed as the wavelength and intensity increase. Our simulations show that the (i) probability distributions, (ii) PADs, and (iii) energy spectra are extremely sensitive to the CEP and thus measuring such distributions provides a useful tool for determining this phase. The symmetrical properties of the electron momentum distributions are also found to be strongly correlated with the CEP and this provides an additional robust method for measuring the CEP of a laser pulse. Our calculations further show that for a three-cycle pulse inclusion of all eight saddle points is required in the evaluation of the transition amplitude to yield an accurate description of the photodetachment process. This is in contrast to recent results for a five-cycle pulse.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Peatlands are a key component of the global carbon cycle. Chronologies of peatland initiation are typically based on compiled basal peat radiocarbon (14C) dates and frequency histograms of binned calibrated age ranges. However, such compilations are problematic because poor quality 14C dates are commonly included and because frequency histograms of binned age ranges introduce chronological artefacts that bias the record of peatland initiation. Using a published compilation of 274 basal 14C dates from Alaska as a case study, we show that nearly half the 14C dates are inappropriate for reconstructing peatland initiation, and that the temporal structure of peatland initiation is sensitive to sampling biases and treatment of calibrated14C dates. We present revised chronologies of peatland initiation for Alaska and the circumpolar Arctic based on summed probability distributions of calibrated 14C dates. These revised chronologies reveal that northern peatland initiation lagged abrupt increases in atmospheric CH4 concentration at the start of the Bølling–Allerød interstadial (Termination 1A) and the end of the Younger Dryas chronozone (Termination 1B), suggesting that northern peatlands were not the primary drivers of the rapid increases in atmospheric CH4. Our results demonstrate that subtle methodological changes in the synthesis of basal 14C ages lead to substantially different interpretations of temporal trends in peatland initiation, with direct implications for the role of peatlands in the global carbon cycle.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The liquid state structure of the ionic liquid, 1-ethyl-3-methylimidazolium acetate, and the solute/solvent structure of glucose dissolved in the ionic liquid at a 1: 6 molar ratio have been investigated at 323 K by molecular dynamics simulations and neutron diffraction experiments using H/D isotopically substituted materials. Interactions between hydrogen-bond donating cation sites and polar, directional hydrogen-bond accepting acetate anions are examined. Ion-ion radial distribution functions for the neat ionic liquid, calculated from both MD and derived from the empirical potential structure refinement model to the experimental data, show the alternating shell-structure of anions around the cation, as anticipated. Spatial probability distributions reveal the main anion-to-cation features as in-plane interactions of anions with imidazolium ring hydrogens and cation-cation planar stacking. Interestingly, the presence of the polarised hydrogen-bond acceptor anion leads to increased anion-anion tail-tail structuring within each anion shell, indicating the onset of hydrophobic regions within the anion regions of the liquid.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The process of accounting for heterogeneity has made significant advances in statistical research, primarily in the framework of stochastic analysis and the development of multiple-point statistics (MPS). Among MPS techniques, the direct sampling (DS) method is tested to determine its ability to delineate heterogeneity from aerial magnetics data in a regional sandstone aquifer intruded by low-permeability volcanic dykes in Northern Ireland, UK. The use of two two-dimensional bivariate training images aids in creating spatial probability distributions of heterogeneities of hydrogeological interest, despite relatively ‘noisy’ magnetics data (i.e. including hydrogeologically irrelevant urban noise and regional geologic effects). These distributions are incorporated into a hierarchy system where previously published density function and upscaling methods are applied to derive regional distributions of equivalent hydraulic conductivity tensor K. Several K models, as determined by several stochastic realisations of MPS dyke locations, are computed within groundwater flow models and evaluated by comparing modelled heads with field observations. Results show a significant improvement in model calibration when compared to a simplistic homogeneous and isotropic aquifer model that does not account for the dyke occurrence evidenced by airborne magnetic data. The best model is obtained when normal and reverse polarity dykes are computed separately within MPS simulations and when a probability threshold of 0.7 is applied. The presented stochastic approach also provides improvement when compared to a previously published deterministic anisotropic model based on the unprocessed (i.e. noisy) airborne magnetics. This demonstrates the potential of coupling MPS to airborne geophysical data for regional groundwater modelling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Semi-qualitative probabilistic networks (SQPNs) merge two important graphical model formalisms: Bayesian networks and qualitative probabilistic networks. They provide a very general modeling framework by allowing the combination of numeric and qualitative assessments over a discrete domain, and can be compactly encoded by exploiting the same factorization of joint probability distributions that are behind the Bayesian networks. This paper explores the computational complexity of semi-qualitative probabilistic networks, and takes the polytree-shaped networks as its main target. We show that the inference problem is coNP-Complete for binary polytrees with multiple observed nodes. We also show that inferences can be performed in linear time if there is a single observed node, which is a relevant practical case. Because our proof is constructive, we obtain an efficient linear time algorithm for SQPNs under such assumptions. To the best of our knowledge, this is the first exact polynomial-time algorithm for SQPNs. Together these results provide a clear picture of the inferential complexity in polytree-shaped SQPNs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A credal network is a graph-theoretic model that represents imprecision in joint probability distributions. An inference in a credal net aims at computing an interval for the probability of an event of interest. Algorithms for inference in credal networks can be divided into exact and approximate. The selection of an algorithm is based on a trade off that ponders how much time someone wants to spend in a particular calculation against the quality of the computed values. This paper presents an algorithm, called IDS, that combines exact and approximate methods for computing inferences in polytree-shaped credal networks. The algorithm provides an approach to trade time and precision when making inferences in credal nets

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hidden Markov models (HMMs) are widely used models for sequential data. As with other probabilistic graphical models, they require the specification of precise probability values, which can be too restrictive for some domains, especially when data are scarce or costly to acquire. We present a generalized version of HMMs, whose quantification can be done by sets of, instead of single, probability distributions. Our models have the ability to suspend judgment when there is not enough statistical evidence, and can serve as a sensitivity analysis tool for standard non-stationary HMMs. Efficient inference algorithms are developed to address standard HMM usage such as the computation of likelihoods and most probable explanations. Experiments with real data show that the use of imprecise probabilities leads to more reliable inferences without compromising efficiency.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Credal nets are probabilistic graphical models which extend Bayesian nets to cope with sets of distributions. This feature makes the model particularly suited for the implementation of classifiers and knowledge-based systems. When working with sets of (instead of single) probability distributions, the identification of the optimal option can be based on different criteria, some of them eventually leading to multiple choices. Yet, most of the inference algorithms for credal nets are designed to compute only the bounds of the posterior probabilities. This prevents some of the existing criteria from being used. To overcome this limitation, we present two simple transformations for credal nets which make it possible to compute decisions based on the maximality and E-admissibility criteria without any modification in the inference algorithms. We also prove that these decision problems have the same complexity of standard inference, being NP^PP-hard for general credal nets and NP-hard for polytrees.