961 resultados para Joint probability
Resumo:
A sequence of moments obtained from statistical trials encodes a classical probability distribution. However, it is well known that an incompatible set of moments arises in the quantum scenario, when correlation outcomes associated with measurements on spatially separated entangled states are considered. This feature, viz., the incompatibility of moments with a joint probability distribution, is reflected in the violation of Bell inequalities. Here, we focus on sequential measurements on a single quantum system and investigate if moments and joint probabilities are compatible with each other. By considering sequential measurement of a dichotomic dynamical observable at three different time intervals, we explicitly demonstrate that the moments and the probabilities are inconsistent with each other. Experimental results using a nuclear magnetic resonance system are reported here to corroborate these theoretical observations, viz., the incompatibility of the three-time joint probabilities with those extracted from the moment sequence when sequential measurements on a single-qubit system are considered.
Resumo:
Modeling of the joint probability density function of the mixture fraction and progress variable with a given covariance value is studied. This modeling is validated using experimental and direct numerical simulation (DNS) data. A very good agreement with experimental data of turbulent stratified flames and DNS data of a lifted hydrogen jet flame is obtained. The effect of using this joint pdf modeling to calculate the mean reaction rate with a flamelet closure in Reynolds averaged Navier-Stokes (RANS) calculation of stratified flames is studied. The covariance effect is observed to be large within the flame brush. The results obtained from RANS calculations using this modeling for stratified jet- and rod-stabilized V-flames are discussed and compared to the measurements as a posteriori validation for the joint probability density function model with the flamelet closure. The agreement between the computed and measured values of flame and turbulence quantities is found to be good. © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
Joint quantum measurements of noncommuting observables are possible, if one accepts an increase in the measured variances. A necessary condition for a joint measurement to be possible is that a joint probability distribution exists for the measurement. This fact suggests that there may be a link with Bell inequalities, as these will be satisfied if and only if a joint probability distribution for all involved observables exists. We investigate the connections between Bell inequalities and conditions for joint quantum measurements to be possible. Mermin's inequality for the three-particle Greenberger-Horne-Zeilinger state turns out to be equivalent to the condition for a joint measurement on two out of the three quantum systems to exist. Gisin's Bell inequality for three coplanar measurement directions, meanwhile, is shown to be less strict than the condition for the corresponding joint measurement.
Resumo:
This thesis discusses various aspects of the integrity monitoring of GPS applied to civil aircraft navigation in different phases of flight. These flight phases include en route, terminal, non-precision approach and precision approach. The thesis includes four major topics: probability problem of GPS navigation service, risk analysis of aircraft precision approach and landing, theoretical analysis of Receiver Autonomous Integrity Monitoring (RAIM) techniques and RAIM availability, and GPS integrity monitoring at a ground reference station. Particular attention is paid to the mathematical aspects of the GPS integrity monitoring system. The research has been built upon the stringent integrity requirements defined by civil aviation community, and concentrates on the capability and performance investigation of practical integrity monitoring systems with rigorous mathematical and statistical concepts and approaches. Major contributions of this research are: • Rigorous integrity and continuity risk analysis for aircraft precision approach. Based on the joint probability density function of the affecting components, the integrity and continuity risks of aircraft precision approach with DGPS were computed. This advanced the conventional method of allocating the risk probability. • A theoretical study of RAIM test power. This is the first time a theoretical study on RAIM test power based on the probability statistical theory has been presented, resulting in a new set of RAIM criteria. • Development of a GPS integrity monitoring and DGPS quality control system based on GPS reference station. A prototype of GPS integrity monitoring and DGPS correction prediction system has been developed and tested, based on the A USN A V GPS base station on the roof of QUT ITE Building.
Resumo:
This article introduces a “pseudo classical” notion of modelling non-separability. This form of non-separability can be viewed as lying between separability and quantum-like non-separability. Non-separability is formalized in terms of the non-factorizabilty of the underlying joint probability distribution. A decision criterium for determining the non-factorizability of the joint distribution is related to determining the rank of a matrix as well as another approach based on the chi-square-goodness-of-fit test. This pseudo-classical notion of non-separability is discussed in terms of quantum games and concept combinations in human cognition.
Resumo:
This paper presents a robust stochastic framework for the incorporation of visual observations into conventional estimation, data fusion, navigation and control algorithms. The representation combines Isomap, a non-linear dimensionality reduction algorithm, with expectation maximization, a statistical learning scheme. The joint probability distribution of this representation is computed offline based on existing training data. The training phase of the algorithm results in a nonlinear and non-Gaussian likelihood model of natural features conditioned on the underlying visual states. This generative model can be used online to instantiate likelihoods corresponding to observed visual features in real-time. The instantiated likelihoods are expressed as a Gaussian mixture model and are conveniently integrated within existing non-linear filtering algorithms. Example applications based on real visual data from heterogenous, unstructured environments demonstrate the versatility of the generative models.
Resumo:
In this paper, we present the application of a non-linear dimensionality reduction technique for the learning and probabilistic classification of hyperspectral image. Hyperspectral image spectroscopy is an emerging technique for geological investigations from airborne or orbital sensors. It gives much greater information content per pixel on the image than a normal colour image. This should greatly help with the autonomous identification of natural and manmade objects in unfamiliar terrains for robotic vehicles. However, the large information content of such data makes interpretation of hyperspectral images time-consuming and userintensive. We propose the use of Isomap, a non-linear manifold learning technique combined with Expectation Maximisation in graphical probabilistic models for learning and classification. Isomap is used to find the underlying manifold of the training data. This low dimensional representation of the hyperspectral data facilitates the learning of a Gaussian Mixture Model representation, whose joint probability distributions can be calculated offline. The learnt model is then applied to the hyperspectral image at runtime and data classification can be performed.
Resumo:
The question of under what conditions conceptual representation is compositional remains debatable within cognitive science. This paper proposes a well developed mathematical apparatus for a probabilistic representation of concepts, drawing upon methods developed in quantum theory to propose a formal test that can determine whether a specific conceptual combination is compositional, or not. This test examines a joint probability distribution modeling the combination, asking whether or not it is factorizable. Empirical studies indicate that some combinations should be considered non-compositionally.
Resumo:
Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.
Resumo:
Aim: To quantify the consequences of major threats to biodiversity, such as climate and land-use change, it is important to use explicit measures of species persistence, such as extinction risk. The extinction risk of metapopulations can be approximated through simple models, providing a regional snapshot of the extinction probability of a species. We evaluated the extinction risk of three species under different climate change scenarios in three different regions of the Mexican cloud forest, a highly fragmented habitat that is particularly vulnerable to climate change. Location: Cloud forests in Mexico. Methods: Using Maxent, we estimated the potential distribution of cloud forest for three different time horizons (2030, 2050 and 2080) and their overlap with protected areas. Then, we calculated the extinction risk of three contrasting vertebrate species for two scenarios: (1) climate change only (all suitable areas of cloud forest through time) and (2) climate and land-use change (only suitable areas within a currently protected area), using an explicit patch-occupancy approximation model and calculating the joint probability of all populations becoming extinct when the number of remaining patches was less than five. Results: Our results show that the extent of environmentally suitable areas for cloud forest in Mexico will sharply decline in the next 70 years. We discovered that if all habitat outside protected areas is transformed, then only species with small area requirements are likely to persist. With habitat loss through climate change only, high dispersal rates are sufficient for persistence, but this requires protection of all remaining cloud forest areas. Main conclusions: Even if high dispersal rates mitigate the extinction risk of species due to climate change, the synergistic impacts of changing climate and land use further threaten the persistence of species with higher area requirements. Our approach for assessing the impacts of threats on biodiversity is particularly useful when there is little time or data for detailed population viability analyses. © 2013 John Wiley & Sons Ltd.
Resumo:
Conceptual combination performs a fundamental role in creating the broad range of compound phrases utilised in everyday language. While the systematicity and productivity of language provide a strong argument in favour of assuming compositionality, this very assumption is still regularly questioned in both cognitive science and philosophy. This article provides a novel probabilistic framework for assessing whether the semantics of conceptual combinations are compositional, and so can be considered as a function of the semantics of the constituent concepts, or not. Rather than adjudicating between different grades of compositionality, the framework presented here contributes formal methods for determining a clear dividing line between compositional and non-compositional semantics. Compositionality is equated with a joint probability distribution modelling how the constituent concepts in the combination are interpreted. Marginal selectivity is emphasised as a pivotal probabilistic constraint for the application of the Bell/CH and CHSH systems of inequalities (referred to collectively as Bell-type). Non-compositionality is then equated with either a failure of marginal selectivity, or, in the presence of marginal selectivity, with a violation of Bell-type inequalities. In both non-compositional scenarios, the conceptual combination cannot be modelled using a joint probability distribution with variables corresponding to the interpretation of the individual concepts. The framework is demonstrated by applying it to an empirical scenario of twenty-four non-lexicalised conceptual combinations.
Resumo:
Guo and Nixon proposed a feature selection method based on maximizing I(x; Y),the multidimensional mutual information between feature vector x and class variable Y. Because computing I(x; Y) can be difficult in practice, Guo and Nixon proposed an approximation of I(x; Y) as the criterion for feature selection. We show that Guo and Nixon's criterion originates from approximating the joint probability distributions in I(x; Y) by second-order product distributions. We remark on the limitations of the approximation and discuss computationally attractive alternatives to compute I(x; Y).
Resumo:
We report a measurement of the top quark mass $M_t$ in the dilepton decay channel $t\bar{t}\to b\ell'^{+}\nu'_\ell\bar{b}\ell^{-}\bar{\nu}_{\ell}$. Events are selected with a neural network which has been directly optimized for statistical precision in top quark mass using neuroevolution, a technique modeled on biological evolution. The top quark mass is extracted from per-event probability densities that are formed by the convolution of leading order matrix elements and detector resolution functions. The joint probability is the product of the probability densities from 344 candidate events in 2.0 fb$^{-1}$ of $p\bar{p}$ collisions collected with the CDF II detector, yielding a measurement of $M_t= 171.2\pm 2.7(\textrm{stat.})\pm 2.9(\textrm{syst.})\mathrm{GeV}/c^2$.
Resumo:
We report a measurement of the top quark mass $M_t$ in the dilepton decay channel $t\bar{t}\to b\ell'^{+}\nu'_\ell\bar{b}\ell^{-}\bar{\nu}_{\ell}$. Events are selected with a neural network which has been directly optimized for statistical precision in top quark mass using neuroevolution, a technique modeled on biological evolution. The top quark mass is extracted from per-event probability densities that are formed by the convolution of leading order matrix elements and detector resolution functions. The joint probability is the product of the probability densities from 344 candidate events in 2.0 fb$^{-1}$ of $p\bar{p}$ collisions collected with the CDF II detector, yielding a measurement of $M_t= 171.2\pm 2.7(\textrm{stat.})\pm 2.9(\textrm{syst.})\mathrm{GeV}/c^2$.
Resumo:
A technique is developed to study random vibration of nonlinear systems. The method is based on the assumption that the joint probability density function of the response variables and input variables is Gaussian. It is shown that this method is more general than the statistical linearization technique in that it can handle non-Gaussian excitations and amplitude-limited responses. As an example a bilinear hysteretic system under white noise excitation is analyzed. The prediction of various response statistics by this technique is in good agreement with other available results.