896 resultados para Joint probability
Resumo:
The emergence of uncorrelated growing networks is proved when nodes are removed either uniformly or under the preferential survival rule recently observed in the World Wide Web evolution. To this aim, the rate equation for the joint probability of degrees is derived, and stationary symmetrical solutions are obtained, by passing to the continuum limit. When a uniformly random removal of extant nodes and linear preferential attachment of new nodes are at work, we prove that the only stationary solution corresponds to uncorrelated networks for any removal rate r ∈ (0,1). In the more general case of preferential survival of nodes, uncorrelated solutions are also obtained. These results generalize the uncorrelatedness displayed by the (undirected) Barab´asi-Albert network model to models with uniformly random and selective (against low degrees) removal of nodes
Resumo:
Using a scaling assumption, we propose a phenomenological model aimed to describe the joint probability distribution of two magnitudes A and T characterizing the spatial and temporal scales of a set of avalanches. The model also describes the correlation function of a sequence of such avalanches. As an example we study the joint distribution of amplitudes and durations of the acoustic emission signals observed in martensitic transformations [Vives et al., preceding paper, Phys. Rev. B 52, 12 644 (1995)].
Resumo:
In this work, we propose a copula-based method to generate synthetic gene expression data that account for marginal and joint probability distributions features captured from real data. Our method allows us to implant significant genes in the synthetic dataset in a controlled manner, giving the possibility of testing new detection algorithms under more realistic environments.
Resumo:
Using a scaling assumption, we propose a phenomenological model aimed to describe the joint probability distribution of two magnitudes A and T characterizing the spatial and temporal scales of a set of avalanches. The model also describes the correlation function of a sequence of such avalanches. As an example we study the joint distribution of amplitudes and durations of the acoustic emission signals observed in martensitic transformations [Vives et al., preceding paper, Phys. Rev. B 52, 12 644 (1995)].
Resumo:
Exercises and solutions in PDF
Resumo:
Exercises and solutions in LaTex
Resumo:
A means of assessing, monitoring and controlling aggregate emissions from multi-instrument Emissions Trading Schemes is proposed. The approach allows contributions from different instruments with different forms of emissions targets to be integrated. Where Emissions Trading Schemes are helping meet specific national targets, the approach allows the entry requirements of new participants to be calculated and set at a level that will achieve these targets. The approach is multi-levelled, and may be extended downwards to support pooling of participants within instruments, or upwards to embed Emissions Trading Schemes within a wider suite of policies and measures with hard and soft targets. Aggregate emissions from each instrument are treated stochastically. Emissions from the scheme as a whole are then the joint probability distribution formed by integrating the emissions from its instruments. Because a Bayesian approach is adopted, qualitative and semi-qualitative data from expert opinion can be used where quantitative data is not currently available, or is incomplete. This approach helps government retain sufficient control over emissions trading scheme targets to allow them to meet their emissions reduction obligations, while minimising the need for retrospectively adjusting existing participants’ conditions of entry. This maintains participant confidence, while providing the necessary policy levers for good governance.
Resumo:
The evidence provided by modelled assessments of future climate impact on flooding is fundamental to water resources and flood risk decision making. Impact models usually rely on climate projections from global and regional climate models (GCM/RCMs). However, challenges in representing precipitation events at catchment-scale resolution mean that decisions must be made on how to appropriately pre-process the meteorological variables from GCM/RCMs. Here the impacts on projected high flows of differing ensemble approaches and application of Model Output Statistics to RCM precipitation are evaluated while assessing climate change impact on flood hazard in the Upper Severn catchment in the UK. Various ensemble projections are used together with the HBV hydrological model with direct forcing and also compared to a response surface technique. We consider an ensemble of single-model RCM projections from the current UK Climate Projections (UKCP09); multi-model ensemble RCM projections from the European Union's FP6 ‘ENSEMBLES’ project; and a joint probability distribution of precipitation and temperature from a GCM-based perturbed physics ensemble. The ensemble distribution of results show that flood hazard in the Upper Severn is likely to increase compared to present conditions, but the study highlights the differences between the results from different ensemble methods and the strong assumptions made in using Model Output Statistics to produce the estimates of future river discharge. The results underline the challenges in using the current generation of RCMs for local climate impact studies on flooding. Copyright © 2012 Royal Meteorological Society
Resumo:
Our knowledge of stratospheric O3-N2O correlations is extended, and their potential for model-measurement comparison assessed, using data from the Atmospheric Chemistry Experiment (ACE) satellite and the Canadian Middle Atmosphere Model (CMAM). ACE provides the first comprehensive data set for the investigation of interhemispheric, interseasonal, and height-resolved differences of the O_3-N_2O correlation structure. By subsampling the CMAM data, the representativeness of the ACE data is evaluated. In the middle stratosphere, where the correlations are not compact and therefore mainly reflect the data sampling, joint probability density functions provide a detailed picture of key aspects of transport and mixing, but also trace polar ozone loss. CMAM captures these important features, but exhibits a displacement of the tropical pipe into the Southern Hemisphere (SH). Below about 21 km, the ACE data generally confirm the compactness of the correlations, although chemical ozone loss tends to destroy the compactness during late winter/spring, especially in the SH. This allows a quantitative comparison of the correlation slopes in the lower and lowermost stratosphere (LMS), which exhibit distinct seasonal cycles that reveal the different balances between diabatic descent and horizontal mixing in these two regions in the Northern Hemisphere (NH), reconciling differences found in aircraft measurements, and the strong role of chemical ozone loss in the SH. The seasonal cycles are qualitatively well reproduced by CMAM, although their amplitude is too weak in the NH LMS. The correlation slopes allow a "chemical" definition of the LMS, which is found to vary substantially in vertical extent with season.
Resumo:
We propose and demonstrate a fully probabilistic (Bayesian) approach to the detection of cloudy pixels in thermal infrared (TIR) imagery observed from satellite over oceans. Using this approach, we show how to exploit the prior information and the fast forward modelling capability that are typically available in the operational context to obtain improved cloud detection. The probability of clear sky for each pixel is estimated by applying Bayes' theorem, and we describe how to apply Bayes' theorem to this problem in general terms. Joint probability density functions (PDFs) of the observations in the TIR channels are needed; the PDFs for clear conditions are calculable from forward modelling and those for cloudy conditions have been obtained empirically. Using analysis fields from numerical weather prediction as prior information, we apply the approach to imagery representative of imagers on polar-orbiting platforms. In comparison with the established cloud-screening scheme, the new technique decreases both the rate of failure to detect cloud contamination and the false-alarm rate by one quarter. The rate of occurrence of cloud-screening-related errors of >1 K in area-averaged SSTs is reduced by 83%. Copyright © 2005 Royal Meteorological Society.
Resumo:
In this paper, we present a study on a deterministic partially self-avoiding walk (tourist walk), which provides a novel method for texture feature extraction. The method is able to explore an image on all scales simultaneously. Experiments were conducted using different dynamics concerning the tourist walk. A new strategy, based on histograms. to extract information from its joint probability distribution is presented. The promising results are discussed and compared to the best-known methods for texture description reported in the literature. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Canalizing genes possess such broad regulatory power, and their action sweeps across a such a wide swath of processes that the full set of affected genes are not highly correlated under normal conditions. When not active, the controlling gene will not be predictable to any significant degree by its subject genes, either alone or in groups, since their behavior will be highly varied relative to the inactive controlling gene. When the controlling gene is active, its behavior is not well predicted by any one of its targets, but can be very well predicted by groups of genes under its control. To investigate this question, we introduce in this paper the concept of intrinsically multivariate predictive (IMP) genes, and present a mathematical study of IMP in the context of binary genes with respect to the coefficient of determination (CoD), which measures the predictive power of a set of genes with respect to a target gene. A set of predictor genes is said to be IMP for a target gene if all properly contained subsets of the predictor set are bad predictors of the target but the full predictor set predicts the target with great accuracy. We show that logic of prediction, predictive power, covariance between predictors, and the entropy of the joint probability distribution of the predictors jointly affect the appearance of IMP genes. In particular, we show that high-predictive power, small covariance among predictors, a large entropy of the joint probability distribution of predictors, and certain logics, such as XOR in the 2-predictor case, are factors that favor the appearance of IMP. The IMP concept is applied to characterize the behavior of the gene DUSP1, which exhibits control over a central, process-integrating signaling pathway, thereby providing preliminary evidence that IMP can be used as a criterion for discovery of canalizing genes.
Resumo:
In this paper we study the accumulated claim in some fixed time period, skipping the classical assumption of mutual independence between the variables involved. Two basic models are considered: Model I assumes that any pair of claims are equally correlated which means that the corresponding square-integrable sequence is exchangeable one. Model 2 states that the correlations between the adjacent claims are the same. Recurrence and explicit expressions for the joint probability generating function are derived and the impact of the dependence parameter (correlation coefficient) in both models is examined. The Markov binomial distribution is obtained as a particular case under assumptions of Model 2. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The characterization of soil CO2 emissions (FCO2) is important for the study of the global carbon cycle. This phenomenon presents great variability in space and time, a characteristic that makes attempts at modeling and forecasting FCO2 challenging. Although spatial estimates have been performed in several studies, the association of these estimates with the uncertainties inherent in the estimation procedures is not considered. This study aimed to evaluate the local, spatial, local-temporal and spatial-temporal uncertainties of short-term FCO2 after harvest period in a sugar cane area. The FCO2 was featured in a sampling grid of 60m×60m containing 127 points with minimum separation distances from 0.5 to 10m between points. The FCO2 was evaluated 7 times within a total period of 10 days. The variability of FCO2 was described by descriptive statistics and variogram modeling. To calculate the uncertainties, 300 realizations made by sequential Gaussian simulation were considered. Local uncertainties were evaluated using the probability values exceeding certain critical thresholds, while the spatial uncertainties considering the probability of regions with high probability values together exceed the adopted limits. Using the daily uncertainties, the local-spatial and spatial-temporal uncertainty (Ftemp) was obtained. The daily and mean emissions showed a variability structure that was described by spherical and Gaussian models. The differences between the daily maps were related to variations in the magnitude of FCO2, covering mean values ranging from 1.28±0.11μmolm-2s-1 (F197) to 1.82±0.07μmolm-2s-1 (F195). The Ftemp showed low spatial uncertainty coupled with high local uncertainty estimates. The average emission showed great spatial uncertainty of the simulated values. The evaluation of uncertainties associated with the knowledge of temporal and spatial variability is an important tool for understanding many phenomena over time, such as the quantification of greenhouse gases or the identification of areas with high crop productivity. © 2013 Elsevier B.V.
Resumo:
Semi-qualitative probabilistic networks (SQPNs) merge two important graphical model formalisms: Bayesian networks and qualitative probabilistic networks. They provade a very Complexity of inferences in polytree-shaped semi-qualitative probabilistic networks and qualitative probabilistic networks. They provide a very general modeling framework by allowing the combination of numeric and qualitative assessments over a discrete domain, and can be compactly encoded by exploiting the same factorization of joint probability distributions that are behind the bayesian networks. This paper explores the computational complexity of semi-qualitative probabilistic networks, and takes the polytree-shaped networks as its main target. We show that the inference problem is coNP-Complete for binary polytrees with multiple observed nodes. We also show that interferences can be performed in time linear in the number of nodes if there is a single observed node. Because our proof is construtive, we obtain an efficient linear time algorithm for SQPNs under such assumptions. To the best of our knowledge, this is the first exact polynominal-time algorithm for SQPn. Together these results provide a clear picture of the inferential complexity in polytree-shaped SQPNs.