960 resultados para Probability Metrics
Resumo:
The present notes are intended to present a detailed review of the existing results in dissipative kinetic theory which make use of the contraction properties of two main families of probability metrics: optimal mass transport and Fourier-based metrics. The first part of the notes is devoted to a self-consistent summary and presentation of the properties of both probability metrics, including new aspects on the relationships between them and other metrics of wide use in probability theory. These results are of independent interest with potential use in other contexts in Partial Differential Equations and Probability Theory. The second part of the notes makes a different presentation of the asymptotic behavior of Inelastic Maxwell Models than the one presented in the literature and it shows a new example of application: particle's bath heating. We show how starting from the contraction properties in probability metrics, one can deduce the existence, uniqueness and asymptotic stability in classical spaces. A global strategy with this aim is set up and applied in two dissipative models.
Resumo:
* Research supported by NATO GRANT CRG 900 798 and by Humboldt Award for U.S. Scientists.
Resumo:
We quantify the long-time behavior of a system of (partially) inelastic particles in a stochastic thermostat by means of the contractivity of a suitable metric in the set of probability measures. Existence, uniqueness, boundedness of moments and regularity of a steady state are derived from this basic property. The solutions of the kinetic model are proved to converge exponentially as t→ ∞ to this diffusive equilibrium in this distance metrizing the weak convergence of measures. Then, we prove a uniform bound in time on Sobolev norms of the solution, provided the initial data has a finite norm in the corresponding Sobolev space. These results are then combined, using interpolation inequalities, to obtain exponential convergence to the diffusive equilibrium in the strong L¹-norm, as well as various Sobolev norms.
Resumo:
Traditionally, the Iowa Department of Transportation has used the Iowa Runoff Chart and single-variable regional-regression equations (RREs) from a U.S. Geological Survey report (published in 1987) as the primary methods to estimate annual exceedance-probability discharge (AEPD) for small (20 square miles or less) drainage basins in Iowa. With the publication of new multi- and single-variable RREs by the U.S. Geological Survey (published in 2013), the Iowa Department of Transportation needs to determine which methods of AEPD estimation provide the best accuracy and the least bias for small drainage basins in Iowa. Twenty five streamgages with drainage areas less than 2 square miles (mi2) and 55 streamgages with drainage areas between 2 and 20 mi2 were selected for the comparisons that used two evaluation metrics. Estimates of AEPDs calculated for the streamgages using the expected moments algorithm/multiple Grubbs-Beck test analysis method were compared to estimates of AEPDs calculated from the 2013 multivariable RREs; the 2013 single-variable RREs; the 1987 single-variable RREs; the TR-55 rainfall-runoff model; and the Iowa Runoff Chart. For the 25 streamgages with drainage areas less than 2 mi2, results of the comparisons seem to indicate the best overall accuracy and the least bias may be achieved by using the TR-55 method for flood regions 1 and 3 (published in 2013) and by using the 1987 single-variable RREs for flood region 2 (published in 2013). For drainage basins with areas between 2 and 20 mi2, results of the comparisons seem to indicate the best overall accuracy and the least bias may be achieved by using the 1987 single-variable RREs for the Southern Iowa Drift Plain landform region and for flood region 3 (published in 2013), by using the 2013 multivariable RREs for the Iowan Surface landform region, and by using the 2013 or 1987 single-variable RREs for flood region 2 (published in 2013). For all other landform or flood regions in Iowa, use of the 2013 single-variable RREs may provide the best overall accuracy and the least bias. An examination was conducted to understand why the 1987 single-variable RREs seem to provide better accuracy and less bias than either of the 2013 multi- or single-variable RREs. A comparison of 1-percent annual exceedance-probability regression lines for hydrologic regions 1–4 from the 1987 single-variable RREs and for flood regions 1–3 from the 2013 single-variable RREs indicates that the 1987 single-variable regional-regression lines generally have steeper slopes and lower discharges when compared to 2013 single-variable regional-regression lines for corresponding areas of Iowa. The combination of the definition of hydrologic regions, the lower discharges, and the steeper slopes of regression lines associated with the 1987 single-variable RREs seem to provide better accuracy and less bias when compared to the 2013 multi- or single-variable RREs; better accuracy and less bias was determined particularly for drainage areas less than 2 mi2, and also for some drainage areas between 2 and 20 mi2. The 2013 multi- and single-variable RREs are considered to provide better accuracy and less bias for larger drainage areas. Results of this study indicate that additional research is needed to address the curvilinear relation between drainage area and AEPDs for areas of Iowa.
Resumo:
We assess the predictive ability of three VPIN metrics on the basis of two highly volatile market events of China, and examine the association between VPIN and toxic-induced volatility through conditional probability analysis and multiple regression. We examine the dynamic relationship on VPIN and high-frequency liquidity using Vector Auto-Regression models, Granger Causality tests, and impulse response analysis. Our results suggest that Bulk Volume VPIN has the best risk-warning effect among major VPIN metrics. VPIN has a positive association with market volatility induced by toxic information flow. Most importantly, we document a positive feedback effect between VPIN and high-frequency liquidity, where a negative liquidity shock boosts up VPIN, which, in turn, leads to further liquidity drain. Our study provides empirical evidence that reflects an intrinsic game between informed traders and market makers when facing toxic information in the high-frequency trading world.
Resumo:
The occupant impact velocity (OIV) and acceleration severity index (ASI) are competing measures of crash severity used to assess occupant injury risk in full-scale crash tests involving roadside safety hardware, e.g. guardrail. Delta-V, or the maximum change in vehicle velocity, is the traditional metric of crash severity for real world crashes. This study compares the ability of the OIV, ASI, and delta-V to discriminate between serious and non-serious occupant injury in real world frontal collisions. Vehicle kinematics data from event data recorders (EDRs) were matched with detailed occupant injury information for 180 real world crashes. Cumulative probability of injury risk curves were generated using binary logistic regression for belted and unbelted data subsets. By comparing the available fit statistics and performing a separate ROC curve analysis, the more computationally intensive OIV and ASI were found to offer no significant predictive advantage over the simpler delta-V.
Resumo:
Geometrical dependencies are being researched for analytical representation of the probability density function (pdf) for the travel time between a random, and a known or another random point in Tchebyshev’s metric. In the most popular case - a rectangular area of service - the pdf of this random variable depends directly on the position of the server. Two approaches have been introduced for the exact analytical calculation of the pdf: Ad-hoc approach – useful for a ‘manual’ solving of a specific case; by superposition – an algorithmic approach for the general case. The main concept of each approach is explained, and a short comparison is done to prove the faithfulness.
Resumo:
Amphibians have been declining worldwide and the comprehension of the threats that they face could be improved by using mark-recapture models to estimate vital rates of natural populations. Recently, the consequences of marking amphibians have been under discussion and the effects of toe clipping on survival are debatable, although it is still the most common technique for individually identifying amphibians. The passive integrated transponder (PIT tag) is an alternative technique, but comparisons among marking techniques in free-ranging populations are still lacking. We compared these two marking techniques using mark-recapture models to estimate apparent survival and recapture probability of a neotropical population of the blacksmith tree frog, Hypsiboas faber. We tested the effects of marking technique and number of toe pads removed while controlling for sex. Survival was similar among groups, although slightly decreased from individuals with one toe pad removed, to individuals with two and three toe pads removed, and finally to PIT-tagged individuals. No sex differences were detected. Recapture probability slightly increased with the number of toe pads removed and was the lowest for PIT-tagged individuals. Sex was an important predictor for recapture probability, with males being nearly five times more likely to be recaptured. Potential negative effects of both techniques may include reduced locomotion and high stress levels. We recommend the use of covariates in models to better understand the effects of marking techniques on frogs. Accounting for the effect of the technique on the results should be considered, because most techniques may reduce survival. Based on our results, but also on logistical and cost issues associated with PIT tagging, we suggest the use of toe clipping with anurans like the blacksmith tree frog.
Resumo:
The structure of probability currents is studied for the dynamical network after consecutive contraction on two-state, nonequilibrium lattice systems. This procedure allows us to investigate the transition rates between configurations on small clusters and highlights some relevant effects of lattice symmetries on the elementary transitions that are responsible for entropy production. A method is suggested to estimate the entropy production for different levels of approximations (cluster sizes) as demonstrated in the two-dimensional contact process with mutation.
Resumo:
We propose an alternative fidelity measure (namely, a measure of the degree of similarity) between quantum states and benchmark it against a number of properties of the standard Uhlmann-Jozsa fidelity. This measure is a simple function of the linear entropy and the Hilbert-Schmidt inner product between the given states and is thus, in comparison, not as computationally demanding. It also features several remarkable properties such as being jointly concave and satisfying all of Jozsa's axioms. The trade-off, however, is that it is supermultiplicative and does not behave monotonically under quantum operations. In addition, metrics for the space of density matrices are identified and the joint concavity of the Uhlmann-Jozsa fidelity for qubit states is established.
Resumo:
Two experiments were conducted on the nature of expert perception in the sport of squash. In the first experiment, ten expert and fifteen novice players attempted to predict the direction and force of squash strokes from either a film display (occluded at variable time periods before and after the opposing player had struck the ball) or a matched point-light display (containing only the basic kinematic features of the opponent's movement pattern). Experts outperformed the novices under both display conditions, and the same basic time windows that characterised expert and novice pick-up of information in the film task also persisted in the point-light task. This suggests that the experts' perceptual advantage is directly related to their superior pick-up of essential kinematic information. In the second experiment, the vision of six expert and six less skilled players was occluded by remotely triggered liquid-crystal spectacles at quasi-random intervals during simulated match play. Players were required to complete their current stroke even when the display was occluded and their prediction performance was assessed with respect to whether they moved to the correct half of the court to match the direction and depth of the opponent's stroke. Consistent with experiment 1, experts were found to be superior in their advance pick-up of both directional and depth information when the display was occluded during the opponent's hitting action. However, experts also remained better than chance, and clearly superior to less skilled players, in their prediction performance under conditions where occlusion occurred before any significant pre-contact preparatory movement by the opposing player was visible. This additional source of expert superiority is attributable to their superior attunement to the information contained in the situational probabilities and sequential dependences within their opponent's pattern of play.
Resumo:
The phenomenon of probability backflow, previously quantified for a free nonrelativistic particle, is considered for a free particle obeying Dirac's equation. It is known that probability backflow can occur in the opposite direction to the momentum; that is to say, there exist positive-energy states in which the particle certainly has a positive momentum in a given direction, but for which the component of the probability flux vector in that direction is negative. It is shown thar the maximum possible amount of probability that can flow backwards, over a given time interval of duration T, depends on the dimensionless parameter epsilon = root 4h/mc(2)T, where m is the mass of the particle and c is the speed of light. At epsilon = 0, the nonrelativistic value of approximately 0.039 for this maximum is recovered. Numerical studies suggest that the maximum decreases monotonically as epsilon increases from 0, and show that it depends on the size of m, h, and T, unlike the nonrelativistic case.
Resumo:
We present a review of perceptual image quality metrics and their application to still image compression. The review describes how image quality metrics can be used to guide an image compression scheme and outlines the advantages, disadvantages and limitations of a number of quality metrics. We examine a broad range of metrics ranging from simple mathematical measures to those which incorporate full perceptual models. We highlight some variation in the models for luminance adaptation and the contrast sensitivity function and discuss what appears to be a lack of a general consensus regarding the models which best describe contrast masking and error summation. We identify how the various perceptual components have been incorporated in quality metrics, and identify a number of psychophysical testing techniques that can be used to validate the metrics. We conclude by illustrating some of the issues discussed throughout the paper with a simple demonstration. (C) 1998 Elsevier Science B.V. All rights reserved.
Resumo:
In his study of the 'time of arrival' problem in the nonrelativistic quantum mechanics of a single particle, Allcock [1] noted that the direction of the probability flux vector is not necessarily the same as that of the mean momentum of a wave packet, even when the packet is composed entirely of plane waves with a common direction of momentum. Packets can be constructed, for example for a particle moving under a constant force, in which probability flows for a finite time in the opposite direction to the momentum. A similar phenomenon occurs for the Dirac electron. The maximum amount of probabilitiy backflow which can occur over a given time interval can be calculated in each case.
Resumo:
The aim of this study was to investigate the frequency of axillary metastasis in women with tubular carcinoma (TC) of the breast. Women who underwent axillary dissection for TC in the Western Sydney area (1984-1995) were identified retrospectively through a search of computerized records. A centralized pathology review was performed and tumours were classified as pure tubular (22) or mixed tubular (nine), on the basis of the invasive component containing 90 per cent or more, or 75-90 per cent tubule formation respectively. A Medline search of the literature was undertaken to compile a collective series (20 studies with a total of 680 patients) to address the frequency of nodal involvement in TC. A quantitative meta-analysis was used to combine the results of these studies. The overall frequency of nodal metastasis was five of 31 (16 per cent); one of 22 pure tubular and four of nine mixed tumours (P = 0.019). None of the tumours with a diameter of 10 mm or less (n = 16) had nodal metastasis compared with five of 15 larger tumours (P = 0.018). The meta-analysis of 680 women showed an overall frequency of nodal metastasis in TC of 13.8 (95 per cent confidence interval 9.3-18.3) per cent. The frequency of nodal involvement was 6.6 (1.7-11.4) per cent in pure TC (n = 244) and 25.0 (12.5-37.6) per cent in mixed TC (n = 149). A case may be made for observing the clinically negative axilla in women with a small TC (10 mm or less in diameter).