958 resultados para Probabilistic generalization
Resumo:
Prior probabilities represent a core element of the Bayesian probabilistic approach to relatedness testing. This letter opinions on the commentary 'Use of prior odds for missing persons identifications' by Budowle et al. (2011), published recently in this journal. Contrary to Budowle et al. (2011), we argue that the concept of prior probabilities (i) is not endowed with the notion of objectivity, (ii) is not a case for computation and (iii) does not require new guidelines edited by the forensic DNA community - as long as probability is properly considered as an expression of personal belief. Please see related article: http://www.investigativegenetics.com/content/3/1/3
The hematology laboratory in blood doping (bd): 2014 update on the athlete biological passport (APB)
Resumo:
Introduction: Blood doping (BD) is the use of Erythropoietic Stimulating Agents (ESAs) and/or transfusion to increase aerobic performance in athletes. Direct toxicologic techniques are insufficient to unmask sophisticated doping protocols. The Hematological module of the ABP (World Anti-Doping Agency), associates decision support technology and expert assessment to indirectly detect BD hematological effects. Methods: The ABP module is based on blood parameters, under strict pre-analytical and analytical rules for collection, storage and transport at 2-12°C, internal and external QC. Accuracy, reproducibility and interlaboratory harmonization fulfill forensic standard. Blood samples are collected in competition and out-ofcompetition. Primary parameters for longitudinal monitoring are: - hemoglobin (HGB); - reticulocyte percentage (RET); - OFF score, indicator of suppressed erythropoiesis, calculated as [HGB(g/L) * 60-√RET%]. Statistical calculation predicts individual expected limits by probabilistic inference. Secondary parameters are RBC, HCT, MCHC-MCH-MCV-RDW-IFR. ABP profiles flagged as atypical are review by experts in hematology, pharmacology, sports medicine or physiology, and classified as: - normal - suspect (to target) - likely due to BD - likely due to pathology. Results: Thousands of athletes worldwide are currently monitored. Since 2010, at least 35 athletes have been sanctioned and others are prosecuted on the sole basis of abnormal ABP, with a 240% increase of positivity to direct tests for ESA, thanks to improved targeting of suspicious athletes (WADA data). Specific doping scenarios have been identified by the Experts (Table and Figure). Figure. Typical HGB and RET profiles in two highly suspicious athletes. A. Sample 2: simultaneous increases in HGB and RET (likely ESA stimulation) in a male. B. Samples 3, 6 and 7: "OFF" picture, with high HGB and low RET in a female. Sample 10: normal HGB and increased RET (ESA or blood withdrawal). Conclusions: ABP is a powerful tool for indirect doping detection, based on the recognition of specific, unphysiological changes triggered by blood doping. The effect of factors of heterogeneity, such as sex and altitude, must also be considered. Schumacher YO, et al. Drug Test Anal 2012, 4:846-853. Sottas PE, et al. Clin Chem 2011, 57:969-976.
Resumo:
In a series of three experiments, participants made inferences about which one of a pair of two objects scored higher on a criterion. The first experiment was designed to contrast the prediction of Probabilistic Mental Model theory (Gigerenzer, Hoffrage, & Kleinbölting, 1991) concerning sampling procedure with the hard-easy effect. The experiment failed to support the theory's prediction that a particular pair of randomly sampled item sets would differ in percentage correct; but the observation that German participants performed practically as well on comparisons between U.S. cities (many of which they did not even recognize) than on comparisons between German cities (about which they knew much more) ultimately led to the formulation of the recognition heuristic. Experiment 2 was a second, this time successful, attempt to unconfound item difficulty and sampling procedure. In Experiment 3, participants' knowledge and recognition of each city was elicited, and how often this could be used to make an inference was manipulated. Choices were consistent with the recognition heuristic in about 80% of the cases when it discriminated and people had no additional knowledge about the recognized city (and in about 90% when they had such knowledge). The frequency with which the heuristic could be used affected the percentage correct, mean confidence, and overconfidence as predicted. The size of the reference class, which was also manipulated, modified these effects in meaningful and theoretically important ways.
Resumo:
Quantitative research that aimed to identify the mean total cost (MTC) of connecting, maintaining and disconnecting patient-controlled analgesia pump (PCA) in the management of pain. The non-probabilistic sample corresponded to the observation of 81 procedures in 17 units of the Central Institute of the Clinics Hospital, Faculty of Medicine, University of Sao Paulo. We calculated the MTC multiplying by the time spent by nurses at a unit cost of direct labor, adding the cost of materials and medications/solutions. The MTC of connecting was R$ 107.91; maintenance R$ 110.55 and disconnecting R$ 4.94. The results found will subsidize discussions about the need to transfer money from the Unified Health System to hospitals units that perform this technique of analgesic therapy and it will contribute to the cost management aimed at making efficient and effective decision-making in the allocation of available resources.
Resumo:
Sobriety checkpoints are not usually randomly located by traffic authorities. As such, information provided by non-random alcohol tests cannot be used to infer the characteristics of the general driving population. In this paper a case study is presented in which the prevalence of alcohol-impaired driving is estimated for the general population of drivers. A stratified probabilistic sample was designed to represent vehicles circulating in non-urban areas of Catalonia (Spain), a region characterized by its complex transportation network and dense traffic around the metropolis of Barcelona. Random breath alcohol concentration tests were performed during spring 2012 on 7,596 drivers. The estimated prevalence of alcohol-impaired drivers was 1.29%, which is roughly a third of the rate obtained in non-random tests. Higher rates were found on weekends (1.90% on Saturdays, 4.29% on Sundays) and especially at night. The rate is higher for men (1.45%) than for women (0.64%) and the percentage of positive outcomes shows an increasing pattern with age. In vehicles with two occupants, the proportion of alcohol-impaired drivers is estimated at 2.62%, but when the driver was alone the rate drops to 0.84%, which might reflect the socialization of drinking habits. The results are compared with outcomes in previous surveys, showing a decreasing trend in the prevalence of alcohol-impaired drivers over time.
Resumo:
Objective To analyze the determinants of emergency contraception non-use among women in unplanned and ambivalent pregnancies. Method Cross-sectional study with a probabilistic sample of 366 pregnant women from 12 primary health care units in the city of São Paulo, Brazil. A multinomial logistic regression was performed, comparing three groups: women who used emergency contraception to prevent ongoing pregnancies (reference); women who made no use of emergency contraception, but used other contraceptive methods; and women who made no use of any contraceptive methods at all. Results Cohabitation with a partner was the common determinant of emergency contraception non-use. No pregnancy risk awareness, ambivalent pregnancies and no previous use of emergency contraception also contributed to emergency contraception non-use. Conclusion Apart from what is pointed out in the literature, knowledge of emergency contraception and the fertile period were not associated to its use.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
The visual cortex in each hemisphere is linked to the opposite hemisphere by axonal projections that pass through the splenium of the corpus callosum. Visual-callosal connections in humans and macaques are found along the V1/V2 border where the vertical meridian is represented. Here we identify the topography of V1 vertical midline projections through the splenium within six human subjects with normal vision using diffusion-weighted MR imaging and probabilistic diffusion tractography. Tractography seed points within the splenium were classified according to their estimated connectivity profiles to topographic subregions of V1, as defined by functional retinotopic mapping. First, we report a ventral-dorsal mapping within the splenium with fibers from ventral V1 (representing the upper visual field) projecting to the inferior-anterior corner of the splenium and fibers from dorsal V1 (representing the lower visual field) projecting to the superior-posterior end. Second, we also report an eccentricity gradient of projections from foveal-to-peripheral V1 subregions running in the anterior-superior to posterior-inferior direction, orthogonal to the dorsal-ventral mapping. These results confirm and add to a previous diffusion MRI study (Dougherty et al., 2005) which identified a dorsal/ventral mapping of human splenial fibers. These findings yield a more detailed view of the structural organization of the splenium than previously reported and offer new opportunities to study structural plasticity in the visual system.
Resumo:
We start with a generalization of the well-known three-door problem:the n-door problem. The solution of this new problem leads us toa beautiful representation system for real numbers in (0,1] as alternated series, known in the literature as Pierce expansions. A closer look to Pierce expansions will take us to some metrical properties of sets defined through the Pierce expansions of its elements. Finally, these metrical properties will enable us to present 'strange' sets, similar to the classical Cantor set.
Resumo:
The objective of this study consists in quantifying in money terms thepotential reduction in usage of public health care outlets associatedto the tenure of double (public plus private) insurance. In order to address the problem, a probabilistic model for visits to physicians is specified and estimated using data from the Catalonian Health Survey. Also, a model for the marginal cost of a visit to a physician is estimated using data from a representative sample of fee-for-service payments from a major insurer. Combining the estimates from the two models it is possible to quantify in money terms the cost/savings of alternative policies which bear an impact on the adoption of double insurance by the population. The results suggest that the private sector absorbs an important volumeof demand which would be re-directed to the public sector if consumerscease to hold double insurance.
Resumo:
Scoring rules that elicit an entire belief distribution through the elicitation of point beliefsare time-consuming and demand considerable cognitive e¤ort. Moreover, the results are validonly when agents are risk-neutral or when one uses probabilistic rules. We investigate a classof rules in which the agent has to choose an interval and is rewarded (deterministically) onthe basis of the chosen interval and the realization of the random variable. We formulatean e¢ ciency criterion for such rules and present a speci.c interval scoring rule. For single-peaked beliefs, our rule gives information about both the location and the dispersion of thebelief distribution. These results hold for all concave utility functions.
Resumo:
The first generation models of currency crises have often been criticized because they predict that, in the absence of very large triggering shocks, currency attacks should be predictable and lead to small devaluations. This paper shows that these features of first generation models are not robust to the inclusion of private information. In particular, this paper analyzes a generalization of the Krugman-Flood-Garber (KFG) model, which relaxes the assumption that all consumers are perfectly informed about the level of fundamentals. In this environment, the KFG equilibrium of zero devaluation is only one of many possible equilibria. In all the other equilibria, the lack of perfect information delays the attack on the currency past the point at which the shadow exchange rate equals the peg, giving rise to unpredictable and discrete devaluations.
Resumo:
Recently, kernel-based Machine Learning methods have gained great popularity in many data analysis and data mining fields: pattern recognition, biocomputing, speech and vision, engineering, remote sensing etc. The paper describes the use of kernel methods to approach the processing of large datasets from environmental monitoring networks. Several typical problems of the environmental sciences and their solutions provided by kernel-based methods are considered: classification of categorical data (soil type classification), mapping of environmental and pollution continuous information (pollution of soil by radionuclides), mapping with auxiliary information (climatic data from Aral Sea region). The promising developments, such as automatic emergency hot spot detection and monitoring network optimization are discussed as well.
Resumo:
In this paper we consider an insider with privileged information thatis affected by an independent noise vanishing as the revelation timeapproaches. At this time, information is available to every trader. Ourfinancial markets are based on Wiener space. In probabilistic terms weobtain an infinite dimensional extension of Jacod s theorem to covercases of progressive enlargement of filtrations. The application ofthis result gives the semimartingale decomposition of the originalWiener process under the progressively enlarged filtration. As anapplication we prove that if the rate at which the additional noise inthe insider s information vanishes is slow enough then there is noarbitrage and the additional utility of the insider is finite.
Resumo:
We exhibit and characterize an entire class of simple adaptive strategies,in the repeated play of a game, having the Hannan-consistency property: In the long-run, the player is guaranteed an average payoff as large as the best-reply payoff to the empirical distribution of play of the otherplayers; i.e., there is no "regret." Smooth fictitious play (Fudenberg and Levine [1995]) and regret-matching (Hart and Mas-Colell [1998]) areparticular cases. The motivation and application of this work come from the study of procedures whose empirical distribution of play is, in thelong-run, (almost) a correlated equilibrium. The basic tool for the analysis is a generalization of Blackwell's [1956a] approachability strategy for games with vector payoffs.