999 resultados para Negative probability


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Two stochastic models have been fitted to daily rainfall data for an interior station of Brazil. Of these two models, the results show a better fit to describe the data, by truncated negative probability model in comparison with Markov chain probability model. Kolmogorov-Smirnov test is applied for significance for these models. © 1983 Springer-Verlag.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper studies several topics related with the concept of “fractional” that are not directly related with Fractional Calculus, but can help the reader in pursuit new research directions. We introduce the concept of non-integer positional number systems, fractional sums, fractional powers of a square matrix, tolerant computing and FracSets, negative probabilities, fractional delay discrete-time linear systems, and fractional Fourier transform.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract Background Smear-negative pulmonary tuberculosis (SNPTB) accounts for 30% of Pulmonary Tuberculosis (PTB) cases reported annually in developing nations. Polymerase chain reaction (PCR) may provide an alternative for the rapid detection of Mycobacterium tuberculosis (MTB); however little data are available regarding the clinical utility of PCR in SNPTB, in a setting with a high burden of TB/HIV co-infection. Methods To evaluate the performance of the PCR dot-blot in parallel with pretest probability (Clinical Suspicion) in patients suspected of having SNPTB, a prospective study of 213 individuals with clinical and radiological suspicion of SNPTB was carried out from May 2003 to May 2004, in a TB/HIV reference hospital. Respiratory specialists estimated the pretest probability of active disease into high, intermediate, low categories. Expectorated sputum was examined by direct microscopy (Ziehl-Neelsen staining), culture (Lowenstein Jensen) and PCR dot-blot. Gold standard was based on culture positivity combined with the clinical definition of PTB. Results In smear-negative and HIV subjects, active PTB was diagnosed in 28.4% (43/151) and 42.2% (19/45), respectively. In the high, intermediate and low pretest probability categories active PTB was diagnosed in 67.4% (31/46), 24% (6/25), 7.5% (6/80), respectively. PCR had sensitivity of 65% (CI 95%: 50%–78%) and specificity of 83% (CI 95%: 75%–89%). There was no difference in the sensitivity of PCR in relation to HIV status. PCR sensitivity and specificity among non-previously TB treated and those treated in the past were, respectively: 69%, 43%, 85% and 80%. The high pretest probability, when used as a diagnostic test, had sensitivity of 72% (CI 95%:57%–84%) and specificity of 86% (CI 95%:78%–92%). Using the PCR dot-blot in parallel with high pretest probability as a diagnostic test, sensitivity, specificity, positive and negative predictive values were: 90%, 71%, 75%, and 88%, respectively. Among non-previously TB treated and HIV subjects, this approach had sensitivity, specificity, positive and negative predictive values of 91%, 79%, 81%, 90%, and 90%, 65%, 72%, 88%, respectively. Conclusion PCR dot-blot associated with a high clinical suspicion may provide an important contribution to the diagnosis of SNPTB mainly in patients that have not been previously treated attended at a TB/HIV reference hospital.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article provides an importance sampling algorithm for computing the probability of ruin with recuperation of a spectrally negative Lévy risk process with light-tailed downwards jumps. Ruin with recuperation corresponds to the following double passage event: for some t∈(0,∞)t∈(0,∞), the risk process starting at level x∈[0,∞)x∈[0,∞) falls below the null level during the period [0,t][0,t] and returns above the null level at the end of the period tt. The proposed Monte Carlo estimator is logarithmic efficient, as t,x→∞t,x→∞, when y=t/xy=t/x is constant and below a certain bound.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stochastic simulation algorithm was introduced by Gillespie and in a different form by Kurtz. There have been many attempts at accelerating the algorithm without deviating from the behavior of the simulated system. The crux of the explicit τ-leaping procedure is the use of Poisson random variables to approximate the number of occurrences of each type of reaction event during a carefully selected time period, τ. This method is acceptable providing the leap condition, that no propensity function changes “significantly” during any time-step, is met. Using this method there is a possibility that species numbers can, artificially, become negative. Several recent papers have demonstrated methods that avoid this situation. One such method classifies, as critical, those reactions in danger of sending species populations negative. At most, one of these critical reactions is allowed to occur in the next time-step. We argue that the criticality of a reactant species and its dependent reaction channels should be related to the probability of the species number becoming negative. This way only reactions that, if fired, produce a high probability of driving a reactant population negative are labeled critical. The number of firings of more reaction channels can be approximated using Poisson random variables thus speeding up the simulation while maintaining the accuracy. In implementing this revised method of criticality selection we make use of the probability distribution from which the random variable describing the change in species number is drawn. We give several numerical examples to demonstrate the effectiveness of our new method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult owing to species biology and behavioural characteristics. The design of robust sampling programmes should be based on an underlying statistical distribution that is sufficiently flexible to capture variations in the spatial distribution of the target species. Results: Comparisons are made of the accuracy of four probability-of-detection sampling models - the negative binomial model,1 the Poisson model,1 the double logarithmic model2 and the compound model3 - for detection of insects over a broad range of insect densities. Although the double log and negative binomial models performed well under specific conditions, it is shown that, of the four models examined, the compound model performed the best over a broad range of insect spatial distributions and densities. In particular, this model predicted well the number of samples required when insect density was high and clumped within experimental storages. Conclusions: This paper reinforces the need for effective sampling programs designed to detect insects over a broad range of spatial distributions. The compound model is robust over a broad range of insect densities and leads to substantial improvement in detection probabilities within highly variable systems such as grain storage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

TWIK-related K+ channel TREK1, a background leak K+ channel, has been strongly implicated as the target of several general and local anesthetics. Here, using the whole-cell and single-channel patch-clamp technique, we investigated the effect of lidocaine, a local anesthetic, on the human (h) TREK1 channel heterologously expressed in human embryonic kidney 293 cells by an adenoviral-mediated expression system. Lidocaine, at clinical concentrations, produced reversible, concentration-dependent inhibition of hTREK1 current, with IC50 value of 180 mu M, by reducing the single-channel open probability and stabilizing the closed state. We have identified a strategically placed unique aromatic couplet (Tyr352 and Phe355) in the vicinity of the protein kinase A phosphorylation site, Ser348, in the C-terminal domain (CTD) of hTREK1, that is critical for the action of lidocaine. Furthermore, the phosphorylation state of Ser348 was found to have a regulatory role in lidocaine-mediated inhibition of hTREK1. It is interesting that we observed strong intersubunit negative cooperativity (Hill coefficient = 0.49) and half-of-sites saturation binding stoichiometry (half-reaction order) for the binding of lidocaine to hTREK1. Studies with the heterodimer of wild-type (wt)-hTREK1 and Delta 119 C-terminal deletion mutant (hTREK1(wt)-Delta 119) revealed that single CTD of hTREK1 was capable of mediating partial inhibition by lidocaine, but complete inhibition necessitates the cooperative interaction between both the CTDs upon binding of lidocaine. Based on our observations, we propose a model that explains the unique kinetics and provides a plausible paradigm for the inhibitory action of lidocaine on hTREK1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of the initial height on the temporal persistence probability of steady-state height fluctuations in up-down symmetric linear models of surface growth are investigated. We study the (1 + 1)-dimensional Family model and the (1 + 1)-and (2 + 1)-dimensional larger curvature (LC) model. Both the Family and LC models have up-down symmetry, so the positive and negative persistence probabilities in the steady state, averaged over all values of the initial height h(0), are equal to each other. However, these two probabilities are not equal if one considers a fixed nonzero value of h(0). Plots of the positive persistence probability for negative initial height versus time exhibit power-law behavior if the magnitude of the initial height is larger than the interface width at saturation. By symmetry, the negative persistence probability for positive initial height also exhibits the same behavior. The persistence exponent that describes this power-law decay decreases as the magnitude of the initial height is increased. The dependence of the persistence probability on the initial height, the system size, and the discrete sampling time is found to exhibit scaling behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of changing Be doping concentration in GaAs layer on the integrated photosensitivity for nega- tive-electron-affinity GaAs photocathodes is investigated. Two GaAs samples with the monolayer structure and the muhilayer structure are grown by molecular beam epitaxy. The former has a constant Be concentration of 1 × 10^19 cm^-3, while the latter includes four layers with Be doping concentrations of 1 × 10^19, 7 × 10^18, 4 × 10^18, and 1 × 10^18 cm^-3 from the bottom to the surface. Negative-electron-affinity GaAs photocathodes are fabricated by exciting the sample surfaces with alternating input of Cs and O in the high vacuum system. The spectral response results measured by the on-line spectral response measurement system show that the integrated photosensitivity of the photocathode with the muhilayer structure enhanced by at least 50% as compared to that of the monolayer structure. This attributes to the improvement in the crystal quality and the increase in the surface escape probability. Different stress situations are observed on GaAs samples with monolayer structure and muhilayer structure, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The class of all Exponential-Polynomial-Trigonometric (EPT) functions is classical and equal to the Euler-d’Alembert class of solutions of linear differential equations with constant coefficients. The class of non-negative EPT functions defined on [0;1) was discussed in Hanzon and Holland (2010) of which EPT probability density functions are an important subclass. EPT functions can be represented as ceAxb, where A is a square matrix, b a column vector and c a row vector where the triple (A; b; c) is the minimal realization of the EPT function. The minimal triple is only unique up to a basis transformation. Here the class of 2-EPT probability density functions on R is defined and shown to be closed under a variety of operations. The class is also generalised to include mixtures with the pointmass at zero. This class coincides with the class of probability density functions with rational characteristic functions. It is illustrated that the Variance Gamma density is a 2-EPT density under a parameter restriction. A discrete 2-EPT process is a process which has stochastically independent 2-EPT random variables as increments. It is shown that the distribution of the minimum and maximum of such a process is an EPT density mixed with a pointmass at zero. The Laplace Transform of these distributions correspond to the discrete time Wiener-Hopf factors of the discrete time 2-EPT process. A distribution of daily log-returns, observed over the period 1931-2011 from a prominent US index, is approximated with a 2-EPT density function. Without the non-negativity condition, it is illustrated how this problem is transformed into a discrete time rational approximation problem. The rational approximation software RARL2 is used to carry out this approximation. The non-negativity constraint is then imposed via a convex optimisation procedure after the unconstrained approximation. Sufficient and necessary conditions are derived to characterise infinitely divisible EPT and 2-EPT functions. Infinitely divisible 2-EPT density functions generate 2-EPT Lévy processes. An assets log returns can be modelled as a 2-EPT Lévy process. Closed form pricing formulae are then derived for European Options with specific times to maturity. Formulae for discretely monitored Lookback Options and 2-Period Bermudan Options are also provided. Certain Greeks, including Delta and Gamma, of these options are also computed analytically. MATLAB scripts are provided for calculations involving 2-EPT functions. Numerical option pricing examples illustrate the effectiveness of the 2-EPT approach to financial modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In regression analysis of counts, a lack of simple and efficient algorithms for posterior computation has made Bayesian approaches appear unattractive and thus underdeveloped. We propose a lognormal and gamma mixed negative binomial (NB) regression model for counts, and present efficient closed-form Bayesian inference; unlike conventional Poisson models, the proposed approach has two free parameters to include two different kinds of random effects, and allows the incorporation of prior information, such as sparsity in the regression coefficients. By placing a gamma distribution prior on the NB dispersion parameter r, and connecting a log-normal distribution prior with the logit of the NB probability parameter p, efficient Gibbs sampling and variational Bayes inference are both developed. The closed-form updates are obtained by exploiting conditional conjugacy via both a compound Poisson representation and a Polya-Gamma distribution based data augmentation approach. The proposed Bayesian inference can be implemented routinely, while being easily generalizable to more complex settings involving multivariate dependence structures. The algorithms are illustrated using real examples. Copyright 2012 by the author(s)/owner(s).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple cue probability learning (MCPL) involves learning to predict a criterion based on a set of novel cues when feedback is provided in response to each judgment made. But to what extent does MCPL require controlled attention and explicit hypothesis testing? The results of two experiments show that this depends on cue polarity. Learning about cues that predict positively is aided by automatic cognitive processes, whereas learning about cues that predict negatively is especially demanding on controlled attention and hypothesis testing processes. In the studies reported here, negative, but not positive cue learning related to individual differences in working memory capacity both on measures of overall judgment performance and modelling of the implicit learning process. However, the introduction of a novel method to monitor participants' explicit beliefs about a set of cues on a trial-by-trial basis revealed that participants were engaged in explicit hypothesis testing about positive and negative cues, and explicit beliefs about both types of cues were linked to working memory capacity. Taken together, our results indicate that while people are engaged in explicit hypothesis testing during cue learning, explicit beliefs are applied to judgment only when cues are negative. © 2012 Elsevier Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple-cue probability learning (MCPL) involves learning to predict a criterion when outcome feedback is provided for multiple cues. A great deal of research suggests that working memory capacity (WMC) is involved in a wide range of tasks that draw on higher level cognitive processes. In three experiments, we examined the role of WMC in MCPL by introducing measures of working memory capacity, as well as other task manipulations. While individual differences in WMC positively predicted performance in some kinds of multiple-cue tasks, performance on other tasks was entirely unrelated to these differences. Performance on tasks that contained negative cues was correlated with working memory capacity, as well as measures of explicit knowledge obtained in the learning process. When the relevant cues predicted positively, however, WMC became irrelevant. The results are discussed in terms of controlled and automatic processes in learning and judgement. © 2011 The Experimental Psychology Society.