54 resultados para Survival probability

em Indian Institute of Science - Bangalore - Índia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We derive a very general expression of the survival probability and the first passage time distribution for a particle executing Brownian motion in full phase space with an absorbing boundary condition at a point in the position space, which is valid irrespective of the statistical nature of the dynamics. The expression, together with the Jensen's inequality, naturally leads to a lower bound to the actual survival probability and an approximate first passage time distribution. These are expressed in terms of the position-position, velocity-velocity, and position-velocity variances. Knowledge of these variances enables one to compute a lower bound to the survival probability and consequently the first passage distribution function. As examples, we compute these for a Gaussian Markovian process and, in the case of non-Markovian process, with an exponentially decaying friction kernel and also with a power law friction kernel. Our analysis shows that the survival probability decays exponentially at the long time irrespective of the nature of the dynamics with an exponent equal to the transition state rate constant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report numerical and analytic results for the spatial survival probability for fluctuating one-dimensional interfaces with Edwards-Wilkinson or Kardar-Parisi-Zhang dynamics in the steady state. Our numerical results are obtained from analysis of steady-state profiles generated by integrating a spatially discretized form of the Edwards-Wilkinson equation to long times. We show that the survival probability exhibits scaling behavior in its dependence on the system size and the "sampling interval" used in the measurement for both "steady-state" and "finite" initial conditions. Analytic results for the scaling functions are obtained from a path-integral treatment of a formulation of the problem in terms of one-dimensional Brownian motion. A "deterministic approximation" is used to obtain closed-form expressions for survival probabilities from the formally exact analytic treatment. The resulting approximate analytic results provide a fairly good description of the numerical data.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The properties of the generalized survival probability, that is, the probability of not crossing an arbitrary location R during relaxation, have been investigated experimentally (via scanning tunneling microscope observations) and numerically. The results confirm that the generalized survival probability decays exponentially with a time constant tau(s)(R). The distance dependence of the time constant is shown to be tau(s)(R)=tau(s0)exp[-R/w(T)], where w(2)(T) is the material-dependent mean-squared width of the step fluctuations. The result reveals the dependence on the physical parameters of the system inherent in the prior prediction of the time constant scaling with R/L-alpha, with L the system size and alpha the roughness exponent. The survival behavior is also analyzed using a contrasting concept, the generalized inside survival S-in(t,R), which involves fluctuations to an arbitrary location R further from the average. Numerical simulations of the inside survival probability also show an exponential time dependence, and the extracted time constant empirically shows (R/w)(lambda) behavior, with lambda varying over 0.6 to 0.8 as the sampling conditions are changed. The experimental data show similar behavior, and can be well fit with lambda=1.0 for T=300 K, and 0.5

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We show that data from recent experiments carried out on the kinetics of DNA escape from alpha-hemolysin nanopores [M. Wiggin, C. Tropini, C. T. Cossa, N. N. Jetha, and A. Marziali, Biophys. J. 95, 5317 (2008)] may be rationalized by a model of chain dynamics based on the anomalous diffusion of a particle moving in a harmonic well in the presence of a delta function sink. The experiments of Wiggin found, among other things, that the occasional occurrence of unusually long escape times in the distribution of chain trapping events led to nonexponential decays in the survival probability, S(t), of the DNA molecules within the nanopore. Wiggin ascribed this nonexponentiality to the existence of a distribution of trapping potentials, which they suggested was theresult of stochastic interactions between the bases of the DNA and the amino acids located on the surface of the nanopore. Based on this idea, they showed that the experimentally determined S(t) could be well fit in both the short and long time regimes by a function of the form (1+t/tau)(-alpha) (the so called Becquerel function). In our model, S(t) is found to be given by a Mittag-Leffler function at short times and by a generalized Mittag-Leffler function at long times. By suitable choice of certain parameter values, these functions are found to fit the experimental S(t) even better than the Becquerel function. Anomalous diffusion of DNA within the trap prior to escape over a barrier of fixed height may therefore provide a second, plausible explanation of the data, and may offer fresh perspectives on similar trapping and escape problems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Barrierless chemical reactions have often been modeled as a Brownian motion on a one-dimensional harmonic potential energy surface with a position-dependent reaction sink or window located near the minimum of the surface. This simple (but highly successful) description leads to a nonexponential survival probability only at small to intermediate times but exponential decay in the long-time limit. However, in several reactive events involving proteins and glasses, the reactions are found to exhibit a strongly nonexponential (power law) decay kinetics even in the long time. In order to address such reactions, here, we introduce a model of barrierless chemical reaction where the motion along the reaction coordinate sustains dispersive diffusion. A complete analytical solution of the model can be obtained only in the frequency domain, but an asymptotic solution is obtained in the limit of long time. In this case, the asymptotic long-time decay of the survival probability is a power law of the Mittag−Leffler functional form. When the barrier height is increased, the decay of the survival probability still remains nonexponential, in contrast to the ordinary Brownian motion case where the rate is given by the Smoluchowski limit of the well-known Kramers' expression. Interestingly, the reaction under dispersive diffusion is shown to exhibit strong dependence on the initial state of the system, thus predicting a strong dependence on the excitation wavelength for photoisomerization reactions in a dispersive medium. The theory also predicts a fractional viscosity dependence of the rate, which is often observed in the reactions occurring in complex environments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We study the relaxation of a degenerate two-level system interacting with a heat bath, assuming a random-matrix model for the system-bath interaction. For times larger than the duration of a collision and smaller than the Poincaré recurrence time, the survival probability of still finding the system at timet in the same state in which it was prepared att=0 is exactly calculated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Non-exponential electron transfer kinetics in complex systems are often analyzed in terms of a quenched, static disorder model. In this work we present an alternative analysis in terms of a simple dynamic disorder model where the solvent is characterized by highly non-exponential dynamics. We consider both low and high barrier reactions. For the former, the main result is a simple analytical expression for the survival probability of the reactant. In this case, electron transfer, in the long time, is controlled by the solvent polarization relaxation-in agreement with the analyses of Rips and Jortner and of Nadler and Marcus. The short time dynamics is also non-exponential, but for different reasons. The high barrier reactions, on the other hand, show an interesting dynamic dependence on the electronic coupling element, V-el.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We apply to total cross-sections our model for soft gluon resummation in the infrared region. The model aims to probe large distance interactions in QCD. Our ansatz for an effective coupling for gluons and quarks in the infrared region follows an inverse power law which is singular but integrable. In the context of an eikonal formalism with QCD mini-jets, we study total hadronic cross-sections for protons, pions, photons. We estimate the total inelastic cross-section at LHC comparing with recent measurements and update previous results for survival probability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider a quantum particle, moving on a lattice with a tight-binding Hamiltonian, which is subjected to measurements to detect its arrival at a particular chosen set of sites. The projective measurements are made at regular time intervals tau, and we consider the evolution of the wave function until the time a detection occurs. We study the probabilities of its first detection at some time and, conversely, the probability of it not being detected (i.e., surviving) up to that time. We propose a general perturbative approach for understanding the dynamics which maps the evolution operator, which consists of unitary transformations followed by projections, to one described by a non-Hermitian Hamiltonian. For some examples of a particle moving on one-and two-dimensional lattices with one or more detection sites, we use this approach to find exact expressions for the survival probability and find excellent agreement with direct numerical results. A mean-field model with hopping between all pairs of sites and detection at one site is solved exactly. For the one-and two-dimensional systems, the survival probability is shown to have a power-law decay with time, where the power depends on the initial position of the particle. Finally, we show an interesting and nontrivial connection between the dynamics of the particle in our model and the evolution of a particle under a non-Hermitian Hamiltonian with a large absorbing potential at some sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We calculate the probability of large rapidity gaps in high energy hadronic collisions using a model based on QCD mini-jets and soft gluon emission down into the infrared region. Comparing with other models we find a remarkable agreement among most predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the early stages of operation, high-tech startups need to overcome the liability of newness and manage high degree of uncertainty. Several high-tech startups fail due to inability to deal with skeptical customers, underdeveloped markets and limited resources in selling an offering that has no precedent. This paper leverages the principles of effectuation (a logic of entrepreneurial decision making under uncertainty) to explain the journey from creation to survival of high-tech startups in an emerging economy. Based on the 99tests.com case study, this paper suggests that early stage high-tech startups in emerging economies can increase their probability of survival by adopting the principles of effectuation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An expression is derived for the probability that the determinant of an n x n matrix over a finite field vanishes; from this it is deduced that for a fixed field this probability tends to 1 as n tends to.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The statistical minimum risk pattern recognition problem, when the classification costs are random variables of unknown statistics, is considered. Using medical diagnosis as a possible application, the problem of learning the optimal decision scheme is studied for a two-class twoaction case, as a first step. This reduces to the problem of learning the optimum threshold (for taking appropriate action) on the a posteriori probability of one class. A recursive procedure for updating an estimate of the threshold is proposed. The estimation procedure does not require the knowledge of actual class labels of the sample patterns in the design set. The adaptive scheme of using the present threshold estimate for taking action on the next sample is shown to converge, in probability, to the optimum. The results of a computer simulation study of three learning schemes demonstrate the theoretically predictable salient features of the adaptive scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrologic impacts of climate change are usually assessed by downscaling the General Circulation Model (GCM) output of large-scale climate variables to local-scale hydrologic variables. Such an assessment is characterized by uncertainty resulting from the ensembles of projections generated with multiple GCMs, which is known as intermodel or GCM uncertainty. Ensemble averaging with the assignment of weights to GCMs based on model evaluation is one of the methods to address such uncertainty and is used in the present study for regional-scale impact assessment. GCM outputs of large-scale climate variables are downscaled to subdivisional-scale monsoon rainfall. Weights are assigned to the GCMs on the basis of model performance and model convergence, which are evaluated with the Cumulative Distribution Functions (CDFs) generated from the downscaled GCM output (for both 20th Century [20C3M] and future scenarios) and observed data. Ensemble averaging approach, with the assignment of weights to GCMs, is characterized by the uncertainty caused by partial ignorance, which stems from nonavailability of the outputs of some of the GCMs for a few scenarios (in Intergovernmental Panel on Climate Change [IPCC] data distribution center for Assessment Report 4 [AR4]). This uncertainty is modeled with imprecise probability, i.e., the probability being represented as an interval gray number. Furthermore, the CDF generated with one GCM is entirely different from that with another and therefore the use of multiple GCMs results in a band of CDFs. Representing this band of CDFs with a single valued weighted mean CDF may be misleading. Such a band of CDFs can only be represented with an envelope that contains all the CDFs generated with a number of GCMs. Imprecise CDF represents such an envelope, which not only contains the CDFs generated with all the available GCMs but also to an extent accounts for the uncertainty resulting from the missing GCM output. This concept of imprecise probability is also validated in the present study. The imprecise CDFs of monsoon rainfall are derived for three 30-year time slices, 2020s, 2050s and 2080s, with A1B, A2 and B1 scenarios. The model is demonstrated with the prediction of monsoon rainfall in Orissa meteorological subdivision, which shows a possible decreasing trend in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of detecting statistically significant sequential patterns in multineuronal spike trains. These patterns are characterized by ordered sequences of spikes from different neurons with specific delays between spikes. We have previously proposed a data-mining scheme to efficiently discover such patterns, which occur often enough in the data. Here we propose a method to determine the statistical significance of such repeating patterns. The novelty of our approach is that we use a compound null hypothesis that not only includes models of independent neurons but also models where neurons have weak dependencies. The strength of interaction among the neurons is represented in terms of certain pair-wise conditional probabilities. We specify our null hypothesis by putting an upper bound on all such conditional probabilities. We construct a probabilistic model that captures the counting process and use this to derive a test of significance for rejecting such a compound null hypothesis. The structure of our null hypothesis also allows us to rank-order different significant patterns. We illustrate the effectiveness of our approach using spike trains generated with a simulator.