980 resultados para Stochastic process


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chinese Acad Sci, ISCAS Lab Internet Software Technologies

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Power-law distributions, i.e. Levy flights have been observed in various economical, biological, and physical systems in high-frequency regime. These distributions can be successfully explained via gradually truncated Levy flight (GTLF). In general, these systems converge to a Gaussian distribution in the low-frequency regime. In the present work, we develop a model for the physical basis for the cut-off length in GTLF and its variation with respect to the time interval between successive observations. We observe that GTLF automatically approach a Gaussian distribution in the low-frequency regime. We applied the present method to analyze time series in some physical and financial systems. The agreement between the experimental results and theoretical curves is excellent. The present method can be applied to analyze time series in a variety of fields, which in turn provide a basis for the development of further microscopic models for the system. © 2000 Elsevier Science B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background Using univariate and multivariate variance components linkage analysis methods, we studied possible genotype × age interaction in cardiovascular phenotypes related to the aging process from the Framingham Heart Study. Results We found evidence for genotype × age interaction for fasting glucose and systolic blood pressure. Conclusions There is polygenic genotype × age interaction for fasting glucose and systolic blood pressure and quantitative trait locus × age interaction for a linkage signal for systolic blood pressure phenotypes located on chromosome 17 at 67 cM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While molecular and cellular processes are often modeled as stochastic processes, such as Brownian motion, chemical reaction networks and gene regulatory networks, there are few attempts to program a molecular-scale process to physically implement stochastic processes. DNA has been used as a substrate for programming molecular interactions, but its applications are restricted to deterministic functions and unfavorable properties such as slow processing, thermal annealing, aqueous solvents and difficult readout limit them to proof-of-concept purposes. To date, whether there exists a molecular process that can be programmed to implement stochastic processes for practical applications remains unknown.

In this dissertation, a fully specified Resonance Energy Transfer (RET) network between chromophores is accurately fabricated via DNA self-assembly, and the exciton dynamics in the RET network physically implement a stochastic process, specifically a continuous-time Markov chain (CTMC), which has a direct mapping to the physical geometry of the chromophore network. Excited by a light source, a RET network generates random samples in the temporal domain in the form of fluorescence photons which can be detected by a photon detector. The intrinsic sampling distribution of a RET network is derived as a phase-type distribution configured by its CTMC model. The conclusion is that the exciton dynamics in a RET network implement a general and important class of stochastic processes that can be directly and accurately programmed and used for practical applications of photonics and optoelectronics. Different approaches to using RET networks exist with vast potential applications. As an entropy source that can directly generate samples from virtually arbitrary distributions, RET networks can benefit applications that rely on generating random samples such as 1) fluorescent taggants and 2) stochastic computing.

By using RET networks between chromophores to implement fluorescent taggants with temporally coded signatures, the taggant design is not constrained by resolvable dyes and has a significantly larger coding capacity than spectrally or lifetime coded fluorescent taggants. Meanwhile, the taggant detection process becomes highly efficient, and the Maximum Likelihood Estimation (MLE) based taggant identification guarantees high accuracy even with only a few hundred detected photons.

Meanwhile, RET-based sampling units (RSU) can be constructed to accelerate probabilistic algorithms for wide applications in machine learning and data analytics. Because probabilistic algorithms often rely on iteratively sampling from parameterized distributions, they can be inefficient in practice on the deterministic hardware traditional computers use, especially for high-dimensional and complex problems. As an efficient universal sampling unit, the proposed RSU can be integrated into a processor / GPU as specialized functional units or organized as a discrete accelerator to bring substantial speedups and power savings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study a stochastic process describing the onset of spreading dynamics of an epidemic in a population composed of individuals of three classes: susceptible (S), infected (I), and recovered (R). The stochastic process is defined by local rules and involves the following cyclic process: S -> I -> R -> S (SIRS). The open process S -> I -> R (SIR) is studied as a particular case of the SIRS process. The epidemic process is analyzed at different levels of description: by a stochastic lattice gas model and by a birth and death process. By means of Monte Carlo simulations and dynamical mean-field approximations we show that the SIRS stochastic lattice gas model exhibit a line of critical points separating the two phases: an absorbing phase where the lattice is completely full of S individuals and an active phase where S, I and R individuals coexist, which may or may not present population cycles. The critical line, that corresponds to the onset of epidemic spreading, is shown to belong in the directed percolation universality class. By considering the birth and death process we analyze the role of noise in stabilizing the oscillations. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Industrial producers face the task of optimizing production process in an attempt to achieve the desired quality such as mechanical properties with the lowest energy consumption. In industrial carbon fiber production, the fibers are processed in bundles containing (batches) several thousand filaments and consequently the energy optimization will be a stochastic process as it involves uncertainty, imprecision or randomness. This paper presents a stochastic optimization model to reduce energy consumption a given range of desired mechanical properties. Several processing condition sets are developed and for each set of conditions, 50 samples of fiber are analyzed for their tensile strength and modulus. The energy consumption during production of the samples is carefully monitored on the processing equipment. Then, five standard distribution functions are examined to determine those which can best describe the distribution of mechanical properties of filaments. To verify the distribution goodness of fit and correlation statistics, the Kolmogorov-Smirnov test is used. In order to estimate the selected distribution (Weibull) parameters, the maximum likelihood, least square and genetic algorithm methods are compared. An array of factors including the sample size, the confidence level, and relative error of estimated parameters are used for evaluating the tensile strength and modulus properties. The energy consumption and N2 gas cost are modeled by Convex Hull method. Finally, in order to optimize the carbon fiber production quality and its energy consumption and total cost, mixed integer linear programming is utilized. The results show that using the stochastic optimization models, we are able to predict the production quality in a given range and minimize the energy consumption of its industrial process.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We study the regret of optimal strategies for online convex optimization games. Using von Neumann's minimax theorem, we show that the optimal regret in this adversarial setting is closely related to the behavior of the empirical minimization algorithm in a stochastic process setting: it is equal to the maximum, over joint distributions of the adversary's action sequence, of the difference between a sum of minimal expected losses and the minimal empirical loss. We show that the optimal regret has a natural geometric interpretation, since it can be viewed as the gap in Jensen's inequality for a concave functional--the minimizer over the player's actions of expected loss--defined on a set of probability distributions. We use this expression to obtain upper and lower bounds on the regret of an optimal strategy for a variety of online learning problems. Our method provides upper bounds without the need to construct a learning algorithm; the lower bounds provide explicit optimal strategies for the adversary. Peter L. Bartlett, Alexander Rakhlin

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The quick detection of an abrupt unknown change in the conditional distribution of a dependent stochastic process has numerous applications. In this paper, we pose a minimax robust quickest change detection problem for cases where there is uncertainty about the post-change conditional distribution. Our minimax robust formulation is based on the popular Lorden criteria of optimal quickest change detection. Under a condition on the set of possible post-change distributions, we show that the widely known cumulative sum (CUSUM) rule is asymptotically minimax robust under our Lorden minimax robust formulation as a false alarm constraint becomes more strict. We also establish general asymptotic bounds on the detection delay of misspecified CUSUM rules (i.e. CUSUM rules that are designed with post- change distributions that differ from those of the observed sequence). We exploit these bounds to compare the delay performance of asymptotically minimax robust, asymptotically optimal, and other misspecified CUSUM rules. In simulation examples, we illustrate that asymptotically minimax robust CUSUM rules can provide better detection delay performance at greatly reduced computation effort compared to competing generalised likelihood ratio procedures.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Time series, from a narrow point of view, is a sequence of observations on a stochastic process made at discrete and equally spaced time intervals. Its future behavior can be predicted by identifying, fitting, and confirming a mathematical model. In this paper, time series analysis is applied to problems concerning runwayinduced vibrations of an aircraft. A simple mathematical model based on this technique is fitted to obtain the impulse response coefficients of an aircraft system considered as a whole for a particular type of operation. Using this model, the output which is the aircraft response can be obtained with lesser computation time for any runway profile as the input.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A von Mises truss with stochastically varying material properties is investigated for snapthrough instability. The variability of the snap-through load is calculated analytically as a function of the material property variability represented as a stochastic process. The bounds are established which are independent of the knowledge of the complete description of correlation structure which is seldom possible using the experimental data. Two processes are considered to represent the material property variability and the results are presented graphically. Ein von Mises Fachwerk mit stochastisch verteilten Materialeigenschaften wird bezüglich der Durchschlagsinstabilität untersucht. Die Spannbreite der Durchschlagslast wird analytisch als Funktion der Spannbreite der Materialeigenschaften berechnet, die stochastisch verteilt angenommen werden. Eine explizite Gesamtbeschreibung der Struktur ist bei Benutzung experimenteller Daten selten möglich. Deshalb werden Grenzen für die Durchschlagskraft entwickelt, die von der Kenntnis dieser Gesamtbeschreibung unabhängig sind. Zwei Grenzfälle werden betrachtet, um die Spannbreite der Materialeigenschaften darzustellen. Die Ergebnisse werden grafisch dargestellt.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Image segmentation is formulated as a stochastic process whose invariant distribution is concentrated at points of the desired region. By choosing multiple seed points, different regions can be segmented. The algorithm is based on the theory of time-homogeneous Markov chains and has been largely motivated by the technique of simulated annealing. The method proposed here has been found to perform well on real-world clean as well as noisy images while being computationally far less expensive than stochastic optimisation techniques