894 resultados para random transform
Resumo:
A general formalism on stochastic choice is presented. Tje Rationalizability and Recoverability (Identification) problems are discussed. For the identification issue parametric examples are analyzed by means of techniques of mathematical tomography (Random transforms).
Resumo:
"July 1976."
Resumo:
Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.
Resumo:
The Random Parameter model was proposed to explain the structure of the covariance matrix in problems where most, but not all, of the eigenvalues of the covariance matrix can be explained by Random Matrix Theory. In this article, we explore the scaling properties of the model, as observed in the multifractal structure of the simulated time series. We use the Wavelet Transform Modulus Maxima technique to obtain the multifractal spectrum dependence with the parameters of the model. The model shows a scaling structure compatible with the stylized facts for a reasonable choice of the parameter values. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Gamma ray tomography experiments have been carried out to detect spatial patterns in the porosity in a 0.27 m diameter column packed with steel Rashig rings of different sizes: 12.6, 37.9, and 76 mm. using a first generation CT system (Chen et al., 1998). A fast Fourier transform tomographic reconstruction algorithm has been used to calculate the spatial variation over the column cross section. Cross-sectional gas porosity and solid holdup distribution were determinate. The values of cross-sectional average gas porosity were epsilon=0.849, 0.938 and 0.966 for the 12.6, 37.9, and 76 mm rings, respectively. Radial holdup variation within the packed bed has been determined. The variation of the circumferentially averaged gas holdup in the radial direction indicates that the porosity in the column wall region is a somewhat higher than that in the bulk region, due to the effect of the column wall. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
We study discrete-time models in which death benefits can depend on a stock price index, the logarithm of which is modeled as a random walk. Examples of such benefit payments include put and call options, barrier options, and lookback options. Because the distribution of the curtate-future-lifetime can be approximated by a linear combination of geometric distributions, it suffices to consider curtate-future-lifetimes with a geometric distribution. In binomial and trinomial tree models, closed-form expressions for the expectations of the discounted benefit payment are obtained for a series of options. They are based on results concerning geometric stopping of a random walk, in particular also on a version of the Wiener-Hopf factorization.
Resumo:
In many situations probability models are more realistic than deterministic models. Several phenomena occurring in physics are studied as random phenomena changing with time and space. Stochastic processes originated from the needs of physicists.Let X(t) be a random variable where t is a parameter assuming values from the set T. Then the collection of random variables {X(t), t ∈ T} is called a stochastic process. We denote the state of the process at time t by X(t) and the collection of all possible values X(t) can assume, is called state space
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.
Resumo:
In order to validate the reported precision of space‐based atmospheric composition measurements, validation studies often focus on measurements in the tropical stratosphere, where natural variability is weak. The scatter in tropical measurements can then be used as an upper limit on single‐profile measurement precision. Here we introduce a method of quantifying the scatter of tropical measurements which aims to minimize the effects of short‐term atmospheric variability while maintaining large enough sample sizes that the results can be taken as representative of the full data set. We apply this technique to measurements of O3, HNO3, CO, H2O, NO, NO2, N2O, CH4, CCl2F2, and CCl3F produced by the Atmospheric Chemistry Experiment–Fourier Transform Spectrometer (ACE‐FTS). Tropical scatter in the ACE‐FTS retrievals is found to be consistent with the reported random errors (RREs) for H2O and CO at altitudes above 20 km, validating the RREs for these measurements. Tropical scatter in measurements of NO, NO2, CCl2F2, and CCl3F is roughly consistent with the RREs as long as the effect of outliers in the data set is reduced through the use of robust statistics. The scatter in measurements of O3, HNO3, CH4, and N2O in the stratosphere, while larger than the RREs, is shown to be consistent with the variability simulated in the Canadian Middle Atmosphere Model. This result implies that, for these species, stratospheric measurement scatter is dominated by natural variability, not random error, which provides added confidence in the scientific value of single‐profile measurements.
Resumo:
Many applications, such as intermittent data assimilation, lead to a recursive application of Bayesian inference within a Monte Carlo context. Popular data assimilation algorithms include sequential Monte Carlo methods and ensemble Kalman filters (EnKFs). These methods differ in the way Bayesian inference is implemented. Sequential Monte Carlo methods rely on importance sampling combined with a resampling step, while EnKFs utilize a linear transformation of Monte Carlo samples based on the classic Kalman filter. While EnKFs have proven to be quite robust even for small ensemble sizes, they are not consistent since their derivation relies on a linear regression ansatz. In this paper, we propose another transform method, which does not rely on any a priori assumptions on the underlying prior and posterior distributions. The new method is based on solving an optimal transportation problem for discrete random variables. © 2013, Society for Industrial and Applied Mathematics
Resumo:
We demonstrate that a distributed Raman amplification scheme based on random distributed feedback (DFB) fiber laser enables bidirectional second-order Raman pumping without increasing relative intensity noise (RIN) of the signal. This extends the reach of 10 × 116 Gb/s DP-QPSK WDM transmission up to 7915 km, compared with conventional Raman amplification schemes. Moreover, this scheme gives the longest maximum transmission distance among all the Raman amplification schemes presented in this paper, whilst maintaining relatively uniform and symmetric signal power distribution, and is also adjustable in order to be highly compatible with different nonlinearity compensation techniques, including mid-link optical phase conjugation (OPC) and nonlinear Fourier transform (NFT).
Resumo:
We prove that a random Hilbert scheme that parametrizes the closed subschemes with a fixed Hilbert polynomial in some projective space is irreducible and nonsingular with probability greater than $0.5$. To consider the set of nonempty Hilbert schemes as a probability space, we transform this set into a disjoint union of infinite binary trees, reinterpreting Macaulay's classification of admissible Hilbert polynomials. Choosing discrete probability distributions with infinite support on the trees establishes our notion of random Hilbert schemes. To bound the probability that random Hilbert schemes are irreducible and nonsingular, we show that at least half of the vertices in the binary trees correspond to Hilbert schemes with unique Borel-fixed points.
Resumo:
The Fourier transform-infrared (FT-IR) signature of dry samples of DNA and DNA-polypeptide complexes, as studied by IR microspectroscopy using a diamond attenuated total reflection (ATR) objective, has revealed important discriminatory characteristics relative to the PO2(-) vibrational stretchings. However, DNA IR marks that provide information on the sample's richness in hydrogen bonds have not been resolved in the spectral profiles obtained with this objective. Here we investigated the performance of an all reflecting objective (ARO) for analysis of the FT-IR signal of hydrogen bonds in DNA samples differing in base richness types (salmon testis vs calf thymus). The results obtained using the ARO indicate prominent band peaks at the spectral region representative of the vibration of nitrogenous base hydrogen bonds and of NH and NH2 groups. The band areas at this spectral region differ in agreement with the DNA base richness type when using the ARO. A peak assigned to adenine was more evident in the AT-rich salmon DNA using either the ARO or the ATR objective. It is concluded that, for the discrimination of DNA IR hydrogen bond vibrations associated with varying base type proportions, the use of an ARO is recommended.
Biased Random-key Genetic Algorithms For The Winner Determination Problem In Combinatorial Auctions.
Resumo:
Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.
Resumo:
Isosorbide succinate moieties were incorporated into poly(L-lactide) (PLLA) backbone in order to obtain a new class of biodegradable polymer with enhanced properties. This paper describes the synthesis and characterization of four types of low molecular weight copolymers. Copolymer I was obtained from monomer mixtures of L-lactide, isosorbide, and succinic anhydride; II from oligo(L-lactide) (PLLA), isosorbide, and succinic anhydride; III from oligo(isosorbide succinate) (PIS) and L-lactide; and IV from transesterification reactions between PLLA and PIS. MALDI-TOFMS and 13C-NMR analyses gave evidence that co-oligomerization was successfully attained in all cases. The data suggested that the product I is a random co-oligomer and the products II-IV are block co-oligomers.