986 resultados para spherically invariant random process
Resumo:
Imprinted inactivation of the paternal X chromosome in marsupials is the primordial mechanism of dosage compensation for X-linked genes between females and males in Therians. In Eutherian mammals, X chromosome inactivation (XCI) evolved into a random process in cells from the embryo proper, where either the maternal or paternal X can be inactivated. However, species like mouse and bovine maintained imprinted XCI exclusively in extraembryonic tissues. The existence of imprinted XCI in humans remains controversial, with studies based on the analyses of only one or two X-linked genes in different extraembryonic tissues. Here we readdress this issue in human term placenta by performing a robust analysis of allele-specific expression of 22 X-linked genes, including XIST, using 27 SNPs in transcribed regions. We show that XCI is random in human placenta, and that this organ is arranged in relatively large patches of cells with either maternal or paternal inactive X. In addition, this analysis indicated heterogeneous maintenance of gene silencing along the inactive X, which combined with the extensive mosaicism found in placenta, can explain the lack of agreement among previous studies. Our results illustrate the differences of XCI mechanism between humans and mice, and highlight the importance of addressing the issue of imprinted XCI in other species in order to understand the evolution of dosage compensation in placental mammals.
Resumo:
Polynomial Chaos Expansion (PCE) is widely recognized as a flexible tool to represent different types of random variables/processes. However, applications to real, experimental data are still limited. In this article, PCE is used to represent the random time-evolution of metal corrosion growth in marine environments. The PCE coefficients are determined in order to represent data of 45 corrosion coupons tested by Jeffrey and Melchers (2001) at Taylors Beach, Australia. Accuracy of the representation and possibilities for model extrapolation are considered in the study. Results show that reasonably accurate smooth representations of the corrosion process can be obtained. The representation is not better because a smooth model is used to represent non-smooth corrosion data. Random corrosion leads to time-variant reliability problems, due to resistance degradation over time. Time variant reliability problems are not trivial to solve, especially under random process loading. Two example problems are solved herein, showing how the developed PCE representations can be employed in reliability analysis of structures subject to marine corrosion. Monte Carlo Simulation is used to solve the resulting time-variant reliability problems. However, an accurate and more computationally efficient solution is also presented.
Resumo:
In most real-life environments, mechanical or electronic components are subjected to vibrations. Some of these components may have to pass qualification tests to verify that they can withstand the fatigue damage they will encounter during their operational life. In order to conduct a reliable test, the environmental excitations can be taken as a reference to synthesize the test profile: this procedure is referred to as “test tailoring”. Due to cost and feasibility reasons, accelerated qualification tests are usually performed. In this case, the duration of the original excitation which acts on the component for its entire life-cycle, typically hundreds or thousands of hours, is reduced. In particular, the “Mission Synthesis” procedure lets to quantify the induced damage of the environmental vibration through two functions: the Fatigue Damage Spectrum (FDS) quantifies the fatigue damage, while the Maximum Response Spectrum (MRS) quantifies the maximum stress. Then, a new random Power Spectral Density (PSD) can be synthesized, with same amount of induced damage, but a specified duration in order to conduct accelerated tests. In this work, the Mission Synthesis procedure is applied in the case of so-called Sine-on-Random vibrations, i.e. excitations composed of random vibrations superimposed on deterministic contributions, in the form of sine tones typically due to some rotating parts of the system (e.g. helicopters, engine-mounted components, …). In fact, a proper test tailoring should not only preserve the accumulated fatigue damage, but also the “nature” of the excitation (in this case the sinusoidal components superimposed on the random process) in order to obtain reliable results. The classic time-domain approach is taken as a reference for the comparison of different methods for the FDS calculation in presence of Sine-on-Random vibrations. Then, a methodology to compute a Sine-on-Random specification based on a mission FDS is presented.
Resumo:
A desintegração radioativa é um processo aleatório e a estimativa de todas as medidas associadas é governada por leis estatísticas. Os perfis de taxas de contagem são sempre "ruidosos" quando utilizados períodos curtos como um segundo para cada medida. Os filtros utilizados e posteriormente as correções feitas no processamento atual de dados gamaespectrométricos não são suficientes para remover ou diminuir, consideravelmente, o ruído oriundo do espectro. Dois métodos estatísticos que atuam diretamente nos dados coletados, isto é, nos espectros, vêm sendo sugeridos na literatura para remover e minimizar estes ruídos remanescentes o Noise-Adjusted Singular Value Decomposition - NASVD e Maximum Noise Fraction - MNF. Estes métodos produzem uma redução no ruído de forma significativa. Neste trabalho eles foram implementados dentro do ambiente de processamento do software Oasis Montaj e aplicados na área compreendida pelos blocos I e II do levantamento aerogeofísico que recobre a porção oeste da Província Mineral do Tapajós, entre os Estados do Pará e Amazonas. Os dados filtrados e não-filtrados com as técnicas de NASVD e MNF foram processados com os parâmetros e constantes fornecidos pela empresa Lasa Engenharia e Prospecções S.A., sendo estes comparados. Os resultados da comparação entre perfis e mapas apresentaram-se de forma promissora, pois houve um ganho na resolução dos produtos.
Resumo:
Background: Exposure of cells to environmental stress conditions can lead to the interruption of several intracellular processes, in particular those performed by macromolecular complexes such as the spliceosome. Results: During nucleotide sequencing of cDNA libraries constructed using RNA isolated from B. emersonii cells submitted to heat shock and cadmium stress, a large number of ESTs with retained introns was observed. Among the 6,350 ESTs obtained through sequencing of stress cDNA libraries, 181 ESTs presented putative introns (2.9%), while sequencing of cDNA libraries from unstressed B. emersonii cells revealed only 0.2% of ESTs containing introns. These data indicate an enrichment of ESTs with introns in B. emersonii stress cDNA libraries. Among the 85 genes corresponding to the ESTs that retained introns, 19 showed more than one intron and three showed three introns, with intron length ranging from 55 to 333 nucleotides. Canonical splicing junctions were observed in most of these introns, junction sequences being very similar to those found in introns from genes previously characterized in B. emersonii, suggesting that inhibition of splicing during stress is apparently a random process. Confirming our observations, analyses of gpx3 and hsp70 mRNAs by Northern blot and S1 protection assays revealed a strong inhibition of intron splicing in cells submitted to cadmium stress. Conclusion: In conclusion, data indicate that environmental stresses, particularly cadmium treatment, inhibit intron processing in B. emersonii, revealing a new adaptive response to cellular exposure to this heavy metal.
Resumo:
Due to the several kinds of services that use the Internet and data networks infra-structures, the present networks are characterized by the diversity of types of traffic that have statistical properties as complex temporal correlation and non-gaussian distribution. The networks complex temporal correlation may be characterized by the Short Range Dependence (SRD) and the Long Range Dependence - (LRD). Models as the fGN (Fractional Gaussian Noise) may capture the LRD but not the SRD. This work presents two methods for traffic generation that synthesize approximate realizations of the self-similar fGN with SRD random process. The first one employs the IDWT (Inverse Discrete Wavelet Transform) and the second the IDWPT (Inverse Discrete Wavelet Packet Transform). It has been developed the variance map concept that allows to associate the LRD and SRD behaviors directly to the wavelet transform coefficients. The developed methods are extremely flexible and allow the generation of Gaussian time series with complex statistical behaviors.
Resumo:
We develop a test of evolutionary change that incorporates a null hypothesis of homogeneity, which encompasses time invariance in the variance and autocovariance structure of residuals from estimated econometric relationships. The test framework is based on examining whether shifts in spectral decomposition between two frames of data are significant. Rejection of the null hypothesis will point not only to weak nonstationarity but to shifts in the structure of the second-order moments of the limiting distribution of the random process. This would indicate that the second-order properties of any underlying attractor set has changed in a statistically significant way, pointing to the presence of evolutionary change. A demonstration of the test's applicability to a real-world macroeconomic problem is accomplished by applying the test to the Australian Building Society Deposits (ABSD) model.
Resumo:
This paper addresses the investment decisions considering the presence of financial constraints of 373 large Brazilian firms from 1997 to 2004, using panel data. A Bayesian econometric model was used considering ridge regression for multicollinearity problems among the variables in the model. Prior distributions are assumed for the parameters, classifying the model into random or fixed effects. We used a Bayesian approach to estimate the parameters, considering normal and Student t distributions for the error and assumed that the initial values for the lagged dependent variable are not fixed, but generated by a random process. The recursive predictive density criterion was used for model comparisons. Twenty models were tested and the results indicated that multicollinearity does influence the value of the estimated parameters. Controlling for capital intensity, financial constraints are found to be more important for capital-intensive firms, probably due to their lower profitability indexes, higher fixed costs and higher degree of property diversification.
Resumo:
BACKGROUND: The synthesis of published research in systematic reviews is essential when providing evidence to inform clinical and health policy decision-making. However, the validity of systematic reviews is threatened if journal publications represent a biased selection of all studies that have been conducted (dissemination bias). To investigate the extent of dissemination bias we conducted a systematic review that determined the proportion of studies published as peer-reviewed journal articles and investigated factors associated with full publication in cohorts of studies (i) approved by research ethics committees (RECs) or (ii) included in trial registries. METHODS AND FINDINGS: Four bibliographic databases were searched for methodological research projects (MRPs) without limitations for publication year, language or study location. The searches were supplemented by handsearching the references of included MRPs. We estimated the proportion of studies published using prediction intervals (PI) and a random effects meta-analysis. Pooled odds ratios (OR) were used to express associations between study characteristics and journal publication. Seventeen MRPs (23 publications) evaluated cohorts of studies approved by RECs; the proportion of published studies had a PI between 22% and 72% and the weighted pooled proportion when combining estimates would be 46.2% (95% CI 40.2%-52.4%, I2 = 94.4%). Twenty-two MRPs (22 publications) evaluated cohorts of studies included in trial registries; the PI of the proportion published ranged from 13% to 90% and the weighted pooled proportion would be 54.2% (95% CI 42.0%-65.9%, I2 = 98.9%). REC-approved studies with statistically significant results (compared with those without statistically significant results) were more likely to be published (pooled OR 2.8; 95% CI 2.2-3.5). Phase-III trials were also more likely to be published than phase II trials (pooled OR 2.0; 95% CI 1.6-2.5). The probability of publication within two years after study completion ranged from 7% to 30%. CONCLUSIONS: A substantial part of the studies approved by RECs or included in trial registries remains unpublished. Due to the large heterogeneity a prediction of the publication probability for a future study is very uncertain. Non-publication of research is not a random process, e.g., it is associated with the direction of study findings. Our findings suggest that the dissemination of research findings is biased.
Resumo:
We present simple procedures for the prediction of a real valued sequence. The algorithms are based on a combinationof several simple predictors. We show that if the sequence is a realization of a bounded stationary and ergodic random process then the average of squared errors converges, almost surely, to that of the optimum, given by the Bayes predictor. We offer an analog result for the prediction of stationary gaussian processes.
Resumo:
Previous indirect evidence suggests that impulses towards pro-socialbehavior are diminished when an external authority is responsiblefor an outcome. The responsibility-alleviation effect states that ashift of responsibility to an external authority dampens internalimpulses toward honesty, loyalty, or generosity. In a gift-exchangeexperiment, we find that subjects respond with more generosity(higher effort) when a wage is determined by a random process thanwhen it is assigned by a third party, indicating that even a slightshift in perceived responsibility for the final payoffs can changebehavior. Responsibility-alleviation is a factor in economicenvironments featuring substantial personal interaction.
Resumo:
We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.
Resumo:
It is very well known that the first succesful valuation of a stock option was done by solving a deterministic partial differential equation (PDE) of the parabolic type with some complementary conditions specific for the option. In this approach, the randomness in the option value process is eliminated through a no-arbitrage argument. An alternative approach is to construct a replicating portfolio for the option. From this viewpoint the payoff function for the option is a random process which, under a new probabilistic measure, turns out to be of a special type, a martingale. Accordingly, the value of the replicating portfolio (equivalently, of the option) is calculated as an expectation, with respect to this new measure, of the discounted value of the payoff function. Since the expectation is, by definition, an integral, its calculation can be made simpler by resorting to powerful methods already available in the theory of analytic functions. In this paper we use precisely two of those techniques to find the well-known value of a European call
Resumo:
It is very well known that the first succesful valuation of a stock option was done by solving a deterministic partial differential equation (PDE) of the parabolic type with some complementary conditions specific for the option. In this approach, the randomness in the option value process is eliminated through a no-arbitrage argument. An alternative approach is to construct a replicating portfolio for the option. From this viewpoint the payoff function for the option is a random process which, under a new probabilistic measure, turns out to be of a special type, a martingale. Accordingly, the value of the replicating portfolio (equivalently, of the option) is calculated as an expectation, with respect to this new measure, of the discounted value of the payoff function. Since the expectation is, by definition, an integral, its calculation can be made simpler by resorting to powerful methods already available in the theory of analytic functions. In this paper we use precisely two of those techniques to find the well-known value of a European call