987 resultados para Random process
Resumo:
This paper addresses the time-variant reliability analysis of structures with random resistance or random system parameters. It deals with the problem of a random load process crossing a random barrier level. The implications of approximating the arrival rate of the first overload by an ensemble-crossing rate are studied. The error involved in this so-called ""ensemble-crossing rate"" approximation is described in terms of load process and barrier distribution parameters, and in terms of the number of load cycles. Existing results are reviewed, and significant improvements involving load process bandwidth, mean-crossing frequency and time are presented. The paper shows that the ensemble-crossing rate approximation can be accurate enough for problems where load process variance is large in comparison to barrier variance, but especially when the number of load cycles is small. This includes important practical applications like random vibration due to impact loadings and earthquake loading. Two application examples are presented, one involving earthquake loading and one involving a frame structure subject to wind and snow loadings. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Using the network random generation models from Gustedt (2009)[23], we simulate and analyze several characteristics (such as the number of components, the degree distribution and the clustering coefficient) of the generated networks. This is done for a variety of distributions (fixed value, Bernoulli, Poisson, binomial) that are used to control the parameters of the generation process. These parameters are in particular the size of newly appearing sets of objects, the number of contexts in which new elements appear initially, the number of objects that are shared with `parent` contexts, and, the time period inside which a context may serve as a parent context (aging). The results show that these models allow to fine-tune the generation process such that the graphs adopt properties as can be found in real world graphs. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
When a coherent light beam is scattered from a colloidal medium, in the observation plane, appears a random grainy image known as speckle pattern. The time evolution of this interference image carries information about the ensemble-averaged dynamics of the scatterer particles. The aim of this work was to evaluate the use of dynamic speckles as an alternative tool to monitoring frozen foams formulated with glucose and fructose syrups. Ice creams, after preparation and packing, were stored at 18 degrees C. Changes in properties of products were analyzed by speckle phenomena at three room temperatures (20 degrees C, 25 degrees C and 30 degrees C), minute by minute, during 50 min. Two moments were identified in which samples activity achieved significant levels. These instants were associated, respectively, to ice crystals melting and to air bubbles dissipation into the food matrix causing motion of diverse structures. As expected, ice crystals melting was first in formulations containing fructose syrup, but for same samples, air losses were delayed. Speckle methodology was satisfactory to observe temporal evolution of transient process, opening goods prospects to application, still incoming, in foodstuffs researches. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
We investigate here a modification of the discrete random pore model [Bhatia SK, Vartak BJ, Carbon 1996;34:1383], by including an additional rate constant which takes into account the different reactivity of the initial pore surface having attached functional groups and hydrogens, relative to the subsequently exposed surface. It is observed that the relative initial reactivity has a significant effect on the conversion and structural evolution, underscoring the importance of initial surface chemistry. The model is tested against experimental data on chemically controlled char oxidation and steam gasification at various temperatures. It is seen that the variations of the reaction rate and surface area with conversion are better represented by the present approach than earlier random pore models. The results clearly indicate the improvement of model predictions in the low conversion region, where the effect of the initially attached functional groups and hydrogens is more significant, particularly for char oxidation. It is also seen that, for the data examined, the initial surface chemistry is less important for steam gasification as compared to the oxidation reaction. Further development of the approach must also incorporate the dynamics of surface complexation, which is not considered here.
Resumo:
A new conceptual model for soil pore-solid structure is formalized. Soil pore-solid structure is proposed to comprise spatially abutting elements each with a value which is its membership to the fuzzy set ''pore,'' termed porosity. These values have a range between zero (all solid) and unity (all pore). Images are used to represent structures in which the elements are pixels and the value of each is a porosity. Two-dimensional random fields are generated by allocating each pixel a porosity by independently sampling a statistical distribution. These random fields are reorganized into other pore-solid structural types by selecting parent points which have a specified local region of influence. Pixels of larger or smaller porosity are aggregated about the parent points and within the region of interest by controlled swapping of pixels in the image. This creates local regions of homogeneity within the random field. This is similar to the process known as simulated annealing. The resulting structures are characterized using one-and two-dimensional variograms and functions describing their connectivity. A variety of examples of structures created by the model is presented and compared. Extension to three dimensions presents no theoretical difficulties and is currently under development.
Resumo:
Objectives This prospective study evaluated the association of obesity and hypertension with left atrial (LA) volume over 10 years. Background Although left atrial enlargement (LAE) is an independent risk factor for atrial fibrillation, stroke, and death, little information is available about determinants of LA size in the general population. Methods Participants (1,212 men and women, age 25 to 74 years) originated from a sex-and age-stratified random sample of German residents of the Augsburg area (MONICA S3). Left atrial volume was determined by standardized echocardiography at baseline and again after 10 years. Left atrial volume was indexed to body height (iLA). Left atrial enlargement was defined as iLA >= 35.7 and >= 33.7 ml/m in men and women, respectively. Results At baseline, the prevalence of LAE was 9.8%. Both obesity and hypertension were independent predictors of LAE, obesity (odds ratio [OR]: 2.4; p < 0.001) being numerically stronger than hypertension (OR: 2.2; p < 0.001). Adjusted mean values for iLA were significantly lower in normal-weight hypertensive patients (25.4 ml/m) than in obese normotensive individuals (27.3 ml/m; p = 0.016). The highest iLA was found in the obese hypertensive subgroup (30.0 ml/m; p < 0.001 vs. all other groups). This group also presented with the highest increase in iLA (+6.0 ml/m) and the highest incidence (31.6%) of LAE upon follow-up. Conclusions In the general population, obesity appears to be the most important risk factor for LAE. Given the increasing prevalence of obesity, early interventions, especially in young obese individuals, are essential to prevent premature onset of cardiac remodeling at the atrial level. (J Am Coll Cardiol 2009; 54: 1982-9) (C) 2009 by the American College of Cardiology Foundation
Resumo:
This paper presents a new approach to the LU decomposition method for the simulation of stationary and ergodic random fields. The approach overcomes the size limitations of LU and is suitable for any size simulation. The proposed approach can facilitate fast updating of generated realizations with new data, when appropriate, without repeating the full simulation process. Based on a novel column partitioning of the L matrix, expressed in terms of successive conditional covariance matrices, the approach presented here demonstrates that LU simulation is equivalent to the successive solution of kriging residual estimates plus random terms. Consequently, it can be used for the LU decomposition of matrices of any size. The simulation approach is termed conditional simulation by successive residuals as at each step, a small set (group) of random variables is simulated with a LU decomposition of a matrix of updated conditional covariance of residuals. The simulated group is then used to estimate residuals without the need to solve large systems of equations.
Resumo:
8th International Workshop on Multiple Access Communications (MACOM2015), Helsinki, Finland.
Resumo:
This work presents the results of an investigation of processes in the melting zone during Electron Beam Welding(EBW) through analysis of the secondary current in the plasma.The studies show that the spectrum of the secondary emission signal during steel welding has a pronounced periodic component at a frequency of around 15–25 kHz. The signal contains quasi-periodic sharp peaks (impulses). These impulses have stochastically varying amplitude and follow each other inseries, at random intervals between series. The impulses have a considerable current (up to 0.5 A). It was established that during electron-beam welding with the focal spot scanning these impulses follow each other almost periodically. It was shown that the probability of occurrence of these high-frequency perturbation increases with the concentration of energy in the interaction zone. The paper also presents hypotheses for the mechanism of the formation of the high-frequency oscillations in the secondary current signal in the plasma.
Resumo:
In this paper we explore the effect of bounded rationality on the convergence of individual behavior toward equilibrium. In the context of a Cournot game with a unique and symmetric Nash equilibrium, firms are modeled as adaptive economic agents through a genetic algorithm. Computational experiments show that (1) there is remarkable heterogeneity across identical but boundedly rational agents; (2) such individual heterogeneity is not simply a consequence of the random elements contained in the genetic algorithm; (3) the more rational agents are in terms of memory abilities and pre-play evaluation of strategies, the less heterogeneous they are in their actions. At the limit case of full rationality, the outcome converges to the standard result of uniform individual behavior.
Resumo:
Counting labelled planar graphs, and typical properties of random labelled planar graphs, have received much attention recently. We start the process here of extending these investigations to graphs embeddable on any fixed surface S. In particular we show that the labelled graphs embeddable on S have the same growth constant as for planar graphs, and the same holds for unlabelled graphs. Also, if we pick a graph uniformly at random from the graphs embeddable on S which have vertex set {1, . . . , n}, then with probability tending to 1 as n → ∞, this random graph either is connected or consists of one giant component together with a few nodes in small planar components.
Resumo:
The patterns of genetic variation of samples of Candida spp. isolated from patients infected with human immunodeficiency virus in Vitória, state of Espírito Santo, Brazil, were examined. Thirty-seven strains were isolated from different anatomical sites obtained from different infection episodes of 11 patients infected with the human immunodeficiency virus (HIV). These samples were subjected to randomly amplified polymorphic DNA (RAPD) analysis using 9 different primers. Reproducible and complex DNA banding patterns were obtained. The experiments indicated evidence of dynamic process of yeast colonization in HIV-infected patients, and also that certain primers are efficient in the identification of species of the Candida genus. Thus, we conclude that RAPD analysis may be useful in providing genotypic characters for Candida species typing in epidemiological investigations, and also for the rapid identification of pathogenic fungi.
Resumo:
In this research, we analyse the contact-specific mean of the final cooperation probability, distinguishing on the one hand between contacts with household reference persons and with other eligible household members, and on the other hand between first and later contacts. Data comes from two Swiss Household Panel surveys. The interviewer-specific variance is higher for first contacts, especially in the case of the reference person. For later contacts with the reference person, the contact-specific variance dominates. This means that interaction effects and situational factors are decisive. The contact number has negative effects on the performance of contacts with the reference person, positive in the case of other persons. Also time elapsed since the previous contact has negative effects in the case of reference persons. The result of the previous contact has strong effects, especially in the case of the reference person. These findings call for a quick completion of the household grid questionnaire, assigning the best interviewers to conducting the first contact. While obtaining refusals has negative effects, obtaining other contact results has only weak effects on the interviewer's subsequent contact outcome. Using the same interviewer for contacts has no positive effects.
Resumo:
One of the key aspects in 3D-image registration is the computation of the joint intensity histogram. We propose a new approach to compute this histogram using uniformly distributed random lines to sample stochastically the overlapping volume between two 3D-images. The intensity values are captured from the lines at evenly spaced positions, taking an initial random offset different for each line. This method provides us with an accurate, robust and fast mutual information-based registration. The interpolation effects are drastically reduced, due to the stochastic nature of the line generation, and the alignment process is also accelerated. The results obtained show a better performance of the introduced method than the classic computation of the joint histogram
Resumo:
The space subdivision in cells resulting from a process of random nucleation and growth is a subject of interest in many scientific fields. In this paper, we deduce the expected value and variance of these distributions while assuming that the space subdivision process is in accordance with the premises of the Kolmogorov-Johnson-Mehl-Avrami model. We have not imposed restrictions on the time dependency of nucleation and growth rates. We have also developed an approximate analytical cell size probability density function. Finally, we have applied our approach to the distributions resulting from solid phase crystallization under isochronal heating conditions