178 resultados para Random Number Generation
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
The generalized Birnbaum-Saunders distribution pertains to a class of lifetime models including both lighter and heavier tailed distributions. This model adapts well to lifetime data, even when outliers exist, and has other good theoretical properties and application perspectives. However, statistical inference tools may not exist in closed form for this model. Hence, simulation and numerical studies are needed, which require a random number generator. Three different ways to generate observations from this model are considered here. These generators are compared by utilizing a goodness-of-fit procedure as well as their effectiveness in predicting the true parameter values by using Monte Carlo simulations. This goodness-of-fit procedure may also be used as an estimation method. The quality of this estimation method is studied here. Finally, through a real data set, the generalized and classical Birnbaum-Saunders models are compared by using this estimation method.
Resumo:
Background Airway bypass is a bronchoscopic lung-volume reduction procedure for emphysema whereby transbronchial passages into the lung are created to release trapped air, supported with paclitaxel-coated stents to ease the mechanics of breathing. The aim of the EASE (Exhale airway stents for emphysema) trial was to evaluate safety and efficacy of airway bypass in people with severe homogeneous emphysema. Methods We undertook a randomised, double-blind, sham-controlled study in 38 specialist respiratory centres worldwide. We recruited 315 patients who had severe hyperinflation (ratio of residual volume [RV] to total lung capacity of >= 0.65). By computer using a random number generator, we randomly allocated participants (in a 2:1 ratio) to either airway bypass (n=208) or sham control (107). We divided investigators into team A (masked), who completed pre-procedure and post-procedure assessments, and team B (unmasked), who only did bronchoscopies without further interaction with patients. Participants were followed up for 12 months. The 6-month co-primary efficacy endpoint required 12% or greater improvement in forced vital capacity (FVC) and 1 point or greater decrease in the modified Medical Research Council dyspnoea score from baseline. The composite primary safety endpoint incorporated five severe adverse events. We did Bayesian analysis to show the posterior probability that airway bypass was superior to sham control (success threshold, 0.965). Analysis was by intention to treat. This study is registered with ClinicalTrials.gov, number NCT00391612. Findings All recruited patients were included in the analysis. At 6 months, no difference between treatment arms was noted with respect to the co-primary efficacy endpoint (30 of 208 for airway bypass vs 12 of 107 for sham control; posterior probability 0.749, below the Bayesian success threshold of 0.965). The 6-month composite primary safety endpoint was 14.4% (30 of 208) for airway bypass versus 11.2% (12 of 107) for sham control (judged non-inferior, with a posterior probability of 1.00 [Bayesian success threshold >0.95]). Interpretation Although our findings showed safety and transient improvements, no sustainable benefit was recorded with airway bypass in patients with severe homogeneous emphysema.
Resumo:
The Birnbaum-Saunders (BS) model is a positively skewed statistical distribution that has received great attention in recent decades. A generalized version of this model was derived based on symmetrical distributions in the real line named the generalized BS (GBS) distribution. The R package named gbs was developed to analyze data from GBS models. This package contains probabilistic and reliability indicators and random number generators from GBS distributions. Parameter estimates for censored and uncensored data can also be obtained by means of likelihood methods from the gbs package. Goodness-of-fit and diagnostic methods were also implemented in this package in order to check the suitability of the GBS models. in this article, the capabilities and features of the gbs package are illustrated by using simulated and real data sets. Shape and reliability analyses for GBS models are presented. A simulation study for evaluating the quality and sensitivity of the estimation method developed in the package is provided and discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Using the network random generation models from Gustedt (2009)[23], we simulate and analyze several characteristics (such as the number of components, the degree distribution and the clustering coefficient) of the generated networks. This is done for a variety of distributions (fixed value, Bernoulli, Poisson, binomial) that are used to control the parameters of the generation process. These parameters are in particular the size of newly appearing sets of objects, the number of contexts in which new elements appear initially, the number of objects that are shared with `parent` contexts, and, the time period inside which a context may serve as a parent context (aging). The results show that these models allow to fine-tune the generation process such that the graphs adopt properties as can be found in real world graphs. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Consider a random medium consisting of N points randomly distributed so that there is no correlation among the distances separating them. This is the random link model, which is the high dimensionality limit (mean-field approximation) for the Euclidean random point structure. In the random link model, at discrete time steps, a walker moves to the nearest point, which has not been visited in the last mu steps (memory), producing a deterministic partially self-avoiding walk (the tourist walk). We have analytically obtained the distribution of the number n of points explored by the walker with memory mu=2, as well as the transient and period joint distribution. This result enables us to explain the abrupt change in the exploratory behavior between the cases mu=1 (memoryless walker, driven by extreme value statistics) and mu=2 (walker with memory, driven by combinatorial statistics). In the mu=1 case, the mean newly visited points in the thermodynamic limit (N >> 1) is just < n >=e=2.72... while in the mu=2 case, the mean number < n > of visited points grows proportionally to N(1/2). Also, this result allows us to establish an equivalence between the random link model with mu=2 and random map (uncorrelated back and forth distances) with mu=0 and the abrupt change between the probabilities for null transient time and subsequent ones.
Resumo:
In this paper we determine the local and global resilience of random graphs G(n,p) (p >> n(-1)) with respect to the property of containing a cycle of length at least (1 - alpha)n. Roughly speaking, given alpha > 0, we determine the smallest r(g) (G, alpha) with the property that almost surely every subgraph of G = G(n,p) having more than r(g) (G, alpha)vertical bar E(G)vertical bar edges contains a cycle of length at least (1 - alpha)n (global resilience). We also obtain, for alpha < 1/2, the smallest r(l) (G, alpha) such that any H subset of G having deg(H) (v) larger than r(l) (G, alpha) deg(G) (v) for all v is an element of V(G) contains a cycle of length at least (1 - alpha)n (local resilience). The results above are in fact proved in the more general setting of pseudorandom graphs.
Resumo:
Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.
Resumo:
We consider the problem of interaction neighborhood estimation from the partial observation of a finite number of realizations of a random field. We introduce a model selection rule to choose estimators of conditional probabilities among natural candidates. Our main result is an oracle inequality satisfied by the resulting estimator. We use then this selection rule in a two-step procedure to evaluate the interacting neighborhoods. The selection rule selects a small prior set of possible interacting points and a cutting step remove from this prior set the irrelevant points. We also prove that the Ising models satisfy the assumptions of the main theorems, without restrictions on the temperature, on the structure of the interacting graph or on the range of the interactions. It provides therefore a large class of applications for our results. We give a computationally efficient procedure in these models. We finally show the practical efficiency of our approach in a simulation study.
Resumo:
Background: The in vitro culture of insulinomas provides an attractive tool to study cell proliferation and insulin synthesis and secretion. However, only a few human beta cell lines have been described, with long-term passage resulting in loss of insulin secretion. Therefore, we set out to establish and characterize human insulin-releasing cell lines. Results: We generated ex-vivo primary cultures from two independent human insulinomas and from a human nesidioblastosis, all of which were cultured up to passage number 20. All cell lines secreted human insulin and C-peptide. These cell lines expressed neuroendocrine and islets markers, confirming the expression profile found in the biopsies. Although all beta cell lineages survived an anchorage independent culture, none of them were able to invade an extracellular matrix substrate. Conclusion: We have established three human insulin-releasing cell lines which maintain antigenic characteristics and insulin secretion profiles of the original tumors. These cell lines represent valuable tools for the study of molecular events underlying beta cell function and dysfunction.
Resumo:
The adaptive process in motor learning was examined in terms of effects of varying amounts of constant practice performed before random practice. Participants pressed five response keys sequentially, the last one coincident with the lighting of a final visual stimulus provided by a complex coincident timing apparatus. Different visual stimulus speeds were used during the random practice. 33 children (M age=11.6 yr.) were randomly assigned to one of three experimental groups: constant-random, constant-random 33%, and constant-random 66%. The constant-random group practiced constantly until they reached a criterion of performance stabilization three consecutive trials within 50 msec. of error. The other two groups had additional constant practice of 33 and 66%, respectively, of the number of trials needed to achieve the stabilization criterion. All three groups performed 36 trials under random practice; in the adaptation phase, they practiced at a different visual stimulus speed adopted in the stabilization phase. Global performance measures were absolute, constant, and variable errors, and movement pattern was analyzed by relative timing and overall movement time. There was no group difference in relation to global performance measures and overall movement time. However, differences between the groups were observed on movement pattern, since constant-random 66% group changed its relative timing performance in the adaptation phase.
Resumo:
A Raman scattering study on multiple phase generation in silicon submitted to successive Vickers microindentation cycles, in different crystallographic orientations, was performed. The microindentations were perfon-ned in a virgin single crystal (100)-oriented surface, in the [001] and [011] directions. The results indicated that the formation of multiple phases by cyclic microindentation may depend on the crystallographic direction and number of successive cycles: the onset of several different structural phases was detected after the third cycle for the [001] direction and only after 15 cycles for the [011] direction, indicating that there is a crystallographic orientation dependence for multiple phase generation. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
This communication proposes a simple way to introduce fibers into finite element modelling. This is a promising formulation to deal with fiber-reinforced composites by the finite element method (FEM), as it allows the consideration of short or long fibers placed arbitrarily inside a continuum domain (matrix). The most important feature of the formulation is that no additional degree of freedom is introduced into the pre-existent finite element numerical system to consider any distribution of fiber inclusions. In other words, the size of the system of equations used to solve a non-reinforced medium is the same as the one used to solve the reinforced counterpart. Another important characteristic is the reduced work required by the user to introduce fibers, avoiding `rebar` elements, node-by-node geometrical definitions or even complex mesh generation. An additional characteristic of the technique is the possibility of representing unbounded stresses at the end of fibers using a finite number of degrees of freedom. Further studies are required for non-linear applications in which localization may occur. Along the text the linear formulation is presented and the bounded connection between fibers and continuum is considered. Four examples are presented, including non-linear analysis, to validate and show the capabilities of the formulation. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
This paper addresses the time-variant reliability analysis of structures with random resistance or random system parameters. It deals with the problem of a random load process crossing a random barrier level. The implications of approximating the arrival rate of the first overload by an ensemble-crossing rate are studied. The error involved in this so-called ""ensemble-crossing rate"" approximation is described in terms of load process and barrier distribution parameters, and in terms of the number of load cycles. Existing results are reviewed, and significant improvements involving load process bandwidth, mean-crossing frequency and time are presented. The paper shows that the ensemble-crossing rate approximation can be accurate enough for problems where load process variance is large in comparison to barrier variance, but especially when the number of load cycles is small. This includes important practical applications like random vibration due to impact loadings and earthquake loading. Two application examples are presented, one involving earthquake loading and one involving a frame structure subject to wind and snow loadings. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Gamma ray tomography experiments have been carried out to detect spatial patterns in the porosity in a 0.27 m diameter column packed with steel Rashig rings of different sizes: 12.6, 37.9, and 76 mm. using a first generation CT system (Chen et al., 1998). A fast Fourier transform tomographic reconstruction algorithm has been used to calculate the spatial variation over the column cross section. Cross-sectional gas porosity and solid holdup distribution were determinate. The values of cross-sectional average gas porosity were epsilon=0.849, 0.938 and 0.966 for the 12.6, 37.9, and 76 mm rings, respectively. Radial holdup variation within the packed bed has been determined. The variation of the circumferentially averaged gas holdup in the radial direction indicates that the porosity in the column wall region is a somewhat higher than that in the bulk region, due to the effect of the column wall. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.