985 resultados para Discrete Variables
Resumo:
Discrete stochastic simulations are a powerful tool for understanding the dynamics of chemical kinetics when there are small-to-moderate numbers of certain molecular species. In this paper we introduce delays into the stochastic simulation algorithm, thus mimicking delays associated with transcription and translation. We then show that this process may well explain more faithfully than continuous deterministic models the observed sustained oscillations in expression levels of hes1 mRNA and Hes1 protein.
Resumo:
Biochemical reactions underlying genetic regulation are often modelled as a continuous-time, discrete-state, Markov process, and the evolution of the associated probability density is described by the so-called chemical master equation (CME). However the CME is typically difficult to solve, since the state-space involved can be very large or even countably infinite. Recently a finite state projection method (FSP) that truncates the state-space was suggested and shown to be effective in an example of a model of the Pap-pili epigenetic switch. However in this example, both the model and the final time at which the solution was computed, were relatively small. Presented here is a Krylov FSP algorithm based on a combination of state-space truncation and inexact matrix-vector product routines. This allows larger-scale models to be studied and solutions for larger final times to be computed in a realistic execution time. Additionally the new method computes the solution at intermediate times at virtually no extra cost, since it is derived from Krylov-type methods for computing matrix exponentials. For the purpose of comparison the new algorithm is applied to the model of the Pap-pili epigenetic switch, where the original FSP was first demonstrated. Also the method is applied to a more sophisticated model of regulated transcription. Numerical results indicate that the new approach is significantly faster and extendable to larger biological models.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
In this paper we present a sequential Monte Carlo algorithm for Bayesian sequential experimental design applied to generalised non-linear models for discrete data. The approach is computationally convenient in that the information of newly observed data can be incorporated through a simple re-weighting step. We also consider a flexible parametric model for the stimulus-response relationship together with a newly developed hybrid design utility that can produce more robust estimates of the target stimulus in the presence of substantial model and parameter uncertainty. The algorithm is applied to hypothetical clinical trial or bioassay scenarios. In the discussion, potential generalisations of the algorithm are suggested to possibly extend its applicability to a wide variety of scenarios
Resumo:
Inspection of solder joints has been a critical process in the electronic manufacturing industry to reduce manufacturing cost, improve yield, and ensure project quality and reliability. This paper proposes the use of the Log-Gabor filter bank, Discrete Wavelet Transform and Discrete Cosine Transform for feature extraction of solder joint images on Printed Circuit Boards (PCBs). A distance based on the Mahalanobis Cosine metric is also presented for classification of five different types of solder joints. From the experimental results, this methodology achieved high accuracy and a well generalised performance. This can be an effective method to reduce cost and improve quality in the production of PCBs in the manufacturing industry.
Resumo:
Local climate is a critical element in the design of buildings. In this paper, ten years of historical weather data in Australia's all eight capital cities are analyzed to characterize the variation profiles of climatic variables. The method of descriptive statistics is employed. Either the pattern of cumulative distribution and/or the profile of percentage distribution are used to graphically illustrate the similarity and difference between different study locations. It is found that although the weather variables vary with different locations, except for the extreme parts, there is often a good, nearly linear relation between weather variable and its cumulative percentage for the majority of middle part. The implication of these extreme parts and the slopes of the middle parts on building design is also discussed.
Resumo:
Advances in safety research—trying to improve the collective understanding of motor vehicle crash causes and contributing factors—rest upon the pursuit of numerous lines of research inquiry. The research community has focused considerable attention on analytical methods development (negative binomial models, simultaneous equations, etc.), on better experimental designs (before-after studies, comparison sites, etc.), on improving exposure measures, and on model specification improvements (additive terms, non-linear relations, etc.). One might logically seek to know which lines of inquiry might provide the most significant improvements in understanding crash causation and/or prediction. It is the contention of this paper that the exclusion of important variables (causal or surrogate measures of causal variables) cause omitted variable bias in model estimation and is an important and neglected line of inquiry in safety research. In particular, spatially related variables are often difficult to collect and omitted from crash models—but offer significant opportunities to better understand contributing factors and/or causes of crashes. This study examines the role of important variables (other than Average Annual Daily Traffic (AADT)) that are generally omitted from intersection crash prediction models. In addition to the geometric and traffic regulatory information of intersection, the proposed model includes many spatial factors such as local influences of weather, sun glare, proximity to drinking establishments, and proximity to schools—representing a mix of potential environmental and human factors that are theoretically important, but rarely used. Results suggest that these variables in addition to AADT have significant explanatory power, and their exclusion leads to omitted variable bias. Provided is evidence that variable exclusion overstates the effect of minor road AADT by as much as 40% and major road AADT by 14%.
Resumo:
In this paper, a method of separating variables is effectively implemented for solving a time-fractional telegraph equation (TFTE) in two and three dimensions. We discuss and derive the analytical solution of the TFTE in two and three dimensions with nonhomogeneous Dirichlet boundary condition. This method can be extended to other kinds of the boundary conditions.
Resumo:
This paper establishes practical stability results for an important range of approximate discrete-time filtering problems involving mismatch between the true system and the approximating filter model. Practical stability is established in the sense of an asymptotic bound on the amount of bias introduced by the model approximation. Our analysis applies to a wide range of estimation problems and justifies the common practice of approximating intractable infinite dimensional nonlinear filters by simpler computationally tractable filters.
Resumo:
Objectives: This study examines the hypothesis that a past history of heart interventions will moderate the relationship between psychosocial factors (stressful life events, social support, perceived stress, having a current partner, having a past diagnosis of depression or anxiety over the past 3 years, time pressure, education level, and the mental health index) and the presence of chest pain in a sample of older women. Design: Longitudinal survey over a 3-year period. Methods: The sample was taken from a prospective cohort study of 10,432 women initially aged between 70 and 75 years, who were surveyed in 1996 and then again in 1999. Two groups of women were identified: those reporting to have heart disease but no past history of heart interventions (i.e., coronary artery bypass graft/angioplasty) and those reporting to have heart disease with a past history of heart interventions. Results: Binary logistic regression analysis was used to show that for the women with self-reported coronary heart disease but without a past history of heart intervention, feelings of time pressure as well as the number of stressful life events experienced in the 12 months prior to 1996 were independent risk factors for the presence of chest pain, even after accounting for a range of traditional risk factors. In comparison, for the women with self-reported coronary heart disease who did report a past history of heart interventions, a diagnosis of depression in the previous 3 years was the significant independent risk factor for chest pain even after accounting for traditional risk factors. Conclusion: The results indicate that it is important to consider a history of heart interventions as a moderator of the associations between psychosocial variables and the frequency of chest pain in older women. Statement of Contribution: What is already known on this subject? Psychological factors have been shown to be independent predictors of a range of health outcomes in individuals with coronary heart disease, including the presence of chest pain. Most research has been conducted with men or with small samples of women; however, the evidence does suggest that these relationships exist in women as well as in men. What does this study add? Most studies have looked at overall relationships between psychological variables and health outcomes. The few studies that have looked at moderators have mainly examined gender as a moderator. To our knowledge, this is the first published study to examine a history of heart interventions as a moderator of the relationship between psychological variables and the presence of chest pain.
Resumo:
Evaluating the validity of formative variables has presented ongoing challenges for researchers. In this paper we use global criterion measures to compare and critically evaluate two alternative formative measures of System Quality. One model is based on the ISO-9126 software quality standard, and the other is based on a leading information systems research model. We find that despite both models having a strong provenance, many of the items appear to be non-significant in our study. We examine the implications of this by evaluating the quality of the criterion variables we used, and the performance of PLS when evaluating formative models with a large number of items. We find that our respondents had difficulty distinguishing between global criterion variables measuring different aspects of overall System Quality. Also, because formative indicators “compete with one another” in PLS, it may be difficult to develop a set of measures which are all significant for a complex formative construct with a broad scope and a large number of items. Overall, we suggest that there is cautious evidence that both sets of measures are valid and largely equivalent, although questions still remain about the measures, the use of criterion variables, and the use of PLS for this type of model evaluation.
Resumo:
In this paper we investigate the distribution of the product of Rayleigh distributed random variables. Considering the Mellin-Barnes inversion formula and using the saddle point approach we obtain an upper bound for the product distribution. The accuracy of this tail-approximation increases as the number of random variables in the product increase.