257 resultados para Stochastic Matrix


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Biochemical systems with relatively low numbers of components must be simulated stochastically in order to capture their inherent noise. Although there has recently been considerable work on discrete stochastic solvers, there is still a need for numerical methods that are both fast and accurate. The Bulirsch-Stoer method is an established method for solving ordinary differential equations that possesses both of these qualities. Results In this paper, we present the Stochastic Bulirsch-Stoer method, a new numerical method for simulating discrete chemical reaction systems, inspired by its deterministic counterpart. It is able to achieve an excellent efficiency due to the fact that it is based on an approach with high deterministic order, allowing for larger stepsizes and leading to fast simulations. We compare it to the Euler τ-leap, as well as two more recent τ-leap methods, on a number of example problems, and find that as well as being very accurate, our method is the most robust, in terms of efficiency, of all the methods considered in this paper. The problems it is most suited for are those with increased populations that would be too slow to simulate using Gillespie’s stochastic simulation algorithm. For such problems, it is likely to achieve higher weak order in the moments. Conclusions The Stochastic Bulirsch-Stoer method is a novel stochastic solver that can be used for fast and accurate simulations. Crucially, compared to other similar methods, it better retains its high accuracy when the timesteps are increased. Thus the Stochastic Bulirsch-Stoer method is both computationally efficient and robust. These are key properties for any stochastic numerical method, as they must typically run many thousands of simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since we still know very little about stem cells in their natural environment, it is useful to explore their dynamics through modelling and simulation, as well as experimentally. Most models of stem cell systems are based on deterministic differential equations that ignore the natural heterogeneity of stem cell populations. This is not appropriate at the level of individual cells and niches, when randomness is more likely to affect dynamics. In this paper, we introduce a fast stochastic method for simulating a metapopulation of stem cell niche lineages, that is, many sub-populations that together form a heterogeneous metapopulation, over time. By selecting the common limiting timestep, our method ensures that the entire metapopulation is simulated synchronously. This is important, as it allows us to introduce interactions between separate niche lineages, which would otherwise be impossible. We expand our method to enable the coupling of many lineages into niche groups, where differentiated cells are pooled within each niche group. Using this method, we explore the dynamics of the haematopoietic system from a demand control system perspective. We find that coupling together niche lineages allows the organism to regulate blood cell numbers as closely as possible to the homeostatic optimum. Furthermore, coupled lineages respond better than uncoupled ones to random perturbations, here the loss of some myeloid cells. This could imply that it is advantageous for an organism to connect together its niche lineages into groups. Our results suggest that a potential fruitful empirical direction will be to understand how stem cell descendants communicate with the niche and how cancer may arise as a result of a failure of such communication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Matrix metalloproteinase-2 (MMP-2) is an endopeptidase that facilitates extracellular matrix remodeling and molecular regulation, and is implicated in tumor metastasis. Type I collagen (Col I) regulates the activation of MMP-2 through both transcriptional and post-transcriptional means; however gaps remain in our understanding of the involvement of collagen-binding ?1 integrins in collagen-stimulated MMP-2 activation. Methods Three ?1 integrin siRNAs were used to elucidate the involvement of ?1 integrins in the Col I-induced MMP-2 activation mechanism. ?1 integrin knockdown was analyzed by quantitative RT-PCR, Western Blot and FACS analysis. Adhesion assay and collagen gel contraction were used to test the biological effects of ?1 integrin abrogation. MMP-2 activation levels were monitored by gelatin zymography. Results All three ?1 integrin siRNAs were efficient at ?1 integrin knockdown and FACS analysis revealed commensurate reductions of integrins ?2 and ?3, which are heterodimeric partners of ?1, but not ?V, which is not. All three ?1 integrin siRNAs inhibited adhesion and collagen gel contraction, however only the siRNA showing the greatest magnitude of ?1 knockdown inhibited Col I-induced MMP-2 activation and reduced the accompanying upregulation of MT1-MMP, suggesting a dose response threshold effect. Re-transfection with codon-swapped ?1 integrin overcame the reduction in MMP-2 activation induced by Col-1, confirming the ?1 integrin target specificity. MMP-2 activation induced by TPA or Concanavalin A (Con A) was not inhibited by ?1 integrin siRNA knockdown. Conclusion Together, the data reveals that strong abrogation of ?1 integrin is required to block MMP-2 activation induced by Col I, which may have implications for the therapeutic targeting of ?1 integrin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to simulate stiff biochemical reaction systems, an explicit exponential Euler scheme is derived for multidimensional, non-commutative stochastic differential equations with a semilinear drift term. The scheme is of strong order one half and A-stable in mean square. The combination with this and the projection method shows good performance in numerical experiments dealing with an alternative formulation of the chemical Langevin equation for a human ether a-go-go related gene ion channel mode

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an algorithm for multiarmed bandits that achieves almost optimal performance in both stochastic and adversarial regimes without prior knowledge about the nature of the environment. Our algorithm is based on augmentation of the EXP3 algorithm with a new control lever in the form of exploration parameters that are tailored individually for each arm. The algorithm simultaneously applies the “old” control lever, the learning rate, to control the regret in the adversarial regime and the new control lever to detect and exploit gaps between the arm losses. This secures problem-dependent “logarithmic” regret when gaps are present without compromising on the worst-case performance guarantee in the adversarial regime. We show that the algorithm can exploit both the usual expected gaps between the arm losses in the stochastic regime and deterministic gaps between the arm losses in the adversarial regime. The algorithm retains “logarithmic” regret guarantee in the stochastic regime even when some observations are contaminated by an adversary, as long as on average the contamination does not reduce the gap by more than a half. Our results for the stochastic regime are supported by experimental validation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes a maximum likelihood method for estimating the parameters of the standard square-root stochastic volatility model and a variant of the model that includes jumps in equity prices. The model is fitted to data on the S&P 500 Index and the prices of vanilla options written on the index, for the period 1990 to 2011. The method is able to estimate both the parameters of the physical measure (associated with the index) and the parameters of the risk-neutral measure (associated with the options), including the volatility and jump risk premia. The estimation is implemented using a particle filter whose efficacy is demonstrated under simulation. The computational load of this estimation method, which previously has been prohibitive, is managed by the effective use of parallel computing using graphics processing units (GPUs). The empirical results indicate that the parameters of the models are reliably estimated and consistent with values reported in previous work. In particular, both the volatility risk premium and the jump risk premium are found to be significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of decellularized extracellular matrices to aid tissue regeneration in reconstructive surgery and regenerative medicine has been promising. Several decellularization protocols for removing cellular materials from natural tissues such as heart valves are currently in use. This paper evaluates the feasibility of potential extension of this methodology relative to the desirable properties of load bearing joint tissues such as stiffness, porosity and ability to recover adequately after deformation to facilitate physiological function. Two decellularization protocols, namely: Trypsin and Triton X-100 were evaluated against their effects on bovine articular cartilage, using biomechanical, biochemical and microstructural techniques. These analyses revealed that decellularization with trypsin resulted in severe loss of mechanical stiffness including deleterious collapse of the collagen architecture which in turn significantly compromised the porosity of the construct. In contrast, triton X-100 detergent treatment yielded samples that retain mechanical stiffness relative to that of the normal intact cartilage sample, but the resulting construct contained ruminant cellular constituents. We conclude that both of these common decellularization protocols are inadequate for producing constructs that can serve as effective replacement and scaffolds to regenerate articular joint tissue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extracellular matrix (ECM) materials are widely used in cartilage tissue engineering. However, the current ECM materials are unsatisfactory for clinical practice as most of them are derived from allogenous or xenogenous tissue. This study was designed to develop a novel autologous ECM scaffold for cartilage tissue engineering. The autologous bone marrow mesenchymal stem cell-derived ECM (aBMSC-dECM) membrane was collected and fabricated into a three-dimensional porous scaffold via cross-linking and freeze-drying techniques. Articular chondrocytes were seeded into the aBMSC-dECM scaffold and atelocollagen scaffold, respectively. An in vitro culture and an in vivo implantation in nude mice model were performed to evaluate the influence on engineered cartilage. The current results showed that the aBMSC-dECM scaffold had a good microstructure and biocompatibility. After 4 weeks in vitro culture, the engineered cartilage in the aBMSC-dECM scaffold group formed thicker cartilage tissue with more homogeneous structure and higher expressions of cartilaginous gene and protein compared with the atelocollagen scaffold group. Furthermore, the engineered cartilage based on the aBMSC-dECM scaffold showed better cartilage formation in terms of volume and homogeneity, cartilage matrix content, and compressive modulus after 3 weeks in vivo implantation. These results indicated that the aBMSC-dECM scaffold could be a successful novel candidate scaffold for cartilage tissue engineering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The contemporary methodology for growth models of organisms is based on continuous trajectories and thus it hinders us from modelling stepwise growth in crustacean populations. Growth models for fish are normally assumed to follow a continuous function, but a different type of model is needed for crustacean growth. Crustaceans must moult in order for them to grow. The growth of crustaceans is a discontinuous process due to the periodical shedding of the exoskeleton in moulting. The stepwise growth of crustaceans through the moulting process makes the growth estimation more complex. Stochastic approaches can be used to model discontinuous growth or what are commonly known as "jumps" (Figure 1). However, in stochastic growth model we need to ensure that the stochastic growth model results in only positive jumps. In view of this, we will introduce a subordinator that is a special case of a Levy process. A subordinator is a non-decreasing Levy process, that will assist in modelling crustacean growth for better understanding of the individual variability and stochasticity in moulting periods and increments. We develop the estimation methods for parameter estimation and illustrate them with the help of a dataset from laboratory experiments. The motivational dataset is from the ornate rock lobster, Panulirus ornatus, which can be found between Australia and Papua New Guinea. Due to the presence of sex effects on the growth (Munday et al., 2004), we estimate the growth parameters separately for each sex. Since all hard parts are shed too often, the exact age determination of a lobster can be challenging. However, the growth parameters for the aforementioned moult processes from tank data being able to estimate through: (i) inter-moult periods, and (ii) moult increment. We will attempt to derive a joint density, which is made up of two functions: one for moult increments and the other for time intervals between moults. We claim these functions are conditionally independent given pre-moult length and the inter-moult periods. The variables moult increments and inter-moult periods are said to be independent because of the Markov property or conditional probability. Hence, the parameters in each function can be estimated separately. Subsequently, we integrate both of the functions through a Monte Carlo method. We can therefore obtain a population mean for crustacean growth (e. g. red curve in Figure 1). [GRAPHICS]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To discuss generalized estimating equations as an extension of generalized linear models by commenting on the paper of Ziegler and Vens "Generalized Estimating Equations. Notes on the Choice of the Working Correlation Matrix". Methods Inviting an international group of experts to comment on this paper. Results Several perspectives have been taken by the discussants. Econometricians have established parallels to the generalized method of moments (GMM). Statisticians discussed model assumptions and the aspect of missing data Applied statisticians; commented on practical aspects in data analysis. Conclusions In general, careful modeling correlation is encouraged when considering estimation efficiency and other implications, and a comparison of choosing instruments in GMM and generalized estimating equations, (GEE) would be worthwhile. Some theoretical drawbacks of GEE need to be further addressed and require careful analysis of data This particularly applies to the situation when data are missing at random.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary. Interim analysis is important in a large clinical trial for ethical and cost considerations. Sometimes, an interim analysis needs to be performed at an earlier than planned time point. In that case, methods using stochastic curtailment are useful in examining the data for early stopping while controlling the inflation of type I and type II errors. We consider a three-arm randomized study of treatments to reduce perioperative blood loss following major surgery. Owing to slow accrual, an unplanned interim analysis was required by the study team to determine whether the study should be continued. We distinguish two different cases: when all treatments are under direct comparison and when one of the treatments is a control. We used simulations to study the operating characteristics of five different stochastic curtailment methods. We also considered the influence of timing of the interim analyses on the type I error and power of the test. We found that the type I error and power between the different methods can be quite different. The analysis for the perioperative blood loss trial was carried out at approximately a quarter of the planned sample size. We found that there is little evidence that the active treatments are better than a placebo and recommended closure of the trial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

James (1991, Biometrics 47, 1519-1530) constructed unbiased estimating functions for estimating the two parameters in the von Bertalanffy growth curve from tag-recapture data. This paper provides unbiased estimating functions for a class of growth models that incorporate stochastic components and explanatory variables. a simulation study using seasonal growth models indicates that the proposed method works well while the least-squares methods that are commonly used in the literature may produce substantially biased estimates. The proposed model and method are also applied to real data from tagged rack lobsters to assess the possible seasonal effect on growth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper studies stochastic approximation as a technique for bias reduction. The proposed method does not require approximating the bias explicitly, nor does it rely on having independent identically distributed (i.i.d.) data. The method always removes the leading bias term, under very mild conditions, as long as auxiliary samples from distributions with given parameters are available. Expectation and variance of the bias-corrected estimate are given. Examples in sequential clinical trials (non-i.i.d. case), curved exponential models (i.i.d. case) and length-biased sampling (where the estimates are inconsistent) are used to illustrate the applications of the proposed method and its small sample properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purification of drinking water is routinely achieved by use of conventional coagulants and disinfection procedures. However, there are instances such as flood events when the level of turbidity reaches extreme levels while NOM may be an issue throughout the year. Consequently, there is a need to develop technologies which can effectively treat water of high turbidity during flood events and natural organic matter (NOM) content year round. It was our hypothesis that pebble matrix filtration potentially offered a relatively cheap, simple and reliable means to clarify such challenging water samples. Therefore, a laboratory scale pebble matrix filter (PMF) column was used to evaluate the turbidity and natural organic matter (NOM) pre-treatment performance in relation to 2013 Brisbane River flood water. Since the high turbidity was only a seasonal and short term problem, the general applicability of pebble matrix filters for NOM removal was also investigated. A 1.0 m deep bed of pebbles (the matrix) partly in-filled with either sand or crushed glass was tested, upon which was situated a layer of granular activated carbon (GAC). Turbidity was measured as a surrogate for suspended solids (SS), whereas, total organic carbon (TOC) and UV Absorbance at 254 nm were measured as surrogate parameters for NOM. Experiments using natural flood water showed that without the addition of any chemical coagulants, PMF columns achieved at least 50% turbidity reduction when the source water contained moderate hardness levels. For harder water samples, above 85% turbidity reduction was obtained. The ability to remove 50% turbidity without chemical coagulants may represent significant cost savings to water treatment plants and added environmental benefits accrue due to less sludge formation. A TOC reduction of 35-47% and UV-254 nm reduction of 24-38% was also observed. In addition to turbidity removal during flood periods, the ability to remove NOM using the pebble matrix filter throughout the year may have the benefit of reducing disinfection by-products (DBP) formation potential and coagulant demand at water treatment plants. Final head losses were remarkably low, reaching only 11 cm at a filtration velocity of 0.70 m/h.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.