974 resultados para Stochastic Method
Resumo:
Measurement of individual emission sources (e.g., animals or pen manure) within intensive livestock enterprises is necessary to test emission calculation protocols and to identify targets for decreased emissions. In this study, a vented, fabric-covered large chamber (4.5 × 4.5 m, 1.5 m high; encompassing greater spatial variability than a smaller chamber) in combination with on-line analysis (nitrous oxide [N2O] and methane [CH4] via Fourier Transform Infrared Spectroscopy; 1 analysis min-1) was tested as a means to isolate and measure emissions from beef feedlot pen manure sources. An exponential model relating chamber concentrations to ambient gas concentrations, air exchange (e.g., due to poor sealing with the surface; model linear when ≈ 0 m3 s-1), and chamber dimensions allowed data to be fitted with high confidence. Alternating manure source emission measurements using the large-chamber and the backward Lagrangian stochastic (bLS) technique (5-mo period; bLS validated via tracer gas release, recovery 94-104%) produced comparable N2O and CH4 emission values (no significant difference at P < 0.05). Greater precision of individual measurements was achieved via the large chamber than for the bLS (mean ± standard error of variance components: bLS half-hour measurements, 99.5 ± 325 mg CH4 s-1 and 9.26 ± 20.6 mg N2O s-1; large-chamber measurements, 99.6 ± 64.2 mg CH4 s-1 and 8.18 ± 0.3 mg N2O s-1). The large-chamber design is suitable for measurement of emissions from manure on pen surfaces, isolating these emissions from surrounding emission sources, including enteric emissions. © © American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America.
Resumo:
Since we still know very little about stem cells in their natural environment, it is useful to explore their dynamics through modelling and simulation, as well as experimentally. Most models of stem cell systems are based on deterministic differential equations that ignore the natural heterogeneity of stem cell populations. This is not appropriate at the level of individual cells and niches, when randomness is more likely to affect dynamics. In this paper, we introduce a fast stochastic method for simulating a metapopulation of stem cell niche lineages, that is, many sub-populations that together form a heterogeneous metapopulation, over time. By selecting the common limiting timestep, our method ensures that the entire metapopulation is simulated synchronously. This is important, as it allows us to introduce interactions between separate niche lineages, which would otherwise be impossible. We expand our method to enable the coupling of many lineages into niche groups, where differentiated cells are pooled within each niche group. Using this method, we explore the dynamics of the haematopoietic system from a demand control system perspective. We find that coupling together niche lineages allows the organism to regulate blood cell numbers as closely as possible to the homeostatic optimum. Furthermore, coupled lineages respond better than uncoupled ones to random perturbations, here the loss of some myeloid cells. This could imply that it is advantageous for an organism to connect together its niche lineages into groups. Our results suggest that a potential fruitful empirical direction will be to understand how stem cell descendants communicate with the niche and how cancer may arise as a result of a failure of such communication.
Resumo:
Partial differential equations (PDEs) with multiscale coefficients are very difficult to solve due to the wide range of scales in the solutions. In the thesis, we propose some efficient numerical methods for both deterministic and stochastic PDEs based on the model reduction technique.
For the deterministic PDEs, the main purpose of our method is to derive an effective equation for the multiscale problem. An essential ingredient is to decompose the harmonic coordinate into a smooth part and a highly oscillatory part of which the magnitude is small. Such a decomposition plays a key role in our construction of the effective equation. We show that the solution to the effective equation is smooth, and could be resolved on a regular coarse mesh grid. Furthermore, we provide error analysis and show that the solution to the effective equation plus a correction term is close to the original multiscale solution.
For the stochastic PDEs, we propose the model reduction based data-driven stochastic method and multilevel Monte Carlo method. In the multiquery, setting and on the assumption that the ratio of the smallest scale and largest scale is not too small, we propose the multiscale data-driven stochastic method. We construct a data-driven stochastic basis and solve the coupled deterministic PDEs to obtain the solutions. For the tougher problems, we propose the multiscale multilevel Monte Carlo method. We apply the multilevel scheme to the effective equations and assemble the stiffness matrices efficiently on each coarse mesh grid. In both methods, the $\KL$ expansion plays an important role in extracting the main parts of some stochastic quantities.
For both the deterministic and stochastic PDEs, numerical results are presented to demonstrate the accuracy and robustness of the methods. We also show the computational time cost reduction in the numerical examples.
Resumo:
We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stchastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, nonetheless SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS community, simulating the transitions between active and suppressed periods of tropical convection.
Resumo:
We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stochastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS (GEWEX Cloud System Study) community, simulating transitions between active and suppressed periods of tropical convection.
Resumo:
Background In order to provide insights into the complex biochemical processes inside a cell, modelling approaches must find a balance between achieving an adequate representation of the physical phenomena and keeping the associated computational cost within reasonable limits. This issue is particularly stressed when spatial inhomogeneities have a significant effect on system's behaviour. In such cases, a spatially-resolved stochastic method can better portray the biological reality, but the corresponding computer simulations can in turn be prohibitively expensive. Results We present a method that incorporates spatial information by means of tailored, probability distributed time-delays. These distributions can be directly obtained by single in silico or a suitable set of in vitro experiments and are subsequently fed into a delay stochastic simulation algorithm (DSSA), achieving a good compromise between computational costs and a much more accurate representation of spatial processes such as molecular diffusion and translocation between cell compartments. Additionally, we present a novel alternative approach based on delay differential equations (DDE) that can be used in scenarios of high molecular concentrations and low noise propagation. Conclusions Our proposed methodologies accurately capture and incorporate certain spatial processes into temporal stochastic and deterministic simulations, increasing their accuracy at low computational costs. This is of particular importance given that time spans of cellular processes are generally larger (possibly by several orders of magnitude) than those achievable by current spatially-resolved stochastic simulators. Hence, our methodology allows users to explore cellular scenarios under the effects of diffusion and stochasticity in time spans that were, until now, simply unfeasible. Our methodologies are supported by theoretical considerations on the different modelling regimes, i.e. spatial vs. delay-temporal, as indicated by the corresponding Master Equations and presented elsewhere.
Resumo:
Successful prediction of groundwater flow and solute transport through highly heterogeneous aquifers has remained elusive due to the limitations of methods to characterize hydraulic conductivity (K) and generate realistic stochastic fields from such data. As a result, many studies have suggested that the classical advective-dispersive equation (ADE) cannot reproduce such transport behavior. Here we demonstrate that when high-resolution K data are used with a fractal stochastic method that produces K fields with adequate connectivity, the classical ADE can accurately predict solute transport at the macrodispersion experiment site in Mississippi. This development provides great promise to accurately predict contaminant plume migration, design more effective remediation schemes, and reduce environmental risks. Key Points Non-Gaussian transport behavior at the MADE site is unraveledADE can reproduce tracer transport in heterogeneous aquifers with no calibrationNew fractal method generates heterogeneous K fields with adequate connectivity
Resumo:
O estudo dos diferentes fenômenos de separação tem sido cada vez mais importante para os diferentes ramos da indústria e ciência. Devido à grande capacidade computacional atual, é possível modelar e analisar os fenômenos cromatográficos a nível microscópico. Os modelos de rede vêm sendo cada vez mais utilizados, para representar processos de separação por cromatografia, pois através destes pode-se representar os aspectos topológicos e morfológicos dos diferentes materiais adsorventes disponíveis no mercado. Neste trabalho visamos o desenvolvimento de um modelo de rede tridimensional para representação de uma coluna cromatográfica, a nível microscópico, onde serão modelados os fenômenos de adsorção, dessorção e dispersão axial através de um método estocástico. Também foram utilizadas diferentes abordagens com relação ao impedimento estérico Os resultados obtidos foram comparados a resultados experimentais. Depois é utilizado um modelo de rede bidimensional para representar um sistema de adsorção do tipo batelada, mantendo-se a modelagem dos fenômenos de adsorção e dessorção, e comparados a sistemas reais posteriormente. Em ambos os sistemas modelados foram analisada as constantes de equilíbrio, parâmetro fundamental nos sistemas de adsorção, e por fim foram obtidas e analisadas isotermas de adsorção. Foi possível concluir que, para os modelos de rede, os fenômenos de adsorção e dessorção bastam para obter perfis de saída similares aos vistos experimentalmente, e que o fenômeno da dispersão axial influência menos que os fenômenos cinéticos em questão
Resumo:
A estimação de parâmetros cinéticos em processos químicos e cromatográficos utilizando técnicas numéricas assistidas por computadores tem conduzido para melhoria da eficiência e o favorecimento da compreensão das fenomenologias envolvidas nos mesmos. Na primeira parte deste trabalho será realizada a modelagem computacional do processo de produção de biodiesel via esterificação, sendo que, o método de otimização estocástica Random Restricted Window (R2W) será correlacionado com os dados experimentais da produção de biodiesel a partir da esterificação do ácido láurico com etanol anidro na presença do catalisador ácido nióbico (Nb2O5). Na segunda parte do mesmo será realizada a modelagem computacional do processo de cromatografia de adsorção (batch process) onde serão correlacionados os dados provenientes dos modelos cinéticos de HASHIM, CHASE e IKM2 com os dados experimentais da adsorção de amoxicilina com quitosana, e também serão correlacionados os dados experimentais da adsorção de Bovine Serum Albumin (BSA) com Streamline DEAE com os dados provenientes de uma nova aplicação do método R2W mediante a implementação de um modelo cinético reversível. Ademais, as constantes cinéticas para cada processo supracitado serão estimadas levando em consideração o valor mínimo da função resíduos quadrados.
Resumo:
Biofuels are increasingly promoted worldwide as a means for reducing greenhouse gas (GHG) emissions from transport. However, current regulatory frameworks and most academic life cycle analyses adopt a deterministic approach in determining the GHG intensities of biofuels and thus ignore the inherent risk associated with biofuel production. This study aims to develop a transparent stochastic method for evaluating UK biofuels that determines both the magnitude and uncertainty of GHG intensity on the basis of current industry practices. Using wheat ethanol as a case study, we show that the GHG intensity could span a range of 40-110 gCO2e MJ-1 when land use change (LUC) emissions and various sources of uncertainty are taken into account, as compared with a regulatory default value of 44 gCO2e MJ-1. This suggests that the current deterministic regulatory framework underestimates wheat ethanol GHG intensity and thus may not be effective in evaluating transport fuels. Uncertainties in determining the GHG intensity of UK wheat ethanol include limitations of available data at a localized scale, and significant scientific uncertainty of parameters such as soil N2O and LUC emissions. Biofuel polices should be robust enough to incorporate the currently irreducible uncertainties and flexible enough to be readily revised when better science is available. © 2013 IOP Publishing Ltd.
Resumo:
Guided by geological theories, the author analyzed factual informations and applied advanced technologies including logging reinterpretation, predicting of fractal-based fracture network system and stochastic modeling to the low permeable sandstone reservoirs in Shengli oilfield. A new technology suitable for precious geological research and 3D heterogeneity modeling was formed through studies of strata precious correlation, relation between tectonic evolution and fractural distribution, the control and modification of reservoirs diagenesis, logging interpretation mathematical model, reservoir heterogeneity, and so on. The main research achievements are as follows: (1) Proposed four categories of low permeable reservoirs, which were preferable, general, unusual and super low permeable reservoir, respectively; (2) Discussed ten geological features of the low permeable reservoirs in Shengli area; (3) Classified turbidite fan of Es_3 member of the Area 3 in Bonan oilfield into nine types of lithological facies, and established the facies sequences and patterns; (4) Recognized that the main diagenesis were compaction, cementation and dissolution, among which the percent compaction was up to 50%~90%; (5) Divided the pore space in ES_3 member reservoir into secondary pores with dissolved carbonate cement and residual intergranular pores strongly compacted and cemented; (6) Established logging interpretation mathematical model guided by facies- control modeling theory; (7) Predicted the fracture distribution in barriers using fractal method; (8) Constructed reservoir structural model by deterministic method and the 3D model of reservoir parameters by stochastic method; (9) Applied permeability magnitudes and directions to describe the fractures' effect on fluid flow, and presented four different fractural configurations and their influence on permeability; (10) Developed 3D modeling technology for the low permeable sandstone reservoirs. The research provided reliable geological foundation for the establishment and modification of development plans in low permeable sandstone reservoirs, improved the development effect and produced more reserves, which provided technical support for the stable and sustained development of Shengli Oilfield.
Resumo:
Motivation: We study a stochastic method for approximating the set of local minima in partial RNA folding landscapes associated with a bounded-distance neighbourhood of folding conformations. The conformations are limited to RNA secondary structures without pseudoknots. The method aims at exploring partial energy landscapes pL induced by folding simulations and their underlying neighbourhood relations. It combines an approximation of the number of local optima devised by Garnier and Kallel (2002) with a run-time estimation for identifying sets of local optima established by Reeves and Eremeev (2004).
Results: The method is tested on nine sequences of length between 50 nt and 400 nt, which allows us to compare the results with data generated by RNAsubopt and subsequent barrier tree calculations. On the nine sequences, the method captures on average 92% of local minima with settings designed for a target of 95%. The run-time of the heuristic can be estimated by O(n2D?ln?), where n is the sequence length, ? is the number of local minima in the partial landscape pL under consideration and D is the maximum number of steepest descent steps in attraction basins associated with pL.
Resumo:
Traditional heuristic approaches to the Examination Timetabling Problem normally utilize a stochastic method during Optimization for the selection of the next examination to be considered for timetabling within the neighbourhood search process. This paper presents a technique whereby the stochastic method has been augmented with information from a weighted list gathered during the initial adaptive construction phase, with the purpose of intelligently directing examination selection. In addition, a Reinforcement Learning technique has been adapted to identify the most effective portions of the weighted list in terms of facilitating the greatest potential for overall solution improvement. The technique is tested against the 2007 International Timetabling Competition datasets with solutions generated within a time frame specified by the competition organizers. The results generated are better than those of the competition winner in seven of the twelve examinations, while being competitive for the remaining five examinations. This paper also shows experimentally how using reinforcement learning has improved upon our previous technique.