873 resultados para Multi-scale modeling
Resumo:
Unsaturated water flow in soil is commonly modelled using Richards’ equation, which requires the hydraulic properties of the soil (e.g., porosity, hydraulic conductivity, etc.) to be characterised. Naturally occurring soils, however, are heterogeneous in nature, that is, they are composed of a number of interwoven homogeneous soils each with their own set of hydraulic properties. When the length scale of these soil heterogeneities is small, numerical solution of Richards’ equation is computationally impractical due to the immense effort and refinement required to mesh the actual heterogeneous geometry. A classic way forward is to use a macroscopic model, where the heterogeneous medium is replaced with a fictitious homogeneous medium, which attempts to give the average flow behaviour at the macroscopic scale (i.e., at a scale much larger than the scale of the heterogeneities). Using the homogenisation theory, a macroscopic equation can be derived that takes the form of Richards’ equation with effective parameters. A disadvantage of the macroscopic approach, however, is that it fails in cases when the assumption of local equilibrium does not hold. This limitation has seen the introduction of two-scale models that include at each point in the macroscopic domain an additional flow equation at the scale of the heterogeneities (microscopic scale). This report outlines a well-known two-scale model and contributes to the literature a number of important advances in its numerical implementation. These include the use of an unstructured control volume finite element method and image-based meshing techniques, that allow for irregular micro-scale geometries to be treated, and the use of an exponential time integration scheme that permits both scales to be resolved simultaneously in a completely coupled manner. Numerical comparisons against a classical macroscopic model confirm that only the two-scale model correctly captures the important features of the flow for a range of parameter values.
Resumo:
Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making
Resumo:
Epigenetic changes correspond to heritable modifications of the chromosome structure, which do not involve alteration of the DNA sequence but do affect gene expression. These mechanisms play an important role in normal cell differentiation, but aberration is associated also with several diseases, including cancer and neural disorders. In consequence, despite intensive studies in recent years, the contribution of modifications remains largely unquantified due to overall system complexity and insufficient data. Computational models can provide powerful auxiliary tools to experimentation, not least as scales from the sub-cellular through cell populations (or to networks of genes) can be spanned. In this paper, the challenges to development, of realistic cross-scale models, are discussed and illustrated with respect to current work.
Resumo:
A numerical approach for coupling the temperature and concentration fields using a micro/macro dual scale model for a solidification problem is presented. The dual scale modeling framework is implemented on a hybrid explicit-implicit solidification scheme. The advantage of this model lies in more accurate consideration of microsegregation occurring at micro-scale using a subgrid model. The model is applied to the case of solidification of a Pb-40% Sn alloy in a rectangular cavity. The present simulation results are compared with the corresponding experimental results reported in the literature, showing improvement in macrosegregation predictions. Subsequently, a comparison of macrosegregation prediction between the results of the present method with those of a parameter model is performed, showing similar trends.
Resumo:
Approximate deconvolution modeling is a very recent approach to large eddy simulation of turbulent flows. It has been applied to compressible flows with success. Here, a premixed flame which forms in the wake of a flameholder has been selected to examine the subgrid-scale modeling of reaction rate by this new method because a previous plane two-dimensional simulation of this wake flame, using a wrinkling function and artificial flame thickening, had revealed discrepancies when compared with experiment. The present simulation is of the temporal evolution of a round wakelike flow at two Reynolds numbers, Re = 2000 and 10,000, based on wake defect velocity and wake diameter. A Fourier-spectral code has been used. The reaction is single-step and irreversible, and the rate follows an Arrhenius law. The reference simulation at the lower Reynolds number is fully resolved. At Re = 10,000, subgrid-scale contributions are significant. It was found that subgrid-scale modeling in the present simulation agrees more closely with unresolved subgrid-scale effects observed in experiment. Specifically, the highest contributions appeared in thin folded regions created by vortex convection. The wrinkling function approach had not selected subgrid-scale effects in these regions.
Resumo:
A closed, trans-scale formulation of damage evolution based on the statistical microdamage mechanics is summarized in this paper. The dynamic function of damage bridges the mesoscopic and macroscopic evolution of damage. The spallation in an aluminium plate is studied with this formulation. It is found that the damage evolution is governed by several dimensionless parameters, i.e., imposed Deborah numbers De* and De, Mach number M and damage number S. In particular, the most critical mode of the macroscopic damage evolution, i.e., the damage localization, is deter-mined by Deborah number De+. Deborah number De* reflects the coupling and competition between the macroscopic loading and the microdamage growth. Therefore, our results reveal the multi-scale nature of spallation. In fact, the damage localization results from the nonlinearity of the microdamage growth. In addition, the dependence of the damage rate on imposed Deborah numbers De* and De, Mach number M and damage number S is discussed.
Resumo:
Problems involving coupled multiple space and time scales offer a real challenge for conventional frameworks of either particle or continuum mechanics. In this paper, four cases studies (shear band formation in bulk metallic glasses, spallation resulting from stress wave, interaction between a probe tip and sample, the simulation of nanoindentation with molecular statistical thermodynamics) are provided to illustrate the three levels of trans-scale problems (problems due to various physical mechanisms at macro-level, problems due to micro-structural evolution at macro/micro-level, problems due to the coupling of atoms/molecules and a finite size body at micro/nano-level) and their formulations. Accordingly, non-equilibrium statistical mechanics, coupled trans-scale equations and simultaneous solutions, and trans-scale algorithms based on atomic/molecular interaction are suggested as the three possible modes of trans-scale mechanics.
Resumo:
A main method of predicting turbulent flows is to solve LES equations, which was called traditional LES method. The traditional LES method solves the motions of large eddies of size larger than filtering scale An while modeling unresolved scales less than Delta_n. Hughes et al argued that many shortcomings of the traditional LES approaches were associated with their inabilities to successfully differentiate between large and small scales. One may guess that a priori scale-separation would be better, because it can predict scale-interaction well compared with posteriori scale-separation. To this end, a multi-scale method was suggested to perform scale-separation computation. The primary contents of the multiscale method are l) A space average is used to differentiate scale. 2) The basic equations include the large scale equations and fluctuation equations. 3) The large-scale equations and fluctuation equations are coupled through turbulent stress terms. We use the multiscale equations of n=2, i.e., the large and small scale (LSS) equations, to simulate 3-D evolutions of a channel flow and a planar mixing layer flow Some interesting results are given.
Resumo:
G-protein coupled receptors (GPCRs) form a large family of proteins and are very important drug targets. They are membrane proteins, which makes computational prediction of their structure challenging. Homology modeling is further complicated by low sequence similarly of the GPCR superfamily.
In this dissertation, we analyze the conserved inter-helical contacts of recently solved crystal structures, and we develop a unified sequence-structural alignment of the GPCR superfamily. We use this method to align 817 human GPCRs, 399 of which are nonolfactory. This alignment can be used to generate high quality homology models for the 817 GPCRs.
To refine the provided GPCR homology models we developed the Trihelix sampling method. We use a multi-scale approach to simplify the problem by treating the transmembrane helices as rigid bodies. In contrast to Monte Carlo structure prediction methods, the Trihelix method does a complete local sampling using discretized coordinates for the transmembrane helices. We validate the method on existing structures and apply it to predict the structure of the lactate receptor, HCAR1. For this receptor, we also build extracellular loops by taking into account constraints from three disulfide bonds. Docking of lactate and 3,5-dihydroxybenzoic acid shows likely involvement of three Arg residues on different transmembrane helices in binding a single ligand molecule.
Protein structure prediction relies on accurate force fields. We next present an effort to improve the quality of charge assignment for large atomic models. In particular, we introduce the formalism of the polarizable charge equilibration scheme (PQEQ) and we describe its implementation in the molecular simulation package Lammps. PQEQ allows fast on the fly charge assignment even for reactive force fields.
Resumo:
In the recent history of psychology and cognitive neuroscience, the notion of habit has been reduced to a stimulus-triggered response probability correlation. In this paper we use a computational model to present an alternative theoretical view (with some philosophical implications), where habits are seen as self-maintaining patterns of behavior that share properties in common with self-maintaining biological processes, and that inhabit a complex ecological context, including the presence and influence of other habits. Far from mechanical automatisms, this organismic and self-organizing concept of habit can overcome the dominating atomistic and statistical conceptions, and the high temporal resolution effects of situatedness, embodiment and sensorimotor loops emerge as playing a more central, subtle and complex role in the organization of behavior. The model is based on a novel "iterant deformable sensorimotor medium (IDSM)," designed such that trajectories taken through sensorimotor-space increase the likelihood that in the future, similar trajectories will be taken. We couple the IDSM to sensors and motors of a simulated robot, and show that under certain conditions, the IDSM conditions, the IDSM forms self-maintaining patterns of activity that operate across the IDSM, the robot's body, and the environment. We present various environments and the resulting habits that form in them. The model acts as an abstraction of habits at a much needed sensorimotor "meso-scale" between microscopic neuron-based models and macroscopic descriptions of behavior. Finally, we discuss how this model and extensions of it can help us understand aspects of behavioral self-organization, historicity and autonomy that remain out of the scope of contemporary representationalist frameworks.
Resumo:
The paper develops the basis for a self-consistent, operationally useful, reactive pollutant dispersion model, for application in urban environments. The model addresses the multi-scale nature of the physical and chemical processes and the interaction between the different scales. The methodology builds on existing techniques of source apportionment in pollutant dispersion and on reduction techniques of detailed chemical mechanisms. © 2005 Published by Elsevier Ltd.
Resumo:
Most of the fields in China are in the middle-late development phase or are mature fields. It becomes more and more difficult to develop the remaining oil/gas. Therefore, it is import to enhance oil/gas recovery in order to maintain the production. Fine scale modeling is a key to improve the recovery. Incorporation of geological, seismic and well log data to 3D earth modeling is essential to build such models. In Ken71 field, well log, cross-well seismic and 3D seismic data are available. A key issue is to build 3D earth model with these multi-scales data for oil field development.In this dissertation, studies on sequential Gaussian-Bayesian simulation have been conducted. Its comparison with cokriging and sequential Gaussian simulation has been performed. The realizations generated by sequential Gaussian-Bayesian simulation have higher vertical resolution than those generated by other methods. Less differences between these realization and true case are observed. With field data, it is proved that incorporating well log, cross-well seismic and 3D seismic into 3D fine scale model is reliable. In addition, the advantages of sequential Gaussian-Bayesian simulation and conditions for input data are demonstrated. In Ken71 field, the impedance difference between sandstone and shale is small. It would be difficult to identify sandstone in the reservoir with traditional impedance inversion. After comparisons of different inversion techniques, stochastic hillclimbing inversion was applied. With this method, shale content inversion is performed using 3D seismic data. Then, the inverted results of shale content and well log data are incorporated into 3D models. This demonstrates a procedure to build fine scale models using multi scale seismic data, especially 3D seismic amplitude volume.The models generated through sequential Gaussian-Bayesian simulation have several advantages including: (1) higher vertical resolution compared with 3D inverted acoustic impedance (AI); (2) consistency of lateral variation as 3D inverted AI; (3) more reliability due to integration cross-well seismic data. It is observed that the precision of the model depends on the 3D inversion.
Resumo:
Pre-stack seismic inversion has become the emphasis and hotspot owing to the exploration & exploitation of oil field and the development of seismic technology. Pre-stack seismic inversion has the strongpoint of making the most of amplitude versus offset compared with the post-stack method. In this dissertation, the three parameters were discussed from multi-angle reflectance of P-wave data based on Zoeppritz’s and Aki & Richard’s equation, include P-wave velocity, S-wave velocity, and density. The three parameters are inversed synchronously from the pre-stack multi-angle P-wave data, based on rockphysics model and aimed at the least remnant difference between model simulation and practical data. In order to improve the stability of inversion and resolution to thin bed, several techniques were employed, such as the wavelet transform with multi-scale function, adding the Bayesian soft constraint and hard constraints (the horizon, structure and so on) to the inversion process. Being the result, the uncertainty of the resolution is reduced, the reliability and precision are improved, the significance of parameters becomes clearer. Meeting to the fundamental requirement of pre-stack inversion, some research in rockphysics are carried out which covered the simulation and inversion of S-wave velocity, the influence of pore fluids to geophysical parameters, and the slecting and analyzing of sensitive parameters. The difference between elastic wave equation modeling and Zoeppritz equation method is also compared. A series of key techniques of pre-stack seismic inversion and description were developed, such as attributes optimization, fluid factors, etc. All the techniques mentioned above are assembled to form a technique sets and process of synchronous pre-stack seismic inversion method of the three parameters based on rock physics and model simulation. The new method and technology were applied in many areas with various reservoirs, obtained both geological and economic significance, which proved to be valid and rational. This study will promote the pre-stack inversion technology and it’s application in hidden reservoirs exploration, face good prospects for development and application.
Resumo:
Study of 3D visualization technology of engineering geology and its application to engineering is a cross subject which includes geosciences, computer, software and information technology. Being an important part of the secondary theme of National Basic Research Program of China (973 Program) whose name is Study of Multi-Scale Structure and Occurrence Environment of Complicated Geological Engineering Mass(No.2002CB412701), the dissertation involves the studies of key problems of 3D geological modeling, integrated applications of multi-format geological data, effective modeling methods of complex approximately layered geological mass as well as applications of 3D virtual reality information management technology.The main research findings are listed below:Integrated application method of multi-format geological data is proposed,which has solved the integrated application of drill holes, engineering geology plandrawings, sectional drawings and cutting drawings as well as exploratory trenchsketch. Its application can provide as more as possible fundamental data for 3Dgeological modeling.A 3D surface construction method combined Laplace interpolation points withoriginal points is proposed, so the deformation of 3D model and the crossing error ofupper and lower surface of model resulted from lack of data when constructing alaminated stratum can be eliminated.3D modeling method of approximately layered geological mass is proposed,which has solved the problems of general modeling method based on the sections or points and faces when constructing terrain and concordant strata.The 3D geological model of VII dam site of Xiangjiaba hydropower stationhas been constructed. The applications of 3D geological model to the auto-plotting ofsectional drawing and the converting of numerical analysis model are also discussed.3D virtual reality information integrated platform is developed, whose mostimportant character is that it is a software platform having the functions of 3D virtualreality flying and multi-format data management simultaneously. Therefore, theplatform can load different 3D model so as to satisfy the different engineeringdemands.The relics of Aigong Cave of Longyou Stone Caves are recovered. Thereinforcement plans of 1# and 2# cave in phoenix hill also be expressed. The intuitiveexpression provided decision makers and designers a very good environment.The basic framework and specific functions of 3D geological informationsystem are proposed.The main research findings in the dissertation have been successfully applied to some important engineering such as Xiangjiaba hydropower station, a military airport and Longyou Stone Caves etc.
Resumo:
BACKGROUND: Computer simulations are of increasing importance in modeling biological phenomena. Their purpose is to predict behavior and guide future experiments. The aim of this project is to model the early immune response to vaccination by an agent based immune response simulation that incorporates realistic biophysics and intracellular dynamics, and which is sufficiently flexible to accurately model the multi-scale nature and complexity of the immune system, while maintaining the high performance critical to scientific computing. RESULTS: The Multiscale Systems Immunology (MSI) simulation framework is an object-oriented, modular simulation framework written in C++ and Python. The software implements a modular design that allows for flexible configuration of components and initialization of parameters, thus allowing simulations to be run that model processes occurring over different temporal and spatial scales. CONCLUSION: MSI addresses the need for a flexible and high-performing agent based model of the immune system.