193 resultados para Spatio-numerical modelling
Resumo:
It is shown that the observed difference in sediment transporting efficiency by the swash uprush, compared with the downrush, could be mainly due to greater bed shear stress for a given velocity in the more abruptly accelerated uprush. The bed shear stress generated by an arbitrary free stream velocity time series is modelled in terms of usual wave boundary layer models plus a phase lead (phi(tau) of the bed shear stress compared with the free stream velocity at the peak frequency. With this approach, the total transport amounts in uprush and downrush can be modelled satisfactorily with the same sediment transport formula, without the need for different uprush and downrush coefficients. While the adaptation of sediment transport formulae from steady flow can thus lead to the right total amounts of sediment moved by this method, the timing of the instantaneous sediment transport rates are probably not accurately modelled due to the highly unsteady nature of the swash and the presence of pre-suspended sediment in the uprush. Nevertheless, the proposed method is a useful intermediate step before we have a complete understanding of sediment transport under very rapid accelerations and of the relative contribution of pre-suspended sediment to the onshore sediment transport in swash zones. (C) 2002 Published by Elsevier Science B.V.
Resumo:
Ab initio calculations have been performed to determine the energetics of oxygen atoms adsorbed onto graphene planes and the possible reaction path extracting carbon atorns in the form of carbon monoxide. Front the energetics it is confirmed that this reaction path will not significantly contribute to the gasification of well ordered carbonaceous chars. Modelling results which explore this limit Lire presented. (C) 2002 Elsevier Science Ltd, All rights reserved.
Resumo:
The flow field and the energy transport near thermoacoustic couples are simulated using a 2D full Navier-Stokes solver. The thermoacoustic couple plate is maintained at a constant temperature; plate lengths, which are short and long compared with the particle displacement lengths of the acoustic standing waves, are tested. Also investigated are the effects of plate spacing and the amplitude of the standing wave. Results are examined in the form of energy vectors, particle paths, and overall entropy generation rates. These show that a net heat-pumping effect appears only near the edges of thermoacoustic couple plates, within about a particle displacement distance from the ends. A heat-pumping effect can be seen even on the shortest plates tested when the plate spacing exceeds the thermal penetration depth. It is observed that energy dissipation near the plate increases quadratically as the plate spacing is reduced. The results also indicate that there may be a larger scale vortical motion outside the plates which disappears as the plate spacing is reduced. (C) 2002 Acoustical Society of America.
Resumo:
It was previously published by the authors that granules can either coalesce through Type I (when granules coalesce by viscous dissipation in the surface liquid layer before their surfaces touch) or Type II (when granules are slowed to a halt during rebound, after their surfaces have made contact) (AIChE J. 46 (3) (2000) 529). Based on this coalescence mechanism, a new coalescence kernel for population balance modelling of granule growth is presented. The kernel is constant such that only collisions satisfying the conditions for one of the two coalescence types are successful. One constant rate is assigned to each type of coalescence and zero is for the case of rebound. As the conditions for Types I and II coalescence are dependent on granule and binder properties, the coalescence kernel is thus physically based. Simulation results of a variety of binder and granule materials show good agreement with experimental data. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper presents results on the simulation of the solid state sintering of copper wires using Monte Carlo techniques based on elements of lattice theory and cellular automata. The initial structure is superimposed onto a triangular, two-dimensional lattice, where each lattice site corresponds to either an atom or vacancy. The number of vacancies varies with the simulation temperature, while a cluster of vacancies is a pore. To simulate sintering, lattice sites are picked at random and reoriented in terms of an atomistic model governing mass transport. The probability that an atom has sufficient energy to jump to a vacant lattice site is related to the jump frequency, and hence the diffusion coefficient, while the probability that an atomic jump will be accepted is related to the change in energy of the system as a result of the jump, as determined by the change in the number of nearest neighbours. The jump frequency is also used to relate model time, measured in Monte Carlo Steps, to the actual sintering time. The model incorporates bulk, grain boundary and surface diffusion terms and includes vacancy annihilation on the grain boundaries. The predictions of the model were found to be consistent with experimental data, both in terms of the microstructural evolution and in terms of the sintering time. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
In the last 7 years, a method has been developed to analyse building energy performance using computer simulation, in Brazil. The method combines analysis of building design plans and documentation, walk-through visits, electric and thermal measurements and the use of an energy simulation tool (DOE-2.1E code), The method was used to model more than 15 office buildings (more than 200 000 m(2)), located between 12.5degrees and 27.5degrees South latitude. The paper describes the basic methodology, with data for one building and presents additional results for other six cases. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The prediction of tillering is poor or absent in existing sorghum crop models even though fertile tillers contribute significantly to grain yield. The objective of this study was to identify general quantitative relationships underpinning tiller dynamics of sorghum for a broad range of assimilate availabilities. Emergence, phenology, leaf area development and fertility of individual main calms and tillers were quantified weekly in plants grown at one of four plant densities ranging from two to 16 plants m(-2). On any given day, a tiller was considered potentially fertile (a posteriori) if its number of leaves continued to increase thereafter. The dynamics of potentially fertile tiller number per plant varied greatly with plant density, but could generally be described by three determinants, stable across plant densities: tiller emergence rate aligned with leaf ligule appearance rate; cessation of tiller emergence occurred at a stable leaf area index; and rate of decrease in potentially fertile tillers was linearly related to the ratio of realized to potential leaf area growth. Realized leaf area growth is the measured increase in leaf area, whereas potential leaf area growth is the estimated increase in leaf area if all potentially fertile tillers were to continue to develop. Procedures to predict this ratio, by estimating realized leaf area per plant from intercepted radiation and potential leaf area per plant from the number and type of developing axes, are presented. While it is suitable for modelling tiller dynamics in grain sorghum, this general framework needs to be validated by testing it in different environments and for other cultivars. (C) 2002 Annals of Botany Company.
Resumo:
Crop modelling has evolved over the last 30 or so years in concert with advances in crop physiology, crop ecology and computing technology. Having reached a respectable degree of acceptance, it is appropriate to review briefly the course of developments in crop modelling and to project what might be major contributions of crop modelling in the future. Two major opportunities are envisioned for increased modelling activity in the future. One opportunity is in a continuing central, heuristic role to support scientific investigation, to facilitate decision making by crop managers, and to aid in education. Heuristic activities will also extend to the broader system-level issues of environmental and ecological aspects of crop production. The second opportunity is projected as a prime contributor in understanding and advancing the genetic regulation of plant performance and plant improvement. Physiological dissection and modelling of traits provides an avenue by which crop modelling could contribute to enhancing integration of molecular genetic technologies in crop improvement. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.
Resumo:
We present the first mathematical model on the transmission dynamics of Schistosoma japonicum. The work extends Barbour's classic model of schistosome transmission. It allows for the mammalian host heterogeneity characteristic of the S. japonicum life cycle, and solves the problem of under-specification of Barbour's model by the use of Chinese data we are collecting on human-bovine transmission in the Poyang Lake area of Jiangxi Province in China. The model predicts that in the lake/marshland areas of the Yangtze River basin: (1) once-early mass chemotherapy of humans is little better than twice-yearly mass chemotherapy in reducing human prevalence. Depending on the heterogeneity of prevalence within the population, targeted treatment of high prevalence groups, with lower overall coverage, can be more effective than mass treatment with higher overall coverage. Treatment confers a short term benefit only, with prevalence rising to endemic levels once chemotherapy programs are stopped (2) depending on the relative contributions of bovines and humans, bovine treatment can benefit humans almost as much as human treatment. Like human treatment, bovine treatment confers a short-term benefit. A combination of human and bovine treatment will dramatically reduce human prevalence and maintains the reduction for a longer period of time than treatment of a single host, although human prevalence rises once treatment ceases; (3) assuming 75% coverage of bovines, a bovine vaccine which acts on worm fecundity must have about 75% efficacy to reduce the reproduction rate below one and ensure mid-term reduction and long-term elimination of the parasite. Such a vaccination program should be accompanied by an initial period of human treatment to instigate a short-term reduction in prevalence, following which the reduction is enhanced by vaccine effects; (4) if the bovine vaccine is only 45% efficacious (the level of current prototype vaccines) it will lower the endemic prevalence, but will not result in elimination. If it is accompanied by an initial period of human treatment and by a 45% improvement in human sanitation or a 30% reduction in contaminated water contact by humans, elimination is then possible. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Activated sludge flocculation was modelled using population balances. The model followed the dynamics of activated sludge flocculation providing a good approximation of the change in mean floe size with time. Increasing the average velocity gradient decreased the final floe size. The breakage rate coefficient and collision efficiency also varied with the average velocity gradient. A power law relationship was found for the increase in breakage rate coefficient with increasing average velocity gradient. Further investigation will be conducted to determine the relationship between the collision efficiency and particle size to provide a better approximation of dynamic changes in the floe size distribution during flocculation. (C) 2002 Published by Elsevier Science B.V.
Resumo:
A technique based on laser light diffraction is shown to be successful in collecting on-line experimental data. Time series of floc size distributions (FSD) under different shear rates (G) and calcium additions were collected. The steady state mass mean diameter decreased with increasing shear rate G and increased when calcium additions exceeded 8 mg/l. A so-called population balance model (PBM) was used to describe the experimental data, This kind of model describes both aggregation and breakage through birth and death terms. A discretised PBM was used since analytical solutions of the integro-partial differential equations are non-existing. Despite the complexity of the model, only 2 parameters need to be estimated: the aggregation rate and the breakage rate. The model seems, however, to lack flexibility. Also, the description of the floc size distribution (FSD) in time is not accurate.
Resumo:
We focus on mixtures of factor analyzers from the perspective of a method for model-based density estimation from high-dimensional data, and hence for the clustering of such data. This approach enables a normal mixture model to be fitted to a sample of n data points of dimension p, where p is large relative to n. The number of free parameters is controlled through the dimension of the latent factor space. By working in this reduced space, it allows a model for each component-covariance matrix with complexity lying between that of the isotropic and full covariance structure models. We shall illustrate the use of mixtures of factor analyzers in a practical example that considers the clustering of cell lines on the basis of gene expressions from microarray experiments. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.
Resumo:
Conceptual modelling is an activity undertaken during information systems development work to build a representation of selected semantics about some real-world domain. Ontological theories have been developed to account for the structure and behavior of the real world in general. In this paper, I discuss why ontological theories can be used to inform conceptual modelling research, practice, and pedagogy. I provide examples from my research to illustrate how a particular ontological theory has enabled me to improve my understanding of certain conceptual modelling practices and grammars. I describe, also, how some colleagues and I have used this theory to generate several counter-intuitive, sometimes surprising predictions about widely advocated conceptual modelling practices - predictions that subsequently were supported in empirical research we undertook. Finally, I discuss several possibilities and pitfalls I perceived to be associated with our using ontological theories to underpin research on conceptual modelling.