961 resultados para experimental modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses robust model-order reduction of a high dimensional nonlinear partial differential equation (PDE) model of a complex biological process. Based on a nonlinear, distributed parameter model of the same process which was validated against experimental data of an existing, pilot-scale BNR activated sludge plant, we developed a state-space model with 154 state variables in this work. A general algorithm for robustly reducing the nonlinear PDE model is presented and based on an investigation of five state-of-the-art model-order reduction techniques, we are able to reduce the original model to a model with only 30 states without incurring pronounced modelling errors. The Singular perturbation approximation balanced truncating technique is found to give the lowest modelling errors in low frequency ranges and hence is deemed most suitable for controller design and other real-time applications. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last 7 years, a method has been developed to analyse building energy performance using computer simulation, in Brazil. The method combines analysis of building design plans and documentation, walk-through visits, electric and thermal measurements and the use of an energy simulation tool (DOE-2.1E code), The method was used to model more than 15 office buildings (more than 200 000 m(2)), located between 12.5degrees and 27.5degrees South latitude. The paper describes the basic methodology, with data for one building and presents additional results for other six cases. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The prediction of tillering is poor or absent in existing sorghum crop models even though fertile tillers contribute significantly to grain yield. The objective of this study was to identify general quantitative relationships underpinning tiller dynamics of sorghum for a broad range of assimilate availabilities. Emergence, phenology, leaf area development and fertility of individual main calms and tillers were quantified weekly in plants grown at one of four plant densities ranging from two to 16 plants m(-2). On any given day, a tiller was considered potentially fertile (a posteriori) if its number of leaves continued to increase thereafter. The dynamics of potentially fertile tiller number per plant varied greatly with plant density, but could generally be described by three determinants, stable across plant densities: tiller emergence rate aligned with leaf ligule appearance rate; cessation of tiller emergence occurred at a stable leaf area index; and rate of decrease in potentially fertile tillers was linearly related to the ratio of realized to potential leaf area growth. Realized leaf area growth is the measured increase in leaf area, whereas potential leaf area growth is the estimated increase in leaf area if all potentially fertile tillers were to continue to develop. Procedures to predict this ratio, by estimating realized leaf area per plant from intercepted radiation and potential leaf area per plant from the number and type of developing axes, are presented. While it is suitable for modelling tiller dynamics in grain sorghum, this general framework needs to be validated by testing it in different environments and for other cultivars. (C) 2002 Annals of Botany Company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crop modelling has evolved over the last 30 or so years in concert with advances in crop physiology, crop ecology and computing technology. Having reached a respectable degree of acceptance, it is appropriate to review briefly the course of developments in crop modelling and to project what might be major contributions of crop modelling in the future. Two major opportunities are envisioned for increased modelling activity in the future. One opportunity is in a continuing central, heuristic role to support scientific investigation, to facilitate decision making by crop managers, and to aid in education. Heuristic activities will also extend to the broader system-level issues of environmental and ecological aspects of crop production. The second opportunity is projected as a prime contributor in understanding and advancing the genetic regulation of plant performance and plant improvement. Physiological dissection and modelling of traits provides an avenue by which crop modelling could contribute to enhancing integration of molecular genetic technologies in crop improvement. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the first mathematical model on the transmission dynamics of Schistosoma japonicum. The work extends Barbour's classic model of schistosome transmission. It allows for the mammalian host heterogeneity characteristic of the S. japonicum life cycle, and solves the problem of under-specification of Barbour's model by the use of Chinese data we are collecting on human-bovine transmission in the Poyang Lake area of Jiangxi Province in China. The model predicts that in the lake/marshland areas of the Yangtze River basin: (1) once-early mass chemotherapy of humans is little better than twice-yearly mass chemotherapy in reducing human prevalence. Depending on the heterogeneity of prevalence within the population, targeted treatment of high prevalence groups, with lower overall coverage, can be more effective than mass treatment with higher overall coverage. Treatment confers a short term benefit only, with prevalence rising to endemic levels once chemotherapy programs are stopped (2) depending on the relative contributions of bovines and humans, bovine treatment can benefit humans almost as much as human treatment. Like human treatment, bovine treatment confers a short-term benefit. A combination of human and bovine treatment will dramatically reduce human prevalence and maintains the reduction for a longer period of time than treatment of a single host, although human prevalence rises once treatment ceases; (3) assuming 75% coverage of bovines, a bovine vaccine which acts on worm fecundity must have about 75% efficacy to reduce the reproduction rate below one and ensure mid-term reduction and long-term elimination of the parasite. Such a vaccination program should be accompanied by an initial period of human treatment to instigate a short-term reduction in prevalence, following which the reduction is enhanced by vaccine effects; (4) if the bovine vaccine is only 45% efficacious (the level of current prototype vaccines) it will lower the endemic prevalence, but will not result in elimination. If it is accompanied by an initial period of human treatment and by a 45% improvement in human sanitation or a 30% reduction in contaminated water contact by humans, elimination is then possible. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Activated sludge flocculation was modelled using population balances. The model followed the dynamics of activated sludge flocculation providing a good approximation of the change in mean floe size with time. Increasing the average velocity gradient decreased the final floe size. The breakage rate coefficient and collision efficiency also varied with the average velocity gradient. A power law relationship was found for the increase in breakage rate coefficient with increasing average velocity gradient. Further investigation will be conducted to determine the relationship between the collision efficiency and particle size to provide a better approximation of dynamic changes in the floe size distribution during flocculation. (C) 2002 Published by Elsevier Science B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We focus on mixtures of factor analyzers from the perspective of a method for model-based density estimation from high-dimensional data, and hence for the clustering of such data. This approach enables a normal mixture model to be fitted to a sample of n data points of dimension p, where p is large relative to n. The number of free parameters is controlled through the dimension of the latent factor space. By working in this reduced space, it allows a model for each component-covariance matrix with complexity lying between that of the isotropic and full covariance structure models. We shall illustrate the use of mixtures of factor analyzers in a practical example that considers the clustering of cell lines on the basis of gene expressions from microarray experiments. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conceptual modelling is an activity undertaken during information systems development work to build a representation of selected semantics about some real-world domain. Ontological theories have been developed to account for the structure and behavior of the real world in general. In this paper, I discuss why ontological theories can be used to inform conceptual modelling research, practice, and pedagogy. I provide examples from my research to illustrate how a particular ontological theory has enabled me to improve my understanding of certain conceptual modelling practices and grammars. I describe, also, how some colleagues and I have used this theory to generate several counter-intuitive, sometimes surprising predictions about widely advocated conceptual modelling practices - predictions that subsequently were supported in empirical research we undertook. Finally, I discuss several possibilities and pitfalls I perceived to be associated with our using ontological theories to underpin research on conceptual modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The extent to which density-dependent processes regulate natural populations is the subject of an ongoing debate. We contribute evidence to this debate showing that density-dependent processes influence the population dynamics of the ectoparasite Aponomma hydrosauri (Acari: Ixodidae), a tick species that infests reptiles in Australia. The first piece of evidence comes from an unusually long-term dataset on the distribution of ticks among individual hosts. If density-dependent processes are influencing either host mortality or vital rates of the parasite population, and those distributions can be approximated with negative binomial distributions, then general host-parasite models predict that the aggregation coefficient of the parasite distribution will increase with the average intensity of infections. We fit negative binomial distributions to the frequency distributions of ticks on hosts, and find that the estimated aggregation coefficient k increases with increasing average tick density. This pattern indirectly implies that one or more vital rates of the tick population must be changing with increasing tick density, because mortality rates of the tick's main host, the sleepy lizard, Tiliqua rugosa, are unaffected by changes in tick burdens. Our second piece of evidence is a re-analysis of experimental data on the attachment success of individual ticks to lizard hosts using generalized linear modelling. The probability of successful engorgement decreases with increasing numbers of ticks attached to a host. This is direct evidence of a density-dependent process that could lead to an increase in the aggregation coefficient of tick distributions described earlier. The population-scale increase in the aggregation coefficient is indirect evidence of a density-dependent process or processes sufficiently strong to produce a population-wide pattern, and thus also likely to influence population regulation. The direct observation of a density-dependent process is evidence of at least part of the responsible mechanism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crushing and grinding are the most energy intensive part of the mineral recovery process. A major part of rock size reduction occurs in tumbling mills. Empirical models for the power draw of tumbling mills do not consider the effect of lifters. Discrete element modelling was used to investigate the effect of lifter condition on the power draw of tumbling mill. Results obtained with PFC3D code show that lifter condition will have a significant influence on the power draw and on the mode of energy consumption in the mill. Relatively high lifters will consume less power than low lifters, under otherwise identical conditions. The fraction of the power that will be consumed as friction will increase as the height of the lifters decreases. This will result in less power being used for high intensity comminution caused by the impacts. The fraction of the power that will be used to overcome frictional resistance is determined by the material's coefficient of friction. Based on the modelled results, it appears that the effective coefficient of friction for in situ mill is close to 0.1. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low concentrate density from wet drum magnetic separators in dense medium circuits can cause operating difficulties due to inability to obtain the required circulating medium density and, indirectly, high medium solids losses. The literature is almost silent on the processes controlling concentrate density. However, the common name for the region through which concentrate is discharged-the squeeze pan gap-implies that some extrusion process is thought to be at work. There is no model of magnetics recovery in a wet drum magnetic separator, which includes as inputs all significant machine and operating variables. A series of trials, in both factorial experiments and in single variable experiments, was done using a purpose built rig which featured a small industrial scale (700 mm lip length, 900 turn diameter) wet drum magnetic separator. A substantial data set of 191 trials was generated in this work. The results of the factorial experiments were used to identify the variables having a significant effect on magnetics recovery. It is proposed, based both on the experimental observations of the present work and on observations reported in the literature, that the process controlling magnetic separator concentrate density is one of drainage. Such a process should be able to be defined by an initial moisture, a drainage rate and a drainage time, the latter being defined by the volumetric flowrate and the volume within the drainage zone. The magnetics can be characterised by an experimentally derived ultimate drainage moisture. A model based on these concepts and containing adjustable parameters was developed. This model was then fitted to a randomly chosen 80% of the data, and validated by application to the remaining 20%. The model is shown to be a good fit to data over concentrate solids content values from 40% solids to 80% solids and for both magnetite and ferrosilicon feeds. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today, the standard approach for the kinetic analysis of dynamic PET studies is compartment models, in which the tracer and its metabolites are confined to a few well-mixed compartments. We examine whether the standard model is suitable for modern PET data or whether theories including more physiologic realism can advance the interpretation of dynamic PET data. A more detailed microvascular theory is developed for intravascular tracers in single-capillary and multiple-capillary systems. The microvascular models, which account for concentration gradients in capillaries, are validated and compared with the standard model in a pig liver study. Methods: Eight pigs underwent a 5-min dynamic PET study after O-15-carbon monoxide inhalation. Throughout each experiment, hepatic arterial blood and portal venous blood were sampled, and flow was measured with transit-time flow meters. The hepatic dual-inlet concentration was calculated as the flow-weighted inlet concentration. Dynamic PET data were analyzed with a traditional single-compartment model and 2 microvascular models. Results: Microvascular models provided a better fit of the tissue activity of an intravascular tracer than did the compartment model. In particular, the early dynamic phase after a tracer bolus injection was much improved. The regional hepatic blood flow estimates provided by the microvascular models (1.3 +/- 0.3 mL min(-1) mL(-1) for the single-capillary model and 1.14 +/- 0.14 min(-1) mL(-1) for the multiple-capillary model) (mean +/- SEM mL of blood min(-1) mL of liver tissue(-1)) were in agreement with the total blood flow measured by flow meters and normalized to liver weight (1.03 +/- 0.12 mL min(-1) mL(-1)). Conclusion: Compared with the standard compartment model, the 2 microvascular models provide a superior description of tissue activity after an intravascular tracer bolus injection. The microvascular models include only parameters with a clear-cut physiologic interpretation and are applicable to capillary beds in any organ. In this study, the microvascular models were validated for the liver and provided quantitative regional flow estimates in agreement with flow measurements.