160 resultados para Discreet Element Modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The finite element method is used to simulate coupled problems, which describe the related physical and chemical processes of ore body formation and mineralization, in geological and geochemical systems. The main purpose of this paper is to illustrate some simulation results for different types of modelling problems in pore-fluid saturated rock masses. The aims of the simulation results presented in this paper are: (1) getting a better understanding of the processes and mechanisms of ore body formation and mineralization in the upper crust of the Earth; (2) demonstrating the usefulness and applicability of the finite element method in dealing with a wide range of coupled problems in geological and geochemical systems; (3) qualitatively establishing a set of showcase problems, against which any numerical method and computer package can be reasonably validated. (C) 2002 Published by Elsevier Science B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The principle of using induction rules based on spatial environmental data to model a soil map has previously been demonstrated Whilst the general pattern of classes of large spatial extent and those with close association with geology were delineated small classes and the detailed spatial pattern of the map were less well rendered Here we examine several strategies to improve the quality of the soil map models generated by rule induction Terrain attributes that are better suited to landscape description at a resolution of 250 m are introduced as predictors of soil type A map sampling strategy is developed Classification error is reduced by using boosting rather than cross validation to improve the model Further the benefit of incorporating the local spatial context for each environmental variable into the rule induction is examined The best model was achieved by sampling in proportion to the spatial extent of the mapped classes boosting the decision trees and using spatial contextual information extracted from the environmental variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In contrast to curative therapies, preventive therapies are administered to largely healthy individuals over long periods. The risk-benefit and cost-benefit ratios are more likely to be unfavourable, making treatment decisions difficult. Drug trials provide insufficient information for treatment decisions, as they are conducted on highly selected populations over short durations, estimate only relative benefits of treatment and offer little information on risks and costs. Epidemiological modelling is a method of combining evidence from observational epidemiology and clinical trials to assist in clinical and health policy decision-making. It can estimate absolute benefits, risks and costs of long-term preventive strategies, and thus allow their precise targeting to individuals for whom they are safest and most cost-effective. Epidemiological modelling also allows explicit information about risks and benefits of therapy to be presented to patients, facilitating informed decision-making.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an agent-based approach to modelling individual driver behaviour under the influence of real-time traffic information. The driver behaviour models developed in this study are based on a behavioural survey of drivers which was conducted on a congested commuting corridor in Brisbane, Australia. Commuters' responses to travel information were analysed and a number of discrete choice models were developed to determine the factors influencing drivers' behaviour and their propensity to change route and adjust travel patterns. Based on the results obtained from the behavioural survey, the agent behaviour parameters which define driver characteristics, knowledge and preferences were identified and their values determined. A case study implementing a simple agent-based route choice decision model within a microscopic traffic simulation tool is also presented. Driver-vehicle units (DVUs) were modelled as autonomous software components that can each be assigned a set of goals to achieve and a database of knowledge comprising certain beliefs, intentions and preferences concerning the driving task. Each DVU provided route choice decision-making capabilities, based on perception of its environment, that were similar to the described intentions of the driver it represented. The case study clearly demonstrated the feasibility of the approach and the potential to develop more complex driver behavioural dynamics based on the belief-desire-intention agent architecture. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Functional magnetic resonance imaging (FMRI) analysis methods can be quite generally divided into hypothesis-driven and data-driven approaches. The former are utilised in the majority of FMRI studies, where a specific haemodynamic response is modelled utilising knowledge of event timing during the scan, and is tested against the data using a t test or a correlation analysis. These approaches often lack the flexibility to account for variability in haemodynamic response across subjects and brain regions which is of specific interest in high-temporal resolution event-related studies. Current data-driven approaches attempt to identify components of interest in the data, but currently do not utilise any physiological information for the discrimination of these components. Here we present a hypothesis-driven approach that is an extension of Friman's maximum correlation modelling method (Neurolmage 16, 454-464, 2002) specifically focused on discriminating the temporal characteristics of event-related haemodynamic activity. Test analyses, on both simulated and real event-related FMRI data, will be presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carbon monoxide, the chief killer in fires, and other species are modelled for a series of enclosure fires. The conditions emulate building fires where CO is formed in the rich, turbulent, nonpremixed flame and is transported frozen to lean mixtures by the ceiling jet which is cooled by radiation and dilution. Conditional moment closure modelling is used and computational domain minimisation criteria are developed which reduce the computational cost of this method. The predictions give good agreement for CO and other species in the lean, quenched-gas stream, holding promise that this method may provide a practical means of modelling real, three-dimensional fire situations. (c) 2005 The Combustion Institute. Published by Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pollution by polycyclic aromatic hydrocarbons(PAHs) is widespread due to unsuitable disposal of industrial waste. They are mostly defined as priority pollutants by environmental protection authorities worldwide. Phenanthrene, a typical PAH, was selected as the target in this paper. The PAH-degrading mixed culture, named ZM, was collected from a petroleum contaminated river bed. This culture was injected into phenanthrene solutions at different concentrations to quantify the biodegradation process. Results show near-complete removal of phenanthrene in three days of biodegradation if the initial phenanthrene concentration is low. When the initial concentration is high, the removal rate is increased but 20%-40% of the phenanthrene remains at the end of the experiment. The biomass shows a peak on the third day due to the combined effects of microbial growth and decay. Another peak is evident for cases with a high initial concentration, possibly due to production of an intermediate metabolite. The pH generally decreased during biodegradation because of the production of organic acid. Two phenomenological models were designed to simulate the phenanthrene biodegradation and biomass growth. A relatively simple model that does not consider the intermediate metabolite and its inhibition of phenanthrene biodegradation cannot fit the observed data. A modified Monod model that considered an intermediate metabolite (organic acid) and its inhibiting reversal effect reasonably depicts the experimental results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Light is generally regarded as the most likely cue used by zooplankton to regulate their vertical movements through the water column. However, the way in which light is used by zooplankton as a cue is not well understood. In this paper we present a mathematical model of diel vertical migration which produces vertical distributions of zooplankton that vary in space and time. The model is used to predict the patterns of vertical distribution which result when animals are assumed to adopt one of three commonly proposed mechanisms for vertical swimming. First, we assume zooplankton tend to swim towards a preferred intensity of light. We then assume zooplankton swim in response to either the rate of change in light intensity or the relative rate of change in light intensity. The model predicts that for all three mechanisms movement is fastest at sunset and sunrise and populations are primarily influenced by eddy diffusion at night in the absence of a light stimulus. Daytime patterns of vertical distribution differ between the three mechanisms and the reasons for the predicted differences are discussed. Swimming responses to properties of the light field are shown to be adequate for describing diel vertical migration where animals congregate in near surface waters during the evening and reside at deeper depths during the day. However, the model is unable to explain how some populations halt their ascent before reaching surface waters or how populations re-congregate in surface waters a few hours before sunrise, a phenomenon which is sometimes observed in the held. The model results indicate that other exogenous or endogenous factors besides light may play important roles in regulating vertical movement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Izenman and Sommer (1988) used a non-parametric Kernel density estimation technique to fit a seven-component model to the paper thickness of the 1872 Hidalgo stamp issue of Mexico. They observed an apparent conflict when fitting a normal mixture model with three components with unequal variances. This conflict is examined further by investigating the most appropriate number of components when fitting a normal mixture of components with equal variances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to analyse the effect of modelling assumptions in a formal, rigorous way, a syntax of modelling assumptions has been defined. The syntax of modelling assumptions enables us to represent modelling assumptions as transformations acting on the set of model equations. The notion of syntactical correctness and semantical consistency of sets of modelling assumptions is defined and methods for checking them are described. It is shown on a simple example how different modelling assumptions act on the model equations and their effect on the differential index of the resulted model is also indicated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This investigation focused on the finite element analyses of elastic and plastic properties of aluminium/alumina composite materials with ultrafine microstructure. The commonly used unit cell model was used to predict the elastic properties. By combining the unit cell model with an indentation model, coupled with experimental indentation measurements, the plastic properties of the composites and the associated strengthening mechanism within the metal matrix material were investigated. The grain size of the matrix material was found to be an important factor influencing the mechanical properties of the composites studied. (C) 1997 Elsevier Science S.A.