225 resultados para structural modelling
Resumo:
Environmental processes have been modelled for decades. However. the need for integrated assessment and modeling (IAM) has,town as the extent and severity of environmental problems in the 21st Century worsens. The scale of IAM is not restricted to the global level as in climate change models, but includes local and regional models of environmental problems. This paper discusses various definitions of IAM and identifies five different types of integration that Lire needed for the effective solution of environmental problems. The future is then depicted in the form of two brief scenarios: one optimistic and one pessimistic. The current state of IAM is then briefly reviewed. The issues of complexity and validation in IAM are recognised as more complex than in traditional disciplinary approaches. Communication is identified as a central issue both internally among team members and externally with decision-makers. stakeholders and other scientists. Finally it is concluded that the process of integrated assessment and modelling is considered as important as the product for any particular project. By learning to work together and recognise the contribution of all team members and participants, it is believed that we will have a strong scientific and social basis to address the environmental problems of the 21st Century. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Field and laboratory observations have shown that a relatively low beach groundwater table enhances beach accretion. These observations have led to the beach dewatering technique (artificially lowering the beach water table) for combating beach erosion. Here we present a process-based numerical model that simulates the interacting wave motion on the beach. coastal groundwater flow, swash sediment transport and beach profile changes. Results of model simulations demonstrate that the model replicates accretionary effects of a low beach water table on beach profile changes and has the potential to become a tool for assessing the effectiveness of beach dewatering systems. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
We present the finite element simulations of reactive mineral carrying fluids mixing and mineralization in pore-fluid saturated hydrothermal/sedimentary basins. In particular we explore the mixing of reactive sulfide and sulfate fluids and the relevant patterns of mineralization for Load, zinc and iron minerals in the regime of temperature-gradient-driven convective flow. Since the mineralization and ore body formation may last quite a long period of time in a hydrothermal basin, it is commonly assumed that, in the geochemistry, the solutions of minerals are in an equilibrium state or near an equilibrium state. Therefore, the mineralization rate of a particular kind of mineral can be expressed as the product of the pore-fluid velocity and the equilibrium concentration of this particular kind of mineral Using the present mineralization rate of a mineral, the potential of the modern mineralization theory is illustrated by means of finite element studies related to reactive mineral-carrying fluids mixing problems in materially homogeneous and inhomogeneous porous rock basins.
Resumo:
The principle of using induction rules based on spatial environmental data to model a soil map has previously been demonstrated Whilst the general pattern of classes of large spatial extent and those with close association with geology were delineated small classes and the detailed spatial pattern of the map were less well rendered Here we examine several strategies to improve the quality of the soil map models generated by rule induction Terrain attributes that are better suited to landscape description at a resolution of 250 m are introduced as predictors of soil type A map sampling strategy is developed Classification error is reduced by using boosting rather than cross validation to improve the model Further the benefit of incorporating the local spatial context for each environmental variable into the rule induction is examined The best model was achieved by sampling in proportion to the spatial extent of the mapped classes boosting the decision trees and using spatial contextual information extracted from the environmental variables.
Resumo:
In contrast to curative therapies, preventive therapies are administered to largely healthy individuals over long periods. The risk-benefit and cost-benefit ratios are more likely to be unfavourable, making treatment decisions difficult. Drug trials provide insufficient information for treatment decisions, as they are conducted on highly selected populations over short durations, estimate only relative benefits of treatment and offer little information on risks and costs. Epidemiological modelling is a method of combining evidence from observational epidemiology and clinical trials to assist in clinical and health policy decision-making. It can estimate absolute benefits, risks and costs of long-term preventive strategies, and thus allow their precise targeting to individuals for whom they are safest and most cost-effective. Epidemiological modelling also allows explicit information about risks and benefits of therapy to be presented to patients, facilitating informed decision-making.
Resumo:
The thermal ecology and structural habitat use of two closely related sympatric lizards, Carlia vivax (de Vis) and Lygisaurus foliorum de Vis, were examined in an open sclerophyll forest in subtropical Australia. Comparable mean body temperatures (T-b) and habitat temperatures (T-hab) at the point of capture were recorded for both species. However, sex- related differences in the thermal variables for C. vivax, with females displaying higher temperatures than males, resulted in some significant differences in T-b and T-hab between the species. Variation in T-b and T-hab within and between species was unrelated to time of capture. The difference in T-hab within C. vivax suggested that females were selecting warmer thermal environments than males. Both C. vivax and L. foliorum used most structural features of their habitat randomly as indicated by a similarity in canopy, shrub, ground, log and litter cover and litter depth between habitat surveys and random surveys. However, C. vivax displayed a preference for ground vegetation (height
Resumo:
This paper presents an agent-based approach to modelling individual driver behaviour under the influence of real-time traffic information. The driver behaviour models developed in this study are based on a behavioural survey of drivers which was conducted on a congested commuting corridor in Brisbane, Australia. Commuters' responses to travel information were analysed and a number of discrete choice models were developed to determine the factors influencing drivers' behaviour and their propensity to change route and adjust travel patterns. Based on the results obtained from the behavioural survey, the agent behaviour parameters which define driver characteristics, knowledge and preferences were identified and their values determined. A case study implementing a simple agent-based route choice decision model within a microscopic traffic simulation tool is also presented. Driver-vehicle units (DVUs) were modelled as autonomous software components that can each be assigned a set of goals to achieve and a database of knowledge comprising certain beliefs, intentions and preferences concerning the driving task. Each DVU provided route choice decision-making capabilities, based on perception of its environment, that were similar to the described intentions of the driver it represented. The case study clearly demonstrated the feasibility of the approach and the potential to develop more complex driver behavioural dynamics based on the belief-desire-intention agent architecture. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The development of structure perpendicular to and in the plane of the interface has been studied for mesoporous silicate films self-assembled at the air/water interface. The use of constrained X-ray and neutron specular reflectometry has enabled a detailed study of the structural development perpendicular to the interface during the pre-growth phase. Off-specular neutron reflectometry and grazing incidence X-ray diffraction has enabled the in-plane structure to be probed with excellent time resolution. The growth mechanism under the surfactant to silicate source ratios used in this work is clearly due to the self-assembly of micellar and molecular species at the air/liquid interface, resulting in the formation of a planar mesoporous film that is tens of microns thick. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Allergy is a major cause of morbidity worldwide. The number of characterized allergens and related information is increasing rapidly creating demands for advanced information storage, retrieval and analysis. Bioinformatics provides useful tools for analysing allergens and these are complementary to traditional laboratory techniques for the study of allergens. Specific applications include structural analysis of allergens, identification of B- and T-cell epitopes, assessment of allergenicity and cross-reactivity, and genome analysis. In this paper, the most important bioinformatic tools and methods with relevance to the study of allergy have been reviewed.
Resumo:
Functional magnetic resonance imaging (FMRI) analysis methods can be quite generally divided into hypothesis-driven and data-driven approaches. The former are utilised in the majority of FMRI studies, where a specific haemodynamic response is modelled utilising knowledge of event timing during the scan, and is tested against the data using a t test or a correlation analysis. These approaches often lack the flexibility to account for variability in haemodynamic response across subjects and brain regions which is of specific interest in high-temporal resolution event-related studies. Current data-driven approaches attempt to identify components of interest in the data, but currently do not utilise any physiological information for the discrimination of these components. Here we present a hypothesis-driven approach that is an extension of Friman's maximum correlation modelling method (Neurolmage 16, 454-464, 2002) specifically focused on discriminating the temporal characteristics of event-related haemodynamic activity. Test analyses, on both simulated and real event-related FMRI data, will be presented.
Resumo:
Quantifying mass and energy exchanges within tropical forests is essential for understanding their role in the global carbon budget and how they will respond to perturbations in climate. This study reviews ecosystem process models designed to predict the growth and productivity of temperate and tropical forest ecosystems. Temperate forest models were included because of the minimal number of tropical forest models. The review provides a multiscale assessment enabling potential users to select a model suited to the scale and type of information they require in tropical forests. Process models are reviewed in relation to their input and output parameters, minimum spatial and temporal units of operation, maximum spatial extent and time period of application for each organization level of modelling. Organizational levels included leaf-tree, plot-stand, regional and ecosystem levels, with model complexity decreasing as the time-step and spatial extent of model operation increases. All ecosystem models are simplified versions of reality and are typically aspatial. Remotely sensed data sets and derived products may be used to initialize, drive and validate ecosystem process models. At the simplest level, remotely sensed data are used to delimit location, extent and changes over time of vegetation communities. At a more advanced level, remotely sensed data products have been used to estimate key structural and biophysical properties associated with ecosystem processes in tropical and temperate forests. Combining ecological models and image data enables the development of carbon accounting systems that will contribute to understanding greenhouse gas budgets at biome and global scales.
Resumo:
Exponential and sigmoidal functions have been suggested to describe the bulk density profiles of crusts. The present work aims to evaluate these conceptual models using high resolution X-radiography. Repacked seedbeds from two soil materials, air-dried or prewetted by capillary rise, were subjected to simulated rain, which resulted in three types of structural crusts, namely, slaking, infilling, and coalescing. Bulk density distributions with depth were generated using high-resolution (70 mum), calibrated X-ray images of slices from the resin-impregnated crusted seedbeds. The bulk density decreased progressively with depth, which supports the suggestion that a crust should be considered as a nonuniform layer. For the slaking and the coalescing crusts, the exponential function underestimated the strong change in bulk density across the morphologically defined transition between the crust and the underlying material; the sigmoidal function provided a better description. Neither of these crust models effectively described the shape of the bulk density profiles through the whole seedbed. Below the infilling and slaking crusts, bulk density increased linearly with depth as a result of slumping. In the coalescing crusted seedbed, the whole seedbed uniformly collapsed and most of the bulk density change within the crust could be ascribed to slumping (0.33 g cm(-3)) rather than to crusting (0.12 g cm(-3)). Finally, (i) X-radiography appears as a unique tool to generate high resolution bulk density profiles and (ii) in structural crusts, bulk density profiles could be modeled using the existing exponential and sigmoidal crusting models, provided a slumping model would be coupled.
Resumo:
A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.