983 resultados para Predictive modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the standard models of concentration addition and independent action predict overall toxicity of multicomponent mixtures reasonably, interactions may limit the predictive capability when a few compounds dominate a mixture. This study was conducted to test if statistically significant systematic deviations from concentration addition (i.e. synergism/antagonism, dose ratio- or dose level-dependency) occur when two taxonomically unrelated species, the earthworm Eisenia fetida and the nematode Caenorhabditis elegans were exposed to a full range of mixtures of the similar acting neonicotinoid pesticides imidacloprid and thiacloprid. The effect of the mixtures on C. elegans was described significantly better (p<0.01) by a dose level-dependent deviation from the concentration addition model than by the reference model alone, while the reference model description of the effects on E. fetida could not be significantly improved. These results highlight that deviations from concentration addition are possible even with similar acting compounds, but that the nature of such deviations are species dependent. For improving ecological risk assessment of simple mixtures, this implies that the concentration addition model may need to be used in a probabilistic context, rather than in its traditional deterministic manner. Crown Copyright (C) 2008 Published by Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several studies have highlighted the importance of the cooling period in oil absorption in deep-fat fried products. Specifically, it has been established that the largest proportion of oil which ends up into the food, is sucked into the porous crust region after the fried product is removed from the oil bath, stressing the importance of this time interval. The main objective of this paper was to develop a predictive mechanistic model that can be used to understand the principles behind post-frying cooling oil absorption kinetics, which can also help identifying the key parameters that affect the final oil intake by the fried product. The model was developed for two different geometries, an infinite slab and an infinite cylinder, and was divided into two main sub-models, one describing the immersion frying period itself and the other describing the post-frying cooling period. The immersion frying period was described by a transient moving-front model that considered the movement of the crust/core interface, whereas post-frying cooling oil absorption was considered to be a pressure driven flow mediated by capillary forces. A key element in the model was the hypothesis that oil suction would only begin once a positive pressure driving force had developed. The mechanistic model was based on measurable physical and thermal properties, and process parameters with no need of empirical data fitting, and can be used to study oil absorption in any deep-fat fried product that satisfies the assumptions made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the properties of implied volatility series calculated from options on Treasury bond futures, traded on LIFFE. We demonstrate that the use of near-maturity at the money options to calculate implied volatilities causes less mis-pricing and is therefore superior to, a weighted average measure encompassing all relevant options. We demonstrate that, whilst a set of macroeconomic variables has some predictive power for implied volatilities, we are not able to earn excess returns by trading on the basis of these predictions once we allow for typical investor transactions costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data assimilation is predominantly used for state estimation; combining observational data with model predictions to produce an updated model state that most accurately approximates the true system state whilst keeping the model parameters fixed. This updated model state is then used to initiate the next model forecast. Even with perfect initial data, inaccurate model parameters will lead to the growth of prediction errors. To generate reliable forecasts we need good estimates of both the current system state and the model parameters. This paper presents research into data assimilation methods for morphodynamic model state and parameter estimation. First, we focus on state estimation and describe implementation of a three dimensional variational(3D-Var) data assimilation scheme in a simple 2D morphodynamic model of Morecambe Bay, UK. The assimilation of observations of bathymetry derived from SAR satellite imagery and a ship-borne survey is shown to significantly improve the predictive capability of the model over a 2 year run. Here, the model parameters are set by manual calibration; this is laborious and is found to produce different parameter values depending on the type and coverage of the validation dataset. The second part of this paper considers the problem of model parameter estimation in more detail. We explain how, by employing the technique of state augmentation, it is possible to use data assimilation to estimate uncertain model parameters concurrently with the model state. This approach removes inefficiencies associated with manual calibration and enables more effective use of observational data. We outline the development of a novel hybrid sequential 3D-Var data assimilation algorithm for joint state-parameter estimation and demonstrate its efficacy using an idealised 1D sediment transport model. The results of this study are extremely positive and suggest that there is great potential for the use of data assimilation-based state-parameter estimation in coastal morphodynamic modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomass burning impacts vegetation dynamics, biogeochemical cycling, atmospheric chemistry, and climate, with sometimes deleterious socio-economic impacts. Under future climate projections it is often expected that the risk of wildfires will increase. Our ability to predict the magnitude and geographic pattern of future fire impacts rests on our ability to model fire regimes, either using well-founded empirical relationships or process-based models with good predictive skill. A large variety of models exist today and it is still unclear which type of model or degree of complexity is required to model fire adequately at regional to global scales. This is the central question underpinning the creation of the Fire Model Intercomparison Project - FireMIP, an international project to compare and evaluate existing global fire models against benchmark data sets for present-day and historical conditions. In this paper we summarise the current state-of-the-art in fire regime modelling and model evaluation, and outline what essons may be learned from FireMIP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Credit scoring modelling comprises one of the leading formal tools for supporting the granting of credit. Its core objective consists of the generation of a score by means of which potential clients can be listed in the order of the probability of default. A critical factor is whether a credit scoring model is accurate enough in order to provide correct classification of the client as a good or bad payer. In this context the concept of bootstraping aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the fitted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper we propose a new bagging-type variant procedure, which we call poly-bagging, consisting of combining predictors over a succession of resamplings. The study is derived by credit scoring modelling. The proposed poly-bagging procedure was applied to some different artificial datasets and to a real granting of credit dataset up to three successions of resamplings. We observed better classification accuracy for the two-bagged and the three-bagged models for all considered setups. These results lead to a strong indication that the poly-bagging approach may promote improvement on the modelling performance measures, while keeping a flexible and straightforward bagging-type structure easy to implement. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, predictive habitat distribution models, derived by combining multivariate statistical analyses with Geographic Information System (GIS) technology, have been recognised for their utility in conservation planning. The size and spatial arrangement of suitable habitat can influence the long-term persistence of some faunal species. In southwestern Victoria, Australia, populations of the rare swamp antechinus (Antechinus minimus maritimus) are threatened by further fragmentation of suitable habitat. In the current study, a spatially explicit habitat suitability model was developed for A. minimus that incorporated a measure of vegetation structure. Models were generated using logistic regression with species presence or absence as the dependent variable and landscape variables, extracted from both GIS data layers and multi-spectral digital imagery, as the predictors. The most parsimonious model, based on the Akaike Information Criterion, was spatially extrapolated in the GIS. Probability of species presence was used as an index of habitat suitability. A negative association between A. minimus presence and both elevation and habitat complexity was evidenced, suggesting a preference for relatively low altitudes and a vegetation structure of low vertical complexity. The predictive performance of the selected model was shown to be high (91%), indicating a good fit of the model to the data. The proportion of the study area predicted as suitable habitat for A. minimus (Probability of occurrence greater-or-equal, slanted0.5) was 11.7%. Habitat suitability maps not only provide baseline information about the spatial arrangement of potentially suitable habitat for a species, but they also help to refine the search for other populations, making them an important conservation tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser shock peening (LSP) is an innovative surface treatment technique for metal alloys, with the great improvement of their fatigue, corrosion and wear resistance performance. Finite element method has been widely applied to simulate the LSP to provide the theoretically predictive assessment and optimally parametric design. In the current work, 3-D numerical modelling approaches, combining the explicit dynamic analysis, static equilibrium analysis algorithms and different plasticity models for the high strain rate exceeding 106s-1, are further developed. To verify the proposed methods, 3-D static and dynamic FEA of AA7075-T7351 rods subject to two-sided laser shock peening are performed using the FEA package–ABAQUS. The dynamic and residual stress fields, shock wave propagation and surface deformation of the treated metal from different material modelling approaches have a good agreement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wildlife managers are often faced with the difficult task of determining the distribution of species, and their preferred habitats, at large spatial scales. This task is even more challenging when the species of concern is in low abundance and/or the terrain is largely inaccessible. Spatially explicit distribution models, derived from multivariate statistical analyses and implemented in a geographic information system (GIS), can be used to predict the distributions of species and their habitats, thus making them a useful conservation tool. We present two such models: one for a dasyurid, the Swamp Antechinus (Antechinus minimus), and the other for a ground-dwelling bird, the Rufous Bristlebird (Dasyornis broadbenti), both of which are rare species occurring in the coastal heathlands of south-western Victoria. Models were generated using generalized linear modelling (GLM) techniques with species presence or absence as the independent variable and a series of landscape variables derived from GIS layers and high-resolution imagery as the predictors. The most parsimonious model, based on the Akaike Information Criterion, for each species then was extrapolated spatially in a GIS. Probability of species presence was used as an index of habitat suitability. Because habitat fragmentation is thought to be one of the major threats to these species, an assessment of the spatial distribution of suitable habitat across the landscape is vital in prescribing management actions to prevent further habitat fragmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building design decisions are commonly based on issues pertaining to construction cost, and consideration of energy performance is made only within the context of the initial project budget. Even where energy is elevated to more importance, operating energy is seen as the focus and embodied energy is nearly always ignored. For the first time, a large sample of buildings has been assembled and analysed in a single study to improve the understanding of the relationship between energy and cost performance over their full life cycle. Thirty recently completed buildings in Melbourne, Australia have been studied to explore the accuracy of initial embodied energy prediction based on capital cost at various levels of model detail. The embodied energy of projects, elemental groups, elements and selected items of work are correlated against capital cost and the strength of the relationship is computed. The relationship between initial embodied energy and capital cost generally declines as the predictive model assumes more detail, although elemental modelling may provide the best solution on balance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: To detail and validate a simulation model that describes the dynamics of cannabis use, including its probable causal relationships with schizophrenia, road traffic accidents (RTA) and heroin/poly-drug use (HPU).

Methods: A Markov model with 17 health-states was constructed. Annual cycles were used to simulate the initiation of cannabis use, progression in use, reduction and complete remission. The probabilities of transition between health-states were derived from observational data. Following 10-year-old Australian children for 90 years, the model estimated age-specific prevalence for cannabis use. By applying the relative risks according to the extent of cannabis use, the age-specific prevalence of schizophrenia and HPU, and the annual RTA incidence and fatality rate were also estimated. Predictive validity of the model was tested by comparing modelled outputs with data from other credible sources. Sensitivity and scenario analyses were conducted to evaluate technical validity and face validity.

Results: The estimated cannabis use prevalence in individuals aged 10-65 years was 12.2% which comprised 27.4% weekly and 18.0% daily users. The modelled prevalence and age profile were comparable to the reported cross-sectional data. The model also provided good approximations to the prevalence of schizophrenia (Modelled: 4.75/1,000 persons vs Observed: 4.6/1,000 persons), HPU (3.2/1,000 vs 3.1/1,000) and the RTA fatality rate (8.1 per 100,000 vs 8.2 per 100,000). Sensitivity analyses and scenario analysis provided expected and explainable trends.

Conclusions: The validated model provides a valuable tool to assess the likely effectiveness and cost-effectiveness of interventions designed to affect patterns of cannabis use. It can be updated as new data becomes available and/or applied to other countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents Relation Based Modelling as an extension to the Feature Based Modelling approach to student modelling. Relation Based Modelling dynamically creates new terms allowing the instructional designer to specify a set of primitives and operators from which the modelling system will create the necessary elements. Focal modelling is a new technique devised to manipulate and coordinate the addition of new terms. The thesis presents an evaluation of student modelling systems based on predictive accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Explores how machine learning techniques can be used to build effective student modeling systems with constrained development and operational overheads, by integrating top-down and bottom-up initiatives. Emphasizes feature-based modelling.