967 resultados para Original model


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Agro-hydrological models have widely been used for optimizing resources use and minimizing environmental consequences in agriculture. SMCRN is a recently developed sophisticated model which simulates crop response to nitrogen fertilizer for a wide range of crops, and the associated leaching of nitrate from arable soils. In this paper, we describe the improvements of this model by replacing the existing approximate hydrological cascade algorithm with a new simple and explicit algorithm for the basic soil water flow equation, which not only enhanced the model performance in hydrological simulation, but also was essential to extend the model application to the situations where the capillary flow is important. As a result, the updated SMCRN model could be used for more accurate study of water dynamics in the soil-crop system. The success of the model update was demonstrated by the simulated results that the updated model consistently out-performed the original model in drainage simulations and in predicting time course soil water content in different layers in the soil-wheat system. Tests of the updated SMCRN model against data from 4 field crop experiments showed that crop nitrogen offtakes and soil mineral nitrogen in the top 90 cm were in a good agreement with the measured values, indicating that the model could make more reliable predictions of nitrogen fate in the crop-soil system, and thus provides a useful platform to assess the impacts of nitrogen fertilizer on crop yield and nitrogen leaching from different production systems. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Trust is a fundamental issue in multiagent systems, especially in agent-mediated e-commence. The trust model plays an important role in determining who and how to interact in open and dynamic environments. To this end, many trust models have been developed. Based on the confidence–reputation model proposed by other researchers, an improved trust model is discussed. The original model was enhanced in two aspects: (1) when evaluating the trustworthiness of target agents, the reliability of the witness itself is taken into account and further aggregated by Dempster-Shafer evidence theory and (2) the ontological property of trust is considered, which implies that trust can be calculated more precisely. A case study is provided to show the effectiveness of the improved model.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND: Depression is widely considered to be an independent and robust predictor of Coronary Heart Disease (CHD), however is seldom considered in the context of formal risk assessment. We assessed whether the addition of depression to the Framingham Risk Equation (FRE) improved accuracy for predicting 10-year CHD in a sample of women.

DESIGN: A prospective, longitudinal design comprising an age-stratified, population-based sample of Australian women collected between 1993 and 2011 (n=862).

METHODS: Clinical depressive disorder was assessed using the Structured Clinical Interview for Diagnostic and Statistical Manual of Mental Disorders (SCID-I/NP), using retrospective age-of-onset data. A composite measure of CHD included non-fatal myocardial infarction, unstable angina coronary intervention or cardiac death. Cox proportional-hazards regression models were conducted and overall accuracy assessed using area under receiver operating characteristic (ROC) curve analysis.

RESULTS: ROC curve analyses revealed that the addition of baseline depression status to the FRE model improved its overall accuracy (AUC:0.77, Specificity:0.70, Sensitivity:0.75) when compared to the original FRE model (AUC:0.75, Specificity:0.73, Sensitivity:0.67). However, when calibrated against the original model, the predicted number of events generated by the augmented version marginally over-estimated the true number observed.

CONCLUSIONS: The addition of a depression variable to the FRE equation improves the overall accuracy of the model for predicting 10-year CHD events in women, however may over-estimate the number of events that actually occur. This model now requires validation in larger samples as it could form a new CHD risk equation for women.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A branch and bound algorithm is proposed to solve the [image omitted]-norm model reduction problem for continuous and discrete-time linear systems, with convergence to the global optimum in a finite time. The lower and upper bounds in the optimization procedure are described by linear matrix inequalities (LMI). Also proposed are two methods with which to reduce the convergence time of the branch and bound algorithm: the first one uses the Hankel singular values as a sufficient condition to stop the algorithm, providing to the method a fast convergence to the global optimum. The second one assumes that the reduced model is in the controllable or observable canonical form. The [image omitted]-norm of the error between the original model and the reduced model is considered. Examples illustrate the application of the proposed method.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We investigate the nonequilibrium roughening transition of a one-dimensional restricted solid-on-solid model by directly sampling the stationary probability density of a suitable order parameter as the surface adsorption rate varies. The shapes of the probability density histograms suggest a typical Ginzburg-Landau scenario for the phase transition of the model, and estimates of the "magnetic" exponent seem to confirm its mean-field critical behavior. We also found that the flipping times between the metastable phases of the model scale exponentially with the system size, signaling the breaking of ergodicity in the thermodynamic limit. Incidentally, we discovered that a closely related model not considered before also displays a phase transition with the same critical behavior as the original model. Our results support the usefulness of off-critical histogram techniques in the investigation of nonequilibrium phase transitions. We also briefly discuss in the appendix a good and simple pseudo-random number generator used in our simulations.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Hippocampal place cells in the rat undergo experience-dependent changes when the rat runs stereotyped routes. One such change, the backward shift of the place field center of mass, has been linked by previous modeling efforts to spike-timing-dependent plasticity (STDP). However, these models did not account for the termination of the place field shift and they were based on an abstract implementation of STDP that ignores many of the features found in cortical plasticity. Here, instead of the abstract STDP model, we use a calcium-dependent plasticity (CaDP) learning rule that can account for many of the observed properties of cortical plasticity. We use the CaDP learning rule in combination with a model of metaplasticity to simulate place field dynamics. Without any major changes to the parameters of the original model, the present simulations account both for the initial rapid place field shift and for the subsequent slowing down of this shift. These results suggest that the CaDP model captures the essence of a general cortical mechanism of synaptic plasticity, which may underlie numerous forms of synaptic plasticity observed both in vivo and in vitro.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

nlcheck is a simple diagnostic tool that can be used after fitting a model to quickly check the linearity assumption for a given predictor. nlcheck categorizes the predictor into bins, refits the model including dummy variables for the bins, and then performs a joint Wald test for the added parameters. Alternative, nlcheck uses linear splines for the adaptive model. Support for discrete variables is also provided. Optionally, nlcheck also displays a graph of the adjusted linear predictions from the original model and the adaptive model

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The aim of the study presented was to implement a process model to simulate the dynamic behaviour of a pilot-scale process for anaerobic two-stage digestion of sewage sludge. The model implemented was initiated to support experimental investigations of the anaerobic two-stage digestion process. The model concept implemented in the simulation software package MATLAB(TM)/Simulink(R) is a derivative of the IWA Anaerobic Digestion Model No.1 (ADM1) that has been developed by the IWA task group for mathematical modelling of anaerobic processes. In the present study the original model concept has been adapted and applied to replicate a two-stage digestion process. Testing procedures, including balance checks and 'benchmarking' tests were carried out to verify the accuracy of the implementation. These combined measures ensured a faultless model implementation without numerical inconsistencies. Parameters for both, the thermophilic and the mesophilic process stage, have been estimated successfully using data from lab-scale experiments described in literature. Due to the high number of parameters in the structured model, it was necessary to develop a customised procedure that limited the range of parameters to be estimated. The accuracy of the optimised parameter sets has been assessed against experimental data from pilot-scale experiments. Under these conditions, the model predicted reasonably well the dynamic behaviour of a two-stage digestion process in pilot scale. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Multiresolution Triangular Mesh (MTM) models are widely used to improve the performance of large terrain visualization by replacing the original model with a simplified one. MTM models, which consist of both original and simplified data, are commonly stored in spatial database systems due to their size. The relatively slow access speed of disks makes data retrieval the bottleneck of such terrain visualization systems. Existing spatial access methods proposed to address this problem rely on main-memory MTM models, which leads to significant overhead during query processing. In this paper, we approach the problem from a new perspective and propose a novel MTM called direct mesh that is designed specifically for secondary storage. It supports available indexing methods natively and requires no modification to MTM structure. Experiment results, which are based on two real-world data sets, show an average performance improvement of 5-10 times over the existing methods.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In a special issue of this journal commemorating the 50th anniversary of W. Arthur Lewis's (The Manchester School, Vol. 28 (1954), No. 2, pp. 139-191) seminal paper, the Lewis model is treated as a model of labour market dualism (Fields, The Manchester School, Vol. 72 (2004), No. 6, pp. 724-735). This interpretation is flawed for a number of reasons. First, it overemphasizes the role ascribed by Lewis to intersectoral earnings differentials in his original model. Second, it fails to acknowledge that a major shortcoming of the model was its inability to account for the widening intersectoral earnings differential observed across a wide range of developing economies. For Lewis himself this was one of the 'major theoretical puzzles of the period' (1979, p. 150). Third, it ignores Lewis's subsequent revision of the model (Lewis, The Manchester School, Vol. 47 (1979), No. 3, pp. 211-229) that, ironically, incorporates a dual labour market to resolve this puzzle. However, for Lewis the critical issue was dualism within the modern sector, not, as Fields understands it, labour market dualism between the modern and traditional sectors. Fields's appreciation of the contribution of the Lewis model to understanding the process of wage determination in developing economies is therefore misplaced.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The soil-plant-moisture subsystem is an important component of the hydrological cycle. Over the last 20 or so years a number of computer models of varying complexity have represented this subsystem with differing degrees of success. The aim of this present work has been to improve and extend an existing model. The new model is less site specific thus allowing for the simulation of a wide range of soil types and profiles. Several processes, not included in the original model, are simulated by the inclusion of new algorithms, including: macropore flow; hysteresis and plant growth. Changes have also been made to the infiltration, water uptake and water flow algorithms. Using field data from various sources, regression equations have been derived which relate parameters in the suction-conductivity-moisture content relationships to easily measured soil properties such as particle-size distribution data. Independent tests have been performed on laboratory data produced by Hedges (1989). The parameters found by regression for the suction relationships were then used in equations describing the infiltration and macropore processes. An extensive literature review produced a new model for calculating plant growth from actual transpiration, which was itself partly determined by the root densities and leaf area indices derived by the plant growth model. The new infiltration model uses intensity/duration curves to disaggregate daily rainfall inputs into hourly amounts. The final model has been calibrated and tested against field data, and its performance compared to that of the original model. Simulations have also been carried out to investigate the effects of various parameters on infiltration, macropore flow, actual transpiration and plant growth. Qualitatively comparisons have been made between these results and data given in the literature.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper we investigate whether consideration of store-level heterogeneity in marketing mix effects improves the accuracy of the marketing mix elasticities, fit, and forecasting accuracy of the widely-applied SCAN*PRO model of store sales. Models with continuous and discrete representations of heterogeneity, estimated using hierarchical Bayes (HB) and finite mixture (FM) techniques, respectively, are empirically compared to the original model, which does not account for store-level heterogeneity in marketing mix effects, and is estimated using ordinary least squares (OLS). The empirical comparisons are conducted in two contexts: Dutch store-level scanner data for the shampoo product category, and an extensive simulation experiment. The simulation investigates how between- and within-segment variance in marketing mix effects, error variance, the number of weeks of data, and the number of stores impact the accuracy of marketing mix elasticities, model fit, and forecasting accuracy. Contrary to expectations, accommodating store-level heterogeneity does not improve the accuracy of marketing mix elasticities relative to the homogeneous SCAN*PRO model, suggesting that little may be lost by employing the original homogeneous SCAN*PRO model estimated using ordinary least squares. Improvements in fit and forecasting accuracy are also fairly modest. We pursue an explanation for this result since research in other contexts has shown clear advantages from assuming some type of heterogeneity in market response models. In an Afterthought section, we comment on the controversial nature of our result, distinguishing factors inherent to household-level data and associated models vs. general store-level data and associated models vs. the unique SCAN*PRO model specification.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

1. Active engagement with practitioners is a crucial component of model-based decision-making in conservation management; it can assist with data acquisition, improve models and help narrow the 'knowing-doing' gap.
2. We worked with practitioners of one of the worst invasive species in Australia, the cane toad Rhinella marina, to revise a model that estimates the effectiveness of landscape barriers to contain spread. The original model predicted that the invasion could be contained by managing artificial watering points on pastoral properties, but was initially met with scepticism by practitioners, in part due to a lack of engagement during model development.
3. We held a workshop with practitioners and experts in cane toad biology. Using structured decision-making, we elicited concerns about the original model, revised its structure, updated relevant input data, added an economic component and found the most cost-effective location for a barrier across a range of fixed budgets and management scenarios. We then conducted scenario analyses to test the sensitivity of management decisions to model revisions.
4. We found that toad spread could be contained for all of the scenarios tested. Our modelling suggests a barrier could cost $4·5 M (2015 AUD) over 50 years for the most likely landscape scenario. The incorporation of practitioner knowledge into the model was crucial. As well as improving engagement, when we incorporated practitioner concerns (particularly regarding the effects of irrigation and dwellings on toad spread), we found a different location for the optimal barrier compared to a previously published study (Tingley et al. 2013).
5. Synthesis and applications. Through engagement with practitioners, we turned an academic modelling exercise into a decision-support tool that integrated local information, and considered more realistic scenarios and constraints. Active engagement with practitioners led to productive revisions of a model that estimates the effectiveness of a landscape barrier to contain spread of the invasive cane toad R. marina. Benefits also include greater confidence in model predictions, improving our assessment of the cost and feasibility of containing the spread of toads.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper addresses the following problem: given two or more business process models, create a process model that is the union of the process models given as input. In other words, the behavior of the produced process model should encompass that of the input models. The paper describes an algorithm that produces a single configurable process model from an arbitrary collection of process models. The algorithm works by extracting the common parts of the input process models, creating a single copy of them, and appending the differences as branches of configurable connectors. This way, the merged process model is kept as small as possible, while still capturing all the behavior of the input models. Moreover, analysts are able to trace back from which original model(s) does a given element in the merged model come from. The algorithm has been prototyped and tested against process models taken from several application domains.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As order dependencies between process tasks can get complex, it is easy to make mistakes in process model design, especially behavioral ones such as deadlocks. Notions such as soundness formalize behavioral errors and tools exist that can identify such errors. However these tools do not provide assistance with the correction of the process models. Error correction can be very challenging as the intentions of the process modeler are not known and there may be many ways in which an error can be corrected. We present a novel technique for automatic error correction in process models based on simulated annealing. Via this technique a number of process model alternatives are identified that resolve one or more errors in the original model. The technique is implemented and validated on a sample of industrial process models. The tests show that at least one sound solution can be found for each input model and that the response times are short.