944 resultados para Hazard-Based Models


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article, we study some relevant information divergence measures viz. Renyi divergence and Kerridge’s inaccuracy measures. These measures are extended to conditionally specifiedmodels and they are used to characterize some bivariate distributions using the concepts of weighted and proportional hazard rate models. Moreover, some bounds are obtained for these measures using the likelihood ratio order

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Understanding how biological visual systems perform object recognition is one of the ultimate goals in computational neuroscience. Among the biological models of recognition the main distinctions are between feedforward and feedback and between object-centered and view-centered. From a computational viewpoint the different recognition tasks - for instance categorization and identification - are very similar, representing different trade-offs between specificity and invariance. Thus the different tasks do not strictly require different classes of models. The focus of the review is on feedforward, view-based models that are supported by psychophysical and physiological data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Baylis & Driver (Nature Neuroscience, 2001) have recently presented data on the response of neurons in macaque inferotemporal cortex (IT) to various stimulus transformations. They report that neurons can generalize over contrast and mirror reversal, but not over figure-ground reversal. This finding is taken to demonstrate that ``the selectivity of IT neurons is not determined simply by the distinctive contours in a display, contrary to simple edge-based models of shape recognition'', citing our recently presented model of object recognition in cortex (Riesenhuber & Poggio, Nature Neuroscience, 1999). In this memo, I show that the main effects of the experiment can be obtained by performing the appropriate simulations in our simple feedforward model. This suggests for IT cell tuning that the possible contributions of explicit edge assignment processes postulated in (Baylis & Driver, 2001) might be smaller than expected.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La tecnología LiDAR (Light Detection and Ranging), basada en el escaneado del territorio por un telémetro láser aerotransportado, permite la construcción de Modelos Digitales de Superficie (DSM) mediante una simple interpolación, así como de Modelos Digitales del Terreno (DTM) mediante la identificación y eliminación de los objetos existentes en el terreno (edificios, puentes o árboles). El Laboratorio de Geomática del Politécnico de Milán – Campus de Como- desarrolló un algoritmo de filtrado de datos LiDAR basado en la interpolación con splines bilineares y bicúbicas con una regularización de Tychonov en una aproximación de mínimos cuadrados. Sin embargo, en muchos casos son todavía necesarios modelos más refinados y complejos en los cuales se hace obligatorio la diferenciación entre edificios y vegetación. Este puede ser el caso de algunos modelos de prevención de riesgos hidrológicos, donde la vegetación no es necesaria; o la modelización tridimensional de centros urbanos, donde la vegetación es factor problemático. (...)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, several experiments have shown individuals exhibit authentic reciprocal behaviour in anonymous one-shot interactions. As reciprocity has been shown to be relevant in several economic fields, there have also been several attempts to model reciprocal bahaviour. I review the intention-based models of reciprocity and present an example of teacher management in the public sector in which the government offers an incentive scheme to implement a program. The incentive scheme has a prisoner´s dilemma structure. In both simultaneous and sequential games, equilibrium results may differ from those predicted by standard theory.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite the many models developed for phosphorus concentration prediction at differing spatial and temporal scales, there has been little effort to quantify uncertainty in their predictions. Model prediction uncertainty quantification is desirable, for informed decision-making in river-systems management. An uncertainty analysis of the process-based model, integrated catchment model of phosphorus (INCA-P), within the generalised likelihood uncertainty estimation (GLUE) framework is presented. The framework is applied to the Lugg catchment (1,077 km2), a River Wye tributary, on the England–Wales border. Daily discharge and monthly phosphorus (total reactive and total), for a limited number of reaches, are used to initially assess uncertainty and sensitivity of 44 model parameters, identified as being most important for discharge and phosphorus predictions. This study demonstrates that parameter homogeneity assumptions (spatial heterogeneity is treated as land use type fractional areas) can achieve higher model fits, than a previous expertly calibrated parameter set. The model is capable of reproducing the hydrology, but a threshold Nash-Sutcliffe co-efficient of determination (E or R 2) of 0.3 is not achieved when simulating observed total phosphorus (TP) data in the upland reaches or total reactive phosphorus (TRP) in any reach. Despite this, the model reproduces the general dynamics of TP and TRP, in point source dominated lower reaches. This paper discusses why this application of INCA-P fails to find any parameter sets, which simultaneously describe all observed data acceptably. The discussion focuses on uncertainty of readily available input data, and whether such process-based models should be used when there isn’t sufficient data to support the many parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We assessed the vulnerability of blanket peat to climate change in Great Britain using an ensemble of 8 bioclimatic envelope models. We used 4 published models that ranged from simple threshold models, based on total annual precipitation, to Generalised Linear Models (GLMs, based on mean annual temperature). In addition, 4 new models were developed which included measures of water deficit as threshold, classification tree, GLM and generalised additive models (GAM). Models that included measures of both hydrological conditions and maximum temperature provided a better fit to the mapped peat area than models based on hydrological variables alone. Under UKCIP02 projections for high (A1F1) and low (B1) greenhouse gas emission scenarios, 7 out of the 8 models showed a decline in the bioclimatic space associated with blanket peat. Eastern regions (Northumbria, North York Moors, Orkney) were shown to be more vulnerable than higher-altitude, western areas (Highlands, Western Isles and Argyle, Bute and The Trossachs). These results suggest a long-term decline in the distribution of actively growing blanket peat, especially under the high emissions scenario, although it is emphasised that existing peatlands may well persist for decades under a changing climate. Observational data from long-term monitoring and manipulation experiments in combination with process-based models are required to explore the nature and magnitude of climate change impacts on these vulnerable areas more fully.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Integrated simulation models can be useful tools in farming system research. This chapter reviews three commonly used approaches, i.e. linear programming, system dynamics and agent-based models. Applications of each approach are presented and strengths and drawbacks discussed. We argue that, despite some challenges, mainly related to the integration of different approaches, model validation and the representation of human agents, integrated simulation models contribute important insights to the analysis of farming systems. They help unravelling the complex and dynamic interactions and feedbacks among bio-physical, socio-economic, and institutional components across scales and levels in farming systems. In addition, they can provide a platform for integrative research, and can support transdisciplinary research by functioning as learning platforms in participatory processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Earthworms are important organisms in soil communities and so are used as model organisms in environmental risk assessments of chemicals. However current risk assessments of soil invertebrates are based on short-term laboratory studies, of limited ecological relevance, supplemented if necessary by site-specific field trials, which sometimes are challenging to apply across the whole agricultural landscape. Here, we investigate whether population responses to environmental stressors and pesticide exposure can be accurately predicted by combining energy budget and agent-based models (ABMs), based on knowledge of how individuals respond to their local circumstances. A simple energy budget model was implemented within each earthworm Eisenia fetida in the ABM, based on a priori parameter estimates. From broadly accepted physiological principles, simple algorithms specify how energy acquisition and expenditure drive life cycle processes. Each individual allocates energy between maintenance, growth and/or reproduction under varying conditions of food density, soil temperature and soil moisture. When simulating published experiments, good model fits were obtained to experimental data on individual growth, reproduction and starvation. Using the energy budget model as a platform we developed methods to identify which of the physiological parameters in the energy budget model (rates of ingestion, maintenance, growth or reproduction) are primarily affected by pesticide applications, producing four hypotheses about how toxicity acts. We tested these hypotheses by comparing model outputs with published toxicity data on the effects of copper oxychloride and chlorpyrifos on E. fetida. Both growth and reproduction were directly affected in experiments in which sufficient food was provided, whilst maintenance was targeted under food limitation. Although we only incorporate toxic effects at the individual level we show how ABMs can readily extrapolate to larger scales by providing good model fits to field population data. The ability of the presented model to fit the available field and laboratory data for E. fetida demonstrates the promise of the agent-based approach in ecology, by showing how biological knowledge can be used to make ecological inferences. Further work is required to extend the approach to populations of more ecologically relevant species studied at the field scale. Such a model could help extrapolate from laboratory to field conditions and from one set of field conditions to another or from species to species.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The potential risk of agricultural pesticides to mammals typically depends on internal concentrations within individuals, and these are determined by the amount ingested and by absorption, distribution, metabolism, and excretion (ADME). Pesticide residues ingested depend, amongst other things, on individual spatial choices which determine how much and when feeding sites and areas of pesticide application overlap, and can be calculated using individual-based models (IBMs). Internal concentrations can be calculated using toxicokinetic (TK) models, which are quantitative representations of ADME processes. Here we provide a population model for the wood mouse (Apodemus sylvaticus) in which TK submodels were incorporated into an IBM representation of individuals making choices about where to feed. This allows us to estimate the contribution of individual spatial choice and TK processes to risk. We compared the risk predicted by four IBMs: (i) “AllExposed-NonTK”: assuming no spatial choice so all mice have 100% exposure, no TK, (ii) “AllExposed-TK”: identical to (i) except that the TK processes are included where individuals vary because they have different temporal patterns of ingestion in the IBM, (iii) “Spatial-NonTK”: individual spatial choice, no TK, and (iv) “Spatial-TK”: individual spatial choice and with TK. The TK parameters for hypothetical pesticides used in this study were selected such that a conventional risk assessment would fail. Exposures were standardised using risk quotients (RQ; exposure divided by LD50 or LC50). We found that for the exposed sub-population including either spatial choice or TK reduced the RQ by 37–85%, and for the total population the reduction was 37–94%. However spatial choice and TK together had little further effect in reducing RQ. The reasons for this are that when the proportion of time spent in treated crop (PT) approaches 1, TK processes dominate and spatial choice has very little effect, and conversely if PT is small spatial choice dominates and TK makes little contribution to exposure reduction. The latter situation means that a short time spent in the pesticide-treated field mimics exposure from a small gavage dose, but TK only makes a substantial difference when the dose was consumed over a longer period. We concluded that a combined TK-IBM is most likely to bring added value to the risk assessment process when the temporal pattern of feeding, time spent in exposed area and TK parameters are at an intermediate level; for instance wood mice in foliar spray scenarios spending more time in crop fields because of better plant cover.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Human Body Thermoregulation Models have been widely used in the field of human physiology or thermal comfort studies. However there are few studies on the evaluation method for these models. This paper summarises the existing evaluation methods and critically analyses the flaws. Based on that, a method for the evaluating the accuracy of the Human Body Thermoregulation models is proposed. The new evaluation method contributes to the development of Human Body Thermoregulation models and validates their accuracy both statistically and empirically. The accuracy of different models can be compared by the new method. Furthermore, the new method is not only suitable for the evaluation of Human Body Thermoregulation Models, but also can be theoretically applied to the evaluation of the accuracy of the population-based models in other research fields.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Distributed energy and water balance models require time-series surfaces of the meteorological variables involved in hydrological processes. Most of the hydrological GIS-based models apply simple interpolation techniques to extrapolate the point scale values registered at weather stations at a watershed scale. In mountainous areas, where the monitoring network ineffectively covers the complex terrain heterogeneity, simple geostatistical methods for spatial interpolation are not always representative enough, and algorithms that explicitly or implicitly account for the features creating strong local gradients in the meteorological variables must be applied. Originally developed as a meteorological pre-processing tool for a complete hydrological model (WiMMed), MeteoMap has become an independent software. The individual interpolation algorithms used to approximate the spatial distribution of each meteorological variable were carefully selected taking into account both, the specific variable being mapped, and the common lack of input data from Mediterranean mountainous areas. They include corrections with height for both rainfall and temperature (Herrero et al., 2007), and topographic corrections for solar radiation (Aguilar et al., 2010). MeteoMap is a GIS-based freeware upon registration. Input data include weather station records and topographic data and the output consists of tables and maps of the meteorological variables at hourly, daily, predefined rainfall event duration or annual scales. It offers its own pre and post-processing tools, including video outlook, map printing and the possibility of exporting the maps to images or ASCII ArcGIS formats. This study presents the friendly user interface of the software and shows some case studies with applications to hydrological modeling.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Starting from the idea that economic systems fall into complexity theory, where its many agents interact with each other without a central control and that these interactions are able to change the future behavior of the agents and the entire system, similar to a chaotic system we increase the model of Russo et al. (2014) to carry out three experiments focusing on the interaction between Banks and Firms in an artificial economy. The first experiment is relative to Relationship Banking where, according to the literature, the interaction over time between Banks and Firms are able to produce mutual benefits, mainly due to reduction of the information asymmetry between them. The following experiment is related to information heterogeneity in the credit market, where the larger the bank, the higher their visibility in the credit market, increasing the number of consult for new loans. Finally, the third experiment is about the effects on the credit market of the heterogeneity of prices that Firms faces in the goods market.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The search for better performance in the structural systems has been taken to more refined models, involving the analysis of a growing number of details, which should be correctly formulated aiming at defining a representative model of the real system. Representative models demand a great detailing of the project and search for new techniques of evaluation and analysis. Model updating is one of this technologies, it can be used to improve the predictive capabilities of computer-based models. This paper presents a FRF-based finite element model updating procedure whose the updating variables are physical parameters of the model. It includes the damping effects in the updating procedure assuming proportional and non proportional damping mechanism. The updating parameters are defined at an element level or macro regions of the model. So, the parameters are adjusted locally, facilitating the physical interpretation of the adjusting of the model. Different tests for simulated and experimental data are discussed aiming at evaluating the characteristics and potentialities of the methodology.