941 resultados para Intention-based models


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In spite of the movement to turn political science into a real science, various mathematical methods that are now the staples of physics, biology, and even economics are thoroughly uncommon in political science, especially the study of civil war. This study seeks to apply such methods - specifically, ordinary differential equations (ODEs) - to model civil war based on what one might dub the capabilities school of thought, which roughly states that civil wars end only when one side’s ability to make war falls far enough to make peace truly attractive. I construct several different ODE-based models and then test them all to see which best predicts the instantaneous capabilities of both sides of the Sri Lankan civil war in the period from 1990 to 1994 given parameters and initial conditions. The model that the tests declare most accurate gives very accurate predictions of state military capabilities and reasonable short term predictions of cumulative deaths. Analysis of the model reveals the scale of the importance of rebel finances to the sustainability of insurgency, most notably that the number of troops required to put down the Tamil Tigers is reduced by nearly a full order of magnitude when Tiger foreign funding is stopped. The study thus demonstrates that accurate foresight may come of relatively simple dynamical models, and implies the great potential of advanced and currently unconventional non-statistical mathematical methods in political science.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

State-of-the-art process-based models have shown to be applicable to the simulation and prediction of coastal morphodynamics. On annual to decadal temporal scales, these models may show limitations in reproducing complex natural morphological evolution patterns, such as the movement of bars and tidal channels, e.g. the observed decadal migration of the Medem Channel in the Elbe Estuary, German Bight. Here a morphodynamic model is shown to simulate the hydrodynamics and sediment budgets of the domain to some extent, but fails to adequately reproduce the pronounced channel migration, due to the insufficient implementation of bank erosion processes. In order to allow for long-term simulations of the domain, a nudging method has been introduced to update the model-predicted bathymetries with observations. The model-predicted bathymetry is nudged towards true states in annual time steps. Sensitivity analysis of a user-defined correlation length scale, for the definition of the background error covariance matrix during the nudging procedure, suggests that the optimal error correlation length is similar to the grid cell size, here 80-90 m. Additionally, spatially heterogeneous correlation lengths produce more realistic channel depths than do spatially homogeneous correlation lengths. Consecutive application of the nudging method compensates for the (stand-alone) model prediction errors and corrects the channel migration pattern, with a Brier skill score of 0.78. The proposed nudging method in this study serves as an analytical approach to update model predictions towards a predefined 'true' state for the spatiotemporal interpolation of incomplete morphological data in long-term simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Within the regression framework, we show how different levels of nonlinearity influence the instantaneous firing rate prediction of single neurons. Nonlinearity can be achieved in several ways. In particular, we can enrich the predictor set with basis expansions of the input variables (enlarging the number of inputs) or train a simple but different model for each area of the data domain. Spline-based models are popular within the first category. Kernel smoothing methods fall into the second category. Whereas the first choice is useful for globally characterizing complex functions, the second is very handy for temporal data and is able to include inner-state subject variations. Also, interactions among stimuli are considered. We compare state-of-the-art firing rate prediction methods with some more sophisticated spline-based nonlinear methods: multivariate adaptive regression splines and sparse additive models. We also study the impact of kernel smoothing. Finally, we explore the combination of various local models in an incremental learning procedure. Our goal is to demonstrate that appropriate nonlinearity treatment can greatly improve the results. We test our hypothesis on both synthetic data and real neuronal recordings in cat primary visual cortex, giving a plausible explanation of the results from a biological perspective.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We propose a level set based variational approach that incorporates shape priors into edge-based and region-based models. The evolution of the active contour depends on local and global information. It has been implemented using an efficient narrow band technique. For each boundary pixel we calculate its dynamic according to its gray level, the neighborhood and geometric properties established by training shapes. We also propose a criterion for shape aligning based on affine transformation using an image normalization procedure. Finally, we illustrate the benefits of the our approach on the liver segmentation from CT images.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La predicción de energía eólica ha desempeñado en la última década un papel fundamental en el aprovechamiento de este recurso renovable, ya que permite reducir el impacto que tiene la naturaleza fluctuante del viento en la actividad de diversos agentes implicados en su integración, tales como el operador del sistema o los agentes del mercado eléctrico. Los altos niveles de penetración eólica alcanzados recientemente por algunos países han puesto de manifiesto la necesidad de mejorar las predicciones durante eventos en los que se experimenta una variación importante de la potencia generada por un parque o un conjunto de ellos en un tiempo relativamente corto (del orden de unas pocas horas). Estos eventos, conocidos como rampas, no tienen una única causa, ya que pueden estar motivados por procesos meteorológicos que se dan en muy diferentes escalas espacio-temporales, desde el paso de grandes frentes en la macroescala a procesos convectivos locales como tormentas. Además, el propio proceso de conversión del viento en energía eléctrica juega un papel relevante en la ocurrencia de rampas debido, entre otros factores, a la relación no lineal que impone la curva de potencia del aerogenerador, la desalineación de la máquina con respecto al viento y la interacción aerodinámica entre aerogeneradores. En este trabajo se aborda la aplicación de modelos estadísticos a la predicción de rampas a muy corto plazo. Además, se investiga la relación de este tipo de eventos con procesos atmosféricos en la macroescala. Los modelos se emplean para generar predicciones de punto a partir del modelado estocástico de una serie temporal de potencia generada por un parque eólico. Los horizontes de predicción considerados van de una a seis horas. Como primer paso, se ha elaborado una metodología para caracterizar rampas en series temporales. La denominada función-rampa está basada en la transformada wavelet y proporciona un índice en cada paso temporal. Este índice caracteriza la intensidad de rampa en base a los gradientes de potencia experimentados en un rango determinado de escalas temporales. Se han implementado tres tipos de modelos predictivos de cara a evaluar el papel que juega la complejidad de un modelo en su desempeño: modelos lineales autorregresivos (AR), modelos de coeficientes variables (VCMs) y modelos basado en redes neuronales (ANNs). Los modelos se han entrenado en base a la minimización del error cuadrático medio y la configuración de cada uno de ellos se ha determinado mediante validación cruzada. De cara a analizar la contribución del estado macroescalar de la atmósfera en la predicción de rampas, se ha propuesto una metodología que permite extraer, a partir de las salidas de modelos meteorológicos, información relevante para explicar la ocurrencia de estos eventos. La metodología se basa en el análisis de componentes principales (PCA) para la síntesis de la datos de la atmósfera y en el uso de la información mutua (MI) para estimar la dependencia no lineal entre dos señales. Esta metodología se ha aplicado a datos de reanálisis generados con un modelo de circulación general (GCM) de cara a generar variables exógenas que posteriormente se han introducido en los modelos predictivos. Los casos de estudio considerados corresponden a dos parques eólicos ubicados en España. Los resultados muestran que el modelado de la serie de potencias permitió una mejora notable con respecto al modelo predictivo de referencia (la persistencia) y que al añadir información de la macroescala se obtuvieron mejoras adicionales del mismo orden. Estas mejoras resultaron mayores para el caso de rampas de bajada. Los resultados también indican distintos grados de conexión entre la macroescala y la ocurrencia de rampas en los dos parques considerados. Abstract One of the main drawbacks of wind energy is that it exhibits intermittent generation greatly depending on environmental conditions. Wind power forecasting has proven to be an effective tool for facilitating wind power integration from both the technical and the economical perspective. Indeed, system operators and energy traders benefit from the use of forecasting techniques, because the reduction of the inherent uncertainty of wind power allows them the adoption of optimal decisions. Wind power integration imposes new challenges as higher wind penetration levels are attained. Wind power ramp forecasting is an example of such a recent topic of interest. The term ramp makes reference to a large and rapid variation (1-4 hours) observed in the wind power output of a wind farm or portfolio. Ramp events can be motivated by a broad number of meteorological processes that occur at different time/spatial scales, from the passage of large-scale frontal systems to local processes such as thunderstorms and thermally-driven flows. Ramp events may also be conditioned by features related to the wind-to-power conversion process, such as yaw misalignment, the wind turbine shut-down and the aerodynamic interaction between wind turbines of a wind farm (wake effect). This work is devoted to wind power ramp forecasting, with special focus on the connection between the global scale and ramp events observed at the wind farm level. The framework of this study is the point-forecasting approach. Time series based models were implemented for very short-term prediction, this being characterised by prediction horizons up to six hours ahead. As a first step, a methodology to characterise ramps within a wind power time series was proposed. The so-called ramp function is based on the wavelet transform and it provides a continuous index related to the ramp intensity at each time step. The underlying idea is that ramps are characterised by high power output gradients evaluated under different time scales. A number of state-of-the-art time series based models were considered, namely linear autoregressive (AR) models, varying-coefficient models (VCMs) and artificial neural networks (ANNs). This allowed us to gain insights into how the complexity of the model contributes to the accuracy of the wind power time series modelling. The models were trained in base of a mean squared error criterion and the final set-up of each model was determined through cross-validation techniques. In order to investigate the contribution of the global scale into wind power ramp forecasting, a methodological proposal to identify features in atmospheric raw data that are relevant for explaining wind power ramp events was presented. The proposed methodology is based on two techniques: principal component analysis (PCA) for atmospheric data compression and mutual information (MI) for assessing non-linear dependence between variables. The methodology was applied to reanalysis data generated with a general circulation model (GCM). This allowed for the elaboration of explanatory variables meaningful for ramp forecasting that were utilized as exogenous variables by the forecasting models. The study covered two wind farms located in Spain. All the models outperformed the reference model (the persistence) during both ramp and non-ramp situations. Adding atmospheric information had a noticeable impact on the forecasting performance, specially during ramp-down events. Results also suggested different levels of connection between the ramp occurrence at the wind farm level and the global scale.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

One of the fundamental aspects in the adaptation of the teaching to the European higher education is changing based models of teacher education to models based on student learning. In this work we present an educational experience developed with the teaching method based on the case method, with a clearly multidisciplinary. The experience has been developed in the teaching of analysis and verification of safety rails. This is a multidisciplinary field that presents great difficulties during their teaching. The use of the case method has given good results in the competences achieved by students

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes the impact of electric mobility on the transmission grid in Flanders region (Belgium), using a micro-simulation activity based models. These models are used to provide temporal and spatial estimation of energy and power demanded by electric vehicles (EVs) in different mobility zones. The increment in the load demand due to electric mobility is added to the background load demand in these mobility areas and the effects over the transmission substations are analyzed. From this information, the total storage capacity per zone is evaluated and some strategies for EV aggregator are proposed, allowing the aggregator to fulfill bids on the electricity markets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Carbon (C) and nitrogen (N) process-based models are important tools for estimating and reporting greenhouse gas emissions and changes in soil C stocks. There is a need for continuous evaluation, development and adaptation of these models to improve scientific understanding, national inventories and assessment of mitigation options across the world. To date, much of the information needed to describe different processes like transpiration, photosynthesis, plant growth and maintenance, above and below ground carbon dynamics, decomposition and nitrogen mineralization. In ecosystem models remains inaccessible to the wider community, being stored within model computer source code, or held internally by modelling teams. Here we describe the Global Research Alliance Modelling Platform (GRAMP), a web-based modelling platform to link researchers with appropriate datasets, models and training material. It will provide access to model source code and an interactive platform for researchers to form a consensus on existing methods, and to synthesize new ideas, which will help to advance progress in this area. The platform will eventually support a variety of models, but to trial the platform and test the architecture and functionality, it was piloted with variants of the DNDC model. The intention is to form a worldwide collaborative network (a virtual laboratory) via an interactive website with access to models and best practice guidelines; appropriate datasets for testing, calibrating and evaluating models; on-line tutorials and links to modelling and data provider research groups, and their associated publications. A graphical user interface has been designed to view the model development tree and access all of the above functions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the recent years, the computer vision community has shown great interest on depth-based applications thanks to the performance and flexibility of the new generation of RGB-D imagery. In this paper, we present an efficient background subtraction algorithm based on the fusion of multiple region-based classifiers that processes depth and color data provided by RGB-D cameras. Foreground objects are detected by combining a region-based foreground prediction (based on depth data) with different background models (based on a Mixture of Gaussian algorithm) providing color and depth descriptions of the scene at pixel and region level. The information given by these modules is fused in a mixture of experts fashion to improve the foreground detection accuracy. The main contributions of the paper are the region-based models of both background and foreground, built from the depth and color data. The obtained results using different database sequences demonstrate that the proposed approach leads to a higher detection accuracy with respect to existing state-of-the-art techniques.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Proper management of supply chains is fundamental in the overall system performance of forestbased activities. Usually, efficient management techniques rely on a decision support software, which needs to be able to generate fast and effective outputs from the set of possibilities. In order to do this, it is necessary to provide accurate models representative of the dynamic interactions of systems. Due to forest-based supply chains’ nature, event-based models are more suited to describe their behaviours. This work proposes the modelling and simulation of a forestbased supply chain, in particular the biomass supply chain, through the SimPy framework. This Python based tool allows the modelling of discrete-event systems using operations such as events, processes and resources. The developed model was used to access the impact of changes in the daily working plan in three situations. First, as a control case, the deterministic behaviour was simulated. As a second approach, a machine delay was introduced and its implications in the plan accomplishment were analysed. Finally, to better address real operating conditions, stochastic behaviours of processing and driving times were simulated. The obtained results validate the SimPy simulation environment as a framework for modelling supply chains in general and for the biomass problem in particular.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Community-based models for injury prevention have become an accepted part of the overall injury control strategy. This systematic review of the scientific literature examines the evidence for their effectiveness in reducing pedestrian injury in children 0-14 years of age. A comprehensive search of the literature was performed using the following study selection criteria: community-based intervention study; target population was children under 14 years; outcome measure is either pedestrian injury rates or observed child pedestrian or vehicle driver behaviour; and use of a community control or an historical control in the study design. Quality assessment and data abstraction was guided by a standardized procedure and performed independently by two authors. Data synthesis was in tabular and text form with meta-analysis not being possible due to the discrepancy in methods and measures between the studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present unified, systematic derivations of schemes in the two known measurement-based models of quantum computation. The first model (introduced by Raussendorf and Briegel, [Phys. Rev. Lett. 86, 5188 (2001)]) uses a fixed entangled state, adaptive measurements on single qubits, and feedforward of the measurement results. The second model (proposed by Nielsen, [Phys. Lett. A 308, 96 (2003)] and further simplified by Leung, [Int. J. Quant. Inf. 2, 33 (2004)]) uses adaptive two-qubit measurements that can be applied to arbitrary pairs of qubits, and feedforward of the measurement results. The underlying principle of our derivations is a variant of teleportation introduced by Zhou, Leung, and Chuang, [Phys. Rev. A 62, 052316 (2000)]. Our derivations unify these two measurement-based models of quantum computation and provide significantly simpler schemes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Study Objective: Community-based models for injury prevention have become an accepted part of the overall injury control strategy. This systematic review of the scientific literature examines the evidence for their effectiveness in reducing injury due to inadequate car seat restraint use in children 0-16 years of age. Methods: A comprehensive search of the literature was performed using the following study selection criteria: community-based intervention study: target population was children aged 0-16 years of age; outcome measure was either injury rates due to motor vehicle crashes or observed changes in child restraint use; and use of community control or historical control in the study design. Quality assessment and data abstraction was guided by a standardized procedure and performed independently by two authors. Data synthesis was in tabular and text form with meta-analysis not being possible due to the discrepancy in methods and measures between the studies. Results: This review found eight studies, that met all the inclusion criteria. In the studies that measured injury outcomes, significant reductions in risk of motor vehicle occupant injury (33-55%) were reported in the study communities. For those studies reporting observed car seat restraint use the community-based programs were successful in increasing toddler restraint use in 1-5 year aged children by up to 11%; child booster seat use in 4-8 year aged children by up to 13%; rear restraint use in children aged 0-15 years by 8%; a 50% increase in restraint use in pre-school aged children in a high-risk community; and a 44% increase in children aged 5-11 years. Conclusion: While this review highlights that there is some evidence to support the effectiveness of community-based programs to promote car restraint use and/or motor vehicle occupant injury, limitations in the evaluation methodologies of the studies requires the results to be interpreted with caution. There is clearly a need for further high quality program evaluation research to develop an evidence base. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An appreciation of the physical mechanisms which cause observed seismicity complexity is fundamental to the understanding of the temporal behaviour of faults and single slip events. Numerical simulation of fault slip can provide insights into fault processes by allowing exploration of parameter spaces which influence microscopic and macroscopic physics of processes which may lead towards an answer to those questions. Particle-based models such as the Lattice Solid Model have been used previously for the simulation of stick-slip dynamics of faults, although mainly in two dimensions. Recent increases in the power of computers and the ability to use the power of parallel computer systems have made it possible to extend particle-based fault simulations to three dimensions. In this paper a particle-based numerical model of a rough planar fault embedded between two elastic blocks in three dimensions is presented. A very simple friction law without any rate dependency and no spatial heterogeneity in the intrinsic coefficient of friction is used in the model. To simulate earthquake dynamics the model is sheared in a direction parallel to the fault plane with a constant velocity at the driving edges. Spontaneous slip occurs on the fault when the shear stress is large enough to overcome the frictional forces on the fault. Slip events with a wide range of event sizes are observed. Investigation of the temporal evolution and spatial distribution of slip during each event shows a high degree of variability between the events. In some of the larger events highly complex slip patterns are observed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper compares the UK/US exchange rate forecasting performance of linear and nonlinear models based on monetary fundamentals, to a random walk (RW) model. Structural breaks are identified and taken into account. The exchange rate forecasting framework is also used for assessing the relative merits of the official Simple Sum and the weighted Divisia measures of money. Overall, there are four main findings. First, the majority of the models with fundamentals are able to beat the RW model in forecasting the UK/US exchange rate. Second, the most accurate forecasts of the UK/US exchange rate are obtained with a nonlinear model. Third, taking into account structural breaks reveals that the Divisia aggregate performs better than its Simple Sum counterpart. Finally, Divisia-based models provide more accurate forecasts than Simple Sum-based models provided they are constructed within a nonlinear framework.