794 resultados para Multi-sector New Keynesian DSGE models


Relevância:

40.00% 40.00%

Publicador:

Resumo:

When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this contribution, a system identification procedure of a two-input Wiener model suitable for the analysis of the disturbance behavior of integrated nonlinear circuits is presented. The identified block model is comprised of two linear dynamic and one static nonlinear block, which are determined using an parameterized approach. In order to characterize the linear blocks, an correlation analysis using a white noise input in combination with a model reduction scheme is adopted. After having characterized the linear blocks, from the output spectrum under single tone excitation at each input a linear set of equations will be set up, whose solution gives the coefficients of the nonlinear block. By this data based black box approach, the distortion behavior of a nonlinear circuit under the influence of an interfering signal at an arbitrary input port can be determined. Such an interfering signal can be, for example, an electromagnetic interference signal which conductively couples into the port of consideration. © 2011 Author(s).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Experiments with ultracold atoms in optical lattice have become a versatile testing ground to study diverse quantum many-body Hamiltonians. A single-band Bose-Hubbard (BH) Hamiltonian was first proposed to describe these systems in 1998 and its associated quantum phase-transition was subsequently observed in 2002. Over the years, there has been a rapid progress in experimental realizations of more complex lattice geometries, leading to more exotic BH Hamiltonians with contributions from excited bands, and modified tunneling and interaction energies. There has also been interesting theoretical insights and experimental studies on “un- conventional” Bose-Einstein condensates in optical lattices and predictions of rich orbital physics in higher bands. In this thesis, I present our results on several multi- band BH models and emergent quantum phenomena. In particular, I study optical lattices with two local minima per unit cell and show that the low energy states of a multi-band BH Hamiltonian with only pairwise interactions is equivalent to an effec- tive single-band Hamiltonian with strong three-body interactions. I also propose a second method to create three-body interactions in ultracold gases of bosonic atoms in a optical lattice. In this case, this is achieved by a careful cancellation of two contributions in the pair-wise interaction between the atoms, one proportional to the zero-energy scattering length and a second proportional to the effective range. I subsequently study the physics of Bose-Einstein condensation in the second band of a double-well 2D lattice and show that the collision aided decay rate of the con- densate to the ground band is smaller than the tunneling rate between neighboring unit cells. Finally, I propose a numerical method using the discrete variable repre- sentation for constructing real-valued Wannier functions localized in a unit cell for optical lattices. The developed numerical method is general and can be applied to a wide array of optical lattice geometries in one, two or three dimensions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a new radiation scheme for the Oxford Planetary Unified Model System for Venus, suitable for the solar and thermal bands. This new and fast radiative parameterization uses a different approach in the two main radiative wavelength bands: solar radiation (0.1-5.5 mu m) and thermal radiation (1.7-260 mu m). The solar radiation calculation is based on the delta-Eddington approximation (two-stream-type) with an adding layer method. For the thermal radiation case, a code based on an absorptivity/emissivity formulation is used. The new radiative transfer formulation implemented is intended to be computationally light, to allow its incorporation in 3D global circulation models, but still allowing for the calculation of the effect of atmospheric conditions on radiative fluxes. This will allow us to investigate the dynamical-radiative-microphysical feedbacks. The model flexibility can be also used to explore the uncertainties in the Venus atmosphere such as the optical properties in the deep atmosphere or cloud amount. The results of radiative cooling and heating rates and the global-mean radiative-convective equilibrium temperature profiles for different atmospheric conditions are presented and discussed. This new scheme works in an atmospheric column and can be easily implemented in 3D Venus global circulation models. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many geological formations consist of crystalline rocks that have very low matrix permeability but allow flow through an interconnected network of fractures. Understanding the flow of groundwater through such rocks is important in considering disposal of radioactive waste in underground repositories. A specific area of interest is the conditioning of fracture transmissivities on measured values of pressure in these formations. This is the process where the values of fracture transmissivities in a model are adjusted to obtain a good fit of the calculated pressures to measured pressure values. While there are existing methods to condition transmissivity fields on transmissivity, pressure and flow measurements for a continuous porous medium there is little literature on conditioning fracture networks. Conditioning fracture transmissivities on pressure or flow values is a complex problem because the measurements are not linearly related to the fracture transmissivities and they are also dependent on all the fracture transmissivities in the network. We present a new method for conditioning fracture transmissivities on measured pressure values based on the calculation of certain basis vectors; each basis vector represents the change to the log transmissivity of the fractures in the network that results in a unit increase in the pressure at one measurement point whilst keeping the pressure at the remaining measurement points constant. The fracture transmissivities are updated by adding a linear combination of basis vectors and coefficients, where the coefficients are obtained by minimizing an error function. A mathematical summary of the method is given. This algorithm is implemented in the existing finite element code ConnectFlow developed and marketed by Serco Technical Services, which models groundwater flow in a fracture network. Results of the conditioning are shown for a number of simple test problems as well as for a realistic large scale test case.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Ligand-protein docking is an optimization problem based on predicting the position of a ligand with the lowest binding energy in the active site of the receptor. Molecular docking problems are traditionally tackled with single-objective, as well as with multi-objective approaches, to minimize the binding energy. In this paper, we propose a novel multi-objective formulation that considers: the Root Mean Square Deviation (RMSD) difference in the coordinates of ligands and the binding (intermolecular) energy, as two objectives to evaluate the quality of the ligand-protein interactions. To determine the kind of Pareto front approximations that can be obtained, we have selected a set of representative multi-objective algorithms such as NSGA-II, SMPSO, GDE3, and MOEA/D. Their performances have been assessed by applying two main quality indicators intended to measure convergence and diversity of the fronts. In addition, a comparison with LGA, a reference single-objective evolutionary algorithm for molecular docking (AutoDock) is carried out. In general, SMPSO shows the best overall results in terms of energy and RMSD (value lower than 2A for successful docking results). This new multi-objective approach shows an improvement over the ligand-protein docking predictions that could be promising in in silico docking studies to select new anticancer compounds for therapeutic targets that are multidrug resistant.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The U.S. Nuclear Regulatory Commission implemented a safety goal policy in response to the 1979 Three Mile Island accident. This policy addresses the question “How safe is safe enough?” by specifying quantitative health objectives (QHOs) for comparison with results from nuclear power plant (NPP) probabilistic risk analyses (PRAs) to determine whether proposed regulatory actions are justified based on potential safety benefit. Lessons learned from recent operating experience—including the 2011 Fukushima accident—indicate that accidents involving multiple units at a shared site can occur with non-negligible frequency. Yet risk contributions from such scenarios are excluded by policy from safety goal evaluations—even for the nearly 60% of U.S. NPP sites that include multiple units. This research develops and applies methods for estimating risk metrics for comparison with safety goal QHOs using models from state-of-the-art consequence analyses to evaluate the effect of including multi-unit accident risk contributions in safety goal evaluations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis attempts to find the least-cost strategy to reduce CO2 emission by replacing coal by other energy sources for electricity generation in the context of the proposed EPA’s regulation on CO2 emissions from existing coal-fired power plants. An ARIMA model is built to forecast coal consumption for electricity generation and its CO2 emissions in Michigan from 2016 to 2020. CO2 emission reduction costs are calculated under three emission reduction scenarios- reduction to 17%, 30% and 50% below the 2005 emission level. The impacts of Production Tax Credit (PTC) and the intermittency of renewable energy are also discussed. The results indicate that in most cases natural gas will be the best alternative to coal for electricity generation to realize CO2 reduction goals; if the PTC for wind power will continue after 2015, a natural gas and wind combination approach could be the best strategy based on the least-cost criterion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In multi-unit organisations such as a bank and its branches or a national body delivering publicly funded health or education services through local operating units, the need arises to incentivize the units to operate efficiently. In such instances, it is generally accepted that units found to be inefficient can be encouraged to make efficiency savings. However, units which are found to be efficient need to be incentivized in a different manner. It has been suggested that efficient units could be incentivized by some reward compatible with the level to which their attainment exceeds that of the best of the rest, normally referred to as “super-efficiency”. A recent approach to this issue (Varmaz et. al. 2013) has used Data Envelopment Analysis (DEA) models to measure the super-efficiency of the whole system of operating units with and without the involvement of each unit in turn in order to provide incentives. We identify shortcomings in this approach and use it as a starting point to develop a new DEA-based system for incentivizing operating units to operate efficiently for the benefit of the aggregate system of units. Data from a small German retail bank is used to illustrate our method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

El presente documento pretende mostrar la manera como se debe ejecutar la creación de marca mediante la utilización de mecanismos estratégicos comunitarios y marketing. El objetivo del estudio se basa en encontrar los mecanismos adecuados para el desarrollo y creación de una marca enfocándose en el análisis de las principales prácticas y modelos desarrollados en el área del marketing, examinando el impacto que la marca pueda generar en la comunidad en la cual la organización está incluida, estableciendo además un conexión directa con el modo de vida de los consumidores. Durante el desarrollo del documento se demuestra que las estrategias de marketing aplicadas por cada compañía, sirven para construir una relación estrecha y fuerte con todos los agentes involucrados en la construcción de una marca, principalmente con los clientes, ya que la forma más efectiva de establecer relaciones a largo plazo, es enfocándose exclusivamente en las necesidades desarrolladas por los consumidores, y a partir de ellas ajustar los valores (misión, visión, cultura organizacional, objetivos) de la organización. Estas estrategias comunitarias son también influenciadas por varios factores internos y externos a la organización, los cuales deben ser tenidos en cuenta al momento de elegir la estrategia adecuada. Los mecanismos estratégicos que desarrollan las empresas pueden cambiar significativamente de un sector comercial a otro, la importancia de las necesidades que se deben suplir y el consumidor final se deben evaluar desde un aspecto comunitario, entendiendo como comunidad como el conjunto de grupos sociales y comerciales que tienen relación directa o indirecta con la empresa. Con la investigación llevada a cabo acerca de las estrategias que deben aplicar las compañías se concluye que las marcas reflejan la imagen que la empresa transmite a sus compradores estableciendo una relación emocional entre los consumidores y la marca desarrollada, además de estimular la oferta y demanda del negocio. Se espera que por medio de la obtención de información teórica y conceptual, se pueda aclarar la manera como se puede desarrollar la creación de una marca por medio de la correcta utilización de mecanismos estratégicos comunitarios y de marketing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Actualmente encontramos una fuerte presión en las organizaciones por adaptarse a un mundo competitivo con un descenso en las utilidades y una incertidumbre constante en su flujo de caja. Estas circunstancias obligan a las organizaciones al mejoramiento continuo buscando nuevas formas de gestionar sus procesos y sus recursos. Para las organizaciones de prestación de servicios en el sector de telecomunicaciones una de las ventajas competitivas más importantes de obtener es la productividad debido a que sus ganancias dependen directamente del número de actividades que puedan ejecutar cada empleado. El reto es hacer más con menos y con mejor calidad. Para lograrlo, la necesidad de gestionar efectivamente los recursos humanos aparece, y aquí es donde los sistemas de compensación toman un rol importante. El objetivo en este trabajo es diseñar y aplicar un modelo de remuneración variable para una empresa de prestación de servicios profesionales en el sector de las telecomunicaciones y con esto aportar al estudio de la gestión del desempeño y del talento humano en Colombia. Su realización permitió la documentación del diseño y aplicación del modelo de remuneración variable en un proyecto del sector de telecomunicaciones en Colombia. Su diseño utilizó las tendencias de programas remunerativos y teorías de gestión de desempeño para lograr un modelo integral que permita el crecimiento sostenido en el largo plazo y la motivación al recurso más importante de la organización que es el talento humano. Su aplicación permitió también la documentación de problemas y aciertos en la implementación de estos modelos.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Models based on species distributions are widely used and serve important purposes in ecology, biogeography and conservation. Their continuous predictions of environmental suitability are commonly converted into a binary classification of predicted (or potential) presences and absences, whose accuracy is then evaluated through a number of measures that have been the subject of recent reviews. We propose four additional measures that analyse observation-prediction mismatch from a different angle – namely, from the perspective of the predicted rather than the observed area – and add to the existing toolset of model evaluation methods. We explain how these measures can complete the view provided by the existing measures, allowing further insights into distribution model predictions. We also describe how they can be particularly useful when using models to forecast the spread of diseases or of invasive species and to predict modifications in species’ distributions under climate and land-use change