936 resultados para ECONOMIC MODELS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Revisions of US macroeconomic data are not white-noise. They are persistent, correlated with real-time data, and with high variability (around 80% of volatility observed in US real-time data). Their business cycle effects are examined in an estimated DSGE model extended with both real-time and final data. After implementing a Bayesian estimation approach, the role of both habit formation and price indexation fall significantly in the extended model. The results show how revision shocks of both output and inflation are expansionary because they occur when real-time published data are too low and the Fed reacts by cutting interest rates. Consumption revisions, by contrast, are countercyclical as consumption habits mirror the observed reduction in real-time consumption. In turn, revisions of the three variables explain 9.3% of changes of output in its long-run variance decomposition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Angler creel surveys and economic impact models were used to evaluate potential expansion of aquatic vegetation in Lakes Murray and Moultrie, South Carolina. (PDF contains 4 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal management in a multi-cohort Beverton-Holt model with any number of age classes and imperfect selectivity is equivalent to finding the optimal fish lifespan by chosen fallow cycles. Optimal policy differs in two main ways from the optimal lifespan rule with perfect selectivity. First, weight gain is valued in terms of the whole population structure. Second, the cost of waiting is the interest rate adjusted for the increase in the pulse length. This point is especially relevant for assessing the role of selectivity. Imperfect selectivity reduces the optimal lifespan and the optimal pulse length. We illustrate our theoretical findings with a numerical example. Results obtained using global numerical methods select the optimal pulse length predicted by the optimal lifespan rule.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global warming of the oceans is expected to alter the environmental conditions that determine the growth of a fishery resource. Most climate change studies are based on models and scenarios that focus on economic growth, or they concentrate on simulating the potential losses or cost to fisheries due to climate change. However, analysis that addresses model optimization problems to better understand of the complex dynamics of climate change and marine ecosystems is still lacking. In this paper a simple algorithm to compute transitional dynamics in order to quantify the effect of climate change on the European sardine fishery is presented. The model results indicate that global warming will not necessarily lead to a monotonic decrease in the expected biomass levels. Our results show that if the resource is exploited optimally then in the short run, increases in the surface temperature of the fishery ground are compatible with higher expected biomass and economic profit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

According to the Millennium Ecosystem Assessment’s chapter “Coastal Systems” (Agardy and Alder 2005), 40% of the world population falls within 100 km of the coast. Agardy and Alder report that population densities in coastal regions are three times those of inland regions and demographic forecasts suggest a continued rise in coastal populations. These high population levels can be partially traced to the abundance of ecosystem services provided in the coastal zone. While populations benefit from an abundance of services, population pressure also degrades existing services and leads to increased susceptibility of property and human life to natural hazards. In the face of these challenges, environmental administrators on the coast must pursue agendas which reflect the difficult balance between private and public interests. These decisions include maintaining economic prosperity and personal freedoms, protecting or enhancing the existing flow of ecosystem services to society, and mitigating potential losses from natural hazards. (PDF contains 5 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In three essays we examine user-generated product ratings with aggregation. While recommendation systems have been studied extensively, this simple type of recommendation system has been neglected, despite its prevalence in the field. We develop a novel theoretical model of user-generated ratings. This model improves upon previous work in three ways: it considers rational agents and allows them to abstain from rating when rating is costly; it incorporates rating aggregation (such as averaging ratings); and it considers the effect on rating strategies of multiple simultaneous raters. In the first essay we provide a partial characterization of equilibrium behavior. In the second essay we test this theoretical model in laboratory, and in the third we apply established behavioral models to the data generated in the lab. This study provides clues to the prevalence of extreme-valued ratings in field implementations. We show theoretically that in equilibrium, ratings distributions do not represent the value distributions of sincere ratings. Indeed, we show that if rating strategies follow a set of regularity conditions, then in equilibrium the rate at which players participate is increasing in the extremity of agents' valuations of the product. This theoretical prediction is realized in the lab. We also find that human subjects show a disproportionate predilection for sincere rating, and that when they do send insincere ratings, they are almost always in the direction of exaggeration. Both sincere and exaggerated ratings occur with great frequency despite the fact that such rating strategies are not in subjects' best interest. We therefore apply the behavioral concepts of quantal response equilibrium (QRE) and cursed equilibrium (CE) to the experimental data. Together, these theories explain the data significantly better than does a theory of rational, Bayesian behavior -- accurately predicting key comparative statics. However, the theories fail to predict the high rates of sincerity, and it is clear that a better theory is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An economic air pollution control model, which determines the least cost of reaching various air quality levels, is formulated. The model takes the form of a general, nonlinear, mathematical programming problem. Primary contaminant emission levels are the independent variables. The objective function is the cost of attaining various emission levels and is to be minimized subject to constraints that given air quality levels be attained.

The model is applied to a simplified statement of the photochemical smog problem in Los Angeles County in 1975 with emissions specified by a two-dimensional vector, total reactive hydrocarbon, (RHC), and nitrogen oxide, (NOx), emissions. Air quality, also two-dimensional, is measured by the expected number of days per year that nitrogen dioxide, (NO2), and mid-day ozone, (O3), exceed standards in Central Los Angeles.

The minimum cost of reaching various emission levels is found by a linear programming model. The base or "uncontrolled" emission levels are those that will exist in 1975 with the present new car control program and with the degree of stationary source control existing in 1971. Controls, basically "add-on devices", are considered here for used cars, aircraft, and existing stationary sources. It is found that with these added controls, Los Angeles County emission levels [(1300 tons/day RHC, 1000 tons /day NOx) in 1969] and [(670 tons/day RHC, 790 tons/day NOx) at the base 1975 level], can be reduced to 260 tons/day RHC (minimum RHC program) and 460 tons/day NOx (minimum NOx program).

"Phenomenological" or statistical air quality models provide the relationship between air quality and emissions. These models estimate the relationship by using atmospheric monitoring data taken at one (yearly) emission level and by using certain simple physical assumptions, (e. g., that emissions are reduced proportionately at all points in space and time). For NO2, (concentrations assumed proportional to NOx emissions), it is found that standard violations in Central Los Angeles, (55 in 1969), can be reduced to 25, 5, and 0 days per year by controlling emissions to 800, 550, and 300 tons /day, respectively. A probabilistic model reveals that RHC control is much more effective than NOx control in reducing Central Los Angeles ozone. The 150 days per year ozone violations in 1969 can be reduced to 75, 30, 10, and 0 days per year by abating RHC emissions to 700, 450, 300, and 150 tons/day, respectively, (at the 1969 NOx emission level).

The control cost-emission level and air quality-emission level relationships are combined in a graphical solution of the complete model to find the cost of various air quality levels. Best possible air quality levels with the controls considered here are 8 O3 and 10 NO2 violations per year (minimum ozone program) or 25 O3 and 3 NO2 violations per year (minimum NO2 program) with an annualized cost of $230,000,000 (above the estimated $150,000,000 per year for the new car control program for Los Angeles County motor vehicles in 1975).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This book section aims to synthesise the results of the surveys related to the LVFRP by developing different strategies to implement a sustainable and participative co-management model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a simple method of characterizing countervailing incentives in adverse selection problems. The key element in our characterization consists of analyzing properties of the full information problem. This allows solving the principal problem without using optimal control theory. Our methodology can also be applied to different economic settings: health economics, monopoly regulation, labour contracts, limited liabilities and environmental regulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International fisheries agencies recommend exploitation paths that satisfy two features. First, for precautionary reasons exploitation paths should avoid high fishing mortality in those fisheries where the biomass is depleted to a degree that jeopardise the stock's capacity to produce the Maximum Sustainable Yield (MSY). Second, for economic and social reasons, captures should be as stable (smooth) as possible over time. In this article we show that a conflict between these two interests may occur when seeking for optimal exploitation paths using age structured bioeconomic approach. Our results show that this conflict be overtaken by using non constant discount factors that value future stocks considering their relative intertemporal scarcity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Professionals who are responsible for coastal environmental and natural resource planning and management have a need to become conversant with new concepts designed to provide quantitative measures of the environmental benefits of natural resources. These amenities range from beaches to wetlands to clean water and other assets that normally are not bought and sold in everyday markets. At all levels of government — from federal agencies to townships and counties — decisionmakers are being asked to account for the costs and benefits of proposed actions. To non-specialists, the tools of professional economists are often poorly understood and sometimes inappropriate for the problem at hand. This handbook is intended to bridge this gap. The most widely used organizing tool for dealing with natural and environmental resource choices is benefit-cost analysis — it offers a convenient way to carefully identify and array, quantitatively if possible, the major costs, benefits, and consequences of a proposed policy or regulation. The major strength of benefit-cost analysis is not necessarily the predicted outcome, which depends upon assumptions and techniques, but the process itself, which forces an approach to decision-making that is based largely on rigorous and quantitative reasoning. However, a major shortfall of benefit-cost analysis has been the difficulty of quantifying both benefits and costs of actions that impact environmental assets not normally, nor even regularly, bought and sold in markets. Failure to account for these assets, to omit them from the benefit-cost equation, could seriously bias decisionmaking, often to the detriment of the environment. Economists and other social scientists have put a great deal of effort into addressing this shortcoming by developing techniques to quantify these non-market benefits. The major focus of this handbook is on introducing and illustrating concepts of environmental valuation, among them Travel Cost models and Contingent Valuation. These concepts, combined with advances in natural sciences that allow us to better understand how changes in the natural environment influence human behavior, aim to address some of the more serious shortcomings in the application of economic analysis to natural resource and environmental management and policy analysis. Because the handbook is intended for non-economists, it addresses basic concepts of economic value such as willingness-to-pay and other tools often used in decision making such as costeffectiveness analysis, economic impact analysis, and sustainable development. A number of regionally oriented case studies are included to illustrate the practical application of these concepts and techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Housing stock models can be useful tools in helping to assess the environmental and socio-economic impacts of retrofits to residential buildings; however, existing housing stock models are not able to quantify the uncertainties that arise in the modelling process from various sources, thus limiting the role that they can play in helping decision makers. This paper examines the different sources of uncertainty involved in housing stock models and proposes a framework for handling these uncertainties. This framework involves integrating probabilistic sensitivity analysis with a Bayesian calibration process in order to quantify uncertain parameters more accurately. The proposed framework is tested on a case study building, and suggestions are made on how to expand the framework for retrofit analysis at an urban-scale. © 2011 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The case for energy policy modelling is strong in Ireland, where stringent EU climate targets are projected to be overshot by 2015. Policy targets aiming to deliver greenhouse gas and renewable energy targets have been made, but it is unclear what savings are to be achieved and from which sectors. Concurrently, the growth of personal mobility has caused an astonishing increase in CO2 emissions from private cars in Ireland, a 37% rise between 2000 and 2008, and while there have been improvements in the efficiency of car technology, there was no decrease in the energy intensity of the car fleet in the same period. This thesis increases the capacity for evidenced-based policymaking in Ireland by developing techno-economic transport energy models and using them to analyse historical trends and to project possible future scenarios. A central focus of this thesis is to understand the effect of the car fleet‘s evolving technical characteristics on energy demand. A car stock model is developed to analyse this question from three angles: Firstly, analysis of car registration and activity data between 2000 and 2008 examines the trends which brought about the surge in energy demand. Secondly, the car stock is modelled into the future and is used to populate a baseline “no new policy” scenario, looking at the impact of recent (2008-2011) policy and purchasing developments on projected energy demand and emissions. Thirdly, a range of technology efficiency, fuel switching and behavioural scenarios are developed up to 2025 in order to indicate the emissions abatement and renewable energy penetration potential from alternative policy packages. In particular, an ambitious car fleet electrification target for Ireland is examined. The car stock model‘s functionality is extended by linking it with other models: LEAP-Ireland, a bottom-up energy demand model for all energy sectors in the country; Irish TIMES, a linear optimisation energy system model; and COPERT, a pollution model. The methodology is also adapted to analyse trends in freight energy demand in a similar way. Finally, this thesis addresses the gap in the representation of travel behaviour in linear energy systems models. A novel methodology is developed and case studies for Ireland and California are presented using the TIMES model. Transport Energy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While there is growing interest in measuring the size and scope of local spillovers, it is well understood that such spillovers cannot be distinguished from unobservable local attributes using solely the observed location decisions of individuals or firms. We propose an empirical strategy for recovering estimates of spillovers in the presence of unobserved local attributes for a broadly applicable class of equilibrium sorting models. Our approach relies on an IV strategy derived from the internal logic of the sorting model itself. We show practically how the strategy is implemented, provide intuition for our instruments, discuss the role of effective choice-set variation in identifying the model, and carry-out a series of Monte Carlo simulations to demonstrate performance in small samples. © 2007 The Author(s). Journal compilation Royal Economic Society 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss a general approach to dynamic sparsity modeling in multivariate time series analysis. Time-varying parameters are linked to latent processes that are thresholded to induce zero values adaptively, providing natural mechanisms for dynamic variable inclusion/selection. We discuss Bayesian model specification, analysis and prediction in dynamic regressions, time-varying vector autoregressions, and multivariate volatility models using latent thresholding. Application to a topical macroeconomic time series problem illustrates some of the benefits of the approach in terms of statistical and economic interpretations as well as improved predictions. Supplementary materials for this article are available online. © 2013 Copyright Taylor and Francis Group, LLC.