950 resultados para General allocation model
Resumo:
A new general linear model (GLM) beamformer method is described for processing magnetoencephalography (MEG) data. A standard nonlinear beamformer is used to determine the time course of neuronal activation for each point in a predefined source space. A Hilbert transform gives the envelope of oscillatory activity at each location in any chosen frequency band (not necessary in the case of sustained (DC) fields), enabling the general linear model to be applied and a volumetric T statistic image to be determined. The new method is illustrated by a two-source simulation (sustained field and 20 Hz) and is shown to provide accurate localization. The method is also shown to locate accurately the increasing and decreasing gamma activities to the temporal and frontal lobes, respectively, in the case of a scintillating scotoma. The new method brings the advantages of the general linear model to the analysis of MEG data and should prove useful for the localization of changing patterns of activity across all frequency ranges including DC (sustained fields). © 2004 Elsevier Inc. All rights reserved.
Resumo:
Macroeconomic policy makers are typically concerned with several indicators of economic performance. We thus propose to tackle the design of macroeconomic policy using Multicriteria Decision Making (MCDM) techniques. More specifically, we employ Multiobjective Programming (MP) to seek so-called efficient policies. The MP approach is combined with a computable general equilibrium (CGE) model. We chose use of a CGE model since they have the dual advantage of being consistent with standard economic theory while allowing one to measure the effect(s) of a specific policy with real data. Applying the proposed methodology to Spain (via the 1995 Social Accounting Matrix) we first quantified the trade-offs between two specific policy objectives: growth and inflation, when designing fiscal policy. We then constructed a frontier of efficient policies involving real growth and inflation. In doing so, we found that policy in 1995 Spain displayed some degree of inefficiency with respect to these two policy objectives. We then offer two sets of policy recommendations that, ostensibly, could have helped Spain at the time. The first deals with efficiency independent of the importance given to both growth and inflation by policy makers (we label this set: general policy recommendations). A second set depends on which policy objective is seen as more important by policy makers: increasing growth or controlling inflation (we label this one: objective-specific recommendations).
Resumo:
Intermediate-complexity general circulation models are a fundamental tool to investigate the role of internal and external variability within the general circulation of the atmosphere and ocean. The model used in this thesis is an intermediate complexity atmospheric general circulation model (SPEEDY) coupled to a state-of-the-art modelling framework for the ocean (NEMO). We assess to which extent the model allows a realistic simulation of the most prominent natural mode of variability at interannual time scales: El-Niño Southern Oscillation (ENSO). To a good approximation, the model represents the ENSO-induced Sea Surface Temperature (SST) pattern in the equatorial Pacific, despite a cold tongue-like bias. The model underestimates (overestimates) the typical ENSO spatial variability during the winter (summer) seasons. The mid-latitude response to ENSO reveals that the typical poleward stationary Rossby wave train is reasonably well represented. The spectral decomposition of ENSO features a spectrum that lacks periodicity at high frequencies and is overly periodic at interannual timescales. We then implemented an idealised transient mean state change in the SPEEDY model. A warmer climate is simulated by an alteration of the parametrized radiative fluxes that corresponds to doubled carbon dioxide absorptivity. Results indicate that the globally averaged surface air temperature increases of 0.76 K. Regionally, the induced signal on the SST field features a significant warming over the central-western Pacific and an El-Niño-like warming in the subtropics. In general, the model features a weakening of the tropical Walker circulation and a poleward expansion of the local Hadley cell. This response is also detected in a poleward rearrangement of the tropical convective rainfall pattern. The model setting that has been here implemented provides a valid theoretical support for future studies on climate sensitivity and forced modes of variability under mean state changes.
Resumo:
In this paper we analyze productivity and welfare losses from capital misallocation in a general equilibrium model of occupational choice and endogenous financial intermediation. We study the effects of borrowing and lending, insurance, and risk sharing on the optimal allocation of resources. We find that financial markets together with general equilibrium effects have large impact on entrepreneurs' entry and firm-size decisions. Efficiency gains are increasing in the quality of financial markets, particularly in their ability to alleviate a financing constraint by providing insurance against idiosyncratic risk.
Resumo:
Cost allocation is an inescapable problem in nearly every organization and in nearly every facet of accounting. Within large corporations there are several different types of units, like profit-making business units and non-profit service units. In order to evaluate the performance of the business units and to fund the operations of service units, the expenses of service production need to be allocated to the business units benefiting from the services.The objective of this thesis was to find good and fair allocating factors for the costs of corporate wide IT services. In order to reach this objective, the cost allocation process was studied in general and an overview of cost structure was established. All possible cost driver candidates were mapped and their good and bad properties were weighed. The cost allocation problem was handled separately according to organizational division of corporate IT department: infrastructure, administrative systems, sales system and e-business. The emphasis was on two largest cost groups: infrastructure costs and sales system costs. As a result of the study an allocation model is presented. It contains categorization of the costs, selected cost drivers and cost distributions for the current year.
Resumo:
The main objective of this Master’s thesis is to develop a cost allocation model for a leading food industry company in Finland. The goal is to develop an allocation method for fixed overhead expenses produced in a specific production unit and create a plausible tracking system for product costs. The second objective is to construct an allocation model and modify the created model to be suited for other units as well. Costs, activities, drivers and appropriate allocation methods are studied. This thesis is started with literature review of existing theory of ABC, inspecting cost information and then conducting interviews with officials to get a general view of the requirements for the model to be constructed. The familiarization of the company started with becoming acquainted with the existing cost accounting methods. The main proposals for a new allocation model were revealed through interviews, which were utilized in setting targets for developing the new allocation method. As a result of this thesis, an Excel-based model is created based on the theoretical and empiric data. The new system is able to handle overhead costs in more detail improving the cost awareness, transparency in cost allocations and enhancing products’ cost structure. The improved cost awareness is received by selecting the best possible cost drivers for this situation. Also the capacity changes are taken into consideration, such as usage of practical or normal capacity instead of theoretical is suggested to apply. Also some recommendations for further development are made about capacity handling and cost collection.
Resumo:
The Mara River Basin (MRB) is endowed with pristine biodiversity, socio-cultural heritage and natural resources. The purpose of my study is to develop and apply an integrated water resource allocation framework for the MRB based on the hydrological processes, water demand and economic factors. The basin was partitioned into twelve sub-basins and the rainfall runoff processes was modeled using the Soil and Water Assessment Tool (SWAT) after satisfactory Nash-Sutcliff efficiency of 0.68 for calibration and 0.43 for validation at Mara Mines station. The impact and uncertainty of climate change on the hydrology of the MRB was assessed using SWAT and three scenarios of statistically downscaled outputs from twenty Global Circulation Models. Results predicted the wet season getting more wet and the dry season getting drier, with a general increasing trend of annual rainfall through 2050. Three blocks of water demand (environmental, normal and flood) were estimated from consumptive water use by human, wildlife, livestock, tourism, irrigation and industry. Water demand projections suggest human consumption is expected to surpass irrigation as the highest water demand sector by 2030. Monthly volume of water was estimated in three blocks of current minimum reliability, reserve (>95%), normal (80–95%) and flood (40%) for more than 5 months in a year. The assessment of water price and marginal productivity showed that current water use hardly responds to a change in price or productivity of water. Finally, a water allocation model was developed and applied to investigate the optimum monthly allocation among sectors and sub-basins by maximizing the use value and hydrological reliability of water. Model results demonstrated that the status on reserve and normal volumes can be improved to ‘low’ or ‘moderate’ by updating the existing reliability to meet prevailing demand. Flow volumes and rates for four scenarios of reliability were presented. Results showed that the water allocation framework can be used as comprehensive tool in the management of MRB, and possibly be extended similar watersheds.
Resumo:
This article develops a weighted least squares version of Levene's test of homogeneity of variance for a general design, available both for univariate and multivariate situations. When the design is balanced, the univariate and two common multivariate test statistics turn out to be proportional to the corresponding ordinary least squares test statistics obtained from an analysis of variance of the absolute values of the standardized mean-based residuals from the original analysis of the data. The constant of proportionality is simply a design-dependent multiplier (which does not necessarily tend to unity). Explicit results are presented for randomized block and Latin square designs and are illustrated for factorial treatment designs and split-plot experiments. The distribution of the univariate test statistic is close to a standard F-distribution, although it can be slightly underdispersed. For a complex design, the test assesses homogeneity of variance across blocks, treatments, or treatment factors and offers an objective interpretation of residual plot.
Resumo:
Robust decision making implies welfare costs or robustness premia when the approximating model is the true data generating process. To examine the importance of these premia at the aggregate level we employ a simple two-sector dynamic general equilibrium model with human capital and introduce an additional form of precautionary behavior. The latter arises from the robust decision maker s ability to reduce the effects of model misspecification through allocating time and existing human capital to this end. We find that the extent of the robustness premia critically depends on the productivity of time relative to that of human capital. When the relative efficiency of time is low, despite transitory welfare costs, there are gains from following robust policies in the long-run. In contrast, high relative productivity of time implies misallocation costs that remain even in the long-run. Finally, depending on the technology used to reduce model uncertainty, we fi nd that while increasing the fear of model misspecfi cation leads to a net increase in precautionary behavior, investment and output can fall.
Resumo:
UK regional policy has been advocated as a means of reducing regional disparities and stimulating national growth. However, there is limited understanding of the interregional and national effects of such a policy. This paper uses an interregional computable general equilibrium model to identify the national impact of a policy-induced regional demand shock under alternative labour market closures. Our simulation results suggest that regional policy operating solely on the demand side has significant national impacts. Furthermore, the effects on the non-target region are particularly sensitive to the treatment of the regional labour market.
Resumo:
The aim of the paper is to identify the added value from using general equilibrium techniques to consider the economy-wide impacts of increased efficiency in household energy use. We take as an illustrative case study the effect of a 5% improvement in household energy efficiency on the UK economy. This impact is measured through simulations that use models that have increasing degrees of endogeneity but are calibrated on a common data set. That is to say, we calculate rebound effects for models that progress from the most basic partial equilibrium approach to a fully specified general equilibrium treatment. The size of the rebound effect on total energy use depends upon: the elasticity of substitution of energy in household consumption; the energy intensity of the different elements of household consumption demand; and the impact of changes in income, economic activity and relative prices. A general equilibrium model is required to capture these final three impacts.
Resumo:
This paper presents a general equilibrium model in which nominal government debt pays an inflation risk premium. The model predicts that the inflation risk premium will be higher in economies which are exposed to unanticipated inflation through nominal asset holdings. In particular, the inflation risk premium is higher when government debt is primarily nominal, steady-state inflation is low, and when cash and nominal debt account for a large fraction of consumers' retirement portfolios. These channels do not appear to have been highlighted in previous models or tested empirically. Numerical results suggest that the inflation risk premium is comparable in magnitude to standard representative agent models. These findings have implications for management of government debt, since the inflation risk premium makes it more costly for governments to borrow using nominal rather than indexed debt. Simulations of an extended model with Epstein-Zin preferences suggest that increasing the share of indexed debt would enable governments to permanently lower taxes by an amount that is quantitatively non-trivial.