906 resultados para R15 - Econometric and Input Output Models
Resumo:
The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.
Resumo:
This dissertation proposes statistical methods to formulate, estimate and apply complex transportation models. Two main problems are part of the analyses conducted and presented in this dissertation. The first method solves an econometric problem and is concerned with the joint estimation of models that contain both discrete and continuous decision variables. The use of ordered models along with a regression is proposed and their effectiveness is evaluated with respect to unordered models. Procedure to calculate and optimize the log-likelihood functions of both discrete-continuous approaches are derived, and difficulties associated with the estimation of unordered models explained. Numerical approximation methods based on the Genz algortithm are implemented in order to solve the multidimensional integral associated with the unordered modeling structure. The problems deriving from the lack of smoothness of the probit model around the maximum of the log-likelihood function, which makes the optimization and the calculation of standard deviations very difficult, are carefully analyzed. A methodology to perform out-of-sample validation in the context of a joint model is proposed. Comprehensive numerical experiments have been conducted on both simulated and real data. In particular, the discrete-continuous models are estimated and applied to vehicle ownership and use models on data extracted from the 2009 National Household Travel Survey. The second part of this work offers a comprehensive statistical analysis of free-flow speed distribution; the method is applied to data collected on a sample of roads in Italy. A linear mixed model that includes speed quantiles in its predictors is estimated. Results show that there is no road effect in the analysis of free-flow speeds, which is particularly important for model transferability. A very general framework to predict random effects with few observations and incomplete access to model covariates is formulated and applied to predict the distribution of free-flow speed quantiles. The speed distribution of most road sections is successfully predicted; jack-knife estimates are calculated and used to explain why some sections are poorly predicted. Eventually, this work contributes to the literature in transportation modeling by proposing econometric model formulations for discrete-continuous variables, more efficient methods for the calculation of multivariate normal probabilities, and random effects models for free-flow speed estimation that takes into account the survey design. All methods are rigorously validated on both real and simulated data.
Resumo:
The principal aim of this paper is to measure the amount by which the profit of a multi-input, multi-output firm deviates from maximum short-run profit, and then to decompose this profit gap into components that are of practical use to managers. In particular, our interest is in the measurement of the contribution of unused capacity, along with measures of technical inefficiency, and allocative inefficiency, in this profit gap. We survey existing definitions of capacity and, after discussing their shortcomings, we propose a new ray economic capacity measure that involves short-run profit maximisation, with the output mix held constant. We go on to describe how the gap between observed profit and maximum profit can be calculated and decomposed using linear programming methods. The paper concludes with an empirical illustration, involving data on 28 international airline companies. The empirical results indicate that these airline companies achieve profit levels which are on average US$815m below potential levels, and that 70% of the gap may be attributed to unused capacity. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We show how polarization measurements on the output fields generated by parametric down conversion will reveal a violation of multiparticle Bell inequalities, in the regime of both low- and high-output intensity. In this case, each spatially separated system, upon which a measurement is performed, is comprised of more than one particle. In view of the formal analogy with spin systems, the proposal provides an opportunity to test the predictions of quantum mechanics for spatially separated higher spin states. Here the quantum behavior possible even where measurements are performed on systems of large quantum (particle) number may be demonstrated. Our proposal applies to both vacuum-state signal and idler inputs, and also to the quantum-injected parametric amplifier as studied by De Martini The effect of detector inefficiencies is included, and weaker Bell-Clauser-Horne inequalities are derived to enable realistic tests of local hidden variables with auxiliary assumptions for the multiparticle situation.
Resumo:
In previous work we have applied the environmental multi-region input-output (MRIO) method proposed by Turner et al (2007) to examine the ‘CO2 trade balance’ between Scotland and the Rest of the UK. In McGregor et al (2008) we construct an interregional economy-environment input-output (IO) and social accounting matrix (SAM) framework that allows us to investigate methods of attributing responsibility for pollution generation in the UK at the regional level. This facilitates analysis of the nature and significance of environmental spillovers and the existence of an environmental ‘trade balance’ between regions. While the existence of significant data problems mean that the quantitative results of this study should be regarded as provisional, we argue that the use of such a framework allows us to begin to consider questions such as the extent to which a devolved authority like the Scottish Parliament can and should be responsible for contributing to national targets for reductions in emissions levels (e.g. the UK commitment to the Kyoto Protocol) when it is limited in the way it can control emissions, particularly with respect to changes in demand elsewhere in the UK. However, while such analysis is useful in terms of accounting for pollution flows in the single time period that the accounts relate to, it is limited when the focus is on modelling the impacts of any marginal change in activity. This is because a conventional demand-driven IO model assumes an entirely passive supply-side in the economy (i.e. all supply is infinitely elastic) and is further restricted by the assumption of universal Leontief (fixed proportions) technology implied by the use of the A and multiplier matrices. In this paper we argue that where analysis of marginal changes in activity is required, a more flexible interregional computable general equilibrium approach that models behavioural relationships in a more realistic and theory-consistent manner, is more appropriate and informative. To illustrate our analysis, we compare the results of introducing a positive demand stimulus in the UK economy using both IO and CGE interregional models of Scotland and the rest of the UK. In the case of the latter, we demonstrate how more theory consistent modelling of both demand and supply side behaviour at the regional and national levels affect model results, including the impact on the interregional CO2 ‘trade balance’.
Resumo:
The regional economic impact of biofuel production depends upon a number of interrelated factors: the specific biofuels feedstock and production technology employed; the sector’s embeddedness to the rest of the economy, through its demand for local resources; the extent to which new activity is created. These issues can be analysed using multisectoral economic models. Some studies have used (fixed price) Input-Output (IO) and Social Accounting Matrix (SAM) modelling frameworks, whilst a nascent Computable General Equilibrium (CGE) literature has also begun to examine the regional (and national) impact of biofuel development. This paper reviews, compares and evaluates these approaches for modelling the regional economic impacts of biofuels.
Resumo:
The regional economic impact of biofuel production depends upon a number of interrelated factors: the specific biofuels feedstock and production technology employed; the sector’s embeddedness to the rest of the economy, through its demand for local resources; the extent to which new activity is created. These issues can be analysed using multisectoral economic models. Some studies have used (fixed price) Input-Output (IO) and Social Accounting Matrix (SAM) modelling frameworks, whilst a nascent Computable General Equilibrium (CGE) literature has also begun to examine the regional (and national) impact of biofuel development. This paper reviews, compares and evaluates these approaches for modelling the regional economic impacts of biofuels.
Resumo:
In an input-output context the impact of any particular industrial sector is commonly measured in terms of the output multiplier for that industry. Although such measures are routinely calculated and often used to guide regional industrial policy the behaviour of such measures over time is an area that has attracted little academic study. The output multipliers derived from any one table will have a distribution; for some industries the multiplier will be relatively high, for some it will be relatively low. The recentpublication of consistent input-output tables for the Scottish economy makes it possible to examine trends in this mdistribution over the ten year period 1998-2007. This is done by comparing the means and other summary measures of the distributions, the histograms and the cumulative densities. The results indicate a tendency for the multipliers to increase over the period. A Markov chain modelling approach suggests that this drift is a slow but long term phenomenon which appears not to tend to an equilibrium state. The prime reason for the increase in the output multipliers is traced to a decline in the relative importance of imported (both from the rest of the UK and the rest of the world) intermediate inputs used by Scottish industries. This suggests that models calibrated on the set of tables might have to be interpreted with caution.
Resumo:
This paper analyzes the flow of intermediate inputs across sectors by adopting a network perspective on sectoral interactions. I apply these tools to show how fluctuationsin aggregate economic activity can be obtained from independent shocks to individualsectors. First, I characterize the network structure of input trade in the U.S. On thedemand side, a typical sector relies on a small number of key inputs and sectors arehomogeneous in this respect. However, in their role as input-suppliers sectors do differ:many specialized input suppliers coexist alongside general purpose sectors functioningas hubs to the economy. I then develop a model of intersectoral linkages that can reproduce these connectivity features. In a standard multisector setup, I use this modelto provide analytical expressions linking aggregate volatility to the network structureof input trade. I show that the presence of sectoral hubs - by coupling productiondecisions across sectors - leads to fluctuations in aggregates.
Resumo:
What determines which inputs are initially considered and eventually adopted in the productionof new or improved goods? Why are some inputs much more prominent than others? We modelthe evolution of input linkages as a process where new producers first search for potentially usefulinputs and then decide which ones to adopt. A new product initially draws a set of 'essentialsuppliers'. The search stage is then confined to the network neighborhood of the latter, i.e., to theinputs used by the essential suppliers. The adoption decision is driven by a tradeoff between thebenefits accruing from input variety and the costs of input adoption. This has important implicationsfor the number of forward linkages that a product (input variety) develops over time. Inputdiffusion is fostered by network centrality ? an input that is initially represented in many networkneighborhoods is subsequently more likely to be adopted. This mechanism also delivers a powerlaw distribution of forward linkages. Our predictions continue to hold when varieties are aggregatedinto sectors. We can thus test them, using detailed sectoral US input-output tables. We showthat initial network proximity of a sector in 1967 significantly increases the likelihood of adoptionthroughout the subsequent four decades. The same is true for rapid productivity growth in aninput-producing sector. Our empirical results highlight two conditions for new products to becomecentral nodes: initial network proximity to prospective adopters, and technological progress thatreduces their relative price. Semiconductors met both conditions.
Resumo:
Tämä työ luo katsauksen ajallisiin ja stokastisiin ohjelmien luotettavuus malleihin sekä tutkii muutamia malleja käytännössä. Työn teoriaosuus sisältää ohjelmien luotettavuuden kuvauksessa ja arvioinnissa käytetyt keskeiset määritelmät ja metriikan sekä varsinaiset mallien kuvaukset. Työssä esitellään kaksi ohjelmien luotettavuusryhmää. Ensimmäinen ryhmä ovat riskiin perustuvat mallit. Toinen ryhmä käsittää virheiden ”kylvöön” ja merkitsevyyteen perustuvat mallit. Työn empiirinen osa sisältää kokeiden kuvaukset ja tulokset. Kokeet suoritettiin käyttämällä kolmea ensimmäiseen ryhmään kuuluvaa mallia: Jelinski-Moranda mallia, ensimmäistä geometrista mallia sekä yksinkertaista eksponenttimallia. Kokeiden tarkoituksena oli tutkia, kuinka syötetyn datan distribuutio vaikuttaa mallien toimivuuteen sekä kuinka herkkiä mallit ovat syötetyn datan määrän muutoksille. Jelinski-Moranda malli osoittautui herkimmäksi distribuutiolle konvergaatio-ongelmien vuoksi, ensimmäinen geometrinen malli herkimmäksi datan määrän muutoksille.
Resumo:
The pulp and paper industry is currently facing broad structural changes due to global shifts in demand and supply. These changes have significant impacts on national economies worldwide. Planted forests (especially eucalyptus) and recovered paper have quickly increased their importance as raw material for paper and paperboard production. Although advances in information and communication technologies could reduce the demand for communication papers, and the growth of paper consumption has indeed flattened in developed economies, particularly in North America and Western Europe, the consumption is increasing on a global scale. Moreover, the focal point of production and consumption is moving from the Western world to the rapidly growing markets of Southeast Asia. This study analyzes how the so-called megatrends (globalization, technological development, and increasing environmental awareness) affect the pulp and paper industry’s external environment, and seeks reliable ways to incorporate the impact of the megatrends on the models concerning the demand, trade, and use of paper and pulp. The study expands current research in several directions and points of view, for example, by applying and incorporating several quantitative methods and different models. As a result, the thesis makes a significant contribution to better understand and measure the impacts of structural changes on the pulp and paper industry. It also provides some managerial and policy implications.
Resumo:
The aim of this paper is to demonstrate that, even if Marx's solution to the transformation problem can be modified, his basic conclusions remain valid. the proposed alternative solution which is presented hare is based on the constraint of a common general profit rate in both spaces and a money wage level which will be determined simultaneously with prices.
Resumo:
The aim of this paper is to demonstrate that, even if Marx's solution to the transformation problem can be modified, his basic concusions remain valid.
Resumo:
In a recent paper, Bai and Perron (1998) considered theoretical issues related to the limiting distribution of estimators and test statistics in the linear model with multiple structural changes. In this companion paper, we consider practical issues for the empirical applications of the procedures. We first address the problem of estimation of the break dates and present an efficient algorithm to obtain global minimizers of the sum of squared residuals. This algorithm is based on the principle of dynamic programming and requires at most least-squares operations of order O(T 2) for any number of breaks. Our method can be applied to both pure and partial structural-change models. Secondly, we consider the problem of forming confidence intervals for the break dates under various hypotheses about the structure of the data and the errors across segments. Third, we address the issue of testing for structural changes under very general conditions on the data and the errors. Fourth, we address the issue of estimating the number of breaks. We present simulation results pertaining to the behavior of the estimators and tests in finite samples. Finally, a few empirical applications are presented to illustrate the usefulness of the procedures. All methods discussed are implemented in a GAUSS program available upon request for non-profit academic use.