856 resultados para C33 - Models with Panel Data
Resumo:
The paper focuses on the recent pattern of government consumption expenditure in developing countries and estimates the determinants which have influenced government expenditure. Using a panel data set for 111 developing countries from 1984 to 2004, this study finds evidence that political and institutional variables as well as governance variables significantly influence government expenditure. Among other results, the paper finds new evidence of Wagner's law which states that peoples' demand for service and willingness to pay is income-elastic hence the expansion of public economy is influenced by the greater economic affluence of a nation (Cameron1978). Corruption is found to be influential in explaining the public expenditure of developing countries. On the contrary, size of the economy and fractionalization are found to have significant negative association with government expenditure. In addition, the study finds evidence that public expenditure significantly shrinks under military dictatorship compared with other form of governance.
Resumo:
This paper explains how the Armington-Krugman-Melitz supermodel developed by Dixon and Rimmer can be parameterized, and demonstrates that only two kinds of additional information are required in order to extend a standard trade model to include Melitz-type monopolistic competition and heterogeneous firms. Further, it is shown how specifying too much additional information leads to violations of the model constraints, necessitating adjustment and reconciliation of the data. Once a Melitz-type model is parameterized, a Krugman-type model can also be parameterized using the calibrated values in the Melitz-type model without any additional data. Sample code for the General Algebraic Modeling System (GAMS) has also been prepared to promote the innovative supermodel in the AGE community.
Resumo:
Services in smart environments pursue to increase the quality of people?s lives. The most important issues when developing this kind of environments is testing and validating such services. These tasks usually imply high costs and annoying or unfeasible real-world testing. In such cases, artificial societies may be used to simulate the smart environment (i.e. physical environment, equipment and humans). With this aim, the CHROMUBE methodology guides test engineers when modeling human beings. Such models reproduce behaviors which are highly similar to the real ones. Originally, these models are based on automata whose transitions are governed by random variables. Automaton?s structure and the probability distribution functions of each random variable are determined by a manual test and error process. In this paper, it is presented an alternative extension of this methodology which avoids the said manual process. It is based on learning human behavior patterns automatically from sensor data by using machine learning techniques. The presented approach has been tested on a real scenario, where this extension has given highly accurate human behavior models,
Resumo:
Improving the knowledge of demand evolution over time is a key aspect in the evaluation of transport policies and in forecasting future investment needs. It becomes even more critical for the case of toll roads, which in recent decades has become an increasingly common device to fund road projects. However, literature regarding demand elasticity estimates in toll roads is sparse and leaves some important aspects to be analyzed in greater detail. In particular, previous research on traffic analysis does not often disaggregate heavy vehicle demand from the total volume, so that the specific behavioral patternsof this traffic segment are not taken into account. Furthermore, GDP is the main socioeconomic variable most commonly chosen to explain road freight traffic growth over time. This paper seeks to determine the variables that better explain the evolution of heavy vehicle demand in toll roads over time. To that end, we present a dynamic panel data methodology aimed at identifying the key socioeconomic variables that explain the behavior of road freight traffic throughout the years. The results show that, despite the usual practice, GDP may not constitute a suitable explanatory variable for heavy vehicle demand. Rather, considering only the GDP of those sectors with a high impact on transport demand, such as construction or industry, leads to more consistent results. The methodology is applied to Spanish toll roads for the 1990?2011 period. This is an interesting case in the international context, as road freight demand has experienced an even greater reduction in Spain than elsewhere, since the beginning of the economic crisis in 2008.
Resumo:
Tolls have increasingly become a common mechanism to fund road projects in recent decades. Therefore, improving knowledge of demand behavior constitutes a key aspect for stakeholders dealing with the management of toll roads. However, the literature concerning demand elasticity estimates for interurban toll roads is still limited due to their relatively scarce number in the international context. Furthermore, existing research has left some aspects to be investigated, among others, the choice of GDP as the most common socioeconomic variable to explain traffic growth over time. This paper intends to determine the variables that better explain the evolution of light vehicle demand in toll roads throughout the years. To that end, we establish a dynamic panel data methodology aimed at identifying the key socioeconomic variables explaining changes in light vehicle demand over time. The results show that, despite some usefulness, GDP does not constitute the most appropriate explanatory variable, while other parameters such as employment or GDP per capita lead to more stable and consistent results. The methodology is applied to Spanish toll roads for the 1990?2011 period, which constitutes a very interesting case on variations in toll road use, as road demand has experienced a significant decrease since the beginning of the economic crisis in 2008.
Resumo:
This paper deals with the determinants of labour out-migration from agriculture across 149 EU regions over the 1990–2008 period. The central aim is to shed light on the role played by payments from the common agricultural policy (CAP) on this important adjustment process. Using static and dynamic panel data estimators, we show that standard neoclassical drivers, like relative income and the relative labour share, represent significant determinants of the intersectoral migration of agricultural labour. Overall, CAP payments contributed significantly to job creation in agriculture, although the magnitude of the economic effect was rather moderate. We also find that pillar I subsidies exerted an effect approximately two times greater than that of pillar II payments.
Resumo:
In this paper we propose a range of dynamic data envelopment analysis (DEA) models which allow information on costs of adjustment to be incorporated into the DEA framework. We first specify a basic dynamic DEA model predicated on a number or simplifying assumptions. We then outline a number of extensions to this model to accommodate asymmetric adjustment costs, non-static output quantities, non-static input prices, and non-static costs of adjustment, technological change, quasi-fixed inputs and investment budget constraints. The new dynamic DEA models provide valuable extra information relative to the standard static DEA models-they identify an optimal path of adjustment for the input quantities, and provide a measure of the potential cost savings that result from recognising the costs of adjusting input quantities towards the optimal point. The new models are illustrated using data relating to a chain of 35 retail department stores in Chile. The empirical results illustrate the wealth of information that can be derived from these models, and clearly show that static models overstate potential cost savings when adjustment costs are non-zero.
Resumo:
Standard factorial designs sometimes may be inadequate for experiments that aim to estimate a generalized linear model, for example, for describing a binary response in terms of several variables. A method is proposed for finding exact designs for such experiments that uses a criterion allowing for uncertainty in the link function, the linear predictor, or the model parameters, together with a design search. Designs are assessed and compared by simulation of the distribution of efficiencies relative to locally optimal designs over a space of possible models. Exact designs are investigated for two applications, and their advantages over factorial and central composite designs are demonstrated.
Resumo:
Deformable models are an attractive approach to recognizing objects which have considerable within-class variability such as handwritten characters. However, there are severe search problems associated with fitting the models to data which could be reduced if a better starting point for the search were available. We show that by training a neural network to predict how a deformable model should be instantiated from an input image, such improved starting points can be obtained. This method has been implemented for a system that recognizes handwritten digits using deformable models, and the results show that the search time can be significantly reduced without compromising recognition performance. © 1997 Academic Press.
Resumo:
Data Envelopment Analysis (DEA) is a nonparametric method for measuring the efficiency of a set of decision making units such as firms or public sector agencies, first introduced into the operational research and management science literature by Charnes, Cooper, and Rhodes (CCR) [Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The original DEA models were applicable only to technologies characterized by positive inputs/outputs. In subsequent literature there have been various approaches to enable DEA to deal with negative data. In this paper, we propose a semi-oriented radial measure, which permits the presence of variables which can take both negative and positive values. The model is applied to data on a notional effluent processing system to compare the results with those yielded by two alternative methods for dealing with negative data in DEA: The modified slacks-based model suggested by Sharp et al. [Sharp, J.A., Liu, W.B., Meng, W., 2006. A modified slacks-based measure model for data envelopment analysis with ‘natural’ negative outputs and inputs. Journal of Operational Research Society 57 (11) 1–6] and the range directional model developed by Portela et al. [Portela, M.C.A.S., Thanassoulis, E., Simpson, G., 2004. A directional distance approach to deal with negative data in DEA: An application to bank branches. Journal of Operational Research Society 55 (10) 1111–1121]. A further example explores the advantages of using the new model.
Resumo:
Foley [J. Opt. Soc. Am. A 11 (1994) 1710] has proposed an influential psychophysical model of masking in which mask components in a contrast gain pool are raised to an exponent before summation and divisive inhibition. We tested this summation rule in experiments in which contrast detection thresholds were measured for a vertical 1 c/deg (or 2 c/deg) sine-wave component in the presence of a 3 c/deg (or 6 c/deg) mask that had either a single component oriented at -45° or a pair of components oriented at ±45°. Contrary to the predictions of Foley's model 3, we found that for masks of moderate contrast and above, threshold elevation was predicted by linear summation of the mask components in the inhibitory stage of the contrast gain pool. We built this feature into two new models, referred to as the early adaptation model and the hybrid model. In the early adaptation model, contrast adaptation controls a threshold-like nonlinearity on the output of otherwise linear pathways that provide the excitatory and inhibitory inputs to a gain control stage. The hybrid model involves nonlinear and nonadaptable routes to excitatory and inhibitory stages as well as an adaptable linear route. With only six free parameters, both models provide excellent fits to the masking and adaptation data of Foley and Chen [Vision Res. 37 (1997) 2779] but unlike Foley and Chen's model, are able to do so with only one adaptation parameter. However, only the hybrid model is able to capture the features of Foley's (1994) pedestal plus orthogonal fixed mask data. We conclude that (1) linear summation of inhibitory components is a feature of contrast masking, and (2) that the main aftereffect of spatial adaptation on contrast increment thresholds can be assigned to a single site. © 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Despite the increased attention on the impacts of globalisation, there has been little empirical investigation into the impact of multinational firms on the domestic labour market and in particular wage inequality, this is in spite of a rapid increase in foreign direct investment (FDI) at around the same time of rising inequality. Using UK panel data, this paper tests whether inward flows of FDI have contributed to increasing wage inequality. Even after controlling for the two most common explanations of wage inequality, technology and trade, we find that FDI has a significant effect upon wage inequality, with the overall impact of FDI explaining on average 11% of wage inequality. © 2003 Elsevier B.V. All rights reserved.
Resumo:
Using panel data pertaining to large Polish (non-financial) firms this paper examines the determinants of employment change during the period 1996-2002. Paying particular attention to the asymmetry hypothesis we investigate the impact of own wages, outside wages, output growth, regional characteristics and sectoral affiliation on the evolution of employment. In keeping with the 'right to manage' model we find that employment dynamics are not affected negatively by alternative wages. Furthermore, in contrast to the early transition period, we find evidence that employment levels respond to positive sales growth (in all but state firms). The early literature, (e.g. Kollo, 1998) found that labour hoarding lowered employment elasticities in the presence of positive demand shocks. Our findings suggest that inherited labour hoarding may no longer be a factor. We argue that the present pattern of employment adjustment is better explained by the role of insiders. This tentative conclusion is hinged on the contrasting behaviour of state and privatised companies and the similar behaviour of privatised and new private companies. We conclude that lower responsiveness of employment to both positive and negative changes in revenue in state firms is consistent with the proposition that rent sharing by insiders is stronger in the state sector.
Resumo:
Over the last few years Data Envelopment Analysis (DEA) has been gaining increasing popularity as a tool for measuring efficiency and productivity of Decision Making Units (DMUs). Conventional DEA models assume non-negative inputs and outputs. However, in many real applications, some inputs and/or outputs can take negative values. Recently, Emrouznejad et al. [6] introduced a Semi-Oriented Radial Measure (SORM) for modelling DEA with negative data. This paper points out some issues in target setting with SORM models and introduces a modified SORM approach. An empirical study in bank sector demonstrates the applicability of the proposed model. © 2014 Elsevier Ltd. All rights reserved.
Resumo:
Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.