1000 resultados para Finances -- Models economètrics
Aplicación del DEA en el análisis de beneficios en un sistema integrado verticalmente hacia adelante
Resumo:
En el presente trabajo se diseñan tres modelos DEA a partir de un sistema de producción cuyos componentes están colocados en un arreglo en serie que se integran verticalmente hacia adelante. El primer modelo busca optimizar los beneficios del sistema agregado, así como la mejora de los mismos en cada uno de los subsistemas. En el segundo de los modelos, además del objetivo anterior, se incluyen restricciones de transferencia de los recursos específicos asociados a cada subsistema, y en el tercer modelo se estima el intervalo de variación para los precios de transferencia de los inputs intermedios entre ambos subsistemas. Los modelos han sido programados y simulados en el software GAMS a partir de datos generados por una función de producción Cobb-Douglas para los inputs intermedios y los outputs finales.
Resumo:
La elaboración de un índice de performance para la evaluación de carteras de inversión tiene como base la correcta definición de la medida de riesgo a emplear. Este trabajo tiene como objetivo proponer una medida de performance adecuada a la evaluación de carteras de fondos de inversión garantizados. Las particularidades de este tipo de fondos hacen necesario definir una medida explicativa de las características especificas de riesgo de este tipo de carteras. Partiendo de la estrategia de porfolio insurance se define una nueva medida de riesgo basada en el downside risk. Proponemos como medida de downside risk aquella parte del riesgo total de una cartera de títulos que se elimina con la estrategia de portfolio insurance. Por contraposición, proponemos como medida de upside risk aquella otra parte del riesgo total de la cartera que no desaparece con la estrategia de portfolio insurance. De este modo, la suma del upside risk y del downside risk es el riesgo total. Partiendo de la medida de riesgo upside risk y del modelo de valoración de activos C.A.P.M. se propone una medida de performance específica para evaluar los fondos de inversión garantizados.
Resumo:
En este trabajo evaluamos la utilidad de una medida de la eficiencia en la generación de ventas, para la predicción del resultado de explotación futuro, bajo la hipótesis de que si la medida de la eficiencia es capaz de capturar el componente permanente de los resultados, debería ser útil para la predicción de los resultados futuros, en adición a los resultados actuales. Con el objetivo anterior, en una primera etapa, utilizamos el Análisis Envolvente de Datos (DEA) para determinar la ineficiencia relativa de las empresas en el uso de los recursos a su disposición para generar el nivel máximo posible de ventas. Los inputs incorporados en el modelo DEA (gastos de personal, consumos de materias primas y otros, amortización, y otros gastos de explotación) se obtienen a partir de información contenida en la Cuenta de Pérdidas y Ganancias. En la segunda etapa, la medida de ineficiencia se introduce como variable explicativa en un modelo de regresión en el que la variable dependiente es el resultado de explotación en el año inmediatamente posterior. Los resultados del estudio empírico indican que la medida de ineficiencia relativa proporcionada por el modelo DEA tiene contenido informativo para la predicción del resultado de explotación futuro, en adición al resultado de explotación actual y pasado.
Resumo:
Este trabajo explora la formación de grupos estratégicos en el Sector Bancario Español, a partir de una definición alternativa de grupo estratégico (GE). Un GE se define como un conjunto de empresas capaces de responder del mismo modo a perturbaciones. La capacidad de respuesta, o capacidad de adaptar la estrategia competitiva, se define a partir de las relaciones marginales (RM) entre las variables estratégicas (la pendiente de la frontera en un punto), y sirve de base para examinar la presencia de barreras a la movilidad. El Análisis Envolvente de Datos (DEA) es la herramienta utilizada para el cómputo de las RM, aunque su uso genera dos problemas que son solucionados en este trabajo; la multiplicidad de RM para las empresas situadas en la frontera (líderes estratégicos), y la imposibilidad de encontrar RM para las empresas situadas por debajo de la misma (seguidores estratégicos).
Resumo:
This paper assesses empirically the importance of size discrimination and disaggregate data for deciding where to locate a start-up concern. We compare three econometric specifications using Catalan data: a multinomial logit with 4 and 41 alternatives (provinces and comarques, respectively) in which firm size is the main covariate; a conditional logit with 4 and 41 alternatives including attributes of the sites as well as size-site interactions; and a Poisson model on the comarques and the full spatial choice set (942 municipalities) with site-specific variables. Our results suggest that if these two issues are ignored, conclusions may be misleading. We provide evidence that large and small firms behave differently and conclude that Catalan firms tend to choose between comarques rather than between municipalities. Moreover, labour-intensive firms seem more likely to be located in the city of Barcelona. Keywords: Catalonia, industrial location, multinomial response model. JEL: C250, E30, R00, R12
Resumo:
Information sharing in oligopoly has been analyzed by assuming that firms behave as a sole economic agent. In this paper I assume that ownership and management are separated. Managers are allowed to falsely report their costs to owners and rivals. Under such circumstances, if owners want to achieve information sharing they must use managerial contracts that implement truthful cost reporting by managers as a dominant strategy. I show that, contrary to the classical result, without the inclusion of message-dependent payments in managerial contracts there will be no information sharing. On the other hand, with the inclusion of such publicly observable payments and credible ex-ante commitment by owners not to modify these payments, there will be perfect information sharing without the need for third parties. Keywords: Information sharing, Delegation, Managerial contracts. JEL classification numbers: D21, D82, L13, L21
Resumo:
I consider the problem of assigning agents to objects where each agent must pay the price of the object he gets and prices must sum to a given number. The objective is to select an assignment-price pair that is envy-free with respect to the true preferences. I prove that the proposed mechanism will implement both in Nash and strong Nash the set of envy-free allocations. The distinguishing feature of the mechanism is that it treats the announced preferences as the true ones and selects an envy-free allocation with respect to the announced preferences.
Resumo:
This paper analyzes the linkages between the credibility of a target zone regime, the volatility of the exchange rate, and the width of the band where the exchange rate is allowed to fluctuate. These three concepts should be related since the band width induces a trade-off between credibility and volatility. Narrower bands should give less scope for the exchange rate to fluctuate but may make agents perceive a larger probability of realignment which by itself should increase the volatility of the exchange rate. We build a model where this trade-off is made explicit. The model is used to understand the reduction in volatility experienced by most EMS countries after their target zones were widened on August 1993. As a natural extension, the model also rationalizes the existence of non-official, implicit target zones (or fear of floating), suggested by some authors.
Resumo:
Actual tax systems do not follow the normative recommendations of yhe theory of optimal taxation. There are two reasons for this. Firstly, the informational difficulties of knowing or estimating all relevant elasticities and parameters. Secondly, the political complexities that would arise if a new tax implementation would depart too much from current systems that are perceived as somewhat egalitarians. Hence an ex-novo overhaul of the tax system might just be non-viable. In contrast, a small marginal tax reform could be politically more palatable to accept and economically more simple to implement. The goal of this paper is to evaluate, as a step previous to any tax reform, the marginal welfare cost of the current tax system in Spain. We do this by using a computational general equilibrium model calibrated to a point-in-time micro database. The simulations results show that the Spanish tax system gives rise to a considerable marginal excess burden. Its order of magnitude is of about 0.50 money units for each additional money unit collected through taxes.
Resumo:
We analyze the effects of uncertainty and private information on horizontal mergers. Firms face uncertain demands or costs and receive private signals. They may decide to merge sharing their private information. If the uncertainty parameters are independent and the signals are perfect, uncertainty generates an informational advantage only to the merging firms, increasing merger incentives and decreasing free-riding effects. Thus, mergers become more profitable and stable. These results generalize to the case of correlated parameters if the correlation is not very severe, and for perfect correlation if the firms receive noisy signals. From the normative point of view, mergers are socially less harmful compared to deterministic markets and may even be welfare enhancing. If the signals are, instead, publicly observed, uncertainty does not necessarily give more incentives to merge, and mergers are not always less socially harmful.
Resumo:
We use structural methods to assess equilibrium models of bidding with data from first-price auction experiments. We identify conditions to test the Nash equilibrium models for homogenous and for heterogeneous constant relative risk aversion when bidders private valuations are independent and uniformly drawn. The outcomes of our study indicate that behavior may have been affected by the procedure used to conduct the experiments and that the usual Nash equilibrium model for heterogeneous constant relative risk averse bidders does not consistently explain the observed overbidding. From an empirical standpoint, our analysis shows the possible drawbacks of overlooking the homogeneity hypothesis when testing symmetric equilibrium models of bidding and it puts in perspective the sensitivity of structural inferences to the available information.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of "the historical tendencies": a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin's method, to prove the result.
Resumo:
Labour market reforms face very often opposition from the employed workers, because it normally reduces their wages. Also product market regulations are regularly biased towards too much benefitting the firms. As a result there remain many frictions in both the labour and product markets that hinder an optimal functioning of the economy. These issues have recently received a lot of attention in the economics literature and scholars have been looking for politically viable reforms in both markets. However, despite its potential importance, there has been done virtually no research on the interaction between reforms in product and labour markets. We find that when combining reforms, the opposition for reforms decreases considerably. This is because there exist complementarities and the gains in total welfare can be more evenly distributed over the interest groups. Moreover, the interaction of reforms offers a way out for the so-called 'sclerosis' effect.
Resumo:
From the classical gold standard up to the current ERM2 arrangement of the European Union, target zones have been a widely used exchange regime in contemporary history. This paper presents a benchmark model that rationalizes the choice of target zones over the rest of regimes: the fixed rate, the free float and the managed float. It is shown that the monetary authority may gain efficiency by reducing volatility of both the exchange rate and the interest rate at the same time. Furthermore, the model is consistent with some known stylized facts in the empirical literature that previous models were not able to produce, namely, the positive relation between the exchange rate and the interest rate differential, the degree of non-linearity of the function linking the exchage rate to fundamentals and the shape of the exchange rate stochastic distribution.