980 resultados para Stochastic Frontier Models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

O objectivo deste trabalho é a análise da eficiência produtiva e dos efeitos da concentração sobre os custos bancários, tendo por base a indústria bancária portuguesa. O carácter multiproduto da empresa bancária sugere a necessidade de se adoptar formas multiproduto da função custo (tipo Fourier). Introduzimos variáveis de homogeneidade e de estrutura que permitem o recurso a formas funcionais uniproduto (Cobb-Douglas) à banca. A amostra corresponde a 22 bancos que operavam em Portugal entre 1995-2001, base não consolidada e dados em painel. Para o estudo da ineficiência recorreu-se ao modelo estocástico da curva fronteira (SFA), para as duas especificações. Na análise da concentração, introduziram-se variáveis binárias que pretendem captar os efeitos durante quatro anos após a concentração. Tanto no caso da SFA como no da concentração, os resultados encontrados são sensíveis à especificação funcional adoptada. Concluindo, o processo de concentração bancário parece justificar-se pela possibilidade da diminuição da ineficiência-X. This study addresses the productive efficiency and the effects of concentration over the banking costs, stressing its focus on the Portuguese banking market. The multiproduct character of the banking firm suggests the use of functional forms as Fourier. The introduction of variables of structure and of homogeneity allows the association of the banking activity (multiproduct) with a single product function (Cobb-Douglas type). The sample covers 22 banks which operated in Portugal from 1995-2001, non consolidated base with a panel data structure. The study about inefficiency is elaborated through the stochastic frontier model (SFA), for the two specifications selected. As a methodology to analyze the concentration, we introduced binary variables, which intend to catch the effects through four years after the concentration process. The results obtained, through SFA and concentration approach, are influenced by the kind of specifications selected. Summing up, the concentration process of the Banking Industry sounds to be justified by the possibility of the X-inefficiency.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Copyright © 2013 Springer Netherlands.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O objectivo deste trabalho é a análise da eficiência produtiva e dos efeitos da concentração sobre os custos bancários, tendo por base a indústria bancária portuguesa. O carácter multiproduto da empresa bancária sugere a necessidade de se adoptar formas multiproduto da função custo (tipo Fourier). Introduzimos variáveis de homogeneidade e de estrutura que permitem o recurso a formas funcionais uniproduto (Cobb-Douglas) à banca. A amostra corresponde a 22 bancos que operavam em Portugal entre 1995-2001, base não consolidada e dados em painel. Para o estudo da ineficiência recorreu-se ao modelo estocástico da curva fronteira (SFA), para as duas especificações. Na análise da concentração, introduziram-se variáveis binárias que pretendem captar os efeitos durante quatro anos após a concentração. Tanto no caso da SFA como no da concentração, os resultados encontrados são sensíveis à especificação funcional adoptada. Concluindo, o processo de concentração bancário parece justificar-se pela possibilidade da diminuição da ineficiência-X.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

RESUMO - Perante o actual contexto de contenção de gastos no sector da saúde e consequente preocupação com a eficiência do sistema, tem‐se assistido a mudanças várias no modelo de gestão e organizacional do sistema de saúde. Destaca‐se a alteração da estrutura hospitalar, com vista à racionalização dos seus recursos internos, onde as fusões hospitalares têm assumido um papel determinante. Em Portugal, nos últimos 10 anos, assistiu‐se a uma significativa redução do número de hospitais (de sensivelmente 90 para 50 unidades), exclusivamente através das fusões e sem quaisquer alterações no número de estruturas físicas existentes. Não obstante os argumentos justificativos desta reforma, a avaliação dos objectivos implícitos é insuficiente. Neste âmbito, pretendeu‐se com este estudo contribuir para a análise do impacte da criação de centros hospitalares na redução de gastos, isto é, verificar se a consolidação e consequente reengenharia dos processos produtivos teve consequencias ao nível da obtenção de economias de escala. Para esta análise usou‐se uma base de dados em painel, onde se consideraram 75 hospitais durante 7 anos (2003‐2009), número que foi reduzindo ao longo do período em análise devido às inúmeras fusões já referidas. Para avaliar os ganhos relativos às fusões hospitalares, ao nível da eficiência técnica e das economias de escala, recorreu‐se à fronteira estocástica especificada função custo translog. Estimada a fronteira, foi possível analisar três centros hospitalares específicos, onde se comparou o período pré‐fusão (2005‐2006) com o período após a fusão (2008‐2009). Como variáveis explicativas, relativas à produção hospitalar, considerou‐se o número de casos tratados e os dias de internamento (Vita, 1990; Schuffham et al., 1996), o número de consultas e o número de urgências, sendo estas variáveis as mais comuns na literatura (Vita, 1990; Fournier e Mitchell, 1992; Carreira, 1999). Quanto à variável dependente usou‐se o custo variável total, que compreende o total de custos anuais dos hospitais excepto de imobilizado. Como principais conclusões da investigação, em consequência da criação dos centros hospitalares, são de referir os ganhos de escala na fusão de hospitais de reduzida dimensão e com mais serviços complementares. --------ABSTRACT - Driven by the current pressure on resources induced by budgetary cuts, the Portuguese Ministry of Health is imposing changes in the management model and organization of NHS hospitals. The most recent change is based on the creation of Hospital Centres that are a result of administrative mergers of existing hospitals. In less than 10 years the number of hospitals passed from around 90 to around 50, only due to the mergers and without any change in the existing number of physical institutions. According to the political discourse, one of the main goals expected from this measure is the creation of synergies and more efficiency in the use of available resources. However, the merger of the hospitals has been a political decision without support or evaluation of the first experiments. The aim of this study is to measure the results of this policy by looking at economies of scale namely through reductions in the expenditures, as expected and sought by the MoH. Data used covers 7 years (2003‐2009) and 75 hospitals, number that has been reduced my the enoumerous mergers during the last decade. This work uses a stochastic frontier analysis through the translog cost function to examine the gains from mergers, which were decomposed into technical efficiency and economies of scale. It was analised these effects by the creation of three specific hospital centers, using a longitudinal approach to compare the period pre‐merger (2003‐2006) with the post‐merger period (2007‐09). To measure changes in inpatient hospital production volume and length of stay are going to be considered as done by Vita (1990) and Schuffham et al. (1996). For outpatient services the number of consultations and emergencies are going to be considered (Vita, 1990; Fournier e Mitchell, 1992; Carreira, 1999). Total variable cost is considered as the dependent variable explained the aforementioned ones. After a review of the literature results expected point to benefits from the mergers, namely a reduction in total expenditures and in the number of duplicated services. Results extracted from our data point in the same direction, and thus for the existence of some economies of scale only for small hospitals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work evaluates the efficiency position of the health system of each OECD country. It identifies whether, or not, health systems changed in terms of quality and performance after the financial crisis. The health systems performance was calculated by fixed-effects estimator and by stochastic frontier analysis. The results suggest that many of those countries that the crisis affected the most are more efficient than the OECD average. In addition, some of those countries even managed to reach the top decile in the efficiency ranking. Finally, we analyze the stochastic frontier efficiency scores together with other health indicators to evaluate the health systems’ overall adjustments derived from the crisis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In a series of papers (Tang, Chin and Rao, 2008; and Tang, Petrie and Rao 2006 & 2007), we have tried to improve on a mortality-based health status indicator, namely age-at-death (AAD), and its associated health inequality indicators that measure the distribution of AAD. The main contribution of these papers is to propose a frontier method to separate avoidable and unavoidable mortality risks. This has facilitated the development of a new indicator of health status, namely the Realization of Potential Life Years (RePLY). The RePLY measure is based on the concept of a “frontier country” that, by construction, has the lowest mortality risks for each age-sex group amongst all countries. The mortality rates of the frontier country are used as a proxy for the unavoidable mortality rates, and the residual between the observed mortality rates and the unavoidable mortality rates are considered as avoidable morality rates. In this approach, however, countries at different levels of development are benchmarked against the same frontier country without considering their heterogeneity. The main objective of the current paper is to control for national resources in estimating (conditional) unavoidable and avoidable mortality risks for individual countries. This allows us to construct a new indicator of health status – Realization of Conditional Potential Life Years (RCPLY). The paper presents empirical results from a dataset of life tables for 167 countries from the year 2000, compiled and updated by the World Health Organization. Measures of national average health status and health inequality based on RePLY and RCPLY are presented and compared.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Population viability analyses (PVA) are increasingly used in metapopulation conservation plans. Two major types of models are commonly used to assess vulnerability and to rank management options: population-based stochastic simulation models (PSM such as RAMAS or VORTEX) and stochastic patch occupancy models (SPOM). While the first set of models relies on explicit intrapatch dynamics and interpatch dispersal to predict population levels in space and time, the latter is based on spatially explicit metapopulation theory where the probability of patch occupation is predicted given the patch area and isolation (patch topology). We applied both approaches to a European tree frog (Hyla arborea) metapopulation in western Switzerland in order to evaluate the concordances of both models and their applications to conservation. Although some quantitative discrepancies appeared in terms of network occupancy and equilibrium population size, the two approaches were largely concordant regarding the ranking of patch values and sensitivities to parameters, which is encouraging given the differences in the underlying paradigms and input data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We discuss some practical issues related to the use of the Parameterized Expectations Approach (PEA) for solving non-linear stochastic dynamic models with rational expectations. This approach has been applied in models of macroeconomics, financial economics, economic growth, contracttheory, etc. It turns out to be a convenient algorithm, especially when there is a large number of state variables and stochastic shocks in the conditional expectations. We discuss some practical issues having to do with the application of the algorithm, and we discuss a Fortran program for implementing the algorithm that is available through the internet.We discuss these issues in a battery of six examples.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper aims to estimate a translog stochastic frontier production function in the analysis of a panel of 150 mixed Catalan farms in the period 1989-1993, in order to attempt to measure and explain variation in technical inefficiency scores with a one-stage approach. The model uses gross value added as the output aggregate measure. Total employment, fixed capital, current assets, specific costs and overhead costs are introduced into the model as inputs. Stochasticfrontier estimates are compared with those obtained using a linear programming method using a two-stage approach. The specification of the translog stochastic frontier model appears as an appropriate representation of the data, technical change was rejected and the technical inefficiency effects were statistically significant. The mean technical efficiency in the period analyzed was estimated to be 64.0%. Farm inefficiency levels were found significantly at 5%level and positively correlated with the number of economic size units.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article investigates the main sources of heterogeneity in regional efficiency. We estimate a translog stochastic frontier production function in the analysis of Spanish regions in the period 1964-1996, to attempt to measure and explain changes in technical efficiency. Our results confirm that regional inefficiency is significantly and positively correlated with the ratio of public capital to private capital. The proportion of service industries in the private capital, the proportion of public capital devoted to transport infrastructures, the industrial specialization, and spatial spillovers from transport infrastructures in neighbouring regions significantly contributed to improve regional efficiency.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We show that the Heston volatility or equivalently the Cox-Ingersoll-Ross process is Malliavin differentiable and give an explicit expression for the derivative. This result assures the applicability of Malliavin calculus in the framework of the Heston stochastic volatility model and the Cox-Ingersoll-Ross model for interest rates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Extreme times techniques, generally applied to nonequilibrium statistical mechanical processes, are also useful for a better understanding of financial markets. We present a detailed study on the mean first-passage time for the volatility of return time series. The empirical results extracted from daily data of major indices seem to follow the same law regardless of the kind of index thus suggesting an universal pattern. The empirical mean first-passage time to a certain level L is fairly different from that of the Wiener process showing a dissimilar behavior depending on whether L is higher or lower than the average volatility. All of this indicates a more complex dynamics in which a reverting force drives volatility toward its mean value. We thus present the mean first-passage time expressions of the most common stochastic volatility models whose approach is comparable to the random diffusion description. We discuss asymptotic approximations of these models and confront them to empirical results with a good agreement with the exponential Ornstein-Uhlenbeck model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.