854 resultados para mean-variance frontiers


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite the large size of the Brazilian debt market, as well the large diversity of its bonds, the picture that emerges is of a market that has not yet completed its transition from the role it performed during the megainflation years, namely that of providing a liquid asset that provided positive real returns. This unfinished transition is currently placing the market under severe stress, as fears of a possible default from the next administration grow larger. This paper analyzes several aspects pertaining to the management of the domestic public debt. The causes for the extremely large and fast growth ofthe domestic public debt during the seven-year period that President Cardoso are discussed in Section 2. Section 3 computes Value at Risk and Cash Flow at Risk measures for the domestic public debt. The rollover risk is introduced in a mean-variance framework in Section 4. Section 5 discusses a few issues pertaining to the overlap between debt management and monetary policy. Finally, Section 6 wraps up with policy discussion and policy recommendations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho se dedica a analisar o desempenho de modelos de otimização de carteiras regularizadas, empregando ativos financeiros do mercado brasileiro. Em particular, regularizamos as carteiras através do uso de restrições sobre a norma dos pesos dos ativos, assim como DeMiguel et al. (2009). Adicionalmente, também analisamos o desempenho de carteiras que levam em consideração informações sobre a estrutura de grupos de ativos com características semelhantes, conforme proposto por Fernandes, Rocha e Souza (2011). Enquanto a matriz de covariância empregada nas análises é a estimada através dos dados amostrais, os retornos esperados são obtidos através da otimização reversa da carteira de equilíbrio de mercado proposta por Black e Litterman (1992). A análise empírica fora da amostra para o período entre janeiro de 2010 e outubro de 2014 sinaliza-nos que, em linha com estudos anteriores, a penalização das normas dos pesos pode levar (dependendo da norma escolhida e da intensidade da restrição) a melhores performances em termos de Sharpe e retorno médio, em relação a carteiras obtidas via o modelo tradicional de Markowitz. Além disso, a inclusão de informações sobre os grupos de ativos também pode trazer benefícios ao cálculo de portfolios ótimos, tanto em relação aos métodos tradicionais quanto em relação aos casos sem uso da estrutura de grupos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The portfolio theory is a field of study devoted to investigate the decision-making by investors of resources. The purpose of this process is to reduce risk through diversification and thus guarantee a return. Nevertheless, the classical Mean-Variance has been criticized regarding its parameters and it is observed that the use of variance and covariance has sensitivity to the market and parameter estimation. In order to reduce the estimation errors, the Bayesian models have more flexibility in modeling, capable of insert quantitative and qualitative parameters about the behavior of the market as a way of reducing errors. Observing this, the present study aimed to formulate a new matrix model using Bayesian inference as a way to replace the covariance in the MV model, called MCB - Covariance Bayesian model. To evaluate the model, some hypotheses were analyzed using the method ex post facto and sensitivity analysis. The benchmarks used as reference were: (1) the classical Mean Variance, (2) the Bovespa index's market, and (3) in addition 94 investment funds. The returns earned during the period May 2002 to December 2009 demonstrated the superiority of MCB in relation to the classical model MV and the Bovespa Index, but taking a little more diversifiable risk that the MV. The robust analysis of the model, considering the time horizon, found returns near the Bovespa index, taking less risk than the market. Finally, in relation to the index of Mao, the model showed satisfactory, return and risk, especially in longer maturities. Some considerations were made, as well as suggestions for further work

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, the optimal reactive power planning problem under risk is presented. The classical mixed-integer nonlinear model for reactive power planning is expanded into two stage stochastic model considering risk. This new model considers uncertainty on the demand load. The risk is quantified by a factor introduced into the objective function and is identified as the variance of the random variables. Finally numerical results illustrate the performance of the proposed model, that is applied to IEEE 30-bus test system to determine optimal amount and location for reactive power expansion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a novel class of models for functional data exhibiting skewness or other shape characteristics that vary with spatial or temporal location. We use copulas so that the marginal distributions and the dependence structure can be modeled independently. Dependence is modeled with a Gaussian or t-copula, so that there is an underlying latent Gaussian process. We model the marginal distributions using the skew t family. The mean, variance, and shape parameters are modeled nonparametrically as functions of location. A computationally tractable inferential framework for estimating heterogeneous asymmetric or heavy-tailed marginal distributions is introduced. This framework provides a new set of tools for increasingly complex data collected in medical and public health studies. Our methods were motivated by and are illustrated with a state-of-the-art study of neuronal tracts in multiple sclerosis patients and healthy controls. Using the tools we have developed, we were able to find those locations along the tract most affected by the disease. However, our methods are general and highly relevant to many functional data sets. In addition to the application to one-dimensional tract profiles illustrated here, higher-dimensional extensions of the methodology could have direct applications to other biological data including functional and structural MRI.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we show statistical analyses of several types of traffic sources in a 3G network, namely voice, video and data sources. For each traffic source type, measurements were collected in order to, on the one hand, gain better understanding of the statistical characteristics of the sources and, on the other hand, enable forecasting traffic behaviour in the network. The latter can be used to estimate service times and quality of service parameters. The probability density function, mean, variance, mean square deviation, skewness and kurtosis of the interarrival times are estimated by Wolfram Mathematica and Crystal Ball statistical tools. Based on evaluation of packet interarrival times, we show how the gamma distribution can be used in network simulations and in evaluation of available capacity in opportunistic systems. As a result, from our analyses, shape and scale parameters of gamma distribution are generated. Data can be applied also in dynamic network configuration in order to avoid potential network congestions or overflows. Copyright © 2013 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study is to investigate the effects of predictor variable correlations and patterns of missingness with dichotomous and/or continuous data in small samples when missing data is multiply imputed. Missing data of predictor variables is multiply imputed under three different multivariate models: the multivariate normal model for continuous data, the multinomial model for dichotomous data and the general location model for mixed dichotomous and continuous data. Subsequent to the multiple imputation process, Type I error rates of the regression coefficients obtained with logistic regression analysis are estimated under various conditions of correlation structure, sample size, type of data and patterns of missing data. The distributional properties of average mean, variance and correlations among the predictor variables are assessed after the multiple imputation process. ^ For continuous predictor data under the multivariate normal model, Type I error rates are generally within the nominal values with samples of size n = 100. Smaller samples of size n = 50 resulted in more conservative estimates (i.e., lower than the nominal value). Correlation and variance estimates of the original data are retained after multiple imputation with less than 50% missing continuous predictor data. For dichotomous predictor data under the multinomial model, Type I error rates are generally conservative, which in part is due to the sparseness of the data. The correlation structure for the predictor variables is not well retained on multiply-imputed data from small samples with more than 50% missing data with this model. For mixed continuous and dichotomous predictor data, the results are similar to those found under the multivariate normal model for continuous data and under the multinomial model for dichotomous data. With all data types, a fully-observed variable included with variables subject to missingness in the multiple imputation process and subsequent statistical analysis provided liberal (larger than nominal values) Type I error rates under a specific pattern of missing data. It is suggested that future studies focus on the effects of multiple imputation in multivariate settings with more realistic data characteristics and a variety of multivariate analyses, assessing both Type I error and power. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study investigates the role of credit risk in a continuous time stochastic asset allocation model, since the traditional dynamic framework does not provide credit risk flexibility. The general model of the study extends the traditional dynamic efficiency framework by explicitly deriving the optimal value function for the infinite horizon stochastic control problem via a weighted volatility measure of market and credit risk. The model's optimal strategy was then compared to that obtained from a benchmark Markowitz-type dynamic optimization framework to determine which specification adequately reflects the optimal terminal investment returns and strategy under credit and market risks. The paper shows that an investor's optimal terminal return is lower than typically indicated under the traditional mean-variance framework during periods of elevated credit risk. Hence I conclude that, while the traditional dynamic mean-variance approach may indicate the ideal, in the presence of credit-risk it does not accurately reflect the observed optimal returns, terminal wealth and portfolio selection strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El estudio de la fiabilidad de componentes y sistemas tiene gran importancia en diversos campos de la ingenieria, y muy concretamente en el de la informatica. Al analizar la duracion de los elementos de la muestra hay que tener en cuenta los elementos que no fallan en el tiempo que dure el experimento, o bien los que fallen por causas distintas a la que es objeto de estudio. Por ello surgen nuevos tipos de muestreo que contemplan estos casos. El mas general de ellos, el muestreo censurado, es el que consideramos en nuestro trabajo. En este muestreo tanto el tiempo hasta que falla el componente como el tiempo de censura son variables aleatorias. Con la hipotesis de que ambos tiempos se distribuyen exponencialmente, el profesor Hurt estudio el comportamiento asintotico del estimador de maxima verosimilitud de la funcion de fiabilidad. En principio parece interesante utilizar metodos Bayesianos en el estudio de la fiabilidad porque incorporan al analisis la informacion a priori de la que se dispone normalmente en problemas reales. Por ello hemos considerado dos estimadores Bayesianos de la fiabilidad de una distribucion exponencial que son la media y la moda de la distribucion a posteriori. Hemos calculado la expansion asint6tica de la media, varianza y error cuadratico medio de ambos estimadores cuando la distribuci6n de censura es exponencial. Hemos obtenido tambien la distribucion asintotica de los estimadores para el caso m3s general de que la distribucion de censura sea de Weibull. Dos tipos de intervalos de confianza para muestras grandes se han propuesto para cada estimador. Los resultados se han comparado con los del estimador de maxima verosimilitud, y con los de dos estimadores no parametricos: limite producto y Bayesiano, resultando un comportamiento superior por parte de uno de nuestros estimadores. Finalmente nemos comprobado mediante simulacion que nuestros estimadores son robustos frente a la supuesta distribuci6n de censura, y que uno de los intervalos de confianza propuestos es valido con muestras pequenas. Este estudio ha servido tambien para confirmar el mejor comportamiento de uno de nuestros estimadores. SETTING OUT AND SUMMARY OF THE THESIS When we study the lifetime of components it's necessary to take into account the elements that don't fail during the experiment, or those that fail by reasons which are desirable to exclude from consideration. The model of random censorship is very usefull for analysing these data. In this model the time to failure and the time censor are random variables. We obtain two Bayes estimators of the reliability function of an exponential distribution based on randomly censored data. We have calculated the asymptotic expansion of the mean, variance and mean square error of both estimators, when the censor's distribution is exponential. We have obtained also the asymptotic distribution of the estimators for the more general case of censor's Weibull distribution. Two large-sample confidence bands have been proposed for each estimator. The results have been compared with those of the maximum likelihood estimator, and with those of two non parametric estimators: Product-limit and Bayesian. One of our estimators has the best behaviour. Finally we have shown by simulation, that our estimators are robust against the assumed censor's distribution, and that one of our intervals does well in small sample situation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neste trabalho, deriva-se uma política de escolha ótima baseada na análise de média-variância para o Erro de Rastreamento no cenário Multi-período - ERM -. Referindo-se ao ERM como a diferença entre o capital acumulado pela carteira escolhida e o acumulado pela carteira de um benchmark. Assim, foi aplicada a metodologia abordada por Li-Ng em [24] para a solução analítica, obtendo-se dessa maneira uma generalização do caso uniperíodo introduzido por Roll em [38]. Em seguida, selecionou-se um portfólio do mercado de ações brasileiro baseado no fator de orrelação, e adotou-se como benchmark o índice da bolsa de valores do estado de São Paulo IBOVESPA, além da taxa básica de juros SELIC como ativo de renda fixa. Dois casos foram abordados: carteira composta somente de ativos de risco, caso I, e carteira com um ativo sem risco indexado à SELIC - e ativos do caso I (caso II).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La duración del viaje vacacional es una decisión del turista con unas implicaciones fundamentales para las organizaciones turísticas, pero que ha recibido una escasa atención por la literatura. Además, los escasos estudios se han centrado en los destinos costeros, cuando el turismo de interior se está erigiendo como una alternativa importante en algunos países. El presente trabajo analiza los factores determinantes de la elección temporal del viaje turístico, distinguiendo el tipo de destino elegido -costa e interior-, y proponiendo varias hipótesis acerca de la influencia de las características de los individuos relacionadas con el destino, de las restricciones personales y de las características sociodemográficas. La metodología aplicada estima, como novedad en este tipo de decisiones, un Modelo Binomial Negativo Truncado que evita los sesgos de estimación de los modelos de regresión y el supuesto restrictivo de igualdad media-varianza del Modelo de Poisson. La aplicación empírica realizada en España sobre una muestra de 1.600 individuos permite concluir, por un lado, que el Modelo Binomial Negativo es más adecuado que el de Poisson para realizar este tipo de análisis. Por otro lado, las dimensiones determinantes de la duración del viaje vacacional son, para ambos destinos, el alojamiento en hotel y apartamento propio, las restricciones temporales, la edad del turista y la forma de organizar el viaje; mientras que el tamaño de la ciudad de residencia y el atributo “precios baratos” es un aspecto diferencial de la costa; y el alojamiento en apartamentos alquilados lo es de los destinos de interior.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho tem com objetivo abordar o problema de alocação de ativos (análise de portfólio) sob uma ótica Bayesiana. Para isto foi necessário revisar toda a análise teórica do modelo clássico de média-variância e na sequencia identificar suas deficiências que comprometem sua eficácia em casos reais. Curiosamente, sua maior deficiência não esta relacionado com o próprio modelo e sim pelos seus dados de entrada em especial ao retorno esperado calculado com dados históricos. Para superar esta deficiência a abordagem Bayesiana (modelo de Black-Litterman) trata o retorno esperado como uma variável aleatória e na sequência constrói uma distribuição a priori (baseado no modelo de CAPM) e uma distribuição de verossimilhança (baseado na visão de mercado sob a ótica do investidor) para finalmente aplicar o teorema de Bayes tendo como resultado a distribuição a posteriori. O novo valor esperado do retorno, que emerge da distribuição a posteriori, é que substituirá a estimativa anterior do retorno esperado calculado com dados históricos. Os resultados obtidos mostraram que o modelo Bayesiano apresenta resultados conservadores e intuitivos em relação ao modelo clássico de média-variância.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis presents research within empirical financial economics with focus on liquidity and portfolio optimisation in the stock market. The discussion on liquidity is focused on measurement issues, including TAQ data processing and measurement of systematic liquidity factors (FSO). Furthermore, a framework for treatment of the two topics in combination is provided. The liquidity part of the thesis gives a conceptual background to liquidity and discusses several different approaches to liquidity measurement. It contributes to liquidity measurement by providing detailed guidelines on the data processing needed for applying TAQ data to liquidity research. The main focus, however, is the derivation of systematic liquidity factors. The principal component approach to systematic liquidity measurement is refined by the introduction of moving and expanding estimation windows, allowing for time-varying liquidity co-variances between stocks. Under several liability specifications, this improves the ability to explain stock liquidity and returns, as compared to static window PCA and market average approximations of systematic liquidity. The highest ability to explain stock returns is obtained when using inventory cost as a liquidity measure and a moving window PCA as the systematic liquidity derivation technique. Systematic factors of this setting also have a strong ability in explaining a cross-sectional liquidity variation. Portfolio optimisation in the FSO framework is tested in two empirical studies. These contribute to the assessment of FSO by expanding the applicability to stock indexes and individual stocks, by considering a wide selection of utility function specifications, and by showing explicitly how the full-scale optimum can be identified using either grid search or the heuristic search algorithm of differential evolution. The studies show that relative to mean-variance portfolios, FSO performs well in these settings and that the computational expense can be mitigated dramatically by application of differential evolution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since the seminal works of Markowitz (1952), Sharpe (1964), and Lintner (1965), numerous studies on portfolio selection and performance measure have been based upon the mean-variance framework. However, several researchers (e.g., Arditti (1967, and 1971), Samuelson (1970), and Rubinstein (1973)) argue that the higher moments cannot be neglected unless there is reason to believe that: (i) the asset returns are normally distributed and the investor's utility function is quadratic, or (ii) the empirical evidence demonstrates that higher moments are irrelevant to the investor's decision. Based on the same argument, this dissertation investigates the impact of higher moments of return distributions on three issues concerning the 14 international stock markets.^ First, the portfolio selection with skewness is determined using: the Polynomial Goal Programming in which investor preferences for skewness can be incorporated. The empirical findings suggest that the return distributions of international stock markets are not normally distributed, and that the incorporation of skewness into an investor's portfolio decision causes a major change in the construction of his optimal portfolio. The evidence also indicates that an investor will trade expected return of the portfolio for skewness. Moreover, when short sales are allowed, investors are better off as they attain higher expected return and skewness simultaneously.^ Second, the performance of international stock markets are evaluated using two types of performance measures: (i) the two-moment performance measures of Sharpe (1966), and Treynor (1965), and (ii) the higher-moment performance measures of Prakash and Bear (1986), and Stephens and Proffitt (1991). The empirical evidence indicates that higher moments of return distributions are significant and relevant to the investor's decision. Thus, the higher moment performance measures should be more appropriate to evaluate the performances of international stock markets. The evidence also indicates that various measures provide a vastly different performance ranking of the markets, albeit in the same direction.^ Finally, the inter-temporal stability of the international stock markets is investigated using the Parhizgari and Prakash (1989) algorithm for the Sen and Puri (1968) test which accounts for non-normality of return distributions. The empirical finding indicates that there is strong evidence to support the stability in international stock market movements. However, when the Anderson test which assumes normality of return distributions is employed, the stability in the correlation structure is rejected. This suggests that the non-normality of the return distribution is an important factor that cannot be ignored in the investigation of inter-temporal stability of international stock markets. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The role of steroids hormones on the behavior of vertebrates have been described as organizational and activational effects. These actions occur in different periods of the ontogenetic development as fetal, early post natal and during puberty (organizational effect) or modifying the expression of behavioral patterns during time life (activational effects). Studies on brain lateralization in hand use in human and non-human primates have shown that sexual hormones seems to participate in the process of handedness strength that begins in the puberal period and is stabilized at the adult age. The aim of this study was to investigate in adult male Callithrix jacchus if the strength of use of the hand in common marmoset adult male is stable (organizational effect) or androgens variations could affect its stability (activational effect). The preferential use of one hand in 14 common marmoset (Callithrix jacchus was studied in two contexts: (1) spontaneous holding food and directing the food to mouth (feeding episodes), and (2) forced reaching food tests where the animal have to reach the food through a hole within a cover plate with a central hole that allow the use of one hand only to reach the food. The records were made during 5 sessions/20 bouts each during baseline totalizing 100 episodes before two treatments. Firstly it was used GnRH antagonist: a single subcutaneous injection of 100µg de Cetrotide – acetate of cetrorrelix (Baxter Oncology GmbH, Germany) (n=10). Secondly, a single GnRH injection of 0.2mg of GnRH (Sigma – Aldrich) (n= 8) was used. After injections 20 successful attempts of hand use episodes was recorded in the 1st , 2 nd, 7th, 15th and 30 th days, totalizing in the whole period 100 episodes for each context, after both treatments. Fecal sampling to measure extracted fecal androgens was performed in all days of data collection across the length of the basal and during the experimental periods. Statistical analysis by mixed model, Tukey test to compare mean values after the two treatments, and Levene test to compare mean variance were used, all for p-value < 0.05. In basal phase 6 animals used preferentially the right hand, 5 the left and 3 were ambidextrous. Mean handedness index in basal phase were different from that after both treatment starting at 7th day. Mean variance of handedness index for spontaneous and forced activities does not differs before and after both treatments but the mean values for GnRH index were higher than that observed for its antagonist. These findings suggested that androgens have an activational effect on handedness in adult male C. jacchus