975 resultados para Mean-variance analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evaluation of the growth of incisor teeth of rats as influenced by colchicine (doses of 25, 50, 100 and 200 μg/kg) injected during 10 and 18 days is performed using a multivariated variance analysis, which allowed a global view of the results, showing that: there are differences in the growth of teeth of control group (untreated rats) and those treated with colchicine, in the measurements made at the 4th, 7th and 10th days of experiment); there is no difference in the growth of the teeth between the groups treated during 10 and 18 days, except in the measurements made at the 7th day; there is no influence of the doses of colchicine in the group treated during 10 days and in the group treated during 18 days - only at the 7th day is observed an influence of the doses used; and there was no significant interaction between treatment and days of measurement, showing the similarity of the groups during the experiment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we consider the stochastic optimal control problem of discrete-time linear systems subject to Markov jumps and multiplicative noises under two criteria. The first one is an unconstrained mean-variance trade-off performance criterion along the time, and the second one is a minimum variance criterion along the time with constraints on the expected output. We present explicit conditions for the existence of an optimal control strategy for the problems, generalizing previous results in the literature. We conclude the paper by presenting a numerical example of a multi-period portfolio selection problem with regime switching in which it is desired to minimize the sum of the variances of the portfolio along the time under the restriction of keeping the expected value of the portfolio greater than some minimum values specified by the investor. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article investigates the performance of a model called Full-Scale Optimisation, which was presented recently and is used for financial investment advice. The investor’s preferences of expected risk and return are entered into the model, and a recommended portfolio is produced. This model is theoretically more accurate than the mainstream investment advice model, called Mean-Variance Optimization, as there are fewer assumptions made. Our investigation of the model’s performance is broader when it comes to investor preferences, and more general when it comes to investment type, as compared to previous studies. Our investigation shows that Full-Scale Optimisation is more widely applicable than earlier known.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper surveys asset allocation methods that extend the traditional approach. An important feature of the the traditional approach is that measures the risk and return tradeoff in terms of mean and variance of final wealth. However, there are also other important features that are not always made explicit in terms of investor s wealth, information, and horizon: The investor makes a single portfolio choice based only on the mean and variance of her final financial wealth and she knows the relevant parameters in that computation. First, the paper describes traditional portfolio choice based on four basic assumptions, while the rest of the sections extend those assumptions. Each section will describe the corresponding equilibrium implications in terms of portfolio advice and asset pricing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette thèse de doctorat consiste en trois chapitres qui traitent des sujets de choix de portefeuilles de grande taille, et de mesure de risque. Le premier chapitre traite du problème d’erreur d’estimation dans les portefeuilles de grande taille, et utilise le cadre d'analyse moyenne-variance. Le second chapitre explore l'importance du risque de devise pour les portefeuilles d'actifs domestiques, et étudie les liens entre la stabilité des poids de portefeuille de grande taille et le risque de devise. Pour finir, sous l'hypothèse que le preneur de décision est pessimiste, le troisième chapitre dérive la prime de risque, une mesure du pessimisme, et propose une méthodologie pour estimer les mesures dérivées. Le premier chapitre améliore le choix optimal de portefeuille dans le cadre du principe moyenne-variance de Markowitz (1952). Ceci est motivé par les résultats très décevants obtenus, lorsque la moyenne et la variance sont remplacées par leurs estimations empiriques. Ce problème est amplifié lorsque le nombre d’actifs est grand et que la matrice de covariance empirique est singulière ou presque singulière. Dans ce chapitre, nous examinons quatre techniques de régularisation pour stabiliser l’inverse de la matrice de covariance: le ridge, spectral cut-off, Landweber-Fridman et LARS Lasso. Ces méthodes font chacune intervenir un paramètre d’ajustement, qui doit être sélectionné. La contribution principale de cette partie, est de dériver une méthode basée uniquement sur les données pour sélectionner le paramètre de régularisation de manière optimale, i.e. pour minimiser la perte espérée d’utilité. Précisément, un critère de validation croisée qui prend une même forme pour les quatre méthodes de régularisation est dérivé. Les règles régularisées obtenues sont alors comparées à la règle utilisant directement les données et à la stratégie naïve 1/N, selon leur perte espérée d’utilité et leur ratio de Sharpe. Ces performances sont mesurée dans l’échantillon (in-sample) et hors-échantillon (out-of-sample) en considérant différentes tailles d’échantillon et nombre d’actifs. Des simulations et de l’illustration empirique menées, il ressort principalement que la régularisation de la matrice de covariance améliore de manière significative la règle de Markowitz basée sur les données, et donne de meilleurs résultats que le portefeuille naïf, surtout dans les cas le problème d’erreur d’estimation est très sévère. Dans le second chapitre, nous investiguons dans quelle mesure, les portefeuilles optimaux et stables d'actifs domestiques, peuvent réduire ou éliminer le risque de devise. Pour cela nous utilisons des rendements mensuelles de 48 industries américaines, au cours de la période 1976-2008. Pour résoudre les problèmes d'instabilité inhérents aux portefeuilles de grandes tailles, nous adoptons la méthode de régularisation spectral cut-off. Ceci aboutit à une famille de portefeuilles optimaux et stables, en permettant aux investisseurs de choisir différents pourcentages des composantes principales (ou dégrées de stabilité). Nos tests empiriques sont basés sur un modèle International d'évaluation d'actifs financiers (IAPM). Dans ce modèle, le risque de devise est décomposé en deux facteurs représentant les devises des pays industrialisés d'une part, et celles des pays émergents d'autres part. Nos résultats indiquent que le risque de devise est primé et varie à travers le temps pour les portefeuilles stables de risque minimum. De plus ces stratégies conduisent à une réduction significative de l'exposition au risque de change, tandis que la contribution de la prime risque de change reste en moyenne inchangée. Les poids de portefeuille optimaux sont une alternative aux poids de capitalisation boursière. Par conséquent ce chapitre complète la littérature selon laquelle la prime de risque est importante au niveau de l'industrie et au niveau national dans la plupart des pays. Dans le dernier chapitre, nous dérivons une mesure de la prime de risque pour des préférences dépendent du rang et proposons une mesure du degré de pessimisme, étant donné une fonction de distorsion. Les mesures introduites généralisent la mesure de prime de risque dérivée dans le cadre de la théorie de l'utilité espérée, qui est fréquemment violée aussi bien dans des situations expérimentales que dans des situations réelles. Dans la grande famille des préférences considérées, une attention particulière est accordée à la CVaR (valeur à risque conditionnelle). Cette dernière mesure de risque est de plus en plus utilisée pour la construction de portefeuilles et est préconisée pour compléter la VaR (valeur à risque) utilisée depuis 1996 par le comité de Bâle. De plus, nous fournissons le cadre statistique nécessaire pour faire de l’inférence sur les mesures proposées. Pour finir, les propriétés des estimateurs proposés sont évaluées à travers une étude Monte-Carlo, et une illustration empirique en utilisant les rendements journaliers du marché boursier américain sur de la période 2000-2011.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The “case for property” in the mixed-asset portfolio is a topic of continuing interest to practitioners and academics. Such an analysis typically is performed over a fixed period of time and the optimum allocation to property inferred from the weight assigned to property through the use of mean-variance analysis. It is well known, however, that the parameters used in the portfolio analysis problem are unstable through time. Thus, the weight proposed for property in one period is unlikely to be that found in another. Consequently, in order to assess the case for property more thoroughly, the impact of property in the mixed-asset portfolio is evaluated on a rolling basis over a long period of time. In this way we test whether the inclusion of property significantly improves the performance of an existing equity/bond portfolio all of the time. The main findings are that the inclusion of direct property into an existing equity/bond portfolio leads to increase or decreases in return, depending on the relative performance of property compared with the other asset classes. However, including property in the mixed-asset portfolio always leads to reductions in portfolio risk. Consequently, adding property into an equity/bond portfolio can lead to significant increases in risk-adjusted performance. Thus, if the decision to include direct property in the mixed-asset portfolio is based upon its diversification benefits the answer is yes, there is a “case for property” all the time!

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study investigates the role of credit risk in a continuous time stochastic asset allocation model, since the traditional dynamic framework does not provide credit risk flexibility. The general model of the study extends the traditional dynamic efficiency framework by explicitly deriving the optimal value function for the infinite horizon stochastic control problem via a weighted volatility measure of market and credit risk. The model's optimal strategy was then compared to that obtained from a benchmark Markowitz-type dynamic optimization framework to determine which specification adequately reflects the optimal terminal investment returns and strategy under credit and market risks. The paper shows that an investor's optimal terminal return is lower than typically indicated under the traditional mean-variance framework during periods of elevated credit risk. Hence I conclude that, while the traditional dynamic mean-variance approach may indicate the ideal, in the presence of credit-risk it does not accurately reflect the observed optimal returns, terminal wealth and portfolio selection strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - The aim of this paper is to present a synthetic chart based on the non-central chi-square statistic that is operationally simpler and more effective than the joint X̄ and R chart in detecting assignable cause(s). This chart will assist in identifying which (mean or variance) changed due to the occurrence of the assignable causes. Design/methodology/approach - The approach used is based on the non-central chi-square statistic and the steady-state average run length (ARL) of the developed chart is evaluated using a Markov chain model. Findings - The proposed chart always detects process disturbances faster than the joint X̄ and R charts. The developed chart can monitor the process instead of looking at two charts separately. Originality/value - The most important advantage of using the proposed chart is that practitioners can monitor the process by looking at only one chart instead of looking at two charts separately. © Emerald Group Publishing Limted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Dermatomyositis (DM) and polymyositis (PM) are rare systemic autoimmune rheumatic diseases with high fatality rates. There have been few population-based mortality studies of dermatomyositis and polymyositis in the world, and none have been conducted in Brazil. The objective of the present study was to employ multiple-cause of-death methodology in the analysis of trends in mortality related to dermatomyositis and polymyositis in the state of Sao Paulo, Brazil, between 1985 and 2007. Methods: We analyzed mortality data from the Sao Paulo State Data Analysis System, selecting all death certificates on which DM or PM was listed as a cause of death. The variables sex, age and underlying, associated or total mentions of causes of death were studied using mortality rates, proportions and historical trends. Statistical analysis were performed by chi-square and H Kruskal-Wallis tests, variance analysis and linear regression. A p value less than 0.05 was regarded as significant. Results: Over a 23-year period, there were 318 DM-related deaths and 316 PM-related deaths. Overall, DM/PM was designated as an underlying cause in 55.2% and as an associated cause in 44.8%; among 634 total deaths females accounted for 71.5%. During the study period, age-and gender-adjusted DM mortality rates did not change significantly, although PM as an underlying cause and total mentions of PM trended lower (p < 0.05). The mean ages at death were 47.76 +/- 20.81 years for DM and 54.24 +/- 17.94 years for PM (p = 0.0003). For DM/PM, respectively, as underlying causes, the principal associated causes of death were as follows: pneumonia (in 43.8%/33.5%); respiratory failure (in 34.4%/32.3%); interstitial pulmonary diseases and other pulmonary conditions (in 28.9%/17.6%); and septicemia (in 22.8%/15.9%). For DM/PM, respectively, as associated causes, the following were the principal underlying causes of death: respiratory disorders (in 28.3%/26.0%); circulatory disorders (in 17.4%/20.5%); neoplasms (in 16.7%/13.7%); infectious and parasitic diseases (in 11.6%/9.6%); and gastrointestinal disorders (in 8.0%/4.8%). Of the 318 DM-related deaths, 36 involved neoplasms, compared with 20 of the 316 PM-related deaths (p = 0.03). Conclusions: Our study using multiple cause of deaths found that DM/PM were identified as the underlying cause of death in only 55.2% of the deaths, indicating that both diseases were underestimated in the primary mortality statistics. We observed a predominance of deaths in women and in older individuals, as well as a trend toward stability in the mortality rates. We have confirmed that the risk of death is greater when either disease is accompanied by neoplasm, albeit to lesser degree in individuals with PM. The investigation of the underlying and associated causes of death related to DM/PM broaden the knowledge of the natural history of both diseases and could help integrate mortality data for use in the evaluation of control measures for DM/PM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The scope of this paper is to adapt the standard mean-variance model of Henry Markowitz theory, creating a simulation tool to find the optimal configuration of the portfolio aggregator, calculate its profitability and risk. Currently, there is a deep discussion going on among the power system society about the structure and architecture of the future electric system. In this environment, policy makers and electric utilities find new approaches to access the electricity market; this configures new challenging positions in order to find innovative strategies and methodologies. Decentralized power generation is gaining relevance in liberalized markets, and small and medium size electricity consumers are also become producers (“prosumers”). In this scenario an electric aggregator is an entity that joins a group of electric clients, customers, producers, “prosumers” together as a single purchasing unit to negotiate the purchase and sale of electricity. The aggregator conducts research on electricity prices, contract terms and conditions in order to promote better energy prices for their clients and allows small and medium customers to benefit improved market prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern Portfolio Theory (MPT) has been advocated as a more rational approach to the construction of real estate portfolios. The application of MPT can now be achieved with relative ease using the powerful facilities of modern spreadsheet, and does not necessarily need specialist software. This capability is to be found in the use of an add-in Tool now found in several spreadsheets, called an Optimiser or Solver. The value in using this kind of more sophisticated analysis feature of spreadsheets is increasingly difficult to ignore. This paper examines the use of the spreadsheet Optimiser in handling asset allocation problems. Using the Markowitz Mean-Variance approach, the paper introduces the necessary calculations, and shows, by means of an elementary example implemented in Microsoft's Excel, how the Optimiser may be used. Emphasis is placed on understanding the inputs and outputs from the portfolio optimisation process, and the danger of treating the Optimiser as a Black Box is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Although oral lichen planus has been classified by the World Health Organization (WHO) as a potentially malignant disorder, such classification is still the target of much controversy. Aim: To evaluate the cell proliferation rate in oral lichen planus, comparing it to the rate observed in epithelial dysplasia and oral squamous cell carcinoma, aiming at indications which might indicate the potential for malignant transformation. Material and Methods: Twenty-four cases of each lesion were submitted to the streptoavidin-biotin and AgNOR technique to evaluate the immunohistochemical expression of PCNA and the mean NORs/ nucleus, respectively. Results: Positivity for PCNA was observed in 58.33% of oral lichen planus cases, 83.33% of epithelial dysplasia cases and 91.67% of oral squamous cell carcinoma cases. Chi-squared test showed that the number of positive cases for PCNA was significantly lower in oral lichen planus than in oral squamous cell carcinoma (p<0.05). No significant statistical difference between oral lichen planus and epithelial dysplasia (p>0.05) and between the epithelial dysplasia and oral squamous cell carcinoma (p>0.05) was observed. The mean NORs/ nucleus in oral lichen planus, epithelial dysplasia and oral squamous cell carcinoma were 1.74 +/- 0.32, 2.42 +/- 0.62 e 2.41 +/- 0.61, respectively. Variance analysis (ANOVA) revealed significant statistical difference between oral lichen planus and the other studied lesions (p<0.05). Conclusion: Oral lichen planus cell proliferation rate was less than in oral epithelial dysplasia and oral squamous cell carcinoma which might explain the lower malignant transformation rate.