982 resultados para Dynamic Asset Allocation
Resumo:
Dynamic asset rating (DAR) is one of the number of techniques that could be used to facilitate low carbon electricity network operation. Previous work has looked at this technique from an asset perspective. This paper focuses, instead, from a network perspective by proposing a dynamic network rating (DNR) approach. The models available for use with DAR are discussed and compared using measured load and weather data from a trial network area within Milton Keynes in the central area of the U.K. This paper then uses the most appropriate model to investigate, through a network case study, the potential gains in dynamic rating compared to static rating for the different network assets - transformers, overhead lines, and cables. This will inform the network operator of the potential DNR gains on an 11-kV network with all assets present and highlight the limiting assets within each season.
Resumo:
Dynamic asset rating is one of a number of techniques that could be used to facilitate low carbon electricity network operation. This paper focusses on distribution level transformer dynamic rating under this context. The models available for use with dynamic asset rating are discussed and compared using measured load and weather conditions from a trial Network area within Milton Keynes. The paper then uses the most appropriate model to investigate, through simulation, the potential gains in dynamic rating compared to static rating under two transformer cooling methods to understand the potential gain to the Network Operator.
Resumo:
The central product of the DRAMA (Dynamic Re-Allocation of Meshes for parallel Finite Element Applications) project is a library comprising a variety of tools for dynamic re-partitioning of unstructured Finite Element (FE) applications. The input to the DRAMA library is the computational mesh, and corresponding costs, partitioned into sub-domains. The core library functions then perform a parallel computation of a mesh re-allocation that will re-balance the costs based on the DRAMA cost model. We discuss the basic features of this cost model, which allows a general approach to load identification, modelling and imbalance minimisation. Results from crash simulations are presented which show the necessity for multi-phase/multi-constraint partitioning components.
Resumo:
This thesis studies the field of asset price bubbles. It is comprised of three independent chapters. Each of these chapters either directly or indirectly analyse the existence or implications of asset price bubbles. The type of bubbles assumed in each of these chapters is consistent with rational expectations. Thus, the kind of price bubbles investigated here are known as rational bubbles in the literature. The following describes the three chapters. Chapter 1: This chapter attempts to explain the recent US housing price bubble by developing a heterogeneous agent endowment economy asset pricing model with risky housing, endogenous collateral and defaults. Investment in housing is subject to an idiosyncratic risk and some mortgages are defaulted in equilibrium. We analytically derive the leverage or the endogenous loan to value ratio. This variable comes from a limited participation constraint in a one period mortgage contract with monitoring costs. Our results show that low values of housing investment risk produces a credit easing effect encouraging excess leverage and generates credit driven rational price bubbles in the housing good. Conversely, high values of housing investment risk produces a credit crunch characterized by tight borrowing constraints, low leverage and low house prices. Furthermore, the leverage ratio was found to be procyclical and the rate of defaults countercyclical consistent with empirical evidence. Chapter 2: It is widely believed that financial assets have considerable persistence and are susceptible to bubbles. However, identification of this persistence and potential bubbles is not straightforward. This chapter tests for price bubbles in the United States housing market accounting for long memory and structural breaks. The intuition is that the presence of long memory negates price bubbles while the presence of breaks could artificially induce bubble behaviour. Hence, we use procedures namely semi-parametric Whittle and parametric ARFIMA procedures that are consistent for a variety of residual biases to estimate the value of the long memory parameter, d, of the log rent-price ratio. We find that the semi-parametric estimation procedures robust to non-normality and heteroskedasticity errors found far more bubble regions than parametric ones. A structural break was identified in the mean and trend of all the series which when accounted for removed bubble behaviour in a number of regions. Importantly, the United States housing market showed evidence for rational bubbles at both the aggregate and regional levels. In the third and final chapter, we attempt to answer the following question: To what extend should individuals participate in the stock market and hold risky assets over their lifecycle? We answer this question by employing a lifecycle consumption-portfolio choice model with housing, labour income and time varying predictable returns where the agents are constrained in the level of their borrowing. We first analytically characterize and then numerically solve for the optimal asset allocation on the risky asset comparing the return predictability case with that of IID returns. We successfully resolve the puzzles and find equity holding and participation rates close to the data. We also find that return predictability substantially alter both the level of risky portfolio allocation and the rate of stock market participation. High factor (dividend-price ratio) realization and high persistence of factor process indicative of stock market bubbles raise the amount of wealth invested in risky assets and the level of stock market participation, respectively. Conversely, rare disasters were found to bring down these rates, the change being severe for investors in the later years of the life-cycle. Furthermore, investors following time varying returns (return predictability) hedged background risks significantly better than the IID ones.
Resumo:
Reconfigurable computing experienced a considerable expansion in the last few years, due in part to the fast run-time partial reconfiguration features offered by recent SRAM-based Field Programmable Gate Arrays (FPGAs), which allowed the implementation in real-time of dynamic resource allocation strategies, with multiple independent functions from different applications sharing the same logic resources in the space and temporal domains. However, when the sequence of reconfigurations to be performed is not predictable, the efficient management of the logic space available becomes the greatest challenge posed to these systems. Resource allocation decisions have to be made concurrently with system operation, taking into account function priorities and optimizing the space currently available. As a consequence of the unpredictability of this allocation procedure, the logic space becomes fragmented, with many small areas of free resources failing to satisfy most requests and so remaining unused. A rearrangement of the currently running functions is therefore necessary, so as to obtain enough contiguous space to implement incoming functions, avoiding the spreading of their components and the resulting degradation of system performance. A novel active relocation procedure for Configurable Logic Blocks (CLBs) is herein presented, able to carry out online rearrangements, defragmenting the available FPGA resources without disturbing functions currently running.
Resumo:
This paper studies the performance of two different Risk Parity strategies, one from Maillard (2008) and a “naïve” that was already used by market practitioners, against traditional strategies. The tests will compare different regions (US, UK, Germany and Japan) since 1991 to 2013, and will use different ways of volatility. The main findings are that Risk Parity outperforms any traditional strategy, and the “true” (by Maillard) has considerable better results than the “naïve” when using historical volatility, while using EWMA there are significant differences.
Resumo:
This study proposes a systematic model that is able to fit the Global Macro Investing universe. The Analog Model tests the possibility of capturing the likelihood of an optimal investment allocation based on similarity across different periods in history. Instead of observing Macroeconomic data, the model uses financial markets’ variables to classify unknown short-term regimes. This methodology is particularly relevant considering that asset classes and investment strategies react differently to specific macro environment shifts.
Resumo:
Financial crisis have happened in the past and will continue to do so in the future. In the most recent 2008 crisis, global equities (as measured by the MSCI ACWI index) lost a staggering 54.2% in USD, on the year. During those periods wealth preservation becomes at the top of most investor’s concerns. The purpose of this paper is to develop a strategy that protects the investment during bear markets and significant market corrections, generates capital appreciation, and that can support Millennium BCP’s Wealth Management Unit on their asset allocation procedures. This strategy extends the Dual Momentum approach introduced by Gary Antonacci (2014) in two ways. First, the investable set of securities in the equities space increases from two to four. Besides the US it will comprise the Japanese, European (excl. UK) and EM equity indices. Secondly, it adds a volatility filter as well as three indicators related to the business cycle and the state of the economy, which are relevant to decide on the strategy’s exposure to equities. Overall the results attest the resiliency of the strategy before, during and after historical financial crashes, as it drastically reduces the downside exposure and consistently outperforms the benchmark index by providing higher mean returns with lower variance.
Resumo:
Since the financial crisis, risk based portfolio allocations have gained a great deal in popularity. This increase in popularity is primarily due to the fact that they make no assumptions as to the expected return of the assets in the portfolio. These portfolios implicitly put risk management at the heart of asset allocation and thus their recent appeal. This paper will serve as a comparison of four well-known risk based portfolio allocation methods; minimum variance, maximum diversification, inverse volatility and equally weighted risk contribution. Empirical backtests will be performed throughout rising interest rate periods from 1953 to 2015. Additionally, I will compare these portfolios to more simple allocation methods, such as equally weighted and a 60/40 asset-allocation mix. This paper will help to answer the question if these portfolios can survive in a rising interest rate environment.
Resumo:
This paper surveys asset allocation methods that extend the traditional approach. An important feature of the the traditional approach is that measures the risk and return tradeoff in terms of mean and variance of final wealth. However, there are also other important features that are not always made explicit in terms of investor s wealth, information, and horizon: The investor makes a single portfolio choice based only on the mean and variance of her final financial wealth and she knows the relevant parameters in that computation. First, the paper describes traditional portfolio choice based on four basic assumptions, while the rest of the sections extend those assumptions. Each section will describe the corresponding equilibrium implications in terms of portfolio advice and asset pricing.
Resumo:
[cat] En aquest treball provo que, en mercats d’assignació amb més de dos costats, agents de diferents sectors poden no ser complementaris mentre que agents del mateix sector poden no ser substituts. Shapley (1962) va provar que això mai pot succeïr quan el mercat d’assignació només té dos costats. No obstant, demostro que existeixen condicions suficients que garanteixen la substitutabilitat i la complementarietat entre agents en aquests tipus de mercats. A més, provo que, quan els béns al mercat son homogenis, el resultat de Shapley (1962) es manté.
Resumo:
[cat] En aquest treball provo que, en mercats d’assignació amb més de dos costats, agents de diferents sectors poden no ser complementaris mentre que agents del mateix sector poden no ser substituts. Shapley (1962) va provar que això mai pot succeïr quan el mercat d’assignació només té dos costats. No obstant, demostro que existeixen condicions suficients que garanteixen la substitutabilitat i la complementarietat entre agents en aquests tipus de mercats. A més, provo que, quan els béns al mercat son homogenis, el resultat de Shapley (1962) es manté.
Resumo:
The present study deals with the analysis and mapping of Swiss franc interest rates. Interest rates depend on time and maturity, defining term structure of the interest rate curves (IRC). In the present study IRC are considered in a two-dimensional feature space - time and maturity. Exploratory data analysis includes a variety of tools widely used in econophysics and geostatistics. Geostatistical models and machine learning algorithms (multilayer perceptron and Support Vector Machines) were applied to produce interest rate maps. IR maps can be used for the visualisation and pattern perception purposes, to develop and to explore economical hypotheses, to produce dynamic asset-liability simulations and for financial risk assessments. The feasibility of an application of interest rates mapping approach for the IRC forecasting is considered as well. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Tutkimuksen tavoitteena on ensiksi teoreettisesti tuoda esille miten budjetointia käytetään yrityksen strategisessa johtamisessa, ja sitten testata miten case-yritys soveltaa budjetointia strategiatyöskentelyssä. Case-tutkimuksen tulokset viittaavat vahvasti siihen, että budjettiensoveltamisroolit ovat samanlaisia kuin oli kuvattu teoriaosuudessa. Siksi suunnittelu-, toteuttamis- sekä valvontaroolit löytyivät case-yhtiöstä. Bonuksiin liittyvää budjettiharhaa ei voitu objektiivisesti löytää case-yhtiöstä. Kuitenkin kävi ilmi, että yhtiössä oli budjettiharhaa liittyen investointien tuottoarvioiden systemaattiseen minimointiin. Simonsin teoreettinen yrityksen johtamisen suorituskyvyn analysoinnin malli on käytössä case-yhtiössä koska yhtiö tekee pitkän aikavälin strategisen investointibudjetin. Investointiehdotuksia arvioidaan pääomantuottoasteen, markkinaosuuskehittymisen sekä nykyarvomenetelmän avulla. Case-tutkimus toi esille, että yksikönjohtajat haluavat enemmän päätösvaltaa ja riskinottoa, erityisesti investointibudjettia tehtäessä.
Resumo:
Yliopistotukisäätiöt omaavat kohtalaisen paljon sijoitettavaa omaisuutta. Tässä tutkielmassa tutkitaan sitä, millainen tukisäätiöiden taloudellinen asema on ollut, ja millaista sijoitustoimintaa ja allokointia säätiöistä voidaan löytää. Tutkielman kohteena on kahdeksan suomalaista yliopistotukisäätiötä. Tutkimusaineisto koostuu säätiöiden tilinpäätösaineistosta vuosilta 2001-2005. Lisäksi tutkimusta varten on haastateltu mukana olleiden säätiöiden edustajia sekä kahta varainhoitajaa. Tutkielmassa käytetään kvantitatiivisia ja kvalitatiivisia tutkimusmenetelmiä. Tutkimuksessa tutustuttiin säätiöiden sijoitustoimintaan tutkimalla säätiöiden sijoitus- ja toimintastrategioita, varainhoidon järjestämistä, varojen allokaatiota sekä tulevaisuudennäkymiä. Tutkimustulosten perusteella voidaan havaita esimerkiksi, että säätiöiden sijoitustoimintaa kuvaa turvallisen ja varman tuoton turvaavan sijoittamisen vaatimus sekä mahdollisimman pitkä sijoitushorisontti. Säätiöiden väliltä löytyi kuitenkin myös jonkin verran eroja sijoitustoiminnassa.