985 resultados para Portfolio Analysis
Resumo:
As ações de maior liquidez do índice IBOVESPA, refletem o comportamento das ações de um modo geral, bem como a relação das variáveis macroeconômicas em seu comportamento e estão entre as mais negociadas no mercado de capitais brasileiro. Desta forma, pode-se entender que há reflexos de fatores que impactam as empresas de maior liquidez que definem o comportamento das variáveis macroeconômicas e que o inverso também é uma verdade, oscilações nos fatores macroeconômicos também afetam as ações de maior liquidez, como IPCA, PIB, SELIC e Taxa de Câmbio. O estudo propõe uma análise da relação existente entre variáveis macroeconômicas e o comportamento das ações de maior liquidez do índice IBOVESPA, corroborando com estudos que buscam entender a influência de fatores macroeconômicos sobre o preço de ações e contribuindo empiricamente com a formação de portfólios de investimento. O trabalho abrangeu o período de 2008 a 2014. Os resultados concluíram que a formação de carteiras, visando a proteção do capital investido, deve conter ativos com correlação negativa em relação às variáveis estudadas, o que torna possível a composição de uma carteira com risco reduzido.
Resumo:
Cette thèse examine le rôle du pouvoir de marché dans le marché bancaire. L’emphase est mis sur la prise de risque, les économies d’échelle, l’efficacité économique du marché et la transmission des chocs. Le premier chapitre présente un modèle d’équilibre général dynamique stochastique en économie ouverte comprenant un marché bancaire en concurrence monopolistique. Suivant l’hypothèse de Krugman (1979, 1980) sur la relation entre les économies d’échelle et les exportations, les banques doivent défrayer un coût de transaction pour échanger à l’étranger qui diminue à mesure que le volume de leurs activités locales augmente. Cela incite les banques à réduire leur marge locale afin de profiter davantage du marché extérieur. Le modèle est solutionné et simulé pour divers degrés de concentration dans le marché bancaire. Les résultats obtenus indiquent que deux forces contraires, les économies d’échelle et le pouvoir de marché, s’affrontent lorsque le marché se concentre. La concentration permet aussi aux banques d’accroître leurs activités étrangères, ce qui les rend en contrepartie plus vulnérables aux chocs extérieurs. Le deuxième chapitre élabore un cadre de travail semblable, mais à l’intérieur duquel les banques font face à un risque de crédit. Celui-ci est partiellement assuré par un collatéral fourni par les entrepreneurs et peut être limité à l’aide d’un effort financier. Le modèle est solutionné et simulé pour divers degrés de concentration dans le marché bancaire. Les résultats montrent qu’un plus grand pouvoir de marché réduit la taille du marché financier et de la production à l’état stationnaire, mais incite les banques à prendre moins de risques. De plus, les économies dont le marché bancaire est fortement concentré sont moins sensibles à certains chocs puisque les marges plus élevés donnent initialement de la marge de manoeuvre aux banques en cas de chocs négatifs. Cet effet modérateur est éliminé lorsqu’il est possible pour les banques d’entrer et de sortir librement du marché. Une autre extension avec économies d’échelle montre que sous certaines conditions, un marché moyennement concentré est optimal pour l’économie. Le troisième chapitre utilise un modèle en analyse de portefeuille de type Moyenne-Variance afin de représenter une banque détenant du pouvoir de marché. Le rendement des dépôts et des actifs peut varier selon la quantité échangée, ce qui modifie le choix de portefeuille de la banque. Celle-ci tend à choisir un portefeuille dont la variance est plus faible lorsqu’elle est en mesure d’obtenir un rendement plus élevé sur un actif. Le pouvoir de marché sur les dépôts amène un résultat sembable pour un pouvoir de marché modéré, mais la variance finit par augmenter une fois un certain niveau atteint. Les résultats sont robustes pour différentes fonctions de demandes.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Direito, Programa de Pós-Graduação Stricto Sensu em Direito, 2016.
Resumo:
A methodology for the analysis of building energy retrofits has been developed for a diverse set of buildings at the Royal Botanic Gardens (RBG), Kew in southwest London, UK. The methodology requires selection of appropriate building simulation tools dependent on the nature of the principal energy demand. This has involved the development of a stand-alone model to simulate the heat flow in botanical glasshouses, as well as stochastic simulation of electricity demand for buildings with high equipment density and occupancy-led operation. Application of the methodology to the buildings at RBG Kew illustrates the potential reduction in energy consumption at the building scale achievable from the application of retrofit measures deemed appropriate for heritage buildings and the potential benefit to be gained from onsite generation and supply of energy. © 2014 Elsevier Ltd.
Resumo:
This thesis provides a complete analysis of the Standard Capital Requirements given by Solvency II for a real insurance portfolio. We analyze the investment portfolio of BPI Vida e Pensões, an insurance company affiliated with a Portuguese bank BPI, both at security, sub-portfolio and asset class levels. By using the Standard Formula from EIOPA, Total SCR amounts to 239M€. This value is mostly explained by Market and Default Risk whereas the former is driven by Spread and Concentration Risks. Following the methodology of Leblanc (2011), we examine the Marginal Contribution of an asset to the SCR which allows for the evaluation of the risks of each security given its characteristics and interactions in the portfolio. The top contributors to the SCR are Corporate Bonds and Term Deposits. By exploring further the composition of the portfolio, our results show that slight changes in allocation of Term and Cash Deposits have severe impacts on the total Concentration and Default Risks, respectively. Also, diversification effects are very relevant by representing savings of 122M€. Finally, Solvency II represents an opportunity for the portfolio optimization. By constructing efficient frontiers, we find that as the target expected return increases, a shift from Term Deposits/ Commercial Papers to Eurozone/Peripheral and finally Equities occurs.
Resumo:
This paper aims to bring more information related to the critical question "how IT areas of insurance companies are defining and delivering their strategic initiatives Portfolios?" and make conclusions based on the collected data. To reach these interpretations, it is composed of a theoretical investigation on the theme, a strategy delineation for the research methodology and a conclusion presentation based on the findings. In this last part, this study concluded that explored organization does not applied a sufficient number of best practices answering the critical question as "the company is not mature on this subject".
Resumo:
Mestrado em Ciências Actuariais
Resumo:
Services in the form of business services or IT-enabled (Web) Services have become a corporate asset of high interest in striving towards the agile organisation. However, while the design and management of a single service is widely studied and well understood, little is known about how a set of services can be managed. This gap motivated this paper, in which we explore the concept of Service Portfolio Management. In particular, we propose a Service Portfolio Management Framework that explicates service portfolio goals, tasks, governance issues, methods and enablers. The Service Portfolio Management Framework is based upon a thorough analysis and consolidation of existing, well-established portfolio management approaches. From an academic point of view, the Service Portfolio Management Framework can be positioned as an extension of portfolio management conceptualisations in the area of service management. Based on the framework, possible directions for future research are provided. From a practical point of view, the Service Portfolio Management Framework provides an organisation with a novel approach to managing its emerging service portfolios.
Resumo:
The inquiries to return predictability are traditionally limited to conditional mean, while literature on portfolio selection is replete with moment-based analysis with up to the fourth moment being considered. This paper develops a distribution-based framework for both return prediction and portfolio selection. More specifically, a time-varying return distribution is modeled through quantile regressions and copulas, using quantile regressions to extract information in marginal distributions and copulas to capture dependence structure. A preference function which captures higher moments is proposed for portfolio selection. An empirical application highlights the additional information provided by the distributional approach which cannot be captured by the traditional moment-based methods.
Resumo:
Trivium is a bit-based stream cipher in the final portfolio of the eSTREAM project. In this paper, we apply the approach of Berbain et al. to Trivium-like ciphers and perform new algebraic analyses on them, namely Trivium and its reduced versions: Trivium-N, Bivium-A and Bivium-B. In doing so, we answer an open question in the literature. We demonstrate a new algebraic attack on Bivium-A. This attack requires less time and memory than previous techniques which use the F4 algorithm to recover Bivium-A's initial state. Though our attacks on Bivium-B, Trivium and Trivium-N are worse than exhaustive keysearch, the systems of equations which are constructed are smaller and less complex compared to previous algebraic analysis. Factors which can affect the complexity of our attack on Trivium-like ciphers are discussed in detail.
Resumo:
Streamciphers are common cryptographic algorithms used to protect the confidentiality of frame-based communications like mobile phone conversations and Internet traffic. Streamciphers are ideal cryptographic algorithms to encrypt these types of traffic as they have the potential to encrypt them quickly and securely, and have low error propagation. The main objective of this thesis is to determine whether structural features of keystream generators affect the security provided by stream ciphers.These structural features pertain to the state-update and output functions used in keystream generators. Using linear sequences as keystream to encrypt messages is known to be insecure. Modern keystream generators use nonlinear sequences as keystream.The nonlinearity can be introduced through a keystream generator's state-update function, output function, or both. The first contribution of this thesis relates to nonlinear sequences produced by the well-known Trivium stream cipher. Trivium is one of the stream ciphers selected in a final portfolio resulting from a multi-year project in Europe called the ecrypt project. Trivium's structural simplicity makes it a popular cipher to cryptanalyse, but to date, there are no attacks in the public literature which are faster than exhaustive keysearch. Algebraic analyses are performed on the Trivium stream cipher, which uses a nonlinear state-update and linear output function to produce keystream. Two algebraic investigations are performed: an examination of the sliding property in the initialisation process and algebraic analyses of Trivium-like streamciphers using a combination of the algebraic techniques previously applied separately by Berbain et al. and Raddum. For certain iterations of Trivium's state-update function, we examine the sets of slid pairs, looking particularly to form chains of slid pairs. No chains exist for a small number of iterations.This has implications for the period of keystreams produced by Trivium. Secondly, using our combination of the methods of Berbain et al. and Raddum, we analysed Trivium-like ciphers and improved on previous on previous analysis with regards to forming systems of equations on these ciphers. Using these new systems of equations, we were able to successfully recover the initial state of Bivium-A.The attack complexity for Bivium-B and Trivium were, however, worse than exhaustive keysearch. We also show that the selection of stages which are used as input to the output function and the size of registers which are used in the construction of the system of equations affect the success of the attack. The second contribution of this thesis is the examination of state convergence. State convergence is an undesirable characteristic in keystream generators for stream ciphers, as it implies that the effective session key size of the stream cipher is smaller than the designers intended. We identify methods which can be used to detect state convergence. As a case study, theMixer streamcipher, which uses nonlinear state-update and output functions to produce keystream, is analysed. Mixer is found to suffer from state convergence as the state-update function used in its initialisation process is not one-to-one. A discussion of several other streamciphers which are known to suffer from state convergence is given. From our analysis of these stream ciphers, three mechanisms which can cause state convergence are identified.The effect state convergence can have on stream cipher cryptanalysis is examined. We show that state convergence can have a positive effect if the goal of the attacker is to recover the initial state of the keystream generator. The third contribution of this thesis is the examination of the distributions of bit patterns in the sequences produced by nonlinear filter generators (NLFGs) and linearly filtered nonlinear feedback shift registers. We show that the selection of stages used as input to a keystream generator's output function can affect the distribution of bit patterns in sequences produced by these keystreamgenerators, and that the effect differs for nonlinear filter generators and linearly filtered nonlinear feedback shift registers. In the case of NLFGs, the keystream sequences produced when the output functions take inputs from consecutive register stages are less uniform than sequences produced by NLFGs whose output functions take inputs from unevenly spaced register stages. The opposite is true for keystream sequences produced by linearly filtered nonlinear feedback shift registers.
Resumo:
Trivium is a bit-based stream cipher in the final portfolio of the eSTREAM project. In this paper, we apply the algebraic attack approach of Berbain et al. to Trivium-like ciphers and perform new analyses on them. We demonstrate a new algebraic attack on Bivium-A. This attack requires less time and memory than previous techniques to recover Bivium-A's initial state. Though our attacks on Bivium-B, Trivium and Trivium-N are worse than exhaustive keysearch, the systems of equations which are constructed are smaller and less complex compared to previous algebraic analyses. We also answer an open question posed by Berbain et al. on the feasibility of applying their technique on Trivium-like ciphers. Factors which can affect the complexity of our attack on Trivium-like ciphers are discussed in detail. Analysis of Bivium-B and Trivium-N are omitted from this manuscript. The full paper is available on the IACR ePrint Archive.
Resumo:
The top-k retrieval problem aims to find the optimal set of k documents from a number of relevant documents given the user’s query. The key issue is to balance the relevance and diversity of the top-k search results. In this paper, we address this problem using Facility Location Analysis taken from Operations Research, where the locations of facilities are optimally chosen according to some criteria. We show how this analysis technique is a generalization of state-of-the-art retrieval models for diversification (such as the Modern Portfolio Theory for Information Retrieval), which treat the top-k search results like “obnoxious facilities” that should be dispersed as far as possible from each other. However, Facility Location Analysis suggests that the top-k search results could be treated like “desirable facilities” to be placed as close as possible to their customers. This leads to a new top-k retrieval model where the best representatives of the relevant documents are selected. In a series of experiments conducted on two TREC diversity collections, we show that significant improvements can be made over the current state-of-the-art through this alternative treatment of the top-k retrieval problem.