989 resultados para expected shortfall portfolio optimization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines three different, but related problems in the broad area of portfolio management for long-term institutional investors, and focuses mainly on the case of pension funds. The first idea (Chapter 3) is the application of a novel numerical technique – robust optimization – to a real-world pension scheme (the Universities Superannuation Scheme, USS) for first time. The corresponding empirical results are supported by many robustness checks and several benchmarks such as the Bayes-Stein and Black-Litterman models that are also applied for first time in a pension ALM framework, the Sharpe and Tint model and the actual USS asset allocations. The second idea presented in Chapter 4 is the investigation of whether the selection of the portfolio construction strategy matters in the SRI industry, an issue of great importance for long term investors. This study applies a variety of optimal and naïve portfolio diversification techniques to the same SRI-screened universe, and gives some answers to the question of which portfolio strategies tend to create superior SRI portfolios. Finally, the third idea (Chapter 5) compares the performance of a real-world pension scheme (USS) before and after the recent major changes in the pension rules under different dynamic asset allocation strategies and the fixed-mix portfolio approach and quantifies the redistributive effects between various stakeholders. Although this study deals with a specific pension scheme, the methodology can be applied by other major pension schemes in countries such as the UK and USA that have changed their rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to develop a fast capillary electrophoresis method for the determination of benzoate and sorbate ions in commercial beverages. In the method development the pH and constituents of the background electrolyte were selected using the effective mobility versus pH curves. As the high resolution obtained experimentally for sorbate and benzoate in the studies presented in the literature is not in agreement with that expected from the ionic mobility values published, a procedure to determine these values was carried out. The salicylate ion was used as the internal standard. The background electrolyte was composed of 25 mmol L(-1) tris(hydroxymethyl)aminomethane and 12.5 mmol L(-1) 2-hydroxyisobutyric acid, atpH 8.1.Separation was conducted in a fused-silica capillary(32 cm total length and 8.5 cm effective length, 50 mu m I.D.), with short-end injection configuration and direct UV detection at 200 nm for benzoate and salicylate and 254 nm for sorbate ions. The run time was only 28 s. A few figures of merit of the proposed method include: good linearity (R(2) > 0.999), limit of detection of 0.9 and 0.3 mg L(-1) for benzoate and sorbate, respectively, inter-day precision better than 2.7% (n =9) and recovery in the range 97.9-105%. Beverage samples were prepared by simple dilution with deionized water (1:11, v/v). Concentrations in the range of 197-401 mg L(-1) for benzoate and 28-144 mg L(-1) for sorbate were found in soft drinks and tea. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study contributes a rigorous diagnostic assessment of state-of-the-art multiobjective evolutionary algorithms (MOEAs) and highlights key advances that the water resources field can exploit to better discover the critical tradeoffs constraining our systems. This study provides the most comprehensive diagnostic assessment of MOEAs for water resources to date, exploiting more than 100,000 MOEA runs and trillions of design evaluations. The diagnostic assessment measures the effectiveness, efficiency, reliability, and controllability of ten benchmark MOEAs for a representative suite of water resources applications addressing rainfall-runoff calibration, long-term groundwater monitoring (LTM), and risk-based water supply portfolio planning. The suite of problems encompasses a range of challenging problem properties including (1) many-objective formulations with 4 or more objectives, (2) multi-modality (or false optima), (3) nonlinearity, (4) discreteness, (5) severe constraints, (6) stochastic objectives, and (7) non-separability (also called epistasis). The applications are representative of the dominant problem classes that have shaped the history of MOEAs in water resources and that will be dominant foci in the future. Recommendations are provided for which modern MOEAs should serve as tools and benchmarks in the future water resources literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we apply the theory of declsion making with expected utility and non-additive priors to the choice of optimal portfolio. This theory describes the behavior of a rational agent who i5 averse to pure 'uncertainty' (as well as, possibly, to 'risk'). We study the agent's optimal allocation of wealth between a safe and an uncertain asset. We show that there is a range of prices at which the agent neither buys not sells short the uncertain asset. In contrast the standard theory of expected utility predicts that there is exactly one such price. We also provide a definition of an increase in uncertainty aversion and show that it causes the range of prices to increase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Considering the three first moments and allowing short sales, the efficient portfolios set for n risky assets and a riskless one is found, supposing that agents like odd moments and dislike even ones. Analytical formulas for the solution surface are obtained and important geometric properties provide insights on its shape in the three dimensional space defined by the moments. A special duality result is needed and proved. The methodology is general, comprising situations in which, for instance, the investor trades a negative skewness for a higher expected return. Computation of the optimum portfolio weights is feasible in most cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objetivo do presente trabalho é verificar se, ao levar-se em consideração momentos de ordem superior (assimetria e curtose) na alocação de uma carteira de carry trade, há ganhos em relação à alocação tradicional que prioriza somente os dois primeiros momentos (média e variância). A hipótese da pesquisa é que moedas de carry trade apresentam retornos com distribuição não-Normal, e os momentos de ordem superior desta têm uma dinâmica, a qual pode ser modelada através de um modelo da família GARCH, neste caso IC-GARCHSK. Este modelo consiste em uma equação para cada momento condicional dos componentes independentes, explicitamente: o retorno, a variância, a assimetria, e a curtose. Outra hipótese é que um investidor com uma função utilidade do tipo CARA (constant absolute risk aversion), pode tê-la aproximada por uma expansão de Taylor de 4ª ordem. A estratégia do trabalho é modelar a dinâmica dos momentos da série dos logartimos neperianos dos retornos diários de algumas moedas de carry trade através do modelo IC-GARCHSK, e estimar a alocação ótima da carteira dinamicamente, de tal forma que se maximize a função utilidade do investidor. Os resultados mostram que há ganhos sim, ao levar-se em consideração os momentos de ordem superior, uma vez que o custo de oportunidade desta foi menor que o de uma carteira construída somente utilizando como critérios média e variância.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho se dedica a analisar o desempenho de modelos de otimização de carteiras regularizadas, empregando ativos financeiros do mercado brasileiro. Em particular, regularizamos as carteiras através do uso de restrições sobre a norma dos pesos dos ativos, assim como DeMiguel et al. (2009). Adicionalmente, também analisamos o desempenho de carteiras que levam em consideração informações sobre a estrutura de grupos de ativos com características semelhantes, conforme proposto por Fernandes, Rocha e Souza (2011). Enquanto a matriz de covariância empregada nas análises é a estimada através dos dados amostrais, os retornos esperados são obtidos através da otimização reversa da carteira de equilíbrio de mercado proposta por Black e Litterman (1992). A análise empírica fora da amostra para o período entre janeiro de 2010 e outubro de 2014 sinaliza-nos que, em linha com estudos anteriores, a penalização das normas dos pesos pode levar (dependendo da norma escolhida e da intensidade da restrição) a melhores performances em termos de Sharpe e retorno médio, em relação a carteiras obtidas via o modelo tradicional de Markowitz. Além disso, a inclusão de informações sobre os grupos de ativos também pode trazer benefícios ao cálculo de portfolios ótimos, tanto em relação aos métodos tradicionais quanto em relação aos casos sem uso da estrutura de grupos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade mobile wireless communications have witnessed an explosive growth in the user’s penetration rate and their widespread deployment around the globe. It is expected that this tendency will continue to increase with the convergence of fixed Internet wired networks with mobile ones and with the evolution to the full IP architecture paradigm. Therefore mobile wireless communications will be of paramount importance on the development of the information society of the near future. In particular a research topic of particular relevance in telecommunications nowadays is related to the design and implementation of mobile communication systems of 4th generation. 4G networks will be characterized by the support of multiple radio access technologies in a core network fully compliant with the Internet Protocol (all IP paradigm). Such networks will sustain the stringent quality of service (QoS) requirements and the expected high data rates from the type of multimedia applications to be available in the near future. The approach followed in the design and implementation of the mobile wireless networks of current generation (2G and 3G) has been the stratification of the architecture into a communication protocol model composed by a set of layers, in which each one encompasses some set of functionalities. In such protocol layered model, communications is only allowed between adjacent layers and through specific interface service points. This modular concept eases the implementation of new functionalities as the behaviour of each layer in the protocol stack is not affected by the others. However, the fact that lower layers in the protocol stack model do not utilize information available from upper layers, and vice versa, downgrades the performance achieved. This is particularly relevant if multiple antenna systems, in a MIMO (Multiple Input Multiple Output) configuration, are implemented. MIMO schemes introduce another degree of freedom for radio resource allocation: the space domain. Contrary to the time and frequency domains, radio resources mapped into the spatial domain cannot be assumed as completely orthogonal, due to the amount of interference resulting from users transmitting in the same frequency sub-channel and/or time slots but in different spatial beams. Therefore, the availability of information regarding the state of radio resources, from lower to upper layers, is of fundamental importance in the prosecution of the levels of QoS expected from those multimedia applications. In order to match applications requirements and the constraints of the mobile radio channel, in the last few years researches have proposed a new paradigm for the layered architecture for communications: the cross-layer design framework. In a general way, the cross-layer design paradigm refers to a protocol design in which the dependence between protocol layers is actively exploited, by breaking out the stringent rules which restrict the communication only between adjacent layers in the original reference model, and allowing direct interaction among different layers of the stack. An efficient management of the set of available radio resources demand for the implementation of efficient and low complexity packet schedulers which prioritize user’s transmissions according to inputs provided from lower as well as upper layers in the protocol stack, fully compliant with the cross-layer design paradigm. Specifically, efficiently designed packet schedulers for 4G networks should result in the maximization of the capacity available, through the consideration of the limitations imposed by the mobile radio channel and comply with the set of QoS requirements from the application layer. IEEE 802.16e standard, also named as Mobile WiMAX, seems to comply with the specifications of 4G mobile networks. The scalable architecture, low cost implementation and high data throughput, enable efficient data multiplexing and low data latency, which are attributes essential to enable broadband data services. Also, the connection oriented approach of Its medium access layer is fully compliant with the quality of service demands from such applications. Therefore, Mobile WiMAX seems to be a promising 4G mobile wireless networks candidate. In this thesis it is proposed the investigation, design and implementation of packet scheduling algorithms for the efficient management of the set of available radio resources, in time, frequency and spatial domains of the Mobile WiMAX networks. The proposed algorithms combine input metrics from physical layer and QoS requirements from upper layers, according to the crosslayer design paradigm. Proposed schedulers are evaluated by means of system level simulations, conducted in a system level simulation platform implementing the physical and medium access control layers of the IEEE802.16e standard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image restoration attempts to enhance images corrupted by noise and blurring effects. Iterative approaches can better control the restoration algorithm in order to find a compromise of restoring high details in smoothed regions without increasing the noise. Techniques based on Projections Onto Convex Sets (POCS) have been extensively used in the context of image restoration by projecting the solution onto hyperspaces until some convergence criteria be reached. It is expected that an enhanced image can be obtained at the final of an unknown number of projections. The number of convex sets and its combinations allow designing several image restoration algorithms based on POCS. Here, we address two convex sets: Row-Action Projections (RAP) and Limited Amplitude (LA). Although RAP and LA have already been used in image restoration domain, the former has a relaxation parameter (A) that strongly depends on the characteristics of the image that will be restored, i.e., wrong values of A can lead to poorly restoration results. In this paper, we proposed a hybrid Particle Swarm Optimization (PS0)-POCS image restoration algorithm, in which the A value is obtained by PSO to be further used to restore images by POCS approach. Results showed that the proposed PSO-based restoration algorithm outperformed the widely used Wiener and Richardson-Lucy image restoration algorithms. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of robust design methodologies and techniques has become a new topical area in design optimizations in nearly all engineering and applied science disciplines in the last 10 years due to inevitable and unavoidable imprecision or uncertainty which is existed in real word design problems. To develop a fast optimizer for robust designs, a methodology based on polynomial chaos and tabu search algorithm is proposed. In the methodology, the polynomial chaos is employed as a stochastic response surface model of the objective function to efficiently evaluate the robust performance parameter while a mechanism to assign expected fitness only to promising solutions is introduced in tabu search algorithm to minimize the requirement for determining robust metrics of intermediate solutions. The proposed methodology is applied to the robust design of a practical inverse problem with satisfactory results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An economic-statistical model is developed for variable parameters (VP) (X) over bar charts in which all design parameters vary adaptively, that is, each of the design parameters (sample size, sampling interval and control-limit width) vary as a function of the most recent process information. The cost function due to controlling the process quality through a VP (X) over bar chart is derived. During the optimization of the cost function, constraints are imposed on the expected times to signal when the process is in and out of control. In this way, required statistical properties can be assured. Through a numerical example, the proposed economic-statistical design approach for VP (X) over bar charts is compared to the economic design for VP (X) over bar charts and to the economic-statistical and economic designs for fixed parameters (FP) (X) over bar charts in terms of the operating cost and the expected times to signal. From this example, it is possible to assess the benefits provided by the proposed model. Varying some input parameters, their effect on the optimal cost and on the optimal values of the design parameters was analysed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alternative sampling procedures are compared to the pure random search method. It is shown that the efficiency of the algorithm can be improved with respect to the expected number of steps to reach an epsilon-neighborhood of the optimal point.