933 resultados para Return-based pricing kernel


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta pesquisa teve como objetivo fazer uma verificação no processo de alocação dos custos indiretos em três indústrias de alimentos do Estado da Paraíba , cidade de Campina Grande , confrontando-se os método se alocações dessas empresas com a literatura pertinente (CAPíTULO I). Para melhor compreensão das relações entre a teor ia e a prática de alocaçõe s dos custos indiretos , foi apresentada uma revisão de literatura , com alguns conceitos e classificação desses custos , bem como comentários sobre o pro cesso de alocação, tendo em vista quatro propósito s básicos da contabilidade de custos : Avaliação de Estoques, Fixação de preços, Avaliação de Desempenho e Análise, para Tomada de Decisões (CAPíTULO lI ) . Devido à carência de dados empíricos nessa área, a circunscrição da pesquisa a três indústrias e à necessidade de uma descrição mais profunda sobre as práticas corrente s de alocações , optou-se pelo método do " Estudo de Caso " (CAPíTULO III ) . Segue-se a apresentação dos casos estudados e uma descrição dos métodos de alocações adotados pelas empresas ( CAPíTULO IV) . O s resultados obtidos possibilitaram uma análise dos método s de alocações pesquisados, em confronto com a literatura revisada (CAPíTULO V) . Por fim, é apresentado um sumário, onde se chega a conclusões importantes , incluindo recomendações e sugestões para novos estudos nesta área de conhecimento ( CApíTULO VI ).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Forward Premium Puzzle (FPP) is how the empirical observation of a negative relation between future changes in the spot rates and the forward premium is known. Modeling this forward bias as a risk premium and under weak assumptions on the behavior of the pricing kernel, we characterize the potential bias that is present in the regressions where the FPP is observed and we identify the necessary and sufficient conditions that the pricing kernel has to satisfy to account for the predictability of exchange rate movements. Next, we estimate the pricing kernel applying two methods: i) one, du.e to Araújo et aI. (2005), that exploits the fact that the pricing kernel is a serial correlation common feature of asset prices, and ii) a traditional principal component analysis used as a procedure 1;0 generate a statistical factor modeI. Then, using on the sample and out of the sample exercises, we are able to show that the same kernel that explains the Equity Premi um Puzzle (EPP) accounts for the FPP in all our data sets. This suggests that the quest for an economic mo deI that generates a pricing kernel which solves the EPP may double its prize by simultaneously accounting for the FPP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tese é constituída por três ensaios. O primeiro ensaio analisa a informação pública disponível sobre o risco das carteiras de crédito dos bancos brasileiros, sendo dividido em dois capítulos. O primeiro analisa a limitação da informação pública disponibilizada pelos bancos e pelo Banco Central, quando comparada a informação gerencial disponível internamente pelos bancos. Concluiu-se que existe espaço para o aumento da transparência na divulgação das informações, fato que vem ocorrendo gradativamente no Brasil através de novas normas relacionadas ao Pilar 3 de Basileia II e à divulgação de informações mais detalhas pelo Bacen, como, por exemplo, aquelas do “Top50” . A segunda parte do primeiro ensaio mostra a discrepância entre o índice de inadimplência contábil (NPL) e a probabilidade de inadimplência (PD) e também discute a relação entre provisão e perda esperada. Através da utilização de matrizes de migração e de uma simulação baseada na sobreposição de safras de carteira de crédito de grandes bancos, concluiu-se que o índice de inadimplência subestima a PD e que a provisão constituída pelos bancos é menor que a perda esperada do SFN. O segundo ensaio relaciona a gestão de risco à discriminação de preço. Foi desenvolvido um modelo que consiste em um duopólio de Cournot em um mercado de crédito de varejo, em que os bancos podem realizar discriminação de terceiro grau. Neste modelo, os potenciais tomadores de crédito podem ser de dois tipos, de baixo ou de alto risco, sendo que tomadores de baixo risco possuem demanda mais elástica. Segundo o modelo, se o custo para observar o tipo do cliente for alto, a estratégia dos bancos será não discriminar (pooling equilibrium). Mas, se este custo for suficientemente baixo, será ótimo para os bancos cobrarem taxas diferentes para cada grupo. É argumentado que o Acordo de Basileia II funcionou como um choque exógeno que deslocou o equilíbrio para uma situação com maior discriminação. O terceiro ensaio é divido em dois capítulos. O primeiro discute a aplicação dos conceitos de probabilidade subjetiva e incerteza Knigthiana a modelos de VaR e a importância da avaliação do “risco de modelo”, que compreende os riscos de estimação, especificação e identificação. O ensaio propõe que a metodologia dos “quatro elementos” de risco operacional (dados internos, externos, ambiente de negócios e cenários) seja estendida à mensuração de outros riscos (risco de mercado e risco de crédito). A segunda parte deste último ensaio trata da aplicação do elemento análise de cenários para a mensuração da volatilidade condicional nas datas de divulgação econômica relevante, especificamente nos dias de reuniões do Copom.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study cash allocation ability as a possible explanatory factor that allows equity fund managers to produce high levels of adjusted returns (not explained by the risk factors they are exposed to). In order to do so, we explore the non-indexed Brazilian equity fund industry during the period of January 2006 to February 2015, evaluating cash allocation ability by level and effectiveness of cash deployment using return-based and holding-based approaches to explore a database of monthly invested assets and returns. We found that even though market timing is a rare skill in the industry, the flexibility to hold high levels of cash played a significant role in the result of over performing managers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo deste estudo é analisar as durações das carteiras de renda fixa dos fundos previdenciários, que são paradoxalmente curtas em relação aos objetivos de longo prazo inerentes à previdência, e os eventuais efeitos dos incentivos de permanência existentes nos planos coletivos instituídos, como o custeio do instituidor e regras de desligamento – vesting – no alongamento dessas carteiras. Como forma de sobrepujar as dificuldades da observação direta dos prazos de alongamento das carteiras dos fundos analisados, foi proposto um índice de alongamento calcado na Análise de Estilo Baseada nos Retornos desenvolvida por SHARPE (1992) empregando-se as componentes principais dos Índices de Duração Constante da Anbima (IDkA) para a avaliação da sensibilidade dos retornos mensais dos fundos analisados às curvas de juros real e nominal. Os resultados obtidos não mostram evidências de que os fundos que recebem recursos exclusivamente de planos instituídos apresentem duração maior do que daqueles que recebem recursos de planos individuais e coletivos averbados. Por outro lado, os fundos classificados como “Previdência Data Alvo” pela Anbima destacam-se por apresentar índices de alongamento maiores frente à média dos fundos classificados como “Previdência Renda Fixa” ou “Previdência Balanceado” e correlação positiva entre seus índices de alongamento e Ano Alvo do fundo, o que sugere que políticas que trabalhem o conjunto de informação dos agentes, investidores e gestores, são capazes de modificar a alocação dos investimentos. Basta informação para melhorar a alocação.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The developmental processes and functions of an organism are controlled by the genes and the proteins that are derived from these genes. The identification of key genes and the reconstruction of gene networks can provide a model to help us understand the regulatory mechanisms for the initiation and progression of biological processes or functional abnormalities (e.g. diseases) in living organisms. In this dissertation, I have developed statistical methods to identify the genes and transcription factors (TFs) involved in biological processes, constructed their regulatory networks, and also evaluated some existing association methods to find robust methods for coexpression analyses. Two kinds of data sets were used for this work: genotype data and gene expression microarray data. On the basis of these data sets, this dissertation has two major parts, together forming six chapters. The first part deals with developing association methods for rare variants using genotype data (chapter 4 and 5). The second part deals with developing and/or evaluating statistical methods to identify genes and TFs involved in biological processes, and construction of their regulatory networks using gene expression data (chapter 2, 3, and 6). For the first part, I have developed two methods to find the groupwise association of rare variants with given diseases or traits. The first method is based on kernel machine learning and can be applied to both quantitative as well as qualitative traits. Simulation results showed that the proposed method has improved power over the existing weighted sum method (WS) in most settings. The second method uses multiple phenotypes to select a few top significant genes. It then finds the association of each gene with each phenotype while controlling the population stratification by adjusting the data for ancestry using principal components. This method was applied to GAW 17 data and was able to find several disease risk genes. For the second part, I have worked on three problems. First problem involved evaluation of eight gene association methods. A very comprehensive comparison of these methods with further analysis clearly demonstrates the distinct and common performance of these eight gene association methods. For the second problem, an algorithm named the bottom-up graphical Gaussian model was developed to identify the TFs that regulate pathway genes and reconstruct their hierarchical regulatory networks. This algorithm has produced very significant results and it is the first report to produce such hierarchical networks for these pathways. The third problem dealt with developing another algorithm called the top-down graphical Gaussian model that identifies the network governed by a specific TF. The network produced by the algorithm is proven to be of very high accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter provides a detailed discussion of the evidence on housing and mortgage lending discrimination, as well as the potential impacts of such discrimination on minority outcomes like homeownership and neighborhood environment. The paper begins by discussing conceptual issues surrounding empirical analyses of discrimination including explanations for why discrimination takes place, defining different forms of discrimination, and the appropriate interpretation of observed racial and ethnic differences in treatment or outcomes. Next, the paper reviews evidence on housing market discrimination starting with evidence of segregation and price differences in the housing market and followed by direct evidence of discrimination by real estate agents in paired testing studies. Finally, mortgage market discrimination and barriers in access to mortgage credit are discussed. This discussion begins with an assessment of the role credit barriers play in explaining racial and ethnic differences in homeownership and follows with discussions of analyses of underwriting and the price of credit based on administrative and private sector data sources including analyses of the subprime market. The paper concludes that housing discrimination has declined especially in the market for owner-occupied housing and does not appear to play a large role in limiting the neighborhood choices of minority households or the concentration of minorities into central cities. On the other hand, the patterns of racial centralization and lower home ownership rates of African-Americans appear to be related to each other, and lower minority homeownership rates are in part attributable to barriers in the market for mortgage credit. The paper presents considerable evidence of racial and ethnic differences in mortgage underwriting, as well as additional evidence suggesting these differences may be attributable to differential provision of coaching, assistance, and support by loan officers. At this point, innovation in loan products, the shift towards risk based pricing, and growth of the subprime market have not mitigated the role credit barriers play in explaining racial and ethnic differences in homeownership. Further, the growth of the subprime lending industry appears to have segmented the mortgage market in terms of geography leading to increased costs of relying on local/neighborhood sources of mortgage credit and affecting the integrity of many low-income minority neighborhoods through increased foreclosure rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Actualmente existe un gran interés orientado hacia el mercado del gas natural. Son muchas las razones por las que este combustible se posiciona como uno de los más importantes dentro del panorama energético mundial. Además de que salvaría el hueco dejado por el carbón y el petróleo, supone una alternativa mucho más limpia que se podría desarrollar aún más tanto a nivel doméstico, industrial como en el mundo de los transportes. La industria del gas natural está cambiando rápidamente fundamentalmente por la aparición del gas no convencional y sus técnicas de extracción. Por lo que se está produciendo un cambio en la economía de la producción de gas así como en la dinámica y los movimientos del GNL a lo largo de todo el planeta. El propósito de este estudio es enfocar el estado del sector y mercado del gas natural en todo el mundo y de esta forma subrayar las principales regiones que marcan la tendencia general de los precios de todo el planeta. Además, este trabajo reflejará los pronósticos esperados para los próximos años así como un resumen de las tendencias que se han seguido hasta el momento. Particularmente, se centrará la atención en el movimiento hacia los sistemas basados en forma de hub que comenzaron en EE.UU. y que llegaron a Reino Unido y al continente Europeo a principios del S.XX. Esta tendencia es la que se pretende implantar en España con el fin de conseguir una mayor competitividad, flexibilidad y liquidez en los precios y en el sistema gasista. De esta forma, poco a poco se irá construyendo la estructura hacia un Mercado Único Europeo que es el objetivo final que plantean los organismos de los estados miembros. Sin embargo, para la puesta en marcha de este nuevo modelo es necesario realizar una serie de cambios en el sistema como la modificación de la Ley de Hidrocarburos, la designación de un Operador de Mercado, elaboración de una serie de reglas para regular el mercado así como fomentar la liquidez del mercado. Cuando tenga lugar el cambio regulatorio, la liquidez del sistema español incrementará y se dará la oportunidad de crear nuevas formas para balancear las carteras de gas y establecer nuevas estrategias para gestionar el riesgo. No obstante, antes de que se hagan efectivos los cambios en la legislación, se implantaría uno de los modelos planteados en el “Gas Target Model”, el denominado “Modelo de Asignación de Capacidad Implícita”. La introducción de este modelo sería un primer paso para la integración de un mercado de gas sin la necesidad de afrontar un cambio legislativo, lo que serviría de VIII impulso para alcanzar el “Modelo de Área de Mercado” que sería el mejor para el sistema gasista español y se conectaría ampliamente con el resto de mercados europeos. Las conclusiones del estudio en relación a la formación del nuevo modelo en forma de hub plantean la necesidad de aprovechar al máximo la nueva situación y conseguir implantar el hub lo antes posible para poder dotar al sistema de mayor competencia y liquidez. Además, el sistema español debe aprovechar su gran capacidad y moderna infraestructura para convertir al país en la entrada de gas del suroeste de Europa ampliando así la seguridad de suministro de los países miembros. Otra conclusión que se puede extraer del informe es la necesidad de ampliar el índice de penetración del gas en España e incentivar el consumo frente a otros combustibles fósiles como el carbón y el petróleo. Esto situaría al gas natural como la principal energía de respaldo con respecto a las renovables y permitiría disminuir los precios del kilovatio hora del gas natural. El estudio y análisis de la dinámica que se viene dando en la industria del gas en el mundo es fundamental para poder anticiparse y planear las mejores estrategias frente a los cambios que poco a poco irán modificando el sector y el mercado gasista. ABSTRACT There is a great deal of focus on the natural gas market at the moment. Whether you view natural gas as bridging the gap between coal/oil and an altogether cleaner solution yet to be determined, or as a destination fuel which will be used not only for heating and gas fired generation but also as transportation fuel, there is no doubt that natural gas will have an increasingly important role to play in the global energy landscape. The natural gas industry is changing rapidly, as shale gas exploration changes the economics of gas production and LNG connects regions across the globe. The purpose of this study is to outline the present state of the global gas industry highlighting the differing models around the world. This study will pay particular attention to the move towards hub based pricing that has taken hold first in the US and over the past decade across the UK and Continental Europe. In the coming years the Spanish model will move towards hub based pricing. As gas market regulatory change takes hold, liquidity in the Spanish gas market will increase, bringing with it new ways to balance gas portfolios and placing an increasing focus on managing price risk. This study will in turn establish the links between the changes that have taken place in other markets as a way to better understanding how the Spanish market will evolve in the coming years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La evolución de los teléfonos móviles inteligentes, dotados de cámaras digitales, está provocando una creciente demanda de aplicaciones cada vez más complejas que necesitan algoritmos de visión artificial en tiempo real; puesto que el tamaño de las señales de vídeo no hace sino aumentar y en cambio el rendimiento de los procesadores de un solo núcleo se ha estancado, los nuevos algoritmos que se diseñen para visión artificial han de ser paralelos para poder ejecutarse en múltiples procesadores y ser computacionalmente escalables. Una de las clases de procesadores más interesantes en la actualidad se encuentra en las tarjetas gráficas (GPU), que son dispositivos que ofrecen un alto grado de paralelismo, un excelente rendimiento numérico y una creciente versatilidad, lo que los hace interesantes para llevar a cabo computación científica. En esta tesis se exploran dos aplicaciones de visión artificial que revisten una gran complejidad computacional y no pueden ser ejecutadas en tiempo real empleando procesadores tradicionales. En cambio, como se demuestra en esta tesis, la paralelización de las distintas subtareas y su implementación sobre una GPU arrojan los resultados deseados de ejecución con tasas de refresco interactivas. Asimismo, se propone una técnica para la evaluación rápida de funciones de complejidad arbitraria especialmente indicada para su uso en una GPU. En primer lugar se estudia la aplicación de técnicas de síntesis de imágenes virtuales a partir de únicamente dos cámaras lejanas y no paralelas—en contraste con la configuración habitual en TV 3D de cámaras cercanas y paralelas—con información de color y profundidad. Empleando filtros de mediana modificados para la elaboración de un mapa de profundidad virtual y proyecciones inversas, se comprueba que estas técnicas son adecuadas para una libre elección del punto de vista. Además, se demuestra que la codificación de la información de profundidad con respecto a un sistema de referencia global es sumamente perjudicial y debería ser evitada. Por otro lado se propone un sistema de detección de objetos móviles basado en técnicas de estimación de densidad con funciones locales. Este tipo de técnicas es muy adecuada para el modelado de escenas complejas con fondos multimodales, pero ha recibido poco uso debido a su gran complejidad computacional. El sistema propuesto, implementado en tiempo real sobre una GPU, incluye propuestas para la estimación dinámica de los anchos de banda de las funciones locales, actualización selectiva del modelo de fondo, actualización de la posición de las muestras de referencia del modelo de primer plano empleando un filtro de partículas multirregión y selección automática de regiones de interés para reducir el coste computacional. Los resultados, evaluados sobre diversas bases de datos y comparados con otros algoritmos del estado del arte, demuestran la gran versatilidad y calidad de la propuesta. Finalmente se propone un método para la aproximación de funciones arbitrarias empleando funciones continuas lineales a tramos, especialmente indicada para su implementación en una GPU mediante el uso de las unidades de filtraje de texturas, normalmente no utilizadas para cómputo numérico. La propuesta incluye un riguroso análisis matemático del error cometido en la aproximación en función del número de muestras empleadas, así como un método para la obtención de una partición cuasióptima del dominio de la función para minimizar el error. ABSTRACT The evolution of smartphones, all equipped with digital cameras, is driving a growing demand for ever more complex applications that need to rely on real-time computer vision algorithms. However, video signals are only increasing in size, whereas the performance of single-core processors has somewhat stagnated in the past few years. Consequently, new computer vision algorithms will need to be parallel to run on multiple processors and be computationally scalable. One of the most promising classes of processors nowadays can be found in graphics processing units (GPU). These are devices offering a high parallelism degree, excellent numerical performance and increasing versatility, which makes them interesting to run scientific computations. In this thesis, we explore two computer vision applications with a high computational complexity that precludes them from running in real time on traditional uniprocessors. However, we show that by parallelizing subtasks and implementing them on a GPU, both applications attain their goals of running at interactive frame rates. In addition, we propose a technique for fast evaluation of arbitrarily complex functions, specially designed for GPU implementation. First, we explore the application of depth-image–based rendering techniques to the unusual configuration of two convergent, wide baseline cameras, in contrast to the usual configuration used in 3D TV, which are narrow baseline, parallel cameras. By using a backward mapping approach with a depth inpainting scheme based on median filters, we show that these techniques are adequate for free viewpoint video applications. In addition, we show that referring depth information to a global reference system is ill-advised and should be avoided. Then, we propose a background subtraction system based on kernel density estimation techniques. These techniques are very adequate for modelling complex scenes featuring multimodal backgrounds, but have not been so popular due to their huge computational and memory complexity. The proposed system, implemented in real time on a GPU, features novel proposals for dynamic kernel bandwidth estimation for the background model, selective update of the background model, update of the position of reference samples of the foreground model using a multi-region particle filter, and automatic selection of regions of interest to reduce computational cost. The results, evaluated on several databases and compared to other state-of-the-art algorithms, demonstrate the high quality and versatility of our proposal. Finally, we propose a general method for the approximation of arbitrarily complex functions using continuous piecewise linear functions, specially formulated for GPU implementation by leveraging their texture filtering units, normally unused for numerical computation. Our proposal features a rigorous mathematical analysis of the approximation error in function of the number of samples, as well as a method to obtain a suboptimal partition of the domain of the function to minimize approximation error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, the EU energy debate has been dominated by the discussion on energy prices and the competitiveness of the European industry. According to the latest estimates of the International Energy Agency, gas prices in the US are one-quarter of those in Europe. Moreover, prices of imported gas vary across the EU member states. Some EU policy-makers hope that the completion of the internal energy market and the transition to hub-based pricing will solve these discrepancies. Julian Wieczorkiewicz asks in this Commentary whether the abolition of oil-indexation will constitute a cure-all for the above-mentioned problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We estimate the 'fundamental' component of euro area sovereign bond yield spreads, i.e. the part of bond spreads that can be justified by country-specific economic factors, euro area economic fundamentals, and international influences. The yield spread decomposition is achieved using a multi-market, no-arbitrage affine term structure model with a unique pricing kernel. More specifically, we use the canonical representation proposed by Joslin, Singleton, and Zhu (2011) and introduce next to standard spanned factors a set of unspanned macro factors, as in Joslin, Priebsch, and Singleton (2013). The model is applied to yield curve data from Belgium, France, Germany, Italy, and Spain over the period 2005-2013. Overall, our results show that economic fundamentals are the dominant drivers behind sovereign bond spreads. Nevertheless, shocks unrelated to the fundamental component of the spread have played an important role in the dynamics of bond spreads since the intensification of the sovereign debt crisis in the summer of 2011

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. We use non-linear, artificial intelligence techniques, namely, recurrent neural networks, evolution strategies and kernel methods in our forecasting experiment. In the experiment, these three methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. There is evidence in the literature that evolutionary methods can be used to evolve kernels hence our future work should combine the evolutionary and kernel methods to get the benefits of both.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regressiontechniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a nave random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists' long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies. © 2010 Elsevier B.V. All rights reserved.