905 resultados para time history analysis
Resumo:
This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.
Resumo:
En este trabajo se pretende establecer que factores fundamentales influyen en el movimiento de la tasa de cambio COP/USD en un periodo intra-diario de forma horaria, para así poder establecer un modelo que ayude a estimar la prima de riesgo de la tasa de cambio colombiana -- Basados en Pantoja (2012)1, se pretende la aplicación de un modelo VAR (vectores autorregresivos) para estimar la prima de riesgo de la tasa de cambio, donde se encontró que este modelo no es el modelo más adecuado para explicar la serie de datos utilizada, por lo que se propone un modelo GARCH para modelar la serie -- Se encontró que hay factores fundamentales que explican la prima, como lo son el WTI, el S&P500 y la tasa de cambio EUR/USD
Resumo:
The mobile networks market (focus of this work) strategy is based on the consolidation of the installed structure and the optimization of the already existent resources. The increasingly competition and aggression of this market requires, to the mobile operators, a continuous maintenance and update of the networks in order to obtain the minimum number of fails and provide the best experience for its subscribers. In this context, this dissertation presents a study aiming to assist the mobile operators improving future network modifications. In overview, this dissertation compares several forecasting methods (mostly based on time series analysis) capable of support mobile operators with their network planning. Moreover, it presents several network indicators about the more common bottlenecks.
Resumo:
Uno de los temas más complejos y necesarios en los cursos de Administración de Operaciones, es el uso de los pronósticos con modelos de series de tiempo (TSM por sus siglas en inglés) -- Para facilitar el entendimiento y ayudar a los estudiantes a comprender fácilmente los pronósticos de demanda, este proyecto presenta FOR TSM, una herramienta desarrollada en MS Excel VBA® -- La herramienta fue diseñada con una Interfaz gráfica de Usuario (GUI por sus siglas en inglés) para explicar conceptos fundamentales como la selección de los parámetros, los valores de inicialización, cálculo y análisis de medidas de desempeño y finalmente la selección de modelos
Resumo:
Finding rare events in multidimensional data is an important detection problem that has applications in many fields, such as risk estimation in insurance industry, finance, flood prediction, medical diagnosis, quality assurance, security, or safety in transportation. The occurrence of such anomalies is so infrequent that there is usually not enough training data to learn an accurate statistical model of the anomaly class. In some cases, such events may have never been observed, so the only information that is available is a set of normal samples and an assumed pairwise similarity function. Such metric may only be known up to a certain number of unspecified parameters, which would either need to be learned from training data, or fixed by a domain expert. Sometimes, the anomalous condition may be formulated algebraically, such as a measure exceeding a predefined threshold, but nuisance variables may complicate the estimation of such a measure. Change detection methods used in time series analysis are not easily extendable to the multidimensional case, where discontinuities are not localized to a single point. On the other hand, in higher dimensions, data exhibits more complex interdependencies, and there is redundancy that could be exploited to adaptively model the normal data. In the first part of this dissertation, we review the theoretical framework for anomaly detection in images and previous anomaly detection work done in the context of crack detection and detection of anomalous components in railway tracks. In the second part, we propose new anomaly detection algorithms. The fact that curvilinear discontinuities in images are sparse with respect to the frame of shearlets, allows us to pose this anomaly detection problem as basis pursuit optimization. Therefore, we pose the problem of detecting curvilinear anomalies in noisy textured images as a blind source separation problem under sparsity constraints, and propose an iterative shrinkage algorithm to solve it. Taking advantage of the parallel nature of this algorithm, we describe how this method can be accelerated using graphical processing units (GPU). Then, we propose a new method for finding defective components on railway tracks using cameras mounted on a train. We describe how to extract features and use a combination of classifiers to solve this problem. Then, we scale anomaly detection to bigger datasets with complex interdependencies. We show that the anomaly detection problem naturally fits in the multitask learning framework. The first task consists of learning a compact representation of the good samples, while the second task consists of learning the anomaly detector. Using deep convolutional neural networks, we show that it is possible to train a deep model with a limited number of anomalous examples. In sequential detection problems, the presence of time-variant nuisance parameters affect the detection performance. In the last part of this dissertation, we present a method for adaptively estimating the threshold of sequential detectors using Extreme Value Theory on a Bayesian framework. Finally, conclusions on the results obtained are provided, followed by a discussion of possible future work.
Resumo:
There is increasing evidence of a causal link between airborne particles and ill health and this study examined the exposure to both airborne particles and the gas phase contaminants of environmental tobacco smoke (ETS) in a bar. The work reported here utilized concurrent and continuous monitoring using real-time optical scattering personal samplers to record particulate (PM10) concentrations at two internal locations. Very high episodes were observed in seating areas compared with the bar area. A photo-acoustic multi-gas analyser was used to record the gas phases (CO and CO2) at eight different locations throughout the bar and showed little spatial variation. This gave a clear indication of the problems associated with achieving acceptable Indoor Air Quality in a public space and identified a fundamental problem with the simplistic design approach taken to ventilate the space. Both gaseous and particulate concentrations within the bar were below maximum recommended levels although the time-series analysis illustrated the highly episodic nature of this exposure.
Resumo:
El sector agrícola ha constituido una de las principales fuentes de ingreso para la economía colombiana; sin embargo, carece de un mercado de derivados financieros sólido que permita proteger a los productores y exportadores frente al riesgo de la volatilidad del precio -- Con esta propuesta se busca estimar los rendimientos de conveniencia y los precios teóricos para los futuros de café en Colombia -- Para este propósito, inicialmente se describe el mercado de café en Colombia y posteriormente se modelan el precio del café y su volatilidad con base en variables como el clima y los niveles de inventario -- Finalmente, se estiman las bandas en las cuales oscilaría el precio en caso de que se cumplan ciertas condiciones de no arbitraje, a partir de la metodología diseñada por Díaz y Vanegas (2001) y complementada por Cárcamo y Franco (2012) -- A manera de ilustración, se incorporan los rendimientos de conveniencia y se expone un caso hipotético en un mercado de café en Colombia
Resumo:
Dissertação de Mestrado apresentada ao ISPA - Instituto Universitário
Resumo:
Spasticity is a common disorder in people who have upper motor neuron injury. The involvement may occur at different levels. The Modified Ashworth Scale (MAS) is the most used method to measure involvement levels. But it corresponds to a subjective evaluation. Mechanomyography (MMG) is an objective technique that quantifies the muscle vibration during the contraction and stretching events. So, it may assess the level of spasticity accurately. This study aimed to investigate the correlation between spasticity levels determined by MAS with MMG signal in spastic and not spastic muscles. In the experimental protocol, we evaluated 34 members of 22 volunteers, of both genders, with a mean age of 39.91 ± 13.77 years. We evaluated the levels of spasticity by MAS in flexor and extensor muscle groups of the knee and/or elbow, where one muscle group was the agonist and one antagonist. Simultaneously the assessment by the MAS, caught up the MMG signals. We used a custom MMG equipment to register and record the signals, configured in LabView platform. Using the MatLab computer program, it was processed the MMG signals in the time domain (median energy) and spectral domain (median frequency) for the three motion axes: X (transversal), Y (longitudinal) and Z (perpendicular). For bandwidth delimitation, we used a 3rd order Butterworth filter, acting in the range of 5-50 Hz. Statistical tests as Spearman's correlation coefficient, Kruskal-Wallis test and linear correlation test were applied. As results in the time domain, the Kruskal-Wallis test showed differences in median energy (MMGME) between MAS groups. The linear correlation test showed high linear correlation between MAS and MMGME for the agonist muscle as well as for the antagonist group. The largest linear correlation occurred between the MAS and MMG ME for the Z axis of the agonist muscle group (R2 = 0.9557) and the lowest correlation occurred in the X axis, for the antagonist muscle group (R2 = 0.8862). The Spearman correlation test also confirmed high correlation for all axes in the time domain analysis. In the spectral domain, the analysis showed an increase in the median frequency (MMGMF) in MAS’ greater levels. The highest correlation coefficient between MAS and MMGMF signal occurred in the Z axis for the agonist muscle group (R2 = 0.4883), and the lowest value occurred on the Y axis for the antagonist group (R2 = 0.1657). By means of the Spearman correlation test, the highest correlation occurred between the Y axis of the agonist group (0.6951; p <0.001) and the lowest value on the X axis of the antagonist group (0.3592; p <0.001). We conclude that there was a significantly high correlation between the MMGME and MAS in both muscle groups. Also between MMG and MAS occurred a significant correlation, however moderate for the agonist group, and low for the antagonist group. So, the MMGME proved to be more an appropriate descriptor to correlate with the degree of spasticity defined by the MAS.
Quantificação de açúcares com uma língua eletrónica: calibração multivariada com seleção de sensores
Resumo:
Este trabalho incide na análise dos açúcares majoritários nos alimentos (glucose, frutose e sacarose) com uma língua eletrónica potenciométrica através de calibração multivariada com seleção de sensores. A análise destes compostos permite contribuir para a avaliação do impacto dos açúcares na saúde e seu efeito fisiológico, além de permitir relacionar atributos sensoriais e atuar no controlo de qualidade e autenticidade dos alimentos. Embora existam diversas metodologias analíticas usadas rotineiramente na identificação e quantificação dos açúcares nos alimentos, em geral, estes métodos apresentam diversas desvantagens, tais como lentidão das análises, consumo elevado de reagentes químicos e necessidade de pré-tratamentos destrutivos das amostras. Por isso se decidiu aplicar uma língua eletrónica potenciométrica, construída com sensores poliméricos selecionados considerando as sensibilidades aos açucares obtidas em trabalhos anteriores, na análise dos açúcares nos alimentos, visando estabelecer uma metodologia analítica e procedimentos matemáticos para quantificação destes compostos. Para este propósito foram realizadas análises em soluções padrão de misturas ternárias dos açúcares em diferentes níveis de concentração e em soluções de dissoluções de amostras de mel, que foram previamente analisadas em HPLC para se determinar as concentrações de referência dos açúcares. Foi então feita uma análise exploratória dos dados visando-se remover sensores ou observações discordantes através da realização de uma análise de componentes principais. Em seguida, foram construídos modelos de regressão linear múltipla com seleção de variáveis usando o algoritmo stepwise e foi verificado que embora fosse possível estabelecer uma boa relação entre as respostas dos sensores e as concentrações dos açúcares, os modelos não apresentavam desempenho de previsão satisfatório em dados de grupo de teste. Dessa forma, visando contornar este problema, novas abordagens foram testadas através da construção e otimização dos parâmetros de um algoritmo genético para seleção de variáveis que pudesse ser aplicado às diversas ferramentas de regressão, entre elas a regressão pelo método dos mínimos quadrados parciais. Foram obtidos bons resultados de previsão para os modelos obtidos com o método dos mínimos quadrados parciais aliado ao algoritmo genético, tanto para as soluções padrão quanto para as soluções de mel, com R²ajustado acima de 0,99 e RMSE inferior a 0,5 obtidos da relação linear entre os valores previstos e experimentais usando dados dos grupos de teste. O sistema de multi-sensores construído se mostrou uma ferramenta adequada para a análise dos iii açúcares, quando presentes em concentrações maioritárias, e alternativa a métodos instrumentais de referência, como o HPLC, por reduzir o tempo da análise e o valor monetário da análise, bem como, ter um preparo mínimo das amostras e eliminar produtos finais poluentes.
Resumo:
As usage metrics continue to attain an increasingly central role in library system assessment and analysis, librarians tasked with system selection, implementation, and support are driven to identify metric approaches that simultaneously require less technical complexity and greater levels of data granularity. Such approaches allow systems librarians to present evidence-based claims of platform usage behaviors while reducing the resources necessary to collect such information, thereby representing a novel approach to real-time user analysis as well as dual benefit in active and preventative cost reduction. As part of the DSpace implementation for the MD SOAR initiative, the Consortial Library Application Support (CLAS) division has begun test implementation of the Google Tag Manager analytic system in an attempt to collect custom analytical dimensions to track author- and university-specific download behaviors. Building on the work of Conrad , CLAS seeks to demonstrate that the GTM approach to custom analytics provides both granular metadata-based usage statistics in an approach that will prove extensible for additional statistical gathering in the future. This poster will discuss the methodology used to develop these custom tag approaches, the benefits of using the GTM model, and the risks and benefits associated with further implementation.
Resumo:
High population growth fragmented rural landholdings leading to low harvests and crop yields per acre per annum creating surplus labour that may resort to migration as a coping mechanism in least developing countries including Ethiopia. The main aim of the study is to assess trends and differentials of out-migration in south central Ethiopia. The Butajira demographic surveillance system database from 1987 to 2008 was used to conduct event history analysis. There were 3.97 out-migrations per 100 person years. Probability of out-migration was higher among males, teenagers, the youth, completed primary and secondary plus education; not in marital union; Christians, urbanites; lived in rented and owed house compared to their respective counterparts. The higher chances of out-migration among these groups may have social and economic significance.
Resumo:
Ethernet connections, which are widely used in many computer networks, can suffer from electromagnetic interference. Typically, a degradation of the data transmission rate can be perceived as electromagnetic disturbances lead to corruption of data frames on the network media. In this paper a software-based measuring method is presented, which allows a direct assessment of the effects on the link layer. The results can directly be linked to the physical interaction without the influence of software related effects on higher protocol layers. This gives a simple tool for a quantitative analysis of the disturbance of an Ethernet connection based on time domain data. An example is shown, how the data can be used for further investigation of mechanisms and detection of intentional electromagnetic attacks. © 2015 Author(s).