18 resultados para deterministic volatility function
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Most financial and economic time-series display a strong volatility around their trends. The difficulty in explaining this volatility has led economists to interpret it as exogenous, i.e., as the result of forces that lie outside the scope of the assumed economic relations. Consequently, it becomes hard or impossible to formulate short-run forecasts on asset prices or on values of macroeconomic variables. However, many random looking economic and financial series may, in fact, be subject to deterministic irregular behavior, which can be measured and modelled. We address the notion of endogenous volatility and exemplify the concept with a simple business-cycles model.
Resumo:
This article presents a Markov chain framework to characterize the behavior of the CBOE Volatility Index (VIX index). Two possible regimes are considered: high volatility and low volatility. The specification accounts for deviations from normality and the existence of persistence in the evolution of the VIX index. Since the time evolution of the VIX index seems to indicate that its conditional variance is not constant over time, I consider two different versions of the model. In the first one, the variance of the index is a function of the volatility regime, whereas the second version includes an autoregressive conditional heteroskedasticity (ARCH) specification for the conditional variance of the index.
Resumo:
Renal scintigraphy with 99mTc-dimercaptosuccinic acid (99mTc-DMSA) is performed with the aim of detect cortical abnormalities related to urinary tract infection and accurately quantify relative renal function (RRF). For this quantitative assessment Nuclear Medicine Technologist should draw regions of interest (ROI) around each kidney (KROI) and peri-renal background (BKG) ROI, although, controversy still exists about BKG-ROI. The aim of this work was to evaluate the effect of the normalization procedure, number and location of BKG-ROI on the RRF in 99mTc-DMSA scintigraphy.
Resumo:
Cerebral vascular disease is the primary cause of permanent disability in Portugal. Impaired stability is considered an important feature after stroke as it is related with higher risk of falls and functional dependence. Physiotherapy intervention usually starts early after stroke in order to direct motor recovery and help patients to improve their ability to perform activities of daily living (ADL). Purpose: to investigate the relationship of balance to functionality in acute stroke patients. Methods: 16 subjects (8 women and 8 men), mean age 63,62 ± 2,16y, with unilateral ischemic stroke in the middle cerebral artery territory, who were admitted to physiotherapy department of Fernando Fonseca Hospital in Portugal, within the first month after stroke were recruited to participate in this study. All subjects have no cognitive impairment according to Mini Mental State, no history of lower extremity orthopedic problems and no other disease that could interfere with treatments. All patients gave their inform consent to participate in this study. Subjects were assessed with the Modified Barthel Index (MBI) and the Berg Balance Scale (BBS).
Resumo:
This paper seeks to study the persistence in the G7’s stock market volatility, which is carried out using the GARCH, IGARCH and FIGARCH models. The data set consists of the daily returns of the S&P/TSX 60, CAC 40, DAX 30, MIB 30, NIKKEI 225, FTSE 100 and S&P 500 indexes over the period 1999-2009. The results evidences long memory in volatility, which is more pronounced in Germany, Italy and France. On the other hand, Japan appears as the country where this phenomenon is less obvious; nevertheless, the persistence prevails but with minor intensity.
Resumo:
In this paper we analyze the relationship between volatility in index futures markets and the number of open and closed positions. We observe that, although in general both positions are positively correlated with contemporaneous volatility, in the case of S&P 500, only the number of open positions has influence over the volatility. Additionally, we observe a stronger positive relationship on days characterized by extreme movements of these contracting movements dominating the market. Finally, our findings suggest that day-traders are not associated to an increment of volatility, whereas uninformed traders, both opening and closing their positions, have to do with it.
Resumo:
Financial literature and financial industry use often zero coupon yield curves as input for testing hypotheses, pricing assets or managing risk. They assume this provided data as accurate. We analyse implications of the methodology and of the sample selection criteria used to estimate the zero coupon bond yield term structure on the resulting volatility of spot rates with different maturities. We obtain the volatility term structure using historical volatilities and Egarch volatilities. As input for these volatilities we consider our own spot rates estimation from GovPX bond data and three popular interest rates data sets: from the Federal Reserve Board, from the US Department of the Treasury (H15), and from Bloomberg. We find strong evidence that the resulting zero coupon bond yield volatility estimates as well as the correlation coefficients among spot and forward rates depend significantly on the data set. We observe relevant differences in economic terms when volatilities are used to price derivatives.
Resumo:
In this paper our aim is to gain a better understanding of the relationship between market volatility and industrial structure. As conflicting results have been documented regarding the relationship between market industry concentration and market volatility, this study investigates this relationship in the time series. We have found that this relationship is only significant and positive for Spain. Our results suggest that we cannot generalize across different countries that market industrial structure (concentration) is a significant factor in explaining market volatility.
Resumo:
The aim of this paper is to analyze the forecasting ability of the CARR model proposed by Chou (2005) using the S&P 500. We extend the data sample, allowing for the analysis of different stock market circumstances and propose the use of various range estimators in order to analyze their forecasting performance. Our results show that there are two range-based models that outperform the forecasting ability of the GARCH model. The Parkinson model is better for upward trends and volatilities which are higher and lower than the mean while the CARR model is better for downward trends and mean volatilities.
Resumo:
Introduction - Cerebrovascular diseases, and among them, cerebral vascular accidents, are one of the main causes of morbidity and disability at European Union countries. Clinical framework resulting from these diseases include important limitations in functional ability of the these patients Postural control dysfunctions are one of the most common and devastating consequences of a stroke interfering with function and autonomy and affecting different aspects of people’s life and contributing to decrease quality of life. Neurological physiotherapy plays a central role in the recovery of movement and posture, however it is necessary to study the efficacy of techniques that physiotherapists use to treat these problems. Objectives - The aim of this study was to investigate the effects of a physiotherapy intervention program, based on oriented tasks and strengthening of the affected lower limb, on balance and functionality of individuals who have suffered a stroke. In addition our study aimed to investigate the effect of strength training of the affected lower limb on muscle tone.
Resumo:
A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). Methods: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. Results: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. Conclusion: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.
Resumo:
Renal scintigraphy with 99mTc-dimercaptosuccinic acid (99mTc-DMSA) is performed with the aim of detect cortical abnormalities related to urinary tract infection and accurately quantify relative renal function (RRF). For this quantitative assessment Nuclear Medicine Technologist should draw regions of interest (ROI) around each kidney (KROI) and peri-renal background (BKG) ROI although controversy still exists about BKG-ROI. The aim of this work was to evaluate the effect of the normalization procedure, number and location of BKG-ROI on the RRF in 99mTc-DMSA scintigraphy.
Resumo:
Aims - To compare reading performance in children with and without visual function anomalies and identify the influence of abnormal visual function and other variables in reading ability. Methods - A cross-sectional study was carried in 110 children of school age (6-11 years) with Abnormal Visual Function (AVF) and 562 children with Normal Visual Function (NVF). An orthoptic assessment (visual acuity, ocular alignment, near point of convergence and accommodation, stereopsis and vergences) and autorefraction was carried out. Oral reading was analyzed (list of 34 words). Number of errors, accuracy (percentage of success) and reading speed (words per minute - wpm) were used as reading indicators. Sociodemographic information from parents (n=670) and teachers (n=34) was obtained. Results - Children with AVF had a higher number of errors (AVF=3.00 errors; NVF=1.00 errors; p<0.001), a lower accuracy (AVF=91.18%; NVF=97.06%; p<0.001) and reading speed (AVF=24.71 wpm; NVF=27.39 wpm; p=0.007). Reading speed in the 3rd school grade was not statistically different between the two groups (AVF=31.41 wpm; NVF=32.54 wpm; p=0.113). Children with uncorrected hyperopia (p=0.003) and astigmatism (p=0.019) had worst reading performance. Children in 2nd, 3rd, or 4th grades presented a lower risk of having reading impairment when compared with the 1st grade. Conclusion - Children with AVF had reading impairment in the first school grade. It seems that reading abilities have a wide variation and this disparity lessens in older children. The slow reading characteristics of the children with AVF are similar to dyslexic children, which suggest the need for an eye evaluation before classifying the children as dyslexic.
Resumo:
3D laser scanning is becoming a standard technology to generate building models of a facility's as-is condition. Since most constructions are constructed upon planar surfaces, recognition of them paves the way for automation of generating building models. This paper introduces a new logarithmically proportional objective function that can be used in both heuristic and metaheuristic (MH) algorithms to discover planar surfaces in a point cloud without exploiting any prior knowledge about those surfaces. It can also adopt itself to the structural density of a scanned construction. In this paper, a metaheuristic method, genetic algorithm (GA), is used to test this introduced objective function on a synthetic point cloud. The results obtained show the proposed method is capable to find all plane configurations of planar surfaces (with a wide variety of sizes) in the point cloud with a minor distance to the actual configurations. © 2014 IEEE.
Resumo:
Background - Medical image perception research relies on visual data to study the diagnostic relationship between observers and medical images. A consistent method to assess visual function for participants in medical imaging research has not been developed and represents a significant gap in existing research. Methods - Three visual assessment factors appropriate to observer studies were identified: visual acuity, contrast sensitivity, and stereopsis. A test was designed for each, and 30 radiography observers (mean age 31.6 years) participated in each test. Results - Mean binocular visual acuity for distance was 20/14 for all observers. The difference between observers who did and did not use corrective lenses was not statistically significant (P = .12). All subjects had a normal value for near visual acuity and stereoacuity. Contrast sensitivity was better than population norms. Conclusion - All observers had normal visual function and could participate in medical imaging visual analysis studies. Protocols of evaluation and populations norms are provided. Further studies are necessary to understand fully the relationship between visual performance on tests and diagnostic accuracy in practice.