848 resultados para Intraday volatility


Relevância:

10.00% 10.00%

Publicador:

Resumo:

[cat] Com afecten l’obertura comercial i financera a la volatilitat macroeconòmica? La literatura existent, tant empírica com teòrica, no ha assolit encara un consens. Aquest article usa un model microfonamentat de dos països simètrics amb entrada endògena d’empreses per estudiar-ho. L’anàlisis es du a terme per tres règims econòmics diferents amb diferents nivells d’integració internacional: una economia tancada, una autarquia financera i una integració plena. Es consideren diversos nivells d’obertura comercial, en forma de biaix domèstic de la demanda i l’economia pot patir pertorbacions en la productivitat del treball i en innovació. El model conclou que la incertesa macroeconòmica, representada principalment per la volatilitat del consum, la producció i la relació real d’intercanvi internacional, depèn del grau d’obertura i del tipus de pertorbació.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Con este trabajo revisamos los Modelos de niveles de las tasas de intereses en Chile. Además de los Modelos de Nivel tradicionales por Chan, Karoly, Longstaff y Lijadoras (1992) en EE. UU, y Parisi (1998) en Chile, por el método de Probabilidad Maximun permitimos que la volatilidad condicional también incluya los procesos inesperados de la información (el modelo GARCH ) y también que la volatilidad sea la función del nivel de la tasa de intereses (modelo TVP-NIVELE) como en Brenner, Harjes y la Crona (1996). Para esto usamos producciones de mercado de bonos de reconocimiento, en cambio las producciones mensuales medias de subasta PDBC, y la ampliación del tamaño y la frecuencia de la muestra a 4 producciones semanales con términos(condiciones) diferentes a la madurez: 1 año, 5 años, 10 años y 15 años. Los resultados principales del estudio pueden ser resumidos en esto: la volatilidad de los cambios inesperados de las tarifas depende positivamente del nivel de las tarifas, sobre todo en el modelo de TVP-NIVEL. Obtenemos pruebas de reversión tacañas, tal que los incrementos en las tasas de intereses no eran independientes, contrariamente a lo obtenido por Brenner. en EE. UU. Los modelos de NIVELES no son capaces de ajustar apropiadamente la volatilidad en comparación con un modelo GARCH (1,1), y finalmente, el modelo de TVP-NIVEL no vence los resultados del modelo GARCH (1,1)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In studies of the natural history of HIV-1 infection, the time scale of primary interest is the time since infection. Unfortunately, this time is very often unknown for HIV infection and using the follow-up time instead of the time since infection is likely to provide biased results because of onset confounding. Laboratory markers such as the CD4 T-cell count carry important information concerning disease progression and can be used to predict the unknown date of infection. Previous work on this topic has made use of only one CD4 measurement or based the imputation on incident patients only. However, because of considerable intrinsic variability in CD4 levels and because incident cases are different from prevalent cases, back calculation based on only one CD4 determination per person or on characteristics of the incident sub-cohort may provide unreliable results. Therefore, we propose a methodology based on the repeated individual CD4 T-cells marker measurements that use both incident and prevalent cases to impute the unknown date of infection. Our approach uses joint modelling of the time since infection, the CD4 time path and the drop-out process. This methodology has been applied to estimate the CD4 slope and impute the unknown date of infection in HIV patients from the Swiss HIV Cohort Study. A procedure based on the comparison of different slope estimates is proposed to assess the goodness of fit of the imputation. Results of simulation studies indicated that the imputation procedure worked well, despite the intrinsic high volatility of the CD4 marker.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Preventive treatment may avoid future cases of tuberculosis among asylum seekers. The effectiveness of preventive treatment depends in large part on treatment completion. METHODS: In a prospective cohort study, asylum seekers of two of the Swiss Canton Vaud migration centres were screened with the Interferon Gamma Release Assay (IGRA). Those with a positive IGRA were referred for medical examination. Individuals with active or past tuberculosis were excluded. Preventive treatment was offered to all participants with positive IGRA but without active tuberculosis. The adherence was assessed during monthly follow-up. RESULTS: From a population of 393 adult migrants, 98 (24.9%) had a positive IGRA. Eleven did not attend the initial medical assessment. Of the 87 examined, eight presented with pulmonary disease (five of them received a full course of antituberculous therapy), two had a history of prior tuberculosis treatment and two had contraindications to treatment. Preventive treatment was offered to 75 individuals (4 months rifampicin in 74 and 9 months isoniazid in one), of whom 60 (80%) completed the treatment. CONCLUSIONS: The vulnerability and the volatility of this population make screening and observance of treatment difficult. It seems possible to obtain a high rate of completion using a short course of treatment in a closely monitored population living in stable housing conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quadrennial need study was developed to assist in identifying county highway financial needs (construction, rehabilitation, maintenance, and administration) and in the distribution of the road use tax fund (RUTF) among the counties in the state. During the period since the need study was first conducted using HWYNEEDS software, between 1982 and 1998, there have been large fluctuations in the level of funds distributed to individual counties. A recent study performed by Jim Cable (HR-363, 1993), found that one of the major factors affecting the volatility in the level of fluctuations is the quality of the pavement condition data collected and the accuracy of these data. In 1998, the Center for Transportation Research and Education researchers (Maze and Smadi) completed a project to study the feasibility of using automated pavement condition data collected for the Iowa Pavement Management Program (IPMP) for the paved county roads to be used in the HWYNEEDS software (TR-418). The automated condition data are objective and also more current since they are collected in a two year cycle compared to the 10-year cycle used by HWYNEEDS right now. The study proved the use of the automated condition data in HWYNEEDS would be feasible and beneficial in educing fluctuations when applied to a pilot study area. In another recommendation from TR-418, the researchers recommended a full analysis and investigation of HWYNEEDS methodology and parameters (for more information on the project, please review the TR-418 project report). The study reported in this document builds on the previous study on using the automated condition data in HWYNEEDS and covers the analysis and investigation of the HWYNEEDS computer program methodology and parameters. The underlying hypothesis for this study is thatalong with the IPMP automated condition data, some changes need to be made to HWYNEEDS parameters to accommodate the use of the new data, which will stabilize the process of allocating resources and reduce fluctuations from one quadrennial need study to another. Another objective of this research is to investigate the gravel roads needs and study the feasibility of developing a more objective approach to determining needs on the counties gravel road network. This study identifies new procedures by which the HWYNEEDS computer program is used to conduct the quadrennial needs study on paved roads. Also, a new procedure will be developed to determine gravel roads needs outside of the HWYNEED program. Recommendations are identified for the new procedures and also in terms of making changes to the current quadrennial need study. Future research areas are also identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Con este trabajo revisamos los Modelos de niveles de las tasas de intereses en Chile. Además de los Modelos de Nivel tradicionales por Chan, Karoly, Longstaff y Lijadoras (1992) en EE. UU, y Parisi (1998) en Chile, por el método de Probabilidad Maximun permitimos que la volatilidad condicional también incluya los procesos inesperados de la información (el modelo GARCH ) y también que la volatilidad sea la función del nivel de la tasa de intereses (modelo TVP-NIVELE) como en Brenner, Harjes y la Crona (1996). Para esto usamos producciones de mercado de bonos de reconocimiento, en cambio las producciones mensuales medias de subasta PDBC, y la ampliación del tamaño y la frecuencia de la muestra a 4 producciones semanales con términos(condiciones) diferentes a la madurez: 1 año, 5 años, 10 años y 15 años. Los resultados principales del estudio pueden ser resumidos en esto: la volatilidad de los cambios inesperados de las tarifas depende positivamente del nivel de las tarifas, sobre todo en el modelo de TVP-NIVEL. Obtenemos pruebas de reversión tacañas, tal que los incrementos en las tasas de intereses no eran independientes, contrariamente a lo obtenido por Brenner. en EE. UU. Los modelos de NIVELES no son capaces de ajustar apropiadamente la volatilidad en comparación con un modelo GARCH (1,1), y finalmente, el modelo de TVP-NIVEL no vence los resultados del modelo GARCH (1,1)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work the valuation methodology of compound option written on a downand-out call option, developed by Ericsson and Reneby (2003), has been applied to deduce a credit risk model. It is supposed that the firm has a debt structure with two maturity dates and that the credit event takes place when the assets firm value falls under a determined level called barrier. An empirical application of the model for 105 firms of Spanish continuous market is carried out. For each one of them its value in the date of analysis, the volatility and the critical value are obtained and from these, the default probability to short and long-term and the implicit probability in the two previous probabilities are deduced. The results are compared with the ones obtained from the Geskemodel (1977).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[cat] Com afecten l’obertura comercial i financera a la volatilitat macroeconòmica? La literatura existent, tant empírica com teòrica, no ha assolit encara un consens. Aquest article usa un model microfonamentat de dos països simètrics amb entrada endògena d’empreses per estudiar-ho. L’anàlisis es du a terme per tres règims econòmics diferents amb diferents nivells d’integració internacional: una economia tancada, una autarquia financera i una integració plena. Es consideren diversos nivells d’obertura comercial, en forma de biaix domèstic de la demanda i l’economia pot patir pertorbacions en la productivitat del treball i en innovació. El model conclou que la incertesa macroeconòmica, representada principalment per la volatilitat del consum, la producció i la relació real d’intercanvi internacional, depèn del grau d’obertura i del tipus de pertorbació.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this project was to determine the feasibility of using pavement condition data collected for the Iowa Pavement Management Program (IPMP) as input to the Iowa Quadrennial Need Study. The need study, conducted by the Iowa Department of Transportation (Iowa DOT) every four years, currently uses manually collected highway infrastructure condition data (roughness, rutting, cracking, etc.). Because of the Iowa DOT's 10-year data collection cycles, condition data for a given highway segment may be up to 10 years old. In some cases, the need study process has resulted in wide fluctuations in funding allocated to individual Iowa counties from one study to the next. This volatility in funding levels makes it difficult for county engineers to plan and program road maintenance and improvements. One possible remedy is to input more current and less subjective infrastructure condition data. The IPMP was initially developed to satisfy the Intermodal Surface Transportation Efficiency Act (ISTEA) requirement that federal-aid-eligible highways be managed through a pavement management system. Currently all metropolitan planning organizations (MPOs) in Iowa and 15 of Iowa's 18 RPAs participate in the IPMP. The core of this program is a statewide data base of pavement condition and construction history information. The pavement data are collected by machine in two-year cycles. Using pilot areas, researchers examined the implications of using the automated data collected for the IPMP as input to the need study computer program, HWYNEEDS. The results show that using the IPMP automated data in HWYNEEDS is feasible and beneficial, resulting in less volatility in the level of total need between successive quadrennial need studies. In other words, the more current the data, the smaller the shift in total need.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The article presents and discusses long-run series of per capita GDP and life expectancy for Italy and Spain (1861-2008). After refining the available estimates in order to make them comparable and with the avail of the most up-to-date researches, the main changes in the international economy and in technological and sociobiological regimes are used as analytical frameworks to re-assess the performances of the two countries; then structural breaks are searched for and Granger causality between the two variables is investigated. The long-run convergence notwithstanding, significant cyclical differences between the two countries can be detected: Spain began to modernize later in GDP, with higher volatility in life expectancy until recent decades; by contrast, Italy showed a more stable pattern of life expectancy, following early breaks in per capita GDP, but also a negative GDP break in the last decades. Our series confirm that, whereas at the early stages of development differences in GDP tend to mirror those in life expectancy, this is no longer true at later stages of development, when, if any, there seems to be a negative correlation between GDP and life expectancy: this finding is in line with the thesis of a non-monotonic relation between life expectancy and GDP and is supported by tests of Granger causality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the determinants of political myopia in a rational model of electoral accountability where the key elements are informational frictions and uncertainty. We build aframework where political ability is ex-ante unknown and policy choices are not perfectlyobservable. On the one hand, elections improve accountability and allow to keep well-performing incumbents. On the other, politicians invest too little in costly policies withfuture returns in an attempt to signal high ability and increase their reelection probability.Contrary to the conventional wisdom, uncertainty reduces political myopia and may, undersome conditions, increase social welfare. We use the model to study how political rewardscan be set so as to maximise social welfare and the desirability of imposing a one-term limitto governments. The predictions of our theory are consistent with a number of stylised factsand with a new empirical observation documented in this paper: aggregate uncertainty, measured by economic volatility, is associated to better fiscal discipline in a panel of 20 OECDcountries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Capillary electrophoresis has drawn considerable attention in the past few years, particularly in the field of chiral separations because of its high separation efficiency. However, its routine use in therapeutic drug monitoring is hampered by its low sensitivity due to a short optical path. We have developed a capillary zone electrophoresis (CZE) method using 2mM of hydroxypropyl-β-cyclodextrin as a chiral selector, which allows base-to-base separation of the enantiomers of mianserin (MIA), desmethylmianserin (DMIA), and 8-hydroxymianserin (OHMIA). Through the use of an on-column sample concentration step after liquid-liquid extraction from plasma and through the presence of an internal standard, the quantitation limits were found to be 5 ng/mL for each enantiomer of MIA and DMIA and 15 ng/mL for each enantiomer of OHMIA. To our knowledge, this is the first published CE method that allows its use for therapeutic monitoring of antidepressants due to its sensitivity down to the low nanogram range. The variability of the assays, as assessed by the coefficients of variation (CV) measured at two concentrations for each substance, ranged from 2 to 14% for the intraday (eight replicates) and from 5 to 14% for the interday (eight replicates) experiments. The deviations from the theoretical concentrations, which represent the accuracy of the method, were all within 12.5%. A linear response was obtained for all compounds within the range of concentrations used for the calibration curves (10-150 ng/mL for each enantiomer of MIA and DMIA and 20-300 ng/mL for each enantiomer of OHMIA). Good correlations were calculated between [(R) + (S)]-MIA and DMIA concentrations measured in plasma samples of 20 patients by a nonchiral gas chromatography method and CZE, and between the (R)- and (S)-concentrations of MIA and DMIA measured in plasma samples of 37 patients by a previously described chiral high-performance liquid chromatography method and CZE. Finally, no interference was noted from more than 20 other psychotropic drugs. Thus, this method, which is both sensitive and selective, can be routinely used for therapeutic monitoring of the enantiomers of MIA and its metabolites. It could be very useful due to the demonstrated interindividual variability of the stereoselective metabolism of MIA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simple method determining airborne monoethanolamine has been developed. Monoethanolamine determination has traditionally been difficult due to analytical separation problems. Even in recent sophisticated methods, this difficulty remains as the major issue often resulting in time-consuming sample preparations. Impregnated glass fiber filters were used for sampling. Desorption of monoethanolamine was followed by capillary GC analysis and nitrogen phosphorous selective detection. Separation was achieved using a specific column for monoethanolamines (35% diphenyl and 65% dimethyl polysiloxane). The internal standard was quinoline. Derivatization steps were not needed. The calibration range was 0.5-80 μg/mL with a good correlation (R(2) = 0.996). Averaged overall precisions and accuracies were 4.8% and -7.8% for intraday (n = 30), and 10.5% and -5.9% for interday (n = 72). Mean recovery from spiked filters was 92.8% for the intraday variation, and 94.1% for the interday variation. Monoethanolamine on stored spiked filters was stable for at least 4 weeks at 5°C. This newly developed method was used among professional cleaners and air concentrations (n = 4) were 0.42 and 0.17 mg/m(3) for personal and 0.23 and 0.43 mg/m(3) for stationary measurements. The monoethanolamine air concentration method described here was simple, sensitive, and convenient both in terms of sampling and analytical analysis.