879 resultados para time varying parameter model
Resumo:
This paper proposes a test statistic for the null hypothesis of panel stationarity that allows for the presence of multiple structural breaks. Two different speci¿cations are considered depending on the structural breaks affecting the individual effects and/or the time trend. The model is ¿exible enough to allow the number of breaks and their position to differ across individuals. The test is shown to have an exact limit distribution with a good ¿nite sample performance. Its application to a typical panel data set of real per capita GDP gives support to the trend stationarity of these series
Resumo:
Long-run economic growth arouses a great interest since it can shed light on the income-path of an economy and try to explain the large differences in income we observe across countries and over time. The neoclassical model has been followed by several endogenous growth models which, contrarily to the former, seem to predict that economies with similar preferences and technological level, do not necessarily tend to converge to similar per capita income levels. This paper attempts to show a possible mechanismthrough which macroeconomic disequilibria and inefficiencies, represented by budget deficits, may hinder human capital accumulation and therefore economic growth. Using a mixed education system, deficit is characterized as a bug agent which may end up sharply reducing the resources devoted to education and training. The paper goes a step further from the literature on deficit by introducing a rich dynamic analysis of the effects of a deficit reduction on different economic aspects.Following a simple growth model and allowing for slight changes in the law of human capital accumulation, we reach a point where deficit might sharply reduce human capital accumulation. On the other hand, a deficit reduction carried on for a long time, taking that reduction as a more efficient management of the economy, may prove useful in inducing endogenous growth. Empirical evidence for a sample of countries seems to support the theoretical assumptions in the model: (1) evidence on an inverse relationship betweendeficit and human capital accumulation, (2) presence of a strongly negative associationbetween the quantity of deficit in the economy and the rate of growth. They may prove a certain role for budget deficit in economic growth
Resumo:
Long-run economic growth arouses a great interest since it can shed light on the income-path of an economy and try to explain the large differences in income we observe across countries and over time. The neoclassical model has been followed by several endogenous growth models which, contrarily to the former, seem to predict that economies with similar preferences and technological level, do not necessarily tend to converge to similar per capita income levels. This paper attempts to show a possible mechanismthrough which macroeconomic disequilibria and inefficiencies, represented by budget deficits, may hinder human capital accumulation and therefore economic growth. Using a mixed education system, deficit is characterized as a bug agent which may end up sharply reducing the resources devoted to education and training. The paper goes a step further from the literature on deficit by introducing a rich dynamic analysis of the effects of a deficit reduction on different economic aspects.Following a simple growth model and allowing for slight changes in the law of human capital accumulation, we reach a point where deficit might sharply reduce human capital accumulation. On the other hand, a deficit reduction carried on for a long time, taking that reduction as a more efficient management of the economy, may prove useful in inducing endogenous growth. Empirical evidence for a sample of countries seems to support the theoretical assumptions in the model: (1) evidence on an inverse relationship betweendeficit and human capital accumulation, (2) presence of a strongly negative associationbetween the quantity of deficit in the economy and the rate of growth. They may prove a certain role for budget deficit in economic growth
Resumo:
This paper proposes a test statistic for the null hypothesis of panel stationarity that allows for the presence of multiple structural breaks. Two different speci¿cations are considered depending on the structural breaks affecting the individual effects and/or the time trend. The model is ¿exible enough to allow the number of breaks and their position to differ across individuals. The test is shown to have an exact limit distribution with a good ¿nite sample performance. Its application to a typical panel data set of real per capita GDP gives support to the trend stationarity of these series
Resumo:
Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.
Resumo:
Neuronal oscillations are an important aspect of EEG recordings. These oscillations are supposed to be involved in several cognitive mechanisms. For instance, oscillatory activity is considered a key component for the top-down control of perception. However, measuring this activity and its influence requires precise extraction of frequency components. This processing is not straightforward. Particularly, difficulties with extracting oscillations arise due to their time-varying characteristics. Moreover, when phase information is needed, it is of the utmost importance to extract narrow-band signals. This paper presents a novel method using adaptive filters for tracking and extracting these time-varying oscillations. This scheme is designed to maximize the oscillatory behavior at the output of the adaptive filter. It is then capable of tracking an oscillation and describing its temporal evolution even during low amplitude time segments. Moreover, this method can be extended in order to track several oscillations simultaneously and to use multiple signals. These two extensions are particularly relevant in the framework of EEG data processing, where oscillations are active at the same time in different frequency bands and signals are recorded with multiple sensors. The presented tracking scheme is first tested with synthetic signals in order to highlight its capabilities. Then it is applied to data recorded during a visual shape discrimination experiment for assessing its usefulness during EEG processing and in detecting functionally relevant changes. This method is an interesting additional processing step for providing alternative information compared to classical time-frequency analyses and for improving the detection and analysis of cross-frequency couplings.
Resumo:
We present a continuous time random walk model for the scale-invariant transport found in a self-organized critical rice pile [K. Christensen et al., Phys. Rev. Lett. 77, 107 (1996)]. From our analytical results it is shown that the dynamics of the experiment can be explained in terms of Lvy flights for the grains and a long-tailed distribution of trapping times. Scaling relations for the exponents of these distributions are obtained. The predicted microscopic behavior is confirmed by means of a cellular automaton model.
Resumo:
This study aimed to investigate the behaviour of two indicators of influenza activity in the area of Barcelona and to evaluate the usefulness of modelling them to improve the detection of influenza epidemics. DESIGN: Descriptive time series study using the number of deaths due to all causes registered by funeral services and reported cases of influenza-like illness. The study concentrated on five influenza seasons, from week 45 of 1988 to week 44 of 1993. The weekly number of deaths and cases of influenza-like illness registered were processed using identification of a time series ARIMA model. SETTING: Six large towns in the Barcelona province which have more than 60,000 inhabitants and funeral services in all of them. MAIN RESULTS: For mortality, the proposed model was an autoregressive one of order 2 (ARIMA (2,0,0)) and for morbidity it was one of order 3 (ARIMA (3,0,0)). Finally, the two time series were analysed together to facilitate the detection of possible implications between them. The joint study of the two series shows that the mortality series can be modelled separately from the reported morbidity series, but the morbidity series is influenced as much by the number of previous cases of influenza reported as by the previous mortality registered. CONCLUSIONS: The model based on general mortality is useful for detecting epidemic activity of influenza. However, because there is not an absolute gold standard that allows definition of the beginning of the epidemic, the final decision of when it is considered an epidemic and control measures recommended should be taken after evaluating all the indicators included in the influenza surveillance programme.
Resumo:
Laboratory and greenhouse studies were conducted with an artificial dry diet to rear nymphs, and with an artificial plant as substrate for egg laying by the southern green stink bug, Nezara viridula (L.). The artificial diet was composed of: soybean protein (15 g); potato starch (7.5 g); dextrose (7.5 g); sucrose (2.5 g); cellulose (12.5 g); vitamin mixture (niacinamide 1 g, calcium pantothenate 1 g, thiamine 0.25 g, riboflavin 0.5 g, pyridoxine 0.25 g, folic acid 0.25 g, biotin 0.02 mL, vitamin B12 1 g - added to 1,000 mL of distilled water) (5.0 mL); soybean oil (20 mL); wheat germ (17.9 g); and water (30 mL). Nymphs showed normal feeding behavior when fed on the artificial diet. Nymphal development time was longer than or similar to that of nymphs fed on soybean pods. Total nymphal mortality was low (ca. 30%), both for nymphs reared on the artificial diet, and for nymphs fed on soybean pods. At adult emergence, fresh body weights were significantly (P<0.01) less on the artificial diet than on soybean pods. Despite the lower adult survivorship and fecundity on artificial plants than on soybean plants, it was demonstrated for the first time that a model simulating a natural plant, can be used as a substrate for egg mass laying, in conjunction with the artificial diet.
Resumo:
We propose new methods for evaluating predictive densities. The methods includeKolmogorov-Smirnov and Cram?r-von Mises-type tests for the correct specification ofpredictive densities robust to dynamic mis-specification. The novelty is that the testscan detect mis-specification in the predictive densities even if it appears only overa fraction of the sample, due to the presence of instabilities. Our results indicatethat our tests are well sized and have good power in detecting mis-specification inpredictive densities, even when it is time-varying. An application to density forecastsof the Survey of Professional Forecasters demonstrates the usefulness of the proposedmethodologies.
Resumo:
Two likelihood ratio (LR) approaches are presented to evaluate the strength of evidence of MDMA tablet comparisons. The first one is based on a more 'traditional' comparison of MDMA tablets by using distance measures (e.g., Pearson correlation distance or a Euclidean distance). In this approach, LRs are calculated using the distribution of distances between tablets of the same-batch and that of different-batches. The second approach is based on methods used in some other fields of forensic comparison. Here LRs are calculated based on the distribution of values of MDMA tablet characteristics within a specific batch and from all batches. The data used in this paper must be seen as examples to illustrate both methods. In future research the methods can be applied to other and more complex data. In this paper, the methods and their results are discussed, considering their performance in evidence evaluation and several practical aspects. With respect to evidence in favor of the correct hypothesis, the second method proved to be better than the first one. It is shown that the LRs in same-batch comparisons are generally higher compared to the first method and the LRs in different-batch comparisons are generally lower. On the other hand, for operational purposes (where quick information is needed), the first method may be preferred, because it is less time consuming. With this method a model has to be estimated only once in a while, which means that only a few measurements have to be done, while with the second method more measurements are needed because each time a new model has to be estimated.
Resumo:
BACKGROUND: This study describes seasonality of congenital anomalies in Europe to provide a baseline against which to assess the impact of specific time varying exposures such as the H1N1 pandemic influenza, and to provide a comprehensive and recent picture of seasonality and its possible relation to etiologic factors. METHODS: Data on births conceived in 2000 to 2008 were extracted from 20 European Surveillance for Congenital Anomalies population-based congenital anomaly registries in 14 European countries. We performed Poisson regression analysis encompassing sine and cosine terms to investigate seasonality of 65,764 nonchromosomal and 12,682 chromosomal congenital anomalies covering 3.3 million births. Analysis was performed by estimated month of conception. Analyses were performed for 86 congenital anomaly subgroups, including a combined subgroup of congenital anomalies previously associated with influenza. RESULTS: We detected statistically significant seasonality in prevalence of anomalies previously associated with influenza, but the conception peak was in June (2.4% excess). We also detected seasonality in congenital cataract (April conceptions, 27%), hip dislocation and/or dysplasia (April, 12%), congenital hydronephrosis (July, 12%), urinary defects (July, 5%), and situs inversus (December, 36%), but not for nonchromosomal anomalies combined, chromosomal anomalies combined, or other anomalies analyzed. CONCLUSION: We have confirmed previously described seasonality for congenital cataract and hip dislocation and/or dysplasia, and found seasonality for congenital hydronephrosis and situs inversus which have not previously been studied. We did not find evidence of seasonality for several anomalies which had previously been found to be seasonal. Influenza does not appear to be an important factor in the seasonality of congenital anomalies.
Resumo:
OBJECTIVE: To estimate the effect of combined antiretroviral therapy (cART) on mortality among HIV-infected individuals after appropriate adjustment for time-varying confounding by indication. DESIGN: A collaboration of 12 prospective cohort studies from Europe and the United States (the HIV-CAUSAL Collaboration) that includes 62 760 HIV-infected, therapy-naive individuals followed for an average of 3.3 years. Inverse probability weighting of marginal structural models was used to adjust for measured confounding by indication. RESULTS: Two thousand and thirty-nine individuals died during the follow-up. The mortality hazard ratio was 0.48 (95% confidence interval 0.41-0.57) for cART initiation versus no initiation. In analyses stratified by CD4 cell count at baseline, the corresponding hazard ratios were 0.29 (0.22-0.37) for less than 100 cells/microl, 0.33 (0.25-0.44) for 100 to less than 200 cells/microl, 0.38 (0.28-0.52) for 200 to less than 350 cells/microl, 0.55 (0.41-0.74) for 350 to less than 500 cells/microl, and 0.77 (0.58-1.01) for 500 cells/microl or more. The estimated hazard ratio varied with years since initiation of cART from 0.57 (0.49-0.67) for less than 1 year since initiation to 0.21 (0.14-0.31) for 5 years or more (P value for trend <0.001). CONCLUSION: We estimated that cART halved the average mortality rate in HIV-infected individuals. The mortality reduction was greater in those with worse prognosis at the start of follow-up.
Resumo:
One signature of adaptive radiation is a high level of trait change early during the diversification process and a plateau toward the end of the radiation. Although the study of the tempo of evolution has historically been the domain of paleontologists, recently developed phylogenetic tools allow for the rigorous examination of trait evolution in a tremendous diversity of organisms. Enemy-driven adaptive radiation was a key prediction of Ehrlich and Raven's coevolutionary hypothesis [Ehrlich PR, Raven PH (1964) Evolution 18:586-608], yet has remained largely untested. Here we examine patterns of trait evolution in 51 North American milkweed species (Asclepias), using maximum likelihood methods. We study 7 traits of the milkweeds, ranging from seed size and foliar physiological traits to defense traits (cardenolides, latex, and trichomes) previously shown to impact herbivores, including the monarch butterfly. We compare the fit of simple random-walk models of trait evolution to models that incorporate stabilizing selection (Ornstein-Ulenbeck process), as well as time-varying rates of trait evolution. Early bursts of trait evolution were implicated for 2 traits, while stabilizing selection was implicated for several others. We further modeled the relationship between trait change and species diversification while allowing rates of trait evolution to vary during the radiation. Species-rich lineages underwent a proportionately greater decline in latex and cardenolides relative to species-poor lineages, and the rate of trait change was most rapid early in the radiation. An interpretation of this result is that reduced investment in defensive traits accelerated diversification, and disproportionately so, early in the adaptive radiation of milkweeds.
Resumo:
BACKGROUND AND PURPOSE: The posterior circulation Acute Stroke Prognosis Early CT Score (pc-APECTS) applied to CT angiography source images (CTA-SI) predicts the functional outcome of patients in the Basilar Artery International Cooperation Study (BASICS). We assessed the diagnostic and prognostic impact of pc-ASPECTS applied to perfusion CT (CTP) in the BASICS registry population. METHODS: We applied pc-ASPECTS to CTA-SI and cerebral blood flow (CBF), cerebral blood volume (CBV), and mean transit time (MTT) parameter maps of BASICS patients with CTA and CTP studies performed. Hypoattenuation on CTA-SI, relative reduction in CBV or CBF, or relative increase in MTT were rated as abnormal. RESULTS: CTA and CTP were available in 27/592 BASICS patients (4.6%). The proportion of patients with any perfusion abnormality was highest for MTT (93%; 95% confidence interval [CI], 76%-99%), compared with 78% (58%-91%) for CTA-SI and CBF, and 46% (27%-67%) for CBV (P < .001). All 3 patients with a CBV pc-ASPECTS < 8 compared to 6/23 patients with a CBV pc-ASPECTS ≥ 8 had died at 1 month (RR 3.8; 95% CI, 1.9-7.6). CONCLUSION: CTP was performed in a minority of the BASICS registry population. Perfusion disturbances in the posterior circulation were most pronounced on MTT parameter maps. CBV pc-ASPECTS < 8 may indicate patients with high case fatality.