892 resultados para estimating conditional probabilities
Resumo:
For a given self-map f of M, a closed smooth connected and simply-connected manifold of dimension m ≥ 4, we provide an algorithm for estimating the values of the topological invariant Dm r [f], which equals the minimal number of r-periodic points in the smooth homotopy class of f. Our results are based on the combinatorial scheme for computing Dm r [f] introduced by G. Graff and J. Jezierski [J. Fixed Point Theory Appl. 13 (2013), 63–84]. An open-source implementation of the algorithm programmed in C++ is publicly available at http://www.pawelpilarczyk.com/combtop/.
Resumo:
OBJECTIVE: The aim of this study is to evaluate the survival rate in a cohort of Parkinson's disease patients with and without depression. METHODS: A total of 53 Parkinson's disease subjects were followed up from 2003-2008 and 21 were diagnosed as depressed. Mean time of follow up was 3.8 (SD 95% = 1.5) years for all the sample and there was no significant difference in mean time of follow up between depressed and nondepressed Parkinson's disease patients. Survival curves rates were fitted using the Kaplan-Meier method. In order to compare survival probabilities according to the selected covariables the Log-Rank test was used. Multivariate analysis with Cox regression was performed aiming at estimating the effect of predictive covariables on the survival. RESULTS: The cumulative global survival of this sample was 83% with nine deaths at the end of the study - five in the depressed and four in the nondepressed group, and 55.6% died in the first year of observation, and none died at the fourth and fifth year of follow up. CONCLUSION: Our finding point toward incremental death risk in depressed Parkinson's disease patients.
Resumo:
The Symbolic Aggregate Approximation (iSAX) is widely used in time series data mining. Its popularity arises from the fact that it largely reduces time series size, it is symbolic, allows lower bounding and is space efficient. However, it requires setting two parameters: the symbolic length and alphabet size, which limits the applicability of the technique. The optimal parameter values are highly application dependent. Typically, they are either set to a fixed value or experimentally probed for the best configuration. In this work we propose an approach to automatically estimate iSAX’s parameters. The approach – AutoiSAX – not only discovers the best parameter setting for each time series in the database, but also finds the alphabet size for each iSAX symbol within the same word. It is based on simple and intuitive ideas from time series complexity and statistics. The technique can be smoothly embedded in existing data mining tasks as an efficient sub-routine. We analyze its impact in visualization interpretability, classification accuracy and motif mining. Our contribution aims to make iSAX a more general approach as it evolves towards a parameter-free method.
Resumo:
Ever since the appearance of the ARCH model [Engle(1982a)], an impressive array of variance specifications belonging to the same class of models has emerged [i.e. Bollerslev's (1986) GARCH; Nelson's (1990) EGARCH]. This recent domain has achieved very successful developments. Nevertheless, several empirical studies seem to show that the performance of such models is not always appropriate [Boulier(1992)]. In this paper we propose a new specification: the Quadratic Moving Average Conditional heteroskedasticity model. Its statistical properties, such as the kurtosis and the symmetry, as well as two estimators (Method of Moments and Maximum Likelihood) are studied. Two statistical tests are presented, the first one tests for homoskedasticity and the second one, discriminates between ARCH and QMACH specification. A Monte Carlo study is presented in order to illustrate some of the theoretical results. An empirical study is undertaken for the DM-US exchange rate.
Resumo:
Notch proteins influence cell-fate decisions in many developmental systems. Gain-of-function studies have suggested a crucial role for Notch1 signaling at several stages during lymphocyte development, including the B/T, alphabeta/gammadelta and CD4/CD8 lineage choices. Here, we critically re-evaluate these conclusions in the light of recent studies that describe inducible and tissue-specific targeting of the Notch1 gene.
Resumo:
This paper provides evidence on the sources of co-movement in monthly US and UK stock price movements by investigating the role of macroeconomic and financial variables in a bivariate system with time-varying conditional correlations. Crosscountry communality in response is uncovered, with changes in the US Federal Funds rate, UK bond yields and oil prices having similar negative effects in both markets. Other variables also play a role, especially for the UK market. These effects do not, however, explain the marked increase in cross-market correlations observed from around 2000, which we attribute to time variation in the correlations of shocks to these markets. A regime-switching smooth transition model captures this time variation well and shows the correlations increase dramatically around 1999-2000. JEL classifications: C32, C51, G15 Keywords: international stock returns, DCC-GARCH model, smooth transition conditional correlation GARCH model, model evaluation.
Resumo:
In this paper, we attempt to give a theoretical underpinning to the well established empirical stylized fact that asset returns in general and the spot FOREX returns in particular display predictable volatility characteristics. Adopting Moore and Roche s habit persistence version of Lucas model we nd that both the innovation in the spot FOREX return and the FOREX return itself follow "ARCH" style processes. Using the impulse response functions (IRFs) we show that the baseline simulated FOREX series has "ARCH" properties in the quarterly frequency that match well the "ARCH" properties of the empirical monthly estimations in that when we scale the x-axis to synchronize the monthly and quarterly responses we find similar impulse responses to one unit shock in variance. The IRFs for the ARCH processes we estimate "look the same" with an approximately monotonic decreasing fashion. The Lucas two-country monetary model with habit can generate realistic conditional volatility in spot FOREX return.
Resumo:
In this paper we show that the inclusion of unemployment-tenure interaction variates in Mincer wage equations is subject to serious pitfalls. These variates were designed to test whether or not the sensitivity to the business cycle of a worker’s wage varies according to her tenure. We show that three canonical variates used in the literature - the minimum unemployment rate during a worker’s time at the firm(min u), the unemployment rate at the start of her tenure(Su) and the current unemployment rate interacted with a new hire dummy(δu) - can all be significant and "correctly" signed even when each worker in the firm receives the same wage, regardless of tenure (equal treatment). In matched data the problem can be resolved by the inclusion in the panel of firm-year interaction dummies. In unmatched data where this is not possible, we propose a solution for min u and Su based on Solon, Barsky and Parker’s(1994) two step method. This method is sub-optimal because it ignores a large amount of cross tenure variation in average wages and is only valid when the scaled covariances of firm wages and firm employment are acyclical. Unfortunately δu cannot be identified in unmatched data because a differential wage response to unemployment of new hires and incumbents will appear under both equal treatment and unequal treatment.
Resumo:
This paper uses forecasts from the European Central Bank's Survey of Professional Forecasters to investigate the relationship between inflation and inflation expectations in the euro area. We use theoretical structures based on the New Keynesian and Neoclassical Phillips curves to inform our empirical work. Given the relatively short data span of the Survey of Professional Forecasters and the need to control for many explanatory variables, we use dynamic model averaging in order to ensure a parsimonious econometric speci cation. We use both regression-based and VAR-based methods. We find no support for the backward looking behavior embedded in the Neo-classical Phillips curve. Much more support is found for the forward looking behavior of the New Keynesian Phillips curve, but most of this support is found after the beginning of the financial crisis.
Resumo:
‘Modern’ Phillips curve theories predict inflation is an integrated, or near integrated, process. However, inflation appears bounded above and below in developed economies and so cannot be ‘truly’ integrated and more likely stationary around a shifting mean. If agents believe inflation is integrated as in the ‘modern’ theories then they are making systematic errors concerning the statistical process of inflation. An alternative theory of the Phillips curve is developed that is consistent with the ‘true’ statistical process of inflation. It is demonstrated that United States inflation data is consistent with the alternative theory but not with the existing ‘modern’ theories.
Resumo:
This study evaluates the effect of the individual‘s household income on their health at the later stages of working life. A structural equation model is utilised in order to derive a composite and continuous index of the latent health status from qualitative health status indicators. The endogenous relationship between health status and household income status is taken into account by using IV estimators. The findings reveal a significant effect of individual household income on health before and after endogeneity is taken into account and after a host of other factors which is known to influence health, including hereditary factors and the individual‘s locus of control. Importantly, it is also shown that the childhood socioeconomic position of the individual has long lasting effects on health as it appears to play a significant role in determining health during the later stages of working life.
Resumo:
The UK government introduced the Renewable Obligation (RO), a system of tradable quotas, to encourage the installation of renewable electricity capacity. Each unit of generation from renewables created a renewable obligation certificate (ROC). Electricity generators must either; earn ROCs through their own production, purchase ROCs in the market or pay the buy-out price to comply with the quota set by the RO. A unique aspect of this regulation is that all entities holding ROCs receive a share of the buy-out fund (the sum of all compliance purchases using the buy-out price). This set-up ensures that the difference between the market price for ROCs and the buy-out price should equal the expected share of the buy-out fund, as regulated entities arbitrage these two compliance options. The expected share of the buy-out fund depends on whether enough renewable generation is available to meet the quota. This analysis tests whether variables associated with renewable generation or electricity demand are correlated with, and thus can help predict, the price of ROCs.