17 resultados para Random-Walk Hypothesis

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ever since its initial introduction some fifty years ago, the rational expectations paradigm has dominated the way economic theory handles uncertainty. The main assertion made by John F. Muth (1961), seen by many as the father of the paradigm, is that expectations of rational economic agents should essentially be equal to the predictions of relevant economic theory, since rational agents should use information available to them in an optimal way. This assumption often has important consequences on the results and interpretations of the models where it is applied. Although the rational expectations assumption can be applied to virtually any economic theory, the focus in this thesis is on macroeconomic theories of consumption, especially the Rational Expectations–Permanent Income Hypothesis proposed by Robert E. Hall in 1978. The much-debated theory suggests that, assuming that agents have rational expectations on their future income, consumption decisions should follow a random walk, and the best forecast of future consumption level is the current consumption level. Then, changes in consumption are unforecastable. This thesis constructs an empirical test for the Rational Expectations–Permanent Income Hypothesis using Finnish Consumer Survey data as well as various Finnish macroeconomic data. The data sample covers the years 1995–2010. Consumer survey data may be interpreted to directly represent household expectations, which makes it an interesting tool for this particular test. The variable to be predicted is the growth of total household consumption expenditure. The main empirical result is that the Consumer Confidence Index (CCI), a balance figure computed from the most important consumer survey responses, does have statistically significant predictive power over the change in total consumption expenditure. The history of consumption expenditure growth itself, however, fails to predict its own future values. This indicates that the CCI contains some information that the history of consumption decisions does not, and that the consumption decisions are not optimal in the theoretical context. However, when conditioned on various macroeconomic variables, the CCI loses its predictive ability. This finding suggests that the index is merely a (partial) summary of macroeconomic information, and does not contain any significant private information on consumption intentions of households not directly deductible from the objective economic variables. In conclusion, the Rational Expectations–Permanent Income Hypothesis is strongly rejected by the empirical results in this thesis. This result is in accordance with most earlier studies conducted on the topic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis studies the informational efficiency of the European Union emission allowance (EUA) market. In an efficient market, the market price is unpredictable and profits above average are impossible in the long run. The main research problem is does the EUA price follow a random walk. The method is an econometric analysis of the price series, which includes an autocorrelation coefficient test and a variance ratio test. The results reveal that the price series is autocorrelated and therefore a nonrandom walk. In order to find out the extent of predictability, the price series is modelled with an autoregressive model. The conclusion is that the EUA price is autocorrelated only to a small degree and that the predictability cannot be used to make extra profits. The EUA market is therefore considered informationally efficient, although the price series does not fulfill the requirements of a random walk. A market review supports the conclusion, but it is clear that the maturing of the market is still in process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to describe school leadership on a practical level. By observing the daily behaviour of a principal minute by minute, the study tried to answer the following questions: how did the principals use their time, did they have time to develop their school after participating in the daily life of the school, and how did the previously studied challenges of modern leadership show in their practical work? Five principals in different areas of Helsinki were observed – two women and three men. The principals were chosen at random from three educational conferences. The main hypothesis of this research was that the work of the principal consists of solving daily problems and routines concerning the pupils, teachers and other interest groups and writing all kinds of bureaucratic reports. This means that the school and its principal do not have enough resources to give to a visionary development of teaching and learning – in other words pedagogical leading – even though every principal has the best knowledge about his or her own school’s status quo and the needs for development revealed by this status quo. The research material was gathered by applying the Peer-Assisted Leadership method. The researcher shadowed each principal for four days for three hours at a time. After each shadowing period, any unclear situations were clarified with a short interview. After all the shadowing periods, the principals participated in a semi-structured interview that covered the themes emerging from the shadowing material. In addition to this, the principals evaluated their own leading with a self-assessment questionnaire. The results gathered from the shadowing material showed that the actions of the principals were focused on bureaucratic work. The principals spent most of their time in the office (more than 50%). In the office they were sitting mainly by the computer. They also spent a significant mount of time in the office meeting teachers and occasional visitors. The time spent building networks was relatively short, although the principals considered it as an important domain of leadership according to their interviews. After the classification of the shadowing material, the activities of the principals were divided according to certain factors affecting them. The underlying factors were quality management, daily life management, strategic thinking and emotional intelligence. Through these factors the research showed that coping with the daily life of the school took about 40% of the principals’ time. Activities connected with emotional intelligence could be observed over 30% and activities which required strategic thinking were observed over 20% of the time. The activities which according to the criteria of the research consisted of quality management took only 8% of the principals’ time. This result was congruent with previous studies showing that the work of school leaders is focused on something other than developing the quality of teaching and learning. Keywords: distributed leadership, building community, network building, interaction, emotional intelligence, strategy, quality management

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Planar curves arise naturally as interfaces between two regions of the plane. An important part of statistical physics is the study of lattice models. This thesis is about the interfaces of 2D lattice models. The scaling limit is an infinite system limit which is taken by letting the lattice mesh decrease to zero. At criticality, the scaling limit of an interface is one of the SLE curves (Schramm-Loewner evolution), introduced by Oded Schramm. This family of random curves is parametrized by a real variable, which determines the universality class of the model. The first and the second paper of this thesis study properties of SLEs. They contain two different methods to study the whole SLE curve, which is, in fact, the most interesting object from the statistical physics point of view. These methods are applied to study two symmetries of SLE: reversibility and duality. The first paper uses an algebraic method and a representation of the Virasoro algebra to find common martingales to different processes, and that way, to confirm the symmetries for polynomial expected values of natural SLE data. In the second paper, a recursion is obtained for the same kind of expected values. The recursion is based on stationarity of the law of the whole SLE curve under a SLE induced flow. The third paper deals with one of the most central questions of the field and provides a framework of estimates for describing 2D scaling limits by SLE curves. In particular, it is shown that a weak estimate on the probability of an annulus crossing implies that a random curve arising from a statistical physics model will have scaling limits and those will be well-described by Loewner evolutions with random driving forces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time-dependent backgrounds in string theory provide a natural testing ground for physics concerning dynamical phenomena which cannot be reliably addressed in usual quantum field theories and cosmology. A good, tractable example to study is the rolling tachyon background, which describes the decay of an unstable brane in bosonic and supersymmetric Type II string theories. In this thesis I use boundary conformal field theory along with random matrix theory and Coulomb gas thermodynamics techniques to study open and closed string scattering amplitudes off the decaying brane. The calculation of the simplest example, the tree-level amplitude of n open strings, would give us the emission rate of the open strings. However, even this has been unknown. I will organize the open string scattering computations in a more coherent manner and will argue how to make further progress.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Serum parathyroid hormone (PTH) and vitamin D are the major regulators of extracellular calcium homeostasis. The inverse association between PTH and vitamin D and the common age-related elevation of the PTH concentration are well known phenomena. However, the confounding or modifying factors of this relationship and their impact on the response of PTH levels to vitamin D supplementation need further investigation. Clinical conditions such as primary hyperparathyroidism (PHPT), renal failure and vitamin D deficiency, characterized by an elevation of the PTH concentration, have been associated with impaired long-term health outcomes. Curative treatments for these conditions have also been shown to decreases PTH concentration and attenuate some of the adverse health effects. In PHPT it has also been commonly held that hypercalcaemia, the other hallmark of the disease, is the key mediator of the adverse health outcomes. In chronic kidney disease the systemic vascular disease has been proposed to have the most important impact on general health. Some evidence also indicates that vitamin D may have significant extraskeletal actions. However, the frank elevation of PTH concentration seen in advanced PHPT and in end-stage renal failure have also been suggested to be at least partly causally related to an increased risk of death as well as cognitive dysfunction. However, the exact mechanisms have remained unclear. Furthermore, the predictive value of elevated PTH in unselected older populations has been less well studied. The studies presented in this thesis investigated the impact of age and mobility on the responses of PTH levels to vitamin D deficiency and supplementation. Furthermore, the predictive value of PTH for long-term survival and cognitive decline was addressed in an unselected population of older people. The hypothesis was that age and chronic immobility are related to a persistently blunted elevation of PTH concentration, even in the presence of chronic vitamin D deficiency, and to attenuated responses of PTH to vitamin D supplementation. It was also further hypothesized that a slightly elevated or even high-normal PTH concentration is an independent indicator of an increased risk of death and cognitive decline in the general aged population. The data of this thesis are based on three samples: a meta-analysis of published vitamin D supplementation trials, a randomized placebo controlled six-month vitamin D supplementation trial, and a longitudinal prospective cohort study on a general aged population. Based on a PubMed search, a meta-analysis of 52 clinical trials with 6 290 adult participants was performed to evaluate the impact of age and immobility on the responses of PTH to 25-OHD levels and vitamin D supplementation. A total of 218 chronically immobile, very old inpatients were also enrolled into a vitamin D supplementation trial. Mortality data for these patients was also collected after a two-year follow-up. Finally, data from the Helsinki Aging Study, which followed three random age cohorts (75, 80 and 85 years) until death in almost all subjects, was used to evaluate the predictive value of PTH for long-term survival and cognitive decline. This series of studies demonstrated that in older people without overt renal failure or severe hypercalcaemia, serum 25-OHD and PTH were closely associated, but this relationship was also affected by age and immobility. Furthermore, a substantial proportion of old chronically bedridden patients did not respond to vitamin D deficiency by elevating PTH, and the effect of a high-dose (1200 IU/d) six-month cholecalciferol supplementation on the PTH concentration was minor. This study demonstrated longitudinally for the first time that the blunted PTH also persisted over time. Even a subtle elevation of PTH to high-normal levels predicted impaired long-term health outcomes. Slightly elevated PTH concentrations indicated an increased risk of clinically significant cognitive decline and death during the last years of life in a general aged population. This association was also independent of serum ionized calcium (Ca2+) and the estimated glomerular filtration rate (GFR). A slightly elevated PTH also indicated impaired two-year survival during the terminal years of frail elderly subjects independently of Ca2+, GFR, and of 25-OHD levels. The interplay between PTH and vitamin D in the regulation of calcium homeostasis is more complex than has been generally considered. In addition to muskuloskeletal health parathyroid hormone is also related to the maintenance of other important domains of health in old age. Higher PTH concentrations, even within conventional laboratory reference ranges, seem to be an independent indicator of an increased risk of all-cause and of cardiovascular mortality, independently of established cardiovascular risk factors, disturbances in mineral metabolism, and renal failure. Limited and inconsistent evidence supports the role of vitamin D deficiency-related lack of neuroprotective effects over the causal association between PTH and impaired cognitive functions. However, the causality of these associations remains unclear. The clinical implications of the observed relationships remain to be elucidated by future studies interfering with PTH concentrations, especially by long-term interventions to reduce PTH.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hormone therapy (HT) is widely used to relieve climacteric symptoms in order to increase the well-being of the women. The benefits as well as side-effects of HT are well documented. The principal menopausal oral symptoms are dry mouth (DM) and sensation of painful mouth (PM) due to various causes. Profile studies have indicated that HT users are more health-conscious than non-users. The hypothesis of the present study was that there are differences in oral health between woman using HT and those not using HT. A questionnaire study of 3173 women of menopausal age (50-58 years old) was done to investigate the prevalence of self-assessed sensations of PM and DM. Of those women participating in the questionnaire study, a random sample of 400 (200 using, 200 not using HT) was examined clinically in a 2-year follow-up study. Oral status was recorded according to WHO methods using DMFT and CPITN indices. The saliva flows were measured, salivary total protein, albumin and immunoglobulin concentrations and selected periodontal micro-organisms were analysed, and panoramic tomography of the jaws was taken. The patients filled in a structured questionnaire on their systemic health, medication and health habits. According to our questionnaire study there was no significant difference in the occurrence of self- assessed PM or DM between the HT users and non-users. According to logistic regression analyses, climacteric complaints significantly correlated with the occurrence of PM (p=0.000) and DM (p=0.000) irrespective of the use of HT, indicating that PM and DM are associated with climacteric symptoms in general. There was no difference between the groups in DMFT index values at follow up. The number of filled teeth (FT) showed a significant (p<0.05) increase in the HT group at follow-up. Periodontitis was diagnosed in 79% of HT users at baseline and in 71% at the follow-up. The values for non-HT users were 80% vs. 76%, respectively (Ns.). The mean numbers of ≥ 6 mm deep periodontal pockets were 0.9 ± 1.7 at baseline vs. 1.1 ± 2.1 two years later in the HT group, and 1.0 ± 1.7 vs. 1.2 ± 1.9, respectively, in the non-HT group. In a large Finnish national health survey, the prevalence of peridontitis of women of this age group was lower, but the prevalence of severe periodontitis seemed to be higher than in our study. Salivary albumin, IgG and IgM concentrations decreased in the HT group during the 2-year follow up (p<0.05), possibly indicating an improvement in epithelial integrity. No difference was found in any other salivary parameters or in the prevalence of the periodontal bacteria between or within the groups. In conclusion, the present findings showed that 50 to 58 year old women living in Helsinki have fairly good oral and dental health. The occurrence of PM and DM seemed to be associated with climacteric symptoms in general, and the use of HT did not affect the oral symptoms studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the clustering pattern in the Finnish stock market. Using trading volume and time as factors capturing the clustering pattern in the market, the Keim and Madhavan (1996) and the Engle and Russell (1998) model provide the framework for the analysis. The descriptive and the parametric analysis provide evidences that an important determinant of the famous U-shape pattern in the market is the rate of information arrivals as measured by large trading volumes and durations at the market open and close. Precisely, 1) the larger the trading volume, the greater the impact on prices both in the short and the long run, thus prices will differ across quantities. 2) Large trading volume is a non-linear function of price changes in the long run. 3) Arrival times are positively autocorrelated, indicating a clustering pattern and 4) Information arrivals as approximated by durations are negatively related to trading flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Markov random fields (MRF) are popular in image processing applications to describe spatial dependencies between image units. Here, we take a look at the theory and the models of MRFs with an application to improve forest inventory estimates. Typically, autocorrelation between study units is a nuisance in statistical inference, but we take an advantage of the dependencies to smooth noisy measurements by borrowing information from the neighbouring units. We build a stochastic spatial model, which we estimate with a Markov chain Monte Carlo simulation method. The smooth values are validated against another data set increasing our confidence that the estimates are more accurate than the originals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reorganizing a dataset so that its hidden structure can be observed is useful in any data analysis task. For example, detecting a regularity in a dataset helps us to interpret the data, compress the data, and explain the processes behind the data. We study datasets that come in the form of binary matrices (tables with 0s and 1s). Our goal is to develop automatic methods that bring out certain patterns by permuting the rows and columns. We concentrate on the following patterns in binary matrices: consecutive-ones (C1P), simultaneous consecutive-ones (SC1P), nestedness, k-nestedness, and bandedness. These patterns reflect specific types of interplay and variation between the rows and columns, such as continuity and hierarchies. Furthermore, their combinatorial properties are interlinked, which helps us to develop the theory of binary matrices and efficient algorithms. Indeed, we can detect all these patterns in a binary matrix efficiently, that is, in polynomial time in the size of the matrix. Since real-world datasets often contain noise and errors, we rarely witness perfect patterns. Therefore we also need to assess how far an input matrix is from a pattern: we count the number of flips (from 0s to 1s or vice versa) needed to bring out the perfect pattern in the matrix. Unfortunately, for most patterns it is an NP-complete problem to find the minimum distance to a matrix that has the perfect pattern, which means that the existence of a polynomial-time algorithm is unlikely. To find patterns in datasets with noise, we need methods that are noise-tolerant and work in practical time with large datasets. The theory of binary matrices gives rise to robust heuristics that have good performance with synthetic data and discover easily interpretable structures in real-world datasets: dialectical variation in the spoken Finnish language, division of European locations by the hierarchies found in mammal occurrences, and co-occuring groups in network data. In addition to determining the distance from a dataset to a pattern, we need to determine whether the pattern is significant or a mere occurrence of a random chance. To this end, we use significance testing: we deem a dataset significant if it appears exceptional when compared to datasets generated from a certain null hypothesis. After detecting a significant pattern in a dataset, it is up to domain experts to interpret the results in the terms of the application.