12 resultados para Asymptotic Normality

em Helda - Digital Repository of University of Helsinki


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study a Hamiltonian describing a pendulum coupled with several anisochronous oscillators, giving a simple construction of unstable KAM tori and their stable and unstable manifolds for analytic perturbations. When the coupling takes place through an even trigonometric polynomial in the angle variables, we extend analytically the solutions of the equations of motion, order by order in the perturbation parameter, to a large neighbourhood of the real line representing time. Subsequently, we devise an asymptotic expansion for the splitting (matrix) associated with a homoclinic point. This expansion consists of contributions that are manifestly exponentially small in the limit of vanishing gravity, by a shift-of-countour argument. Hence, we infer a similar upper bound for the splitting itself. In particular, the derivation of the result does not call for a tree expansion with explicit cancellation mechanisms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When ordinary nuclear matter is heated to a high temperature of ~ 10^12 K, it undergoes a deconfinement transition to a new phase, strongly interacting quark-gluon plasma. While the color charged fundamental constituents of the nuclei, the quarks and gluons, are at low temperatures permanently confined inside color neutral hadrons, in the plasma the color degrees of freedom become dominant over nuclear, rather than merely nucleonic, volumes. Quantum Chromodynamics (QCD) is the accepted theory of the strong interactions, and confines quarks and gluons inside hadrons. The theory was formulated in early seventies, but deriving first principles predictions from it still remains a challenge, and novel methods of studying it are needed. One such method is dimensional reduction, in which the high temperature dynamics of static observables of the full four-dimensional theory are described using a simpler three-dimensional effective theory, having only the static modes of the various fields as its degrees of freedom. A perturbatively constructed effective theory is known to provide a good description of the plasma at high temperatures, where asymptotic freedom makes the gauge coupling small. In addition to this, numerical lattice simulations have, however, shown that the perturbatively constructed theory gives a surprisingly good description of the plasma all the way down to temperatures a few times the transition temperature. Near the critical temperature, the effective theory, however, ceases to give a valid description of the physics, since it fails to respect the approximate center symmetry of the full theory. The symmetry plays a key role in the dynamics near the phase transition, and thus one expects that the regime of validity of the dimensionally reduced theories can be significantly extended towards the deconfinement transition by incorporating the center symmetry in them. In the introductory part of the thesis, the status of dimensionally reduced effective theories of high temperature QCD is reviewed, placing emphasis on the phase structure of the theories. In the first research paper included in the thesis, the non-perturbative input required in computing the g^6 term in the weak coupling expansion of the pressure of QCD is computed in the effective theory framework at an arbitrary number of colors. The two last papers on the other hand focus on the construction of the center-symmetric effective theories, and subsequently the first non-perturbative studies of these theories are presented. Non-perturbative lattice simulations of a center-symmetric effective theory for SU(2) Yang-Mills theory show --- in sharp contrast to the perturbative setup --- that the effective theory accommodates a phase transition in the correct universality class of the full theory. This transition is seen to take place at a value of the effective theory coupling constant that is consistent with the full theory coupling at the critical temperature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Doctoral dissertation work in sociology examines how human heredity became a scientific, political and a personal issue in the 20th century Finland. The study focuses on the institutionalisation of rationales and technologies concerning heredity, in the context of Finnish medicine and health care. The analysis concentrates specifically on the introduction and development of prenatal screening within maternity care. The data comprises of medical articles, policy documents and committee reports, as well as popular guidebooks and health magazines. The study commences with an analysis on the early 20th century discussions on racial hygiene. It ends with an analysis on the choices given to pregnant mothers and families at present. Freedom to choose, considered by geneticists and many others as a guarantee of the ethicality of medical applications, is presented in this study as a historically, politically and scientifically constructed issue. New medical testing methods have generated new possibilities of governing life itself. However, they have also created new ethical problems. Leaning on recent historical data, the study illustrates how medical risk rationales on heredity have been asserted by the medical profession into Finnish health care. It also depicts medical professions ambivalence between maintaining the patients autonomy and utilizing for example prenatal testing according to health policy interests. Personalized risk is discussed as a result of the empirical analysis. It is indicated that increasing risk awareness amongst the public, as well as offering choices, have had unintended consequences. According to doctors, present day parents often want to control risks more than what is considered justified or acceptable. People s hopes to anticipate the health and normality of their future children have exceeded the limits offered by medicine. Individualization of the government of heredity is closely linked to a process that is termed as depolitization. The concept refers to disembedding of medical genetics from its social contexts. Prenatal screening is regarded to be based on individual choice facilitated by neutral medical knowledge. However, prenatal screening within maternity care also has its basis in health policy aims and economical calculations. Methodological basis of the study lies in Michel Foucault s writings on the history of thought, as well as in science and technology studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Body: The foundation for the formation of the knowledge and conception of gender identity among the transgendered The purpose of this study is to increase the understanding of the experiential formation of the knowledge and conception of one's gender and the foundation of that experience. This study is based on qualitative method and phenomenological approach. The research material consists: Herculine Barbin's Herculine Barbin, Christine Jorgensen's Christine Jorgensen. A Personal Autobiography Kate Bornstein's Gender Outlaw and Deirdre McCloskey's Crossing. A Memoir. The theoretical frame of reference for the study is Michel Henry's phenomenology of the body. The most important relations regarding the formation of the knowledge and conception of gender identity at which the sensing of the body is directed are human being's own subjective, organic and objective bodily form and other people and representatives of institutions. The concept of resistance reveals that gender division and the stereotypes and accountability related to it have dual character in culture. As a resistance they contain the potential for triggering the reflections about one's own gender. As an instrument they may function as means of exercising power and, as such, of monitoring gender normality. According to the research material the sources for the knowledge and conception of gender identity among the transgendered are literature, medical articles and books, internet, clerical and medical professionals, friends and relatives, and the peer group, that is, other transgendered. The transgendered are not only users of gender knowledge, but many of them are also active producers and contributors of gender knowledge and especially of knowledge about transgenderness. The problem is that this knowledge is unevenly distributed in society. The users of gender knowledge are mainly the transgendered, researchers of different disciplines specialized in gender issues, and medical and healthcare professionals specialized in gender adjustments. Therefore not everyone has the sufficient knowledge to support one's own or someone other's life as a gendered being in a society and ability to achieve gender autonomy. The quality of this knowledge is also rather narrow from the gender multiplicity point of view. The feeling of strangeness and the resulting experience of enstrangement have, like stereotypes, dual character in culture. They may be the reason for people's social disadvantage or exclusion, but the experiences may just as well be a resource for people's gender maturity and culture. As a cultural resource in gender issue this would mean innovativity in creating, upholding and changing cultural gender division, stereotypes and accounting customs. A transgendered may then become a liminal that aspires to change the limits related to resistances in society. Transgenderness is not only a medical issue but, first and foremost, an issue bearing upon human situation as a whole, or, in other words, related to the art of life. The subject of gender adjustment treatments is not only gender itself but the art of life as a gendered being. Transgenderness would then require multi-discipline co-operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Topics in Spatial Econometrics — With Applications to House Prices Spatial effects in data occur when geographical closeness of observations influences the relation between the observations. When two points on a map are close to each other, the observed values on a variable at those points tend to be similar. The further away the two points are from each other, the less similar the observed values tend to be. Recent technical developments, geographical information systems (GIS) and global positioning systems (GPS) have brought about a renewed interest in spatial matters. For instance, it is possible to observe the exact location of an observation and combine it with other characteristics. Spatial econometrics integrates spatial aspects into econometric models and analysis. The thesis concentrates mainly on methodological issues, but the findings are illustrated by empirical studies on house price data. The thesis consists of an introductory chapter and four essays. The introductory chapter presents an overview of topics and problems in spatial econometrics. It discusses spatial effects, spatial weights matrices, especially k-nearest neighbours weights matrices, and various spatial econometric models, as well as estimation methods and inference. Further, the problem of omitted variables, a few computational and empirical aspects, the bootstrap procedure and the spatial J-test are presented. In addition, a discussion on hedonic house price models is included. In the first essay a comparison is made between spatial econometrics and time series analysis. By restricting the attention to unilateral spatial autoregressive processes, it is shown that a unilateral spatial autoregression, which enjoys similar properties as an autoregression with time series, can be defined. By an empirical study on house price data the second essay shows that it is possible to form coordinate-based, spatially autoregressive variables, which are at least to some extent able to replace the spatial structure in a spatial econometric model. In the third essay a strategy for specifying a k-nearest neighbours weights matrix by applying the spatial J-test is suggested, studied and demonstrated. In the final fourth essay the properties of the asymptotic spatial J-test are further examined. A simulation study shows that the spatial J-test can be used for distinguishing between general spatial models with different k-nearest neighbours weights matrices. A bootstrap spatial J-test is suggested to correct the size of the asymptotic test in small samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The likelihood ratio test of cointegration rank is the most widely used test for cointegration. Many studies have shown that its finite sample distribution is not well approximated by the limiting distribution. The article introduces and evaluates by Monte Carlo simulation experiments bootstrap and fast double bootstrap (FDB) algorithms for the likelihood ratio test. It finds that the performance of the bootstrap test is very good. The more sophisticated FDB produces a further improvement in cases where the performance of the asymptotic test is very unsatisfactory and the ordinary bootstrap does not work as well as it might. Furthermore, the Monte Carlo simulations provide a number of guidelines on when the bootstrap and FDB tests can be expected to work well. Finally, the tests are applied to US interest rates and international stock prices series. It is found that the asymptotic test tends to overestimate the cointegration rank, while the bootstrap and FDB tests choose the correct cointegration rank.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we examine the predictability of observed volatility smiles in three major European index options markets, utilising the historical return distributions of the respective underlying assets. The analysis involves an application of the Black (1976) pricing model adjusted in accordance with the Jarrow-Rudd methodology as proposed in 1982. Thereby we adjust the expected future returns for the third and fourth central moments as these represent deviations from normality in the distributions of observed returns. Thus, they are considered one possible explanation to the existence of the smile. The obtained results indicate that the inclusion of the higher moments in the pricing model to some extent reduces the volatility smile, compared with the unadjusted Black-76 model. However, as the smile is partly a function of supply, demand, and liquidity, and as such intricate to model, this modification does not appear sufficient to fully capture the characteristics of the smile.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bootstrap likelihood ratio tests of cointegration rank are commonly used because they tend to have rejection probabilities that are closer to the nominal level than the rejection probabilities of the correspond- ing asymptotic tests. The e¤ect of bootstrapping the test on its power is largely unknown. We show that a new computationally inexpensive procedure can be applied to the estimation of the power function of the bootstrap test of cointegration rank. The bootstrap test is found to have a power function close to that of the level-adjusted asymp- totic test. The bootstrap test estimates the level-adjusted power of the asymptotic test highly accurately. The bootstrap test may have low power to reject the null hypothesis of cointegration rank zero, or underestimate the cointegration rank. An empirical application to Euribor interest rates is provided as an illustration of the findings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many economic events involve initial observations that substantially deviate from long-run steady state. Initial conditions of this type have been found to impact diversely on the power of univariate unit root tests, whereas the impact on multivariate tests is largely unknown. This paper investigates the impact of the initial condition on tests for cointegration rank. We compare the local power of the widely used likelihood ratio (LR) test with the local power of a test based on the eigenvalues of the companion matrix. We find that the power of the LR test is increasing in the magnitude of the initial condition, whereas the power of the other test is decreasing. The behaviour of the tests is investigated in an application to price convergence.