966 resultados para penalized likelihood
Resumo:
We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.
Resumo:
We evaluate the use of Generalized Empirical Likelihood (GEL) estimators in portfolios efficiency tests for asset pricing models in the presence of conditional information. Estimators from GEL family presents some optimal statistical properties, such as robustness to misspecification and better properties in finite samples. Unlike GMM, the bias for GEL estimators do not increase as more moment conditions are included, which is expected in conditional efficiency analysis. We found some evidences that estimators from GEL class really performs differently in small samples, where efficiency tests using GEL generate lower estimates compared to tests using the standard approach with GMM. With Monte Carlo experiments we see that GEL has better performance when distortions are present in data, especially under heavy tails and Gaussian shocks.
Resumo:
Pspline uses xtmixed to fit a penalized spline regression and plots the smoothed function. Additional covariates can be specified to adjust the smooth and plot partial residuals.
Resumo:
Transportation Department, Office of Systems Engineering, Washington, D.C.
Resumo:
Thesis--Illinois.
Resumo:
Mode of access: Internet.
Resumo:
"November 1982."
Resumo:
Eviction from housing is an institutionalized social process affecting millions in the western world, but very little is understood about its impact on people’s lives. Guided by George Brown and Tirril Harris’s landmark sociological research on disruptive life events, together with evidence that home is an important ‘place’, this study aims to contribute to an understanding of eviction’s fallout by considering depression as a potential outcome. Taking advantage of unique data on all evictions in Sweden and linking to longitudinal registers, this study seeks to determine whether working-age adults facing imminent eviction in 2009 had a greater risk of depression in the following year compared, using penalized maximum likelihood logistic regressions, to a control group randomly drawn from the Swedish population. Results indicate that imminent eviction is significantly associated with subsequent depression, even accounting for a range of social, economic, geographic and behavioral characteristics. Contrary to expectations, the findings are not robust for gender differences. Recent mental illness is the only control variable significantly moderating the association of interest, which remains significant regardless of illness history. The results provide grounds for treating eviction as a disruptive life event in its own right.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
In simultaneous analyses of multiple data partitions, the trees relevant when measuring support for a clade are the optimal tree, and the best tree lacking the clade (i.e., the most reasonable alternative). The parsimony-based method of partitioned branch support (PBS) forces each data set to arbitrate between the two relevant trees. This value is the amount each data set contributes to clade support in the combined analysis, and can be very different to support apparent in separate analyses. The approach used in PBS can also be employed in likelihood: a simultaneous analysis of all data retrieves the maximum likelihood tree, and the best tree without the clade of interest is also found. Each data set is fitted to the two trees and the log-likelihood difference calculated, giving partitioned likelihood support (PLS) for each data set. These calculations can be performed regardless of the complexity of the ML model adopted. The significance of PLS can be evaluated using a variety of resampling methods, such as the Kishino-Hasegawa test, the Shimodiara-Hasegawa test, or likelihood weights, although the appropriateness and assumptions of these tests remains debated.
Resumo:
We present a novel method, called the transform likelihood ratio (TLR) method, for estimation of rare event probabilities with heavy-tailed distributions. Via a simple transformation ( change of variables) technique the TLR method reduces the original rare event probability estimation with heavy tail distributions to an equivalent one with light tail distributions. Once this transformation has been established we estimate the rare event probability via importance sampling, using the classical exponential change of measure or the standard likelihood ratio change of measure. In the latter case the importance sampling distribution is chosen from the same parametric family as the transformed distribution. We estimate the optimal parameter vector of the importance sampling distribution using the cross-entropy method. We prove the polynomial complexity of the TLR method for certain heavy-tailed models and demonstrate numerically its high efficiency for various heavy-tailed models previously thought to be intractable. We also show that the TLR method can be viewed as a universal tool in the sense that not only it provides a unified view for heavy-tailed simulation but also can be efficiently used in simulation with light-tailed distributions. We present extensive simulation results which support the efficiency of the TLR method.
Resumo:
In diagnosis and prognosis, we should avoid intuitive “guesstimates” and seek a validated numerical aid
Resumo:
In cell lifespan studies the exponential nature of cell survival curves is often interpreted as showing the rate of death is independent of the age of the cells within the population. Here we present an alternative model where cells that die are replaced and the age and lifespan of the population pool is monitored until a, steady state is reached. In our model newly generated individual cells are given a determined lifespan drawn from a number of known distributions including the lognormal, which is frequently found in nature. For lognormal lifespans the analytic steady-state survival curve obtained can be well-fit by a single or double exponential, depending on the mean and standard deviation. Thus, experimental evidence for exponential lifespans of one and/or two populations cannot be taken as definitive evidence for time and age independence of cell survival. A related model for a dividing population in steady state is also developed. We propose that the common adoption of age-independent, constant rates of change in biological modelling may be responsible for significant errors, both of interpretation and of mathematical deduction. We suggest that additional mathematical and experimental methods must be used to resolve the relationship between time and behavioural changes by cells that are predominantly unsynchronized.