837 resultados para Out-of-sample


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This purely theoretical thesis covers aspects of two contemporary research fields: the non-equilibrium dynamics in quantum systems and the electronic properties of three-dimensional topological insulators. In the first part we investigate the non-equilibrium dynamics in closed quantum systems. Thanks to recent technologies, especially from the field of ultracold quantum gases, it is possible to realize such systems in the laboratory. The focus is on the influence of hydrodynamic slow modes on the thermalization process. Generic systems in equilibrium, either classical or quantum, in equilibrium are described by thermodynamics. This is characterized by an ensemble of maximal entropy, but constrained by macroscopically conserved quantities. We will show that these conservation laws slow down thermalization and the final equilibrium state can be approached only algebraically in time. When the conservation laws are violated thermalization takes place exponential in time. In a different study we calculate probability distributions of projective quantum measurements. Newly developed quantum microscopes provide the opportunity to realize new measurement protocols which go far beyond the conventional measurements of correlation functions. The second part of this thesis is dedicated to a new class of materials known as three-dimensional topological insulators. Also here new experimental techniques have made it possible to fabricate these materials to a high enough quality that their topological nature is revealed. However, their transport properties are not fully understood yet. Motivated by unusual experimental results in the optical conductivity we have investigated the formation and thermal destruction of spatially localized electron- and hole-doped regions. These are caused by charged impurities which are introduced into the material in order to make the bulk insulating. Our theoretical results are in agreement with the experiment and can explain the results semi-quantitatively. Furthermore, we study emergent lengthscales in the bulk as well as close to the conducting surface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed timevarying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible realtime term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a high mobility two-dimensional electron gas (2DEG) realized in a GaAs/Al0.3Ga0.7As quantum well we observe changes in the Shubnikov-de Haas oscillations (SdHO) and in the Hall resistance for different sample geometries. We observe for each sample geometry a strong negative magnetoresistance around zero magnetic field which consists of a peak around zero magnetic field and of a huge magnetoresistance at larger fields. The peak around zero magnetic field is left unchanged for different geometries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed time-varying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible real-time term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

“Falling Out of the Sky” is a collection of poems, both formal and free verse, that explores an intimate familial landscape. In particular, these poems raise the question of what it means to be human through examinations of family mythology and its changes as bodies and memories become unreliable with time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La pobreza es un problema mundial que afecta a personas de diferentes maneras. El propósito de este artículo es explorar dos teorías principales que abordan la pobreza y la posibilidad de su superación, que son los enfoques de capital humano y de capacidades humanas. El enfoque del capital humano se centra exclusivamente en la faceta económica de la pobreza; en esta perspectiva, la pobreza se define como la falta de dinero y puede abordarse mediante el aumento de los ingresos financieros de las personas que viven en la pobreza. El enfoque de las capacidades humanas ve la pobreza como un problema multidimensional que va más allá de la economía para áreas como la salud, la educación y la libertad. Este enfoque se orienta hacia el cambio social y ayudar a las personas en situación de pobreza para descubrir y desarrollar su potencial. El autor considera que las capacidades humanas abarcan con mayor precisión el alcance de la pobreza y las personas afectadas por el mismo, aunque debido a su amplia gama ha sido difícil diseñar e implementar políticas eficaces que aborden todas las facetas de la pobreza.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Many acute stroke trials have given neutral results. Sub-optimal statistical analyses may be failing to detect efficacy. Methods which take account of the ordinal nature of functional outcome data are more efficient. We compare sample size calculations for dichotomous and ordinal outcomes for use in stroke trials. Methods Data from stroke trials studying the effects of interventions known to positively or negatively alter functional outcome – Rankin Scale and Barthel Index – were assessed. Sample size was calculated using comparisons of proportions, means, medians (according to Payne), and ordinal data (according to Whitehead). The sample sizes gained from each method were compared using Friedman 2 way ANOVA. Results Fifty-five comparisons (54 173 patients) of active vs. control treatment were assessed. Estimated sample sizes differed significantly depending on the method of calculation (Po00001). The ordering of the methods showed that the ordinal method of Whitehead and comparison of means produced significantly lower sample sizes than the other methods. The ordinal data method on average reduced sample size by 28% (inter-quartile range 14–53%) compared with the comparison of proportions; however, a 22% increase in sample size was seen with the ordinal method for trials assessing thrombolysis. The comparison of medians method of Payne gave the largest sample sizes. Conclusions Choosing an ordinal rather than binary method of analysis allows most trials to be, on average, smaller by approximately 28% for a given statistical power. Smaller trial sample sizes may help by reducing time to completion, complexity, and financial expense. However, ordinal methods may not be optimal for interventions which both improve functional outcome

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In quantitative risk analysis, the problem of estimating small threshold exceedance probabilities and extreme quantiles arise ubiquitously in bio-surveillance, economics, natural disaster insurance actuary, quality control schemes, etc. A useful way to make an assessment of extreme events is to estimate the probabilities of exceeding large threshold values and extreme quantiles judged by interested authorities. Such information regarding extremes serves as essential guidance to interested authorities in decision making processes. However, in such a context, data are usually skewed in nature, and the rarity of exceedance of large threshold implies large fluctuations in the distribution's upper tail, precisely where the accuracy is desired mostly. Extreme Value Theory (EVT) is a branch of statistics that characterizes the behavior of upper or lower tails of probability distributions. However, existing methods in EVT for the estimation of small threshold exceedance probabilities and extreme quantiles often lead to poor predictive performance in cases where the underlying sample is not large enough or does not contain values in the distribution's tail. In this dissertation, we shall be concerned with an out of sample semiparametric (SP) method for the estimation of small threshold probabilities and extreme quantiles. The proposed SP method for interval estimation calls for the fusion or integration of a given data sample with external computer generated independent samples. Since more data are used, real as well as artificial, under certain conditions the method produces relatively short yet reliable confidence intervals for small exceedance probabilities and extreme quantiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Partially clonal organisms are very common in nature, yet the influence of partial asexuality on the temporal dynamics of genetic diversity remains poorly understood. Mathematical models accounting for clonality predict deviations only for extremely rare sex and only towards mean inbreeding coefficient (F-IS) over bar < 0. Yet in partially clonal species, both F-IS < 0 and F-IS > 0 are frequently observed also in populations where there is evidence for a significant amount of sexual reproduction. Here, we studied the joint effects of partial clonality, mutation and genetic drift with a state-and-time discrete Markov chain model to describe the dynamics of F-IS over time under increasing rates of clonality. Results: Results of the mathematical model and simulations show that partial clonality slows down the asymptotic convergence to F-IS = 0. Thus, although clonality alone does not lead to departures from Hardy-Weinberg expectations once reached the final equilibrium state, both negative and positive F-IS values can arise transiently even at intermediate rates of clonality. More importantly, such "transient" departures from Hardy Weinberg proportions may last long as clonality tunes up the temporal variation of F-IS and reduces its rate of change over time, leading to a hyperbolic increase of the maximal time needed to reach the final mean (F-IS,F-infinity) over bar value expected at equilibrium. Conclusion: Our results argue for a dynamical interpretation of F-IS in clonal populations. Negative values cannot be interpreted as unequivocal evidence for extremely scarce sex but also as intermediate rates of clonality in finite populations. Complementary observations (e.g. frequency distribution of multiloci genotypes, population history) or time series data may help to discriminate between different possible conclusions on the extent of clonality when mean (F-IS) over bar values deviating from zero and/or a large variation of F-IS over loci are observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article contests Sean McMeekin’s claims concerning Russian culpability for the First World War. McMeekin maintains that Ottoman rearmament, particularly the purchase of several battleships released onto the global arms market by South American states, threatened to create a situation where the Russian Black Sea Fleet would be outclassed by its Ottoman opposite number. Rather than waiting for this to happen, the tsarist regime chose to go to war. Yet, contrary to McMeekin’s claims, the Ottoman naval expansion never assumed threatening dimensions because the Porte was unable to purchase battleships from Chile or Argentina. As a result, it provided no incentive for Russia to go to war in 1914.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Aim: The prevalence of alcohol use has increased globally. Out-of-school youth are a vulnerable group who might have missed opportunities for learning healthy behaviours in a formal school environment. The purpose of this study was to determine the risk perception, pattern of use, and correlates of alcohol use among out-of-school youth in Lagos, Nigeria. Methods: A cross-sectional study was conducted among 380 out-of-school youth in motor parks in Lagos State, Nigeria, using interviewer administered questionnaires. Results: The lifetime prevalence of alcohol use was 61.1%, while 55.5% were current drinkers. Beer (57.3%) was the most consumed type of alcohol, followed by distilled spirits (29.8%). Using the CAGE scoring system, more than half (57.8%) of the current drinkers had a drinking problem. Almost three quarters (70.1%) had experienced at least one episode of alcohol intoxication within the past month. A considerable number of current drinkers (63.5%) desired to reduce their alcohol intake or stop drinking, while 45.5% had made unsuccessful attempts to do so within the past year. Only 28.9% had received assistance to quit or reduce their drinking and of these less than half (39.3%) received assistance from a professional or healthcare worker. Males were more likely to be current drinkers and to have experienced episodes of alcohol intoxication. Parental and peer drinking were associated with alcohol use but not with intoxication. Conclusions: It is important to design specific programmes to reduce alcohol use among out-of-school youth in these settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We provide a comprehensive study of out-of-sample forecasts for the EUR/USD exchange rate based on multivariate macroeconomic models and forecast combinations. We use profit maximization measures based on directional accuracy and trading strategies in addition to standard loss minimization measures. When comparing predictive accuracy and profit measures, data snooping bias free tests are used. The results indicate that forecast combinations, in particular those based on principal components of forecasts, help to improve over benchmark trading strategies, although the excess return per unit of deviation is limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presence of inhibitory substances in biological forensic samples has, and continues to affect the quality of the data generated following DNA typing processes. Although the chemistries used during the procedures have been enhanced to mitigate the effects of these deleterious compounds, some challenges remain. Inhibitors can be components of the samples, the substrate where samples were deposited or chemical(s) associated to the DNA purification step. Therefore, a thorough understanding of the extraction processes and their ability to handle the various types of inhibitory substances can help define the best analytical processing for any given sample. A series of experiments were conducted to establish the inhibition tolerance of quantification and amplification kits using common inhibitory substances in order to determine if current laboratory practices are optimal for identifying potential problems associated with inhibition. DART mass spectrometry was used to determine the amount of inhibitor carryover after sample purification, its correlation to the initial inhibitor input in the sample and the overall effect in the results. Finally, a novel alternative at gathering investigative leads from samples that would otherwise be ineffective for DNA typing due to the large amounts of inhibitory substances and/or environmental degradation was tested. This included generating data associated with microbial peak signatures to identify locations of clandestine human graves. Results demonstrate that the current methods for assessing inhibition are not necessarily accurate, as samples that appear inhibited in the quantification process can yield full DNA profiles, while those that do not indicate inhibition may suffer from lowered amplification efficiency or PCR artifacts. The extraction methods tested were able to remove >90% of the inhibitors from all samples with the exception of phenol, which was present in variable amounts whenever the organic extraction approach was utilized. Although the results attained suggested that most inhibitors produce minimal effect on downstream applications, analysts should practice caution when selecting the best extraction method for particular samples, as casework DNA samples are often present in small quantities and can contain an overwhelming amount of inhibitory substances.^