976 resultados para Probability and Statistics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

En Colombia se ha podido establecer que la incidencia y mortalidad de la Enfermedad Renal Crónica Terminal continúan en aumento en los últimos 6 años a pesar de las estrategias de intervención para prevención y control de la enfermedad implementadas nivel nacional. Este trabajo busca establecer la línea de base para la población asegurada en Colombia, frente a la supervivencia de pacientes en terapia de remplazo renal (TRR).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use asymptotic linearity to derive confidence intervals for large noncentrality parameters. These results enable us to measure relevance of effects and interactions in multifactors models when we get highly statistically significant the values of F tests statistics. We show how to use our approach by considering two sets of data as application examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study sought to extend earlier work by Mulhern and Wylie (2004) to investigate a UK-wide sample of psychology undergraduates. A total of 890 participants from eight universities across the UK were tested on six broadly defined components of mathematical thinking relevant to the teaching of statistics in psychology - calculation, algebraic reasoning, graphical interpretation, proportionality and ratio, probability and sampling, and estimation. Results were consistent with Mulhern and Wylie's (2004) previously reported findings. Overall, participants across institutions exhibited marked deficiencies in many aspects of mathematical thinking. Results also revealed significant gender differences on calculation, proportionality and ratio, and estimation. Level of qualification in mathematics was found to predict overall performance. Analysis of the nature and content of errors revealed consistent patterns of misconceptions in core mathematical knowledge , likely to hamper the learning of statistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous research has demonstrated that students’ cognitions about statistics are related to their performance in statistics assessments. The purpose of this research is to examine the nature of the relationships between undergraduate psychology students’ previous experiences of maths, statistics and computing; their attitudes toward statistics; and assessment on a statistics course. Of the variables examined, the strongest predictor of assessment outcome was students’ attitude about their intellectual knowledge and skills in relation to statistics at the end of the statistics curriculum. This attitude was related to students’ perceptions of their maths ability at the beginning of the statistics curriculum. Interventions could be designed to change such attitudes with the aim of improving students’ learning of statistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we attempt to make a probabilistic analysis of some physically realizable, though complex, storage and queueing models. It is essentially a mathematical study of the stochastic processes underlying these models. Our aim is to have an improved understanding of the behaviour of such models, that may widen their applicability. Different inventory systems with randon1 lead times, vacation to the server, bulk demands, varying ordering levels, etc. are considered. Also we study some finite and infinite capacity queueing systems with bulk service and vacation to the server and obtain the transient solution in certain cases. Each chapter in the thesis is provided with self introduction and some important references

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Matrix function approximation is a current focus of worldwide interest and finds application in a variety of areas of applied mathematics and statistics. In this thesis we focus on the approximation of A^(-α/2)b, where A ∈ ℝ^(n×n) is a large, sparse symmetric positive definite matrix and b ∈ ℝ^n is a vector. In particular, we will focus on matrix function techniques for sampling from Gaussian Markov random fields in applied statistics and the solution of fractional-in-space partial differential equations. Gaussian Markov random fields (GMRFs) are multivariate normal random variables characterised by a sparse precision (inverse covariance) matrix. GMRFs are popular models in computational spatial statistics as the sparse structure can be exploited, typically through the use of the sparse Cholesky decomposition, to construct fast sampling methods. It is well known, however, that for sufficiently large problems, iterative methods for solving linear systems outperform direct methods. Fractional-in-space partial differential equations arise in models of processes undergoing anomalous diffusion. Unfortunately, as the fractional Laplacian is a non-local operator, numerical methods based on the direct discretisation of these equations typically requires the solution of dense linear systems, which is impractical for fine discretisations. In this thesis, novel applications of Krylov subspace approximations to matrix functions for both of these problems are investigated. Matrix functions arise when sampling from a GMRF by noting that the Cholesky decomposition A = LL^T is, essentially, a `square root' of the precision matrix A. Therefore, we can replace the usual sampling method, which forms x = L^(-T)z, with x = A^(-1/2)z, where z is a vector of independent and identically distributed standard normal random variables. Similarly, the matrix transfer technique can be used to build solutions to the fractional Poisson equation of the form ϕn = A^(-α/2)b, where A is the finite difference approximation to the Laplacian. Hence both applications require the approximation of f(A)b, where f(t) = t^(-α/2) and A is sparse. In this thesis we will compare the Lanczos approximation, the shift-and-invert Lanczos approximation, the extended Krylov subspace method, rational approximations and the restarted Lanczos approximation for approximating matrix functions of this form. A number of new and novel results are presented in this thesis. Firstly, we prove the convergence of the matrix transfer technique for the solution of the fractional Poisson equation and we give conditions by which the finite difference discretisation can be replaced by other methods for discretising the Laplacian. We then investigate a number of methods for approximating matrix functions of the form A^(-α/2)b and investigate stopping criteria for these methods. In particular, we derive a new method for restarting the Lanczos approximation to f(A)b. We then apply these techniques to the problem of sampling from a GMRF and construct a full suite of methods for sampling conditioned on linear constraints and approximating the likelihood. Finally, we consider the problem of sampling from a generalised Matern random field, which combines our techniques for solving fractional-in-space partial differential equations with our method for sampling from GMRFs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers two problems that frequently arise in dynamic discrete choice problems but have not received much attention with regard to simulation methods. The first problem is how to simulate unbiased simulators of probabilities conditional on past history. The second is simulating a discrete transition probability model when the underlying dependent variable is really continuous. Both methods work well relative to reasonable alternatives in the application discussed. However, in both cases, for this application, simpler methods also provide reasonably good results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By the time students reach the middle years they have experienced many chance activities based on dice. Common among these are rolling one die to explore the relationship of frequency and theoretical probability, and rolling two dice and summing the outcomes to consider their probabilities. Although dice may be considered overused by some, the advantage they offer is a familiar context within which to explore much more complex concepts. If the basic chance mechanism of the device is understood, it is possible to enter quickly into an arena of more complex concepts. This is what happened with a two hour activity engaged in by four classes of Grade 6 students in the same school. The activity targeted the concepts of variation and expectation. The teachers held extended discussions with their classes on variation and expectation at the beginning of the activity, with students contributing examples of the two concepts from their own experience. These notions are quite sophisticated for Grade 6, but the underlying concepts describe phenomena that students encounter every day. For example, time varies continuously; sporting results vary from game to game; the maximum temperature varies from day to day. However, there is an expectation about tomorrow’s maximum temperature based on the expert advice from the weather bureau. There may also be an expectation about a sporting result based on the participants’ previous results. It is this juxtaposition that makes life interesting. Variation hence describes the differences we see in phenomena around us. In a scenario displaying variation, expectation describes the effort to characterise or summarise the variation and perhaps make a prediction about the message arising from the scenario. The explicit purpose of the activity described here was to use the familiar scenario of rolling a die to expose these two concepts. Because the students had previously experienced rolling physical dice they knew instinctively about the variation that occurs across many rolls and about the theoretical expectation that each side should “come up” one-sixth of the time. They had observed the instances of the concepts in action, but had not consolidated the underlying terminology to describe it. As the two concepts are so fundamental to understanding statistics, we felt it would be useful to begin building in the familiar environment of rolling a die. Because hand-held dice limit the explorations students can undertake, the classes used the soft-ware TinkerPlots (Konold & Miller, 2011) to simulate rolling a die multiple times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Individual movement is very versatile and inevitable in ecology. In this thesis, I investigate two kinds of movement body condition dependent dispersal and small-range foraging movements resulting in quasi-local competition and their causes and consequences on the individual, population and metapopulation level. Body condition dependent dispersal is a widely evident but barely understood phenomenon. In nature, diverse relationships between body condition and dispersal are observed. I develop the first models that study the evolution of dispersal strategies that depend on individual body condition. In a patchy environment where patches differ in environmental conditions, individuals born in rich (e.g. nutritious) patches are on average stronger than their conspecifics that are born in poorer patches. Body condition (strength) determines competitive ability such that stronger individuals win competition with higher probability than weak individuals. Individuals compete for patches such that kin competition selects for dispersal. I determine the evolutionarily stable strategy (ESS) for different ecological scenarios. My models offer explanations for both dispersal of strong individuals and dispersal of weak individuals. Moreover, I find that within-family dispersal behaviour is not always reflected on the population level. This supports the fact that no consistent pattern is detected in data on body condition dependent dispersal. It also encourages the refining of empirical investigations. Quasi-local competition defines interactions between adjacent populations where one population negatively affects the growth of the other population. I model a metapopulation in a homogeneous environment where adults of different subpopulations compete for resources by spending part of their foraging time in the neighbouring patches, while their juveniles only feed on the resource in their natal patch. I show that spatial patterns (different population densities in the patches) are stable only if one age class depletes the resource very much but mainly the other age group depends on it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Whether a statistician wants to complement a probability model for observed data with a prior distribution and carry out fully probabilistic inference, or base the inference only on the likelihood function, may be a fundamental question in theory, but in practice it may well be of less importance if the likelihood contains much more information than the prior. Maximum likelihood inference can be justified as a Gaussian approximation at the posterior mode, using flat priors. However, in situations where parametric assumptions in standard statistical models would be too rigid, more flexible model formulation, combined with fully probabilistic inference, can be achieved using hierarchical Bayesian parametrization. This work includes five articles, all of which apply probability modeling under various problems involving incomplete observation. Three of the papers apply maximum likelihood estimation and two of them hierarchical Bayesian modeling. Because maximum likelihood may be presented as a special case of Bayesian inference, but not the other way round, in the introductory part of this work we present a framework for probability-based inference using only Bayesian concepts. We also re-derive some results presented in the original articles using the toolbox equipped herein, to show that they are also justifiable under this more general framework. Here the assumption of exchangeability and de Finetti's representation theorem are applied repeatedly for justifying the use of standard parametric probability models with conditionally independent likelihood contributions. It is argued that this same reasoning can be applied also under sampling from a finite population. The main emphasis here is in probability-based inference under incomplete observation due to study design. This is illustrated using a generic two-phase cohort sampling design as an example. The alternative approaches presented for analysis of such a design are full likelihood, which utilizes all observed information, and conditional likelihood, which is restricted to a completely observed set, conditioning on the rule that generated that set. Conditional likelihood inference is also applied for a joint analysis of prevalence and incidence data, a situation subject to both left censoring and left truncation. Other topics covered are model uncertainty and causal inference using posterior predictive distributions. We formulate a non-parametric monotonic regression model for one or more covariates and a Bayesian estimation procedure, and apply the model in the context of optimal sequential treatment regimes, demonstrating that inference based on posterior predictive distributions is feasible also in this case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work develops methods to account for shoot structure in models of coniferous canopy radiative transfer. Shoot structure, as it varies along the light gradient inside canopy, affects the efficiency of light interception per unit needle area, foliage biomass, or foliage nitrogen. The clumping of needles in the shoot volume also causes a notable amount of multiple scattering of light within coniferous shoots. The effect of shoot structure on light interception is treated in the context of canopy level photosynthesis and resource use models, and the phenomenon of within-shoot multiple scattering in the context of physical canopy reflectance models for remote sensing purposes. Light interception. A method for estimating the amount of PAR (Photosynthetically Active Radiation) intercepted by a conifer shoot is presented. The method combines modelling of the directional distribution of radiation above canopy, fish-eye photographs taken at shoot locations to measure canopy gap fraction, and geometrical measurements of shoot orientation and structure. Data on light availability, shoot and needle structure and nitrogen content has been collected from canopies of Pacific silver fir (Abies amabilis (Dougl.) Forbes) and Norway spruce (Picea abies (L.) Karst.). Shoot structure acclimated to light gradient inside canopy so that more shaded shoots have better light interception efficiency. Light interception efficiency of shoots varied about two-fold per needle area, about four-fold per needle dry mass, and about five-fold per nitrogen content. Comparison of fertilized and control stands of Norway spruce indicated that light interception efficiency is not greatly affected by fertilization. Light scattering. Structure of coniferous shoots gives rise to multiple scattering of light between the needles of the shoot. Using geometric models of shoots, multiple scattering was studied by photon tracing simulations. Based on simulation results, the dependence of the scattering coefficient of shoot from the scattering coefficient of needles is shown to follow a simple one-parameter model. The single parameter, termed the recollision probability, describes the level of clumping of the needles in the shoot, is wavelength independent, and can be connected to previously used clumping indices. By using the recollision probability to correct for the within-shoot multiple scattering, canopy radiative transfer models which have used leaves as basic elements can use shoots as basic elements, and thus be applied for coniferous forests. Preliminary testing of this approach seems to explain, at least partially, why coniferous forests appear darker than broadleaved forests in satellite data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Planar curves arise naturally as interfaces between two regions of the plane. An important part of statistical physics is the study of lattice models. This thesis is about the interfaces of 2D lattice models. The scaling limit is an infinite system limit which is taken by letting the lattice mesh decrease to zero. At criticality, the scaling limit of an interface is one of the SLE curves (Schramm-Loewner evolution), introduced by Oded Schramm. This family of random curves is parametrized by a real variable, which determines the universality class of the model. The first and the second paper of this thesis study properties of SLEs. They contain two different methods to study the whole SLE curve, which is, in fact, the most interesting object from the statistical physics point of view. These methods are applied to study two symmetries of SLE: reversibility and duality. The first paper uses an algebraic method and a representation of the Virasoro algebra to find common martingales to different processes, and that way, to confirm the symmetries for polynomial expected values of natural SLE data. In the second paper, a recursion is obtained for the same kind of expected values. The recursion is based on stationarity of the law of the whole SLE curve under a SLE induced flow. The third paper deals with one of the most central questions of the field and provides a framework of estimates for describing 2D scaling limits by SLE curves. In particular, it is shown that a weak estimate on the probability of an annulus crossing implies that a random curve arising from a statistical physics model will have scaling limits and those will be well-described by Loewner evolutions with random driving forces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Financing trade between economic agents located in different countries is affected by many types of risks, resulting from incomplete information about the debtor, the problems of enforcing international contracts, or the prevalence of political and financial crises. Trade is important for economic development and the availability of trade finance is essential, especially for developing countries. Relatively few studies treat the topic of political risk, particularly in the context of international lending. This thesis explores new ground to identify links between political risk and international debt defaults. The core hypothesis of the study is that the default probability of debt increases with increasing political risk in the country of the borrower. The thesis consists of three essays that support the hypothesis from different angles of the credit evaluation process. The first essay takes the point of view of an international lender assessing the credit risk of a public borrower. The second investigates creditworthiness assessment of companies. The obtained results are substantiated in the third essay that deals with an extensive political risk survey among finance professionals in developing countries. The financial instruments of core interest are export credit guaranteed debt initiated between the Export Credit Agency of Finland and buyers in 145 countries between 1975 and 2006. Default events of the foreign credit counterparts are conditioned on country-specific macroeconomic variables, corporate-specific accounting information as well as political risk indicators from various international sources. Essay 1 examines debt issued to government controlled institutions and conditions public default events on traditional macroeconomic fundamentals, in addition to selected political and institutional risk factors. Confirming previous research, the study finds country indebtedness and the GDP growth rate to be significant indicators of public default. Further, it is shown that public defaults respond to various political risk factors. However, the impact of the risk varies between countries at different stages of economic development. Essay 2 proceeds by investigating political risk factors as conveivable drivers of corporate default and uses traditional accounting variables together with new political risk indicators in the credit evaluation of private debtors. The study finds links between corporate default and leverage, as well as between corporate default and the general investment climate and measeures of conflict in the debtor country. Essay 3 concludes the thesis by offering survey evidence on the impact of political risk on debt default, as perceived and experienced by 103 finance professionals in 38 developing countries. Taken together, the results of the thesis suggest that various forms of political risk are associated with international debt defaults and continue to pose great concerns for both international creditors and borrowers in developing countries. The study provides new insights on the importance of variable selection in country risk analysis, and shows how political risk is actually perceived and experienced in the riskier, often lower income countries of the global economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.