12 resultados para Random Variable

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planar curves arise naturally as interfaces between two regions of the plane. An important part of statistical physics is the study of lattice models. This thesis is about the interfaces of 2D lattice models. The scaling limit is an infinite system limit which is taken by letting the lattice mesh decrease to zero. At criticality, the scaling limit of an interface is one of the SLE curves (Schramm-Loewner evolution), introduced by Oded Schramm. This family of random curves is parametrized by a real variable, which determines the universality class of the model. The first and the second paper of this thesis study properties of SLEs. They contain two different methods to study the whole SLE curve, which is, in fact, the most interesting object from the statistical physics point of view. These methods are applied to study two symmetries of SLE: reversibility and duality. The first paper uses an algebraic method and a representation of the Virasoro algebra to find common martingales to different processes, and that way, to confirm the symmetries for polynomial expected values of natural SLE data. In the second paper, a recursion is obtained for the same kind of expected values. The recursion is based on stationarity of the law of the whole SLE curve under a SLE induced flow. The third paper deals with one of the most central questions of the field and provides a framework of estimates for describing 2D scaling limits by SLE curves. In particular, it is shown that a weak estimate on the probability of an annulus crossing implies that a random curve arising from a statistical physics model will have scaling limits and those will be well-described by Loewner evolutions with random driving forces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forestry has influenced forest dwelling organisms for centuries in Fennoscandia. For example, in Finland ca. 30% of the threatened species are threatened because of forestry. Nowadays forest management recommendations include practices aimed at maintaining biodiversity in harvesting, such as green-tree retention. However, the effects of these practices have been little studied. In variable retention, different numbers of trees are retained, varying from green-tree retention (at least a few live standing trees in clear-cuts) to thinning (only individual trees removed). I examined the responses of ground-dwelling spiders and carabid beetles to green-tree retention (with small and large tree groups), gap felling and thinning aimed at an uneven age structure of trees. The impacts of these harvesting methods were compared to those of clear-cutting and uncut controls. I aimed to test the hypothesis that retaining more trees positively affects populations of those species of spiders and carabids that were present before harvesting. The data come from two studies. First, spiders were collected with pitfall traps in south-central Finland in 1995 (pre-treatment) and 1998 (after-treatment) in order to examine the effects of clear-cutting, green-tree retention (with 0.01-0.02-ha sized tree groups), gap felling (with three 0.16-ha sized openings in a 1-ha stand), thinning aiming at an uneven age structure of trees and uncut control. Second, spiders and carabids were caught with pitfall traps in eastern Finland in 1998-2001 (pre-treatment and three post-treatment years) in eleven 0.09-0.55-ha sized retention-tree groups and clear-cuts adjacent to them. Original spider and carabid assemblages were better maintained after harvests that retained more trees. Thinning maintained forest spiders well. However, gap felling and large retention-tree groups maintained some forest spider and carabid species in the short-term, but negatively affected some species over time. However, use of small retention-tree groups was associated with negative effects on forest spider populations. Studies are needed on the long-term effects of variable retention on terrestrial invertebrates; especially those directed at defining appropriate retention patch size and on the importance of structural diversity provided by variable retention for invertebrate populations. However, the aims of variable retention should be specified first. For example, are retention-tree groups planned to constitute life-boats , stepping-stones or to create structural diversity? Does it suffice that some species are maintained, or do we want to preserve the most sensitive ones, and how are these best defined? Moreover, the ecological benefits and economic costs of modified logging methods should be compared to other approaches aimed at maintaining biodiversity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatial and temporal variation in the abundance of species can often be ascribed to spatial and temporal variation in the surrounding environment. Knowledge of how biotic and abiotic factors operate over different spatial and temporal scales in determining distribution, abundance, and structure of populations lies at the heart of ecology. The major part of the current ecological theory stems from studies carried out in central parts of the distributional range of species, whereas knowledge of how marginal populations function is inadequate. Understanding how marginal populations, living at the edge of their range, function is however in a key position to advance ecology and evolutionary biology as scientific disciplines. My thesis focuses on the factors affecting dynamics of marginal populations of blue mussels (Mytilus edulis) living close to their tolerance limits with regard to salinity. The thesis aims to highlight the dynamics at the edge of the range and contrast these with dynamics in more central parts of the range in order to understand the potential interplay between the central and the marginal part in the focal system. The objectives of the thesis are approached by studies on: (1) factors affecting regional patterns of the species, (2) long-term temporal dynamics of the focal species spaced along a regional salinity gradient, (3) selective predation by increasing populations of roach (Rutilus rutilus) when feeding on their main food item, the blue mussel, (4) the primary and secondary effects of local wave exposure gradients and (5) the role of small-scale habitat heterogeneity as determinants of large-scale pattern. The thesis shows that populations of blue mussels are largely determined by large scale changes in sea water salinity, affecting mainly recruitment success and longevity of local populations. In opposite to the traditional view, the thesis strongly indicate that vertebrate predators strongly affect abundance and size structure of blue mussel populations, and that the role of these predators increases towards the margin where populations are increasingly top-down controlled. The thesis also indicates that the positive role of biogenic habitat modifiers increases towards the marginal areas, where populations of blue mussels are largely recruitment limited. Finally, the thesis shows that local blue mussel populations are strongly dependent on high water turbulence, and therefore, dense populations are constrained to offshore habitats. Finally, the thesis suggests that ongoing sedimentation of rocky shores is detrimental for the species, affecting recruitment success and post-recruit survival, pushing stable mussel beds towards offshore areas. Ongoing large scale changes in the Baltic Sea, especially dilution processes with attendant effects, are predicted to substantially contract the distributional range of the mussel, but also affect more central populations. The thesis shows that in order to understand the functioning of marginal populations, research should (1) strive for multi-scale approaches in order to link ecosystem patterns with ecosystem processes, and (2) challenge the prevailing tenets that origin from research carried out in central areas that may not be valid at the edge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time-dependent backgrounds in string theory provide a natural testing ground for physics concerning dynamical phenomena which cannot be reliably addressed in usual quantum field theories and cosmology. A good, tractable example to study is the rolling tachyon background, which describes the decay of an unstable brane in bosonic and supersymmetric Type II string theories. In this thesis I use boundary conformal field theory along with random matrix theory and Coulomb gas thermodynamics techniques to study open and closed string scattering amplitudes off the decaying brane. The calculation of the simplest example, the tree-level amplitude of n open strings, would give us the emission rate of the open strings. However, even this has been unknown. I will organize the open string scattering computations in a more coherent manner and will argue how to make further progress.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Detecting Earnings Management Using Neural Networks. Trying to balance between relevant and reliable accounting data, generally accepted accounting principles (GAAP) allow, to some extent, the company management to use their judgment and to make subjective assessments when preparing financial statements. The opportunistic use of the discretion in financial reporting is called earnings management. There have been a considerable number of suggestions of methods for detecting accrual based earnings management. A majority of these methods are based on linear regression. The problem with using linear regression is that a linear relationship between the dependent variable and the independent variables must be assumed. However, previous research has shown that the relationship between accruals and some of the explanatory variables, such as company performance, is non-linear. An alternative to linear regression, which can handle non-linear relationships, is neural networks. The type of neural network used in this study is the feed-forward back-propagation neural network. Three neural network-based models are compared with four commonly used linear regression-based earnings management detection models. All seven models are based on the earnings management detection model presented by Jones (1991). The performance of the models is assessed in three steps. First, a random data set of companies is used. Second, the discretionary accruals from the random data set are ranked according to six different variables. The discretionary accruals in the highest and lowest quartiles for these six variables are then compared. Third, a data set containing simulated earnings management is used. Both expense and revenue manipulation ranging between -5% and 5% of lagged total assets is simulated. Furthermore, two neural network-based models and two linear regression-based models are used with a data set containing financial statement data from 110 failed companies. Overall, the results show that the linear regression-based models, except for the model using a piecewise linear approach, produce biased estimates of discretionary accruals. The neural network-based model with the original Jones model variables and the neural network-based model augmented with ROA as an independent variable, however, perform well in all three steps. Especially in the second step, where the highest and lowest quartiles of ranked discretionary accruals are examined, the neural network-based model augmented with ROA as an independent variable outperforms the other models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Markov random fields (MRF) are popular in image processing applications to describe spatial dependencies between image units. Here, we take a look at the theory and the models of MRFs with an application to improve forest inventory estimates. Typically, autocorrelation between study units is a nuisance in statistical inference, but we take an advantage of the dependencies to smooth noisy measurements by borrowing information from the neighbouring units. We build a stochastic spatial model, which we estimate with a Markov chain Monte Carlo simulation method. The smooth values are validated against another data set increasing our confidence that the estimates are more accurate than the originals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ever since its initial introduction some fifty years ago, the rational expectations paradigm has dominated the way economic theory handles uncertainty. The main assertion made by John F. Muth (1961), seen by many as the father of the paradigm, is that expectations of rational economic agents should essentially be equal to the predictions of relevant economic theory, since rational agents should use information available to them in an optimal way. This assumption often has important consequences on the results and interpretations of the models where it is applied. Although the rational expectations assumption can be applied to virtually any economic theory, the focus in this thesis is on macroeconomic theories of consumption, especially the Rational Expectations–Permanent Income Hypothesis proposed by Robert E. Hall in 1978. The much-debated theory suggests that, assuming that agents have rational expectations on their future income, consumption decisions should follow a random walk, and the best forecast of future consumption level is the current consumption level. Then, changes in consumption are unforecastable. This thesis constructs an empirical test for the Rational Expectations–Permanent Income Hypothesis using Finnish Consumer Survey data as well as various Finnish macroeconomic data. The data sample covers the years 1995–2010. Consumer survey data may be interpreted to directly represent household expectations, which makes it an interesting tool for this particular test. The variable to be predicted is the growth of total household consumption expenditure. The main empirical result is that the Consumer Confidence Index (CCI), a balance figure computed from the most important consumer survey responses, does have statistically significant predictive power over the change in total consumption expenditure. The history of consumption expenditure growth itself, however, fails to predict its own future values. This indicates that the CCI contains some information that the history of consumption decisions does not, and that the consumption decisions are not optimal in the theoretical context. However, when conditioned on various macroeconomic variables, the CCI loses its predictive ability. This finding suggests that the index is merely a (partial) summary of macroeconomic information, and does not contain any significant private information on consumption intentions of households not directly deductible from the objective economic variables. In conclusion, the Rational Expectations–Permanent Income Hypothesis is strongly rejected by the empirical results in this thesis. This result is in accordance with most earlier studies conducted on the topic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Light scattering, or scattering and absorption of electromagnetic waves, is an important tool in all remote-sensing observations. In astronomy, the light scattered or absorbed by a distant object can be the only source of information. In Solar-system studies, the light-scattering methods are employed when interpreting observations of atmosphereless bodies such as asteroids, atmospheres of planets, and cometary or interplanetary dust. Our Earth is constantly monitored from artificial satellites at different wavelengths. With remote sensing of Earth the light-scattering methods are not the only source of information: there is always the possibility to make in situ measurements. The satellite-based remote sensing is, however, superior in the sense of speed and coverage if only the scattered signal can be reliably interpreted. The optical properties of many industrial products play a key role in their quality. Especially for products such as paint and paper, the ability to obscure the background and to reflect light is of utmost importance. High-grade papers are evaluated based on their brightness, opacity, color, and gloss. In product development, there is a need for computer-based simulation methods that could predict the optical properties and, therefore, could be used in optimizing the quality while reducing the material costs. With paper, for instance, pilot experiments with an actual paper machine can be very time- and resource-consuming. The light-scattering methods presented in this thesis solve rigorously the interaction of light and material with wavelength-scale structures. These methods are computationally demanding, thus the speed and accuracy of the methods play a key role. Different implementations of the discrete-dipole approximation are compared in the thesis and the results provide practical guidelines in choosing a suitable code. In addition, a novel method is presented for the numerical computations of orientation-averaged light-scattering properties of a particle, and the method is compared against existing techniques. Simulation of light scattering for various targets and the possible problems arising from the finite size of the model target are discussed in the thesis. Scattering by single particles and small clusters is considered, as well as scattering in particulate media, and scattering in continuous media with porosity or surface roughness. Various techniques for modeling the scattering media are presented and the results are applied to optimizing the structure of paper. However, the same methods can be applied in light-scattering studies of Solar-system regoliths or cometary dust, or in any remote-sensing problem involving light scattering in random media with wavelength-scale structures.