17 resultados para Markov random field (MRF)
em Helda - Digital Repository of University of Helsinki
Resumo:
Markov random fields (MRF) are popular in image processing applications to describe spatial dependencies between image units. Here, we take a look at the theory and the models of MRFs with an application to improve forest inventory estimates. Typically, autocorrelation between study units is a nuisance in statistical inference, but we take an advantage of the dependencies to smooth noisy measurements by borrowing information from the neighbouring units. We build a stochastic spatial model, which we estimate with a Markov chain Monte Carlo simulation method. The smooth values are validated against another data set increasing our confidence that the estimates are more accurate than the originals.
Resumo:
What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.
Resumo:
Genetics, the science of heredity and variation in living organisms, has a central role in medicine, in breeding crops and livestock, and in studying fundamental topics of biological sciences such as evolution and cell functioning. Currently the field of genetics is under a rapid development because of the recent advances in technologies by which molecular data can be obtained from living organisms. In order that most information from such data can be extracted, the analyses need to be carried out using statistical models that are tailored to take account of the particular genetic processes. In this thesis we formulate and analyze Bayesian models for genetic marker data of contemporary individuals. The major focus is on the modeling of the unobserved recent ancestry of the sampled individuals (say, for tens of generations or so), which is carried out by using explicit probabilistic reconstructions of the pedigree structures accompanied by the gene flows at the marker loci. For such a recent history, the recombination process is the major genetic force that shapes the genomes of the individuals, and it is included in the model by assuming that the recombination fractions between the adjacent markers are known. The posterior distribution of the unobserved history of the individuals is studied conditionally on the observed marker data by using a Markov chain Monte Carlo algorithm (MCMC). The example analyses consider estimation of the population structure, relatedness structure (both at the level of whole genomes as well as at each marker separately), and haplotype configurations. For situations where the pedigree structure is partially known, an algorithm to create an initial state for the MCMC algorithm is given. Furthermore, the thesis includes an extension of the model for the recent genetic history to situations where also a quantitative phenotype has been measured from the contemporary individuals. In that case the goal is to identify positions on the genome that affect the observed phenotypic values. This task is carried out within the Bayesian framework, where the number and the relative effects of the quantitative trait loci are treated as random variables whose posterior distribution is studied conditionally on the observed genetic and phenotypic data. In addition, the thesis contains an extension of a widely-used haplotyping method, the PHASE algorithm, to settings where genetic material from several individuals has been pooled together, and the allele frequencies of each pool are determined in a single genotyping.
Resumo:
This thesis consists of an introduction, four research articles and an appendix. The thesis studies relations between two different approaches to continuum limit of models of two dimensional statistical mechanics at criticality. The approach of conformal field theory (CFT) could be thought of as the algebraic classification of some basic objects in these models. It has been succesfully used by physicists since 1980's. The other approach, Schramm-Loewner evolutions (SLEs), is a recently introduced set of mathematical methods to study random curves or interfaces occurring in the continuum limit of the models. The first and second included articles argue on basis of statistical mechanics what would be a plausible relation between SLEs and conformal field theory. The first article studies multiple SLEs, several random curves simultaneously in a domain. The proposed definition is compatible with a natural commutation requirement suggested by Dubédat. The curves of multiple SLE may form different topological configurations, ``pure geometries''. We conjecture a relation between the topological configurations and CFT concepts of conformal blocks and operator product expansions. Example applications of multiple SLEs include crossing probabilities for percolation and Ising model. The second article studies SLE variants that represent models with boundary conditions implemented by primary fields. The most well known of these, SLE(kappa, rho), is shown to be simple in terms of the Coulomb gas formalism of CFT. In the third article the space of local martingales for variants of SLE is shown to carry a representation of Virasoro algebra. Finding this structure is guided by the relation of SLEs and CFTs in general, but the result is established in a straightforward fashion. This article, too, emphasizes multiple SLEs and proposes a possible way of treating pure geometries in terms of Coulomb gas. The fourth article states results of applications of the Virasoro structure to the open questions of SLE reversibility and duality. Proofs of the stated results are provided in the appendix. The objective is an indirect computation of certain polynomial expected values. Provided that these expected values exist, in generic cases they are shown to possess the desired properties, thus giving support for both reversibility and duality.
Resumo:
Planar curves arise naturally as interfaces between two regions of the plane. An important part of statistical physics is the study of lattice models. This thesis is about the interfaces of 2D lattice models. The scaling limit is an infinite system limit which is taken by letting the lattice mesh decrease to zero. At criticality, the scaling limit of an interface is one of the SLE curves (Schramm-Loewner evolution), introduced by Oded Schramm. This family of random curves is parametrized by a real variable, which determines the universality class of the model. The first and the second paper of this thesis study properties of SLEs. They contain two different methods to study the whole SLE curve, which is, in fact, the most interesting object from the statistical physics point of view. These methods are applied to study two symmetries of SLE: reversibility and duality. The first paper uses an algebraic method and a representation of the Virasoro algebra to find common martingales to different processes, and that way, to confirm the symmetries for polynomial expected values of natural SLE data. In the second paper, a recursion is obtained for the same kind of expected values. The recursion is based on stationarity of the law of the whole SLE curve under a SLE induced flow. The third paper deals with one of the most central questions of the field and provides a framework of estimates for describing 2D scaling limits by SLE curves. In particular, it is shown that a weak estimate on the probability of an annulus crossing implies that a random curve arising from a statistical physics model will have scaling limits and those will be well-described by Loewner evolutions with random driving forces.
Resumo:
Time-dependent backgrounds in string theory provide a natural testing ground for physics concerning dynamical phenomena which cannot be reliably addressed in usual quantum field theories and cosmology. A good, tractable example to study is the rolling tachyon background, which describes the decay of an unstable brane in bosonic and supersymmetric Type II string theories. In this thesis I use boundary conformal field theory along with random matrix theory and Coulomb gas thermodynamics techniques to study open and closed string scattering amplitudes off the decaying brane. The calculation of the simplest example, the tree-level amplitude of n open strings, would give us the emission rate of the open strings. However, even this has been unknown. I will organize the open string scattering computations in a more coherent manner and will argue how to make further progress.
Resumo:
This thesis consists of an introduction, four research articles and an appendix. The thesis studies relations between two different approaches to continuum limit of models of two dimensional statistical mechanics at criticality. The approach of conformal field theory (CFT) could be thought of as the algebraic classification of some basic objects in these models. It has been succesfully used by physicists since 1980's. The other approach, Schramm-Loewner evolutions (SLEs), is a recently introduced set of mathematical methods to study random curves or interfaces occurring in the continuum limit of the models. The first and second included articles argue on basis of statistical mechanics what would be a plausible relation between SLEs and conformal field theory. The first article studies multiple SLEs, several random curves simultaneously in a domain. The proposed definition is compatible with a natural commutation requirement suggested by Dubédat. The curves of multiple SLE may form different topological configurations, ``pure geometries''. We conjecture a relation between the topological configurations and CFT concepts of conformal blocks and operator product expansions. Example applications of multiple SLEs include crossing probabilities for percolation and Ising model. The second article studies SLE variants that represent models with boundary conditions implemented by primary fields. The most well known of these, SLE(kappa, rho), is shown to be simple in terms of the Coulomb gas formalism of CFT. In the third article the space of local martingales for variants of SLE is shown to carry a representation of Virasoro algebra. Finding this structure is guided by the relation of SLEs and CFTs in general, but the result is established in a straightforward fashion. This article, too, emphasizes multiple SLEs and proposes a possible way of treating pure geometries in terms of Coulomb gas. The fourth article states results of applications of the Virasoro structure to the open questions of SLE reversibility and duality. Proofs of the stated results are provided in the appendix. The objective is an indirect computation of certain polynomial expected values. Provided that these expected values exist, in generic cases they are shown to possess the desired properties, thus giving support for both reversibility and duality.
Resumo:
Eddy covariance (EC)-flux measurement technique is based on measurement of turbulent motions of air with accurate and fast measurement devices. For instance, in order to measure methane flux a fast methane gas analyser is needed which measures methane concentration at least ten times in a second in addition to a sonic anemometer, which measures the three wind components with the same sampling interval. Previously measurement of methane flux was almost impossible to carry out with EC-technique due to lack of fast enough gas analysers. However during the last decade new instruments have been developed and thus methane EC-flux measurements have become more common. Performance of four methane gas analysers suitable for eddy covariance measurements are assessed in this thesis. The assessment and comparison was performed by analysing EC-data obtained during summer 2010 (1.4.-26.10.) at Siikaneva fen. The four participating methane gas analysers are TGA-100A (Campbell Scientific Inc., USA), RMT-200 (Los Gatos Research, USA), G1301-f (Picarro Inc., USA) and Prototype-7700 (LI-COR Biosciences, USA). RMT-200 functioned most reliably throughout the measurement campaign and the corresponding methane flux data had the smallest random error. In addition, methane fluxes calculated from data obtained from G1301-f and RMT-200 agree remarkably well throughout the measurement campaign. The calculated cospectra and power spectra agree well with corresponding temperature spectra. Prototype-7700 functioned only slightly over one month in the beginning of the measurement campaign and thus its accuracy and long-term performance is difficult to assess.
Resumo:
The earliest stages of human cortical visual processing can be conceived as extraction of local stimulus features. However, more complex visual functions, such as object recognition, require integration of multiple features. Recently, neural processes underlying feature integration in the visual system have been under intensive study. A specialized mid-level stage preceding the object recognition stage has been proposed to account for the processing of contours, surfaces and shapes as well as configuration. This thesis consists of four experimental, psychophysical studies on human visual feature integration. In two studies, classification image a recently developed psychophysical reverse correlation method was used. In this method visual noise is added to near-threshold stimuli. By investigating the relationship between random features in the noise and observer s perceptual decision in each trial, it is possible to estimate what features of the stimuli are critical for the task. The method allows visualizing the critical features that are used in a psychophysical task directly as a spatial correlation map, yielding an effective "behavioral receptive field". Visual context is known to modulate the perception of stimulus features. Some of these interactions are quite complex, and it is not known whether they reflect early or late stages of perceptual processing. The first study investigated the mechanisms of collinear facilitation, where nearby collinear Gabor flankers increase the detectability of a central Gabor. The behavioral receptive field of the mechanism mediating the detection of the central Gabor stimulus was measured by the classification image method. The results show that collinear flankers increase the extent of the behavioral receptive field for the central Gabor, in the direction of the flankers. The increased sensitivity at the ends of the receptive field suggests a low-level explanation for the facilitation. The second study investigated how visual features are integrated into percepts of surface brightness. A novel variant of the classification image method with brightness matching task was used. Many theories assume that perceived brightness is based on the analysis of luminance border features. Here, for the first time this assumption was directly tested. The classification images show that the perceived brightness of both an illusory Craik-O Brien-Cornsweet stimulus and a real uniform step stimulus depends solely on the border. Moreover, the spatial tuning of the features remains almost constant when the stimulus size is changed, suggesting that brightness perception is based on the output of a single spatial frequency channel. The third and fourth studies investigated global form integration in random-dot Glass patterns. In these patterns, a global form can be immediately perceived, if even a small proportion of random dots are paired to dipoles according to a geometrical rule. In the third study the discrimination of orientation structure in highly coherent concentric and Cartesian (straight) Glass patterns was measured. The results showed that the global form was more efficiently discriminated in concentric patterns. The fourth study investigated how form detectability depends on the global regularity of the Glass pattern. The local structure was either Cartesian or curved. It was shown that randomizing the local orientation deteriorated the performance only with the curved pattern. The results give support for the idea that curved and Cartesian patterns are processed in at least partially separate neural systems.
Resumo:
Ozone (O3) is a reactive gas present in the troposphere in the range of parts per billion (ppb), i.e. molecules of O3 in 109 molecules of air. Its strong oxidative capacity makes it a key element in tropospheric chemistry and a threat to the integrity of materials, including living organisms. Knowledge and control of O3 levels are an issue in relation to indoor air quality, building material endurance, respiratory human disorders, and plant performance. Ozone is also a greenhouse gas and its abundance is relevant to global warming. The interaction of the lower troposphere with vegetated landscapes results in O3 being removed from the atmosphere by reactions that lead to the oxidation of plant-related components. Details on the rate and pattern of removal on different landscapes as well as the ultimate mechanisms by which this occurs are not fully resolved. This thesis analysed the controlling processes of the transfer of ozone at the air-plant interface. Improvement in the knowledge of these processes benefits the prediction of both atmospheric removal of O3 and its impact on vegetation. This study was based on the measurement and analysis of multi-year field measurements of O3 flux to Scots pine (Pinus sylvestris L.) foliage with a shoot-scale gas-exchange enclosure system. In addition, the analyses made use of simultaneous CO2 and H2O exchange, canopy-scale O3, CO2 and H2O exchange, foliage surface wetness, and environmental variables. All data was gathered at the SMEAR measuring station (southern Finland). Enclosure gas-exchange techniques such as those commonly used for the measure of CO2 and water vapour can be applied to the measure of ozone gas-exchange in the field. Through analysis of the system dynamics the occurring disturbances and noise can be identified. In the system used in this study, the possible artefacts arising from the ozone reactivity towards the system materials in combination with low background concentrations need to be taken into account. The main artefact was the loss of ozone towards the chamber walls, which was found to be very variable. The level of wall-loss was obtained from simultaneous and continuous measurements, and was included in the formulation of the mass balance of O3 concentration inside the chamber. The analysis of the field measurements in this study show that the flux of ozone to the Scots pine foliage is generated in about equal proportions by stomatal and non-stomatal controlled processes. Deposition towards foliage and forest is sustained also during night and winter when stomatal gas-exchange is low or absent. The non-stomatal portion of the flux was analysed further. The pattern of flux in time was found to be an overlap of the patterns of biological activity and presence of wetness in the environment. This was seen to occur both at the shoot and canopy scale. The presence of wetness enhanced the flux not only in the presence of liquid droplets but also during existence of a moisture film on the plant surfaces. The existence of these films and their relation to the ozone sinks was determined by simultaneous measurements of leaf surface wetness and ozone flux. The results seem to suggest ozone would be reacting at the foliage surface and the reaction rate would be mediated by the presence of surface wetness. Alternative mechanisms were discussed, including nocturnal stomatal aperture and emission of reactive volatile compounds. The prediction of the total flux could thus be based on a combination of a model of stomatal behaviour and a model of water absorption on the foliage surfaces. The concepts behind the division of stomatal and non-stomatal sinks were reconsidered. This study showed that it is theoretically possible that a sink located before or near the stomatal aperture prevents or diminishes the diffusion of ozone towards the intercellular air space of the mesophyll. This obstacle to stomatal diffusion happens only under certain conditions, which include a very low presence of reaction sites in the mesophyll, an extremely strong sink located on the outer surfaces or stomatal pore. The relevance, or existence, of this process in natural conditions would need to be assessed further. Potentially strong reactions were considered, including dissolved sulphate, volatile organic compounds, and apoplastic ascorbic acid. Information on the location and the relative abundance of these compounds would be valuable. The highest total flux towards the foliage and forest happens when both the plant activity and ambient moisture are high. The highest uptake into the interior of the foliage happens at large stomatal apertures, provided that scavenging reactions located near the stomatal pore are weak or non-existent. The discussion covers the methodological developments of this study, the relevance of the different controlling factors of ozone flux, the partition amongst its component, and the possible mechanisms of non-stomatal uptake.
Resumo:
The aim of this thesis was to increase our knowledge about the effects of seed origin on the timing of height growth cessation and field performance of silver birch from different latitudes, with special attention paid to the browsing damage by moose in young birch plantations. The effect of seed origin latitude and sowing time on timing of height growth cessation of first-year seedlings was studied in a greenhouse experiment with seven seed origins (lat. 58º - 67ºN). Variation in critical night length (CNL) for 50 % bud set within two latitudinally distant stands (60º and 67ºN) was studied in three phytotron experiments. Browsing by moose on 5-11 -year-old silver birch saplings from latitudinally different seed origins (53º - 67ºN) was studied in a field experiment in southern Finland. Yield and stem quality of 22-year-old silver birch trees of Baltic, Finnish and Russian origin (54º - 63ºN) and the effect of latitudinal seed transfers were studied in two provenance trials at Tuusula, southern and Viitasaari, central Finland. The timing of height growth cessation depended systematically on latitude of seed origin and sowing date. The more northern the seed origin, the earlier the growth cessation and the shorter the growth period. Later sowing dates delayed growth cessation but also shortened the growth period. The mean CNL of the southern ecotype was longer, 6.3 ± 0.2 h (95 % confidence interval), than that of the northern ecotype, 3.1 ± 0.3 h. Within-ecotype variance of the CNL was higher in the northern ecotype (0.484 h2) than in the southern ecotype (0.150 h2). Browsing by moose decreased with increasing latitude of seed origin and sapling height. Origins transferred from more southern latitudes were more heavily browsed than the more northern native ones. Southern Finnish seed origins produced the highest volume per unit area in central Finland (lat. 63º11'N). Estonian and north Latvian stand seed origins, and the southern Finnish plus tree origins, were the most productive ones in southern Finland (lat. 60º21'N). Latitudinal seed transfer distance had a significant effect on survival, stem volume/ha and proportion of trees with a stem defect. The relationship of both survival and stem volume/ha to the latitudinal seed transfer distance was curvilinear. Volume was increased by transferring seed from ca. 2 degrees of latitude from the south. A longer transfer from the south, and transfer from the north, decreased the yield. The proportion of trees with a stem defect increased linearly in relation to the latitudinal seed transfer distance from the south.
Resumo:
Volatilization of ammonia (NH3) from animal manure is a major pathway for nitrogen (N) losses that cause eutrophication, acidification, and other environmental hazards. In this study, the effect of alternative techniques of manure treatment (aeration, separation, addition of peat) and application (broadcast spreading, band spreading, injection, incorporation by harrowing) on ammonia emissions in the field and on nitrogen uptake by ley or cereals was studied. The effect of a mixture of slurry and peat on soil properties was also investigated. The aim of this study was to find ways to improve the utilization of manure nitrogen and reduce its release to the environment. Injection into the soil or incorporation by harrowing clearly reduced ammonia volatilization from slurry more than did the surface application onto a smaller area by band spreading or reduction of the dry matter of slurry by aeration or separation. Surface application showed low ammonia volatilization, when pig slurry was applied to tilled bare clay soil or to spring wheat stands in early growth stages. Apparently, the properties of both slurry and soil enabled the rapid infiltration and absorption of slurry and its ammoniacal nitrogen by the soil. On ley, however, surface-applied cattle slurry lost about half of its ammoniacal nitrogen. The volatilization of ammonia from surface-applied peat manure was slow, but proceeded over a long period of time. After rain or irrigation, the peat manure layer on the soil surface retarded evaporation. Incorporation was less important for the fertilizer effect of peat manure than for pig slurry, but both manures were more effective when incorporated. Peat manure applications increase soil organic matter content and aggregate stability. Stubble mulch tillage hastens the effect in surface soil compared with ploughing. The apparent recovery of ammoniacal manure nitrogen in crop yield was higher with injection and incorporation than with surface applications. This was the case for leys as well as for spring cereals, even though ammonia losses from manures applied to cereals were relatively low with surface applications as well. The ammoniacal nitrogen of surface-applied slurry was obviously adsorbed by the very surface soil and remained mostly unavailable to plant roots in the dry soil. Supplementing manures with inorganic fertilizer nitrogen, which adds plant-available nitrogen to the soil at the start of growth, increased the overall recovery of applied nitrogen in crop yields.
Resumo:
Our present-day understanding of fundamental constituents of matter and their interactions is based on the Standard Model of particle physics, which relies on quantum gauge field theories. On the other hand, the large scale dynamical behaviour of spacetime is understood via the general theory of relativity of Einstein. The merging of these two complementary aspects of nature, quantum and gravity, is one of the greatest goals of modern fundamental physics, the achievement of which would help us understand the short-distance structure of spacetime, thus shedding light on the events in the singular states of general relativity, such as black holes and the Big Bang, where our current models of nature break down. The formulation of quantum field theories in noncommutative spacetime is an attempt to realize the idea of nonlocality at short distances, which our present understanding of these different aspects of Nature suggests, and consequently to find testable hints of the underlying quantum behaviour of spacetime. The formulation of noncommutative theories encounters various unprecedented problems, which derive from their peculiar inherent nonlocality. Arguably the most serious of these is the so-called UV/IR mixing, which makes the derivation of observable predictions especially hard by causing new tedious divergencies, to which our previous well-developed renormalization methods for quantum field theories do not apply. In the thesis I review the basic mathematical concepts of noncommutative spacetime, different formulations of quantum field theories in the context, and the theoretical understanding of UV/IR mixing. In particular, I put forward new results to be published, which show that also the theory of quantum electrodynamics in noncommutative spacetime defined via Seiberg-Witten map suffers from UV/IR mixing. Finally, I review some of the most promising ways to overcome the problem. The final solution remains a challenge for the future.
Resumo:
The efforts of combining quantum theory with general relativity have been great and marked by several successes. One field where progress has lately been made is the study of noncommutative quantum field theories that arise as a low energy limit in certain string theories. The idea of noncommutativity comes naturally when combining these two extremes and has profound implications on results widely accepted in traditional, commutative, theories. In this work I review the status of one of the most important connections in physics, the spin-statistics relation. The relation is deeply ingrained in our reality in that it gives us the structure for the periodic table and is of crucial importance for the stability of all matter. The dramatic effects of noncommutativity of space-time coordinates, mainly the loss of Lorentz invariance, call the spin-statistics relation into question. The spin-statistics theorem is first presented in its traditional setting, giving a clarifying proof starting from minimal requirements. Next the notion of noncommutativity is introduced and its implications studied. The discussion is essentially based on twisted Poincaré symmetry, the space-time symmetry of noncommutative quantum field theory. The controversial issue of microcausality in noncommutative quantum field theory is settled by showing for the first time that the light wedge microcausality condition is compatible with the twisted Poincaré symmetry. The spin-statistics relation is considered both from the point of view of braided statistics, and in the traditional Lagrangian formulation of Pauli, with the conclusion that Pauli's age-old theorem stands even this test so dramatic for the whole structure of space-time.