21 resultados para Rejection-sampling Algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis examines the feasibility of a forest inventory method based on two-phase sampling in estimating forest attributes at the stand or substand levels for forest management purposes. The method is based on multi-source forest inventory combining auxiliary data consisting of remote sensing imagery or other geographic information and field measurements. Auxiliary data are utilized as first-phase data for covering all inventory units. Various methods were examined for improving the accuracy of the forest estimates. Pre-processing of auxiliary data in the form of correcting the spectral properties of aerial imagery was examined (I), as was the selection of aerial image features for estimating forest attributes (II). Various spatial units were compared for extracting image features in a remote sensing aided forest inventory utilizing very high resolution imagery (III). A number of data sources were combined and different weighting procedures were tested in estimating forest attributes (IV, V). Correction of the spectral properties of aerial images proved to be a straightforward and advantageous method for improving the correlation between the image features and the measured forest attributes. Testing different image features that can be extracted from aerial photographs (and other very high resolution images) showed that the images contain a wealth of relevant information that can be extracted only by utilizing the spatial organization of the image pixel values. Furthermore, careful selection of image features for the inventory task generally gives better results than inputting all extractable features to the estimation procedure. When the spatial units for extracting very high resolution image features were examined, an approach based on image segmentation generally showed advantages compared with a traditional sample plot-based approach. Combining several data sources resulted in more accurate estimates than any of the individual data sources alone. The best combined estimate can be derived by weighting the estimates produced by the individual data sources by the inverse values of their mean square errors. Despite the fact that the plot-level estimation accuracy in two-phase sampling inventory can be improved in many ways, the accuracy of forest estimates based mainly on single-view satellite and aerial imagery is a relatively poor basis for making stand-level management decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis which consists of an introduction and four peer-reviewed original publications studies the problems of haplotype inference (haplotyping) and local alignment significance. The problems studied here belong to the broad area of bioinformatics and computational biology. The presented solutions are computationally fast and accurate, which makes them practical in high-throughput sequence data analysis. Haplotype inference is a computational problem where the goal is to estimate haplotypes from a sample of genotypes as accurately as possible. This problem is important as the direct measurement of haplotypes is difficult, whereas the genotypes are easier to quantify. Haplotypes are the key-players when studying for example the genetic causes of diseases. In this thesis, three methods are presented for the haplotype inference problem referred to as HaploParser, HIT, and BACH. HaploParser is based on a combinatorial mosaic model and hierarchical parsing that together mimic recombinations and point-mutations in a biologically plausible way. In this mosaic model, the current population is assumed to be evolved from a small founder population. Thus, the haplotypes of the current population are recombinations of the (implicit) founder haplotypes with some point--mutations. HIT (Haplotype Inference Technique) uses a hidden Markov model for haplotypes and efficient algorithms are presented to learn this model from genotype data. The model structure of HIT is analogous to the mosaic model of HaploParser with founder haplotypes. Therefore, it can be seen as a probabilistic model of recombinations and point-mutations. BACH (Bayesian Context-based Haplotyping) utilizes a context tree weighting algorithm to efficiently sum over all variable-length Markov chains to evaluate the posterior probability of a haplotype configuration. Algorithms are presented that find haplotype configurations with high posterior probability. BACH is the most accurate method presented in this thesis and has comparable performance to the best available software for haplotype inference. Local alignment significance is a computational problem where one is interested in whether the local similarities in two sequences are due to the fact that the sequences are related or just by chance. Similarity of sequences is measured by their best local alignment score and from that, a p-value is computed. This p-value is the probability of picking two sequences from the null model that have as good or better best local alignment score. Local alignment significance is used routinely for example in homology searches. In this thesis, a general framework is sketched that allows one to compute a tight upper bound for the p-value of a local pairwise alignment score. Unlike the previous methods, the presented framework is not affeced by so-called edge-effects and can handle gaps (deletions and insertions) without troublesome sampling and curve fitting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is only little information available on the 4-6-year-old child s hospital-related fears, and on the coping with such fears, as expressed by the children themselves. However, previous data collected from parents and hospital personnel indicate that hospitalization is an anxiety-producing experience for young children. The purpose of this study was to describe the experience of hospital-related fears and the experience of coping with hospital-related fears of 4-6-year-old children. The aim of this study was to form a descriptive model of the subjective experience of hospital-related fears and coping strategies of 4-6-year old children. The data were collected by interviewing 4-6-year-old children from a hospital and kindergarten settings in Finland from 2004 to 2006. Ninety children were interviewed in order to describe the hospital-related fear and the experience of fear, and 89 to describe their coping with the fear and the experience of coping. The children were chosen through purposive sampling. The data were gathered by semi-structured interview, supported by pictures. The data about hospital-related fears and on strategies for coping with hospital-related fears were reviewed by qualitative and quantitative methods. The experience of hospital-related fears and coping with these fears were analyzed using Colaizzi s Method of Phenomenological Analysis. The results revealed that more than 90 % of the children said they were afraid of at least one thing in hospital. Most of the fears could be categorized as nursing interventions, fears of being a patient, and fears caused by the developmental stage of the child. Children interviewed in the hospital expressed substantially more fears than children interviewed in kindergarten. Children s meanings of hospital-related fears were placed into four main clusters: 1) insecurity, 2) injury, 3) helplessness, 4) and rejection. The results also showed that children have plenty of coping strategies, to deal with their fears, especially such strategies in which the children themselves play an active role. Most often mentioned coping strategies were 1) the presence of parents and other family members, 2) the help of the personnel, 3) positive images and humour, 4) play, and 5) the child s own safety toy. The children interviewed in the hospital mentioned statistically significantly more often play, positive imagination and humour as their coping strategy than children interviewed in kindergarten. The meaning of coping with hospital fears consisted of six clusters: pleasure, security, care, understanding the meaning of the situation participating, and protecting oneself. Being admitted to a hospital is an event which may increase the fears of a 4-6-year-old child. Children who have personal experience of being admitted to a hospital describe more fears than healthy children in kindergarten. For young children, hospital-related fear can be such a distressing experience that it reflects on their feelings of security and their behaviour. Children can sometimes find it difficult to admit their fear. Children need the help of adults to express their hospital-related fears, the objects of the fears, and to cope with the fears. Personnel should be aware of children s fears and support them in the use of coping strategies. In addition to the experiences of security and care, pre-school-aged children need active coping strategies that they can use themselves, regardless of the presence of the parents or nurses. Most of all, children need the possibility to play and experience pleasure. Children can also be taught coping strategies which give them an active, positive role.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emissions of coal combustion fly ash through real scale ElectroStatic Precipitators (ESP) were studied in different coal combustion and operation conditions. Sub-micron fly-ash aerosol emission from a power plant boiler and the ESP were determined and consequently the aerosol penetration, as based on electrical mobility measurements, thus giving thereby an indication for an estimate on the size and the maximum extent that the small particles can escape. The experimentals indicate a maximum penetration of 4% to 20 % of the small particles, as counted on number basis instead of the normally used mass basis, while simultaneously the ESP is operating at a nearly 100% collection efficiency on mass basis. Although the size range as such seems to appear independent of the coal, of the boiler or even of the device used for the emission control, the maximum penetration level on the number basis depends on the ESP operating parameters. The measured emissions were stable during stable boiler operation for a fired coal, and the emissions seemed each to be different indicating that the sub-micron size distribution of the fly-ash could be used as a specific characteristics for recognition, for instance for authenticity, provided with an indication of known stable operation. Consequently, the results on the emissions suggest an optimum particle size range for environmental monitoring in respect to the probability of finding traces from the samples. The current work embodies also an authentication system for aerosol samples for post-inspection from any macroscopic sample piece. The system can comprise newly introduced new devices, for mutually independent use, or, for use in a combination with each other, as arranged in order to promote the sampling operation length and/or the tag selection diversity. The tag for the samples can be based on naturally occurring measures and/or added measures of authenticity in a suitable combination. The method involves not only military related applications but those in civil industries as well. Alternatively to the samples, the system can be applied to ink for note printing or other monetary valued papers, but also in a filter manufacturing for marking fibrous filters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Volatile organic compounds (VOCs) affect atmospheric chemistry and thereafter also participate in the climate change in many ways. The long-lived greenhouse gases and tropospheric ozone are the most important radiative forcing components warming the climate, while aerosols are the most important cooling component. VOCs can have warming effects on the climate: they participate in tropospheric ozone formation and compete for oxidants with the greenhouse gases thus, for example, lengthening the atmospheric lifetime of methane. Some VOCs, on the other hand, cool the atmosphere by taking part in the formation of aerosol particles. Some VOCs, in addition, have direct health effects, such as carcinogenic benzene. VOCs are emitted into the atmosphere in various processes. Primary emissions of VOC include biogenic emissions from vegetation, biomass burning and human activities. VOCs are also produced in secondary emissions from the reactions of other organic compounds. Globally, forests are the largest source of VOC entering the atmosphere. This thesis focuses on the measurement results of emissions and concentrations of VOCs in one of the largest vegetation zones in the world, the boreal zone. An automated sampling system was designed and built for continuous VOC concentration and emission measurements with a proton transfer reaction - mass spectrometer (PTR-MS). The system measured one hour at a time in three-hourly cycles: 1) ambient volume mixing-ratios of VOCs in the Scots-pine-dominated boreal forest, 2) VOC fluxes above the canopy, and 3) VOC emissions from Scots pine shoots. In addition to the online PTR-MS measurements, we determined the composition and seasonality of the VOC emissions from a Siberian larch with adsorbent samples and GC-MS analysis. The VOC emissions from Siberian larch were reported for the fist time in the literature. The VOC emissions were 90% monoterpenes (mainly sabinene) and the rest sesquiterpenes (mainly a-farnesene). The normalized monoterpene emission potentials were highest in late summer, rising again in late autumn. The normalized sesquiterpene emission potentials were also highest in late summer, but decreased towards the autumn. The emissions of mono- and sesquiterpenes from the deciduous Siberian larch, as well as the emissions of monoterpenes measured from the evergreen Scots pine, were well described by the temperature-dependent algorithm. In the Scots-pine-dominated forest, canopy-scale emissions of monoterpenes and oxygenated VOCs (OVOCs) were of the same magnitude. Methanol and acetone were the most abundant OVOCs emitted from the forest and also in the ambient air. Annually, methanol and mixing ratios were of the order of 1 ppbv. The monoterpene and sum of isoprene 2-methyl-3-buten-2-ol (MBO) volume mixing-ratios were an order of magnitude lower. The majority of the monoterpene and methanol emissions from the Scots-pinedominated forest were explained by emissions from Scots pine shoots. The VOCs were divided into three classes based on the dynamics of the summer-time concentrations: 1) reactive compounds with local biological, anthropogenic or chemical sources (methanol, acetone, butanol and hexanal), 2) compounds whose emissions are only temperaturedependent (monoterpenes), 3) long-lived compounds (benzene, acetaldehyde). Biogenic VOC (methanol, acetone, isoprene MBO and monoterpene) volume mixing-ratios had clear diurnal patterns during summer. The ambient mixing ratios of other VOCs did not show this behaviour. During winter we did not observe systematical diurnal cycles for any of the VOCs. Different sources, removal processes and turbulent mixing explained the dynamics of the measured mixing-ratios qualitatively. However, quantitative understanding will require longterm emission measurements of the OVOCs and the use of comprehensive chemistry models. Keywords: Hydrocarbons, VOC, fluxes, volume mixing-ratio, boreal forest

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Earth s climate is a highly dynamic and complex system in which atmospheric aerosols have been increasingly recognized to play a key role. Aerosol particles affect the climate through a multitude of processes, directly by absorbing and reflecting radiation and indirectly by changing the properties of clouds. Because of the complexity, quantification of the effects of aerosols continues to be a highly uncertain science. Better understanding of the effects of aerosols requires more information on aerosol chemistry. Before the determination of aerosol chemical composition by the various available analytical techniques, aerosol particles must be reliably sampled and prepared. Indeed, sampling is one of the most challenging steps in aerosol studies, since all available sampling techniques harbor drawbacks. In this study, novel methodologies were developed for sampling and determination of the chemical composition of atmospheric aerosols. In the particle-into-liquid sampler (PILS), aerosol particles grow in saturated water vapor with further impaction and dissolution in liquid water. Once in water, the aerosol sample can then be transported and analyzed by various off-line or on-line techniques. In this study, PILS was modified and the sampling procedure was optimized to obtain less altered aerosol samples with good time resolution. A combination of denuders with different coatings was tested to adsorb gas phase compounds before PILS. Mixtures of water with alcohols were introduced to increase the solubility of aerosols. Minimum sampling time required was determined by collecting samples off-line every hour and proceeding with liquid-liquid extraction (LLE) and analysis by gas chromatography-mass spectrometry (GC-MS). The laboriousness of LLE followed by GC-MS analysis next prompted an evaluation of solid-phase extraction (SPE) for the extraction of aldehydes and acids in aerosol samples. These two compound groups are thought to be key for aerosol growth. Octadecylsilica, hydrophilic-lipophilic balance (HLB), and mixed phase anion exchange (MAX) were tested as extraction materials. MAX proved to be efficient for acids, but no tested material offered sufficient adsorption for aldehydes. Thus, PILS samples were extracted only with MAX to guarantee good results for organic acids determined by liquid chromatography-mass spectrometry (HPLC-MS). On-line coupling of SPE with HPLC-MS is relatively easy, and here on-line coupling of PILS with HPLC-MS through the SPE trap produced some interesting data on relevant acids in atmospheric aerosol samples. A completely different approach to aerosol sampling, namely, differential mobility analyzer (DMA)-assisted filter sampling, was employed in this study to provide information about the size dependent chemical composition of aerosols and understanding of the processes driving aerosol growth from nano-size clusters to climatically relevant particles (>40 nm). The DMA was set to sample particles with diameters of 50, 40, and 30 nm and aerosols were collected on teflon or quartz fiber filters. To clarify the gas-phase contribution, zero gas-phase samples were collected by switching off the DMA every other 15 minutes. Gas-phase compounds were adsorbed equally well on both types of filter, and were found to contribute significantly to the total compound mass. Gas-phase adsorption is especially significant during the collection of nanometer-size aerosols and needs always to be taken into account. Other aims of this study were to determine the oxidation products of β-caryophyllene (the major sesquiterpene in boreal forest) in aerosol particles. Since reference compounds are needed for verification of the accuracy of analytical measurements, three oxidation products of β-caryophyllene were synthesized: β-caryophyllene aldehyde, β-nocaryophyllene aldehyde, and β-caryophyllinic acid. All three were identified for the first time in ambient aerosol samples, at relatively high concentrations, and their contribution to the aerosol mass (and probably growth) was concluded to be significant. Methodological and instrumental developments presented in this work enable fuller understanding of the processes behind biogenic aerosol formation and provide new tools for more precise determination of biosphere-atmosphere interactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Thesis presents a state-space model for a basketball league and a Kalman filter algorithm for the estimation of the state of the league. In the state-space model, each of the basketball teams is associated with a rating that represents its strength compared to the other teams. The ratings are assumed to evolve in time following a stochastic process with independent Gaussian increments. The estimation of the team ratings is based on the observed game scores that are assumed to depend linearly on the true strengths of the teams and independent Gaussian noise. The team ratings are estimated using a recursive Kalman filter algorithm that produces least squares optimal estimates for the team strengths and predictions for the scores of the future games. Additionally, if the Gaussianity assumption holds, the predictions given by the Kalman filter maximize the likelihood of the observed scores. The team ratings allow probabilistic inference about the ranking of the teams and their relative strengths as well as about the teams’ winning probabilities in future games. The predictions about the winners of the games are correct 65-70% of the time. The team ratings explain 16% of the random variation observed in the game scores. Furthermore, the winning probabilities given by the model are concurrent with the observed scores. The state-space model includes four independent parameters that involve the variances of noise terms and the home court advantage observed in the scores. The Thesis presents the estimation of these parameters using the maximum likelihood method as well as using other techniques. The Thesis also gives various example analyses related to the American professional basketball league, i.e., National Basketball Association (NBA), and regular seasons played in year 2005 through 2010. Additionally, the season 2009-2010 is discussed in full detail, including the playoffs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that the ratio of matched individuals to blocking pairs grows linearly with the number of propose–accept rounds executed by the Gale–Shapley algorithm for the stable marriage problem. Consequently, the participants can arrive at an almost stable matching even without full information about the problem instance; for each participant, knowing only its local neighbourhood is enough. In distributed-systems parlance, this means that if each person has only a constant number of acceptable partners, an almost stable matching emerges after a constant number of synchronous communication rounds. We apply our results to give a distributed (2 + ε)-approximation algorithm for maximum-weight matching in bicoloured graphs and a centralised randomised constant-time approximation scheme for estimating the size of a stable matching.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a distributed 2-approximation algorithm for the minimum vertex cover problem. The algorithm is deterministic, and it runs in (Δ + 1)2 synchronous communication rounds, where Δ is the maximum degree of the graph. For Δ = 3, we give a 2-approximation algorithm also for the weighted version of the problem.