13 resultados para classical integral transforms
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
This thesis studies gray-level distance transforms, particularly the Distance Transform on Curved Space (DTOCS). The transform is produced by calculating distances on a gray-level surface. The DTOCS is improved by definingmore accurate local distances, and developing a faster transformation algorithm. The Optimal DTOCS enhances the locally Euclidean Weighted DTOCS (WDTOCS) with local distance coefficients, which minimize the maximum error from the Euclideandistance in the image plane, and produce more accurate global distance values.Convergence properties of the traditional mask operation, or sequential localtransformation, and the ordered propagation approach are analyzed, and compared to the new efficient priority pixel queue algorithm. The Route DTOCS algorithmdeveloped in this work can be used to find and visualize shortest routes between two points, or two point sets, along a varying height surface. In a digital image, there can be several paths sharing the same minimal length, and the Route DTOCS visualizes them all. A single optimal path can be extracted from the route set using a simple backtracking algorithm. A new extension of the priority pixel queue algorithm produces the nearest neighbor transform, or Voronoi or Dirichlet tessellation, simultaneously with the distance map. The transformation divides the image into regions so that each pixel belongs to the region surrounding the reference point, which is nearest according to the distance definition used. Applications and application ideas for the DTOCS and its extensions are presented, including obstacle avoidance, image compression and surface roughness evaluation.
Resumo:
The objective of this thesis is to study wavelets and their role in turbulence applications. Under scrutiny in the thesis is the intermittency in turbulence models. Wavelets are used as a mathematical tool to study the intermittent activities that turbulence models produce. The first section generally introduces wavelets and wavelet transforms as a mathematical tool. Moreover, the basic properties of turbulence are discussed and classical methods for modeling turbulent flows are explained. Wavelets are implemented to model the turbulence as well as to analyze turbulent signals. The model studied here is the GOY (Gledzer 1973, Ohkitani & Yamada 1989) shell model of turbulence, which is a popular model for explaining intermittency based on the cascade of kinetic energy. The goal is to introduce better quantification method for intermittency obtained in a shell model. Wavelets are localized in both space (time) and scale, therefore, they are suitable candidates for the study of singular bursts, that interrupt the calm periods of an energy flow through various scales. The study concerns two questions, namely the frequency of the occurrence as well as the intensity of the singular bursts at various Reynolds numbers. The results gave an insight that singularities become more local as Reynolds number increases. The singularities become more local also when the shell number is increased at certain Reynolds number. The study revealed that the singular bursts are more frequent at Re ~ 107 than other cases with lower Re. The intermittency of bursts for the cases with Re ~ 106 and Re ~ 105 was similar, but for the case with Re ~ 104 bursts occured after long waiting time in a different fashion so that it could not be scaled with higher Re.
Resumo:
Luettelointi kesken
Resumo:
This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4
Resumo:
Identification of order of an Autoregressive Moving Average Model (ARMA) by the usual graphical method is subjective. Hence, there is a need of developing a technique to identify the order without employing the graphical investigation of series autocorrelations. To avoid subjectivity, this thesis focuses on determining the order of the Autoregressive Moving Average Model using Reversible Jump Markov Chain Monte Carlo (RJMCMC). The RJMCMC selects the model from a set of the models suggested by better fitting, standard deviation errors and the frequency of accepted data. Together with deep analysis of the classical Box-Jenkins modeling methodology the integration with MCMC algorithms has been focused through parameter estimation and model fitting of ARMA models. This helps to verify how well the MCMC algorithms can treat the ARMA models, by comparing the results with graphical method. It has been seen that the MCMC produced better results than the classical time series approach.
Resumo:
This thesis studies properties of transforms based on parabolic scaling, like Curvelet-, Contourlet-, Shearlet- and Hart-Smith-transform. Essentially, two di erent questions are considered: How these transforms can characterize H older regularity and how non-linear approximation of a piecewise smooth function converges. In study of Hölder regularities, several theorems that relate regularity of a function f : R2 → R to decay properties of its transform are presented. Of particular interest is the case where a function has lower regularity along some line segment than elsewhere. Theorems that give estimates for direction and location of this line, and regularity of the function are presented. Numerical demonstrations suggest also that similar theorems would hold for more general shape of segment of low regularity. Theorems related to uniform and pointwise Hölder regularity are presented as well. Although none of the theorems presented give full characterization of regularity, the su cient and necessary conditions are very similar. Another theme of the thesis is the study of convergence of non-linear M ─term approximation of functions that have discontinuous on some curves and otherwise are smooth. With particular smoothness assumptions, it is well known that squared L2 approximation error is O(M-2(logM)3) for curvelet, shearlet or contourlet bases. Here it is shown that assuming higher smoothness properties, the log-factor can be removed, even if the function still is discontinuous.
Resumo:
The objective of this Bachelor's Thesis is to find out the role of social media in the B-to-B marketing environment of the information technology industry and to discover how IT-firms utilize social media as a part of their customer reference marketing. To reach the objectives the concepts of customer reference marketing and social media are determined. Customer reference marketing can be characterized as one of the most practically relevant but academically relatively overlooked ways in which a company can leverage its customers and delivered solutions and use them as references in its marketing activities. We will cover which external and internal functions customer references have, that contribute to the growth and performance of B-to-B firms. We also address the three mechanisms of customer reference marketing which are 'status transfer', 'validation through testimonials' and 'demonstration of experience and prior performance'. The concept of social media stands for social interaction and creation of user-based content which exclusively occurs through Internet. The social media are excellent tools for networking because of the fast and easy access, easy interaction and vast amount of multimedia attributes. The allocation of social media is determined. The case company helps clarify the specific characteristics of social media usage as part of customer-reference-marketing activities. For IT-firms the best channels to utilize social media in their customer reference marketing activities are publishing and distribution services of content and networking services.
Resumo:
In this Thesis I discuss the dynamics of the quantum Brownian motion model in harmonic potential. This paradigmatic model has an exact solution, making it possible to consider also analytically the non-Markovian dynamics. The issues covered in this Thesis are themed around decoherence. First, I consider decoherence as the mediator of quantum-to-classical transition. I examine five different definitions for nonclassicality of quantum states, and show how each definition gives qualitatively different times for the onset of classicality. In particular I have found that all characterizations of nonclassicality, apart from one based on the interference term in the Wigner function, result in a finite, rather than asymptotic, time for the emergence of classicality. Second, I examine the diverse effects which coupling to a non-Markovian, structured reservoir, has on our system. By comparing different types of Ohmic reservoirs, I derive some general conclusions on the role of the reservoir spectrum in both the short-time and the thermalization dynamics. Finally, I apply these results to two schemes for decoherence control. Both of the methods are based on the non-Markovian properties of the dynamics.
Resumo:
We live in an age where rationalization and demands of efficiency taint every aspect of our lives both as individuals and as a society. Even warfare cannot escape the increased speed of human interaction. Time is a resource to be managed. It has to be optimized, saved and won in military affairs as well. The purpose of this research paper is to analyze the dogmatic texts of military thought to search for answers what the classics of strategy saw in the interrelations of temporality and warfare and if their thoughts remain meaningful in the contemporary conjunction. Since the way a society functions is reflected in the way it conducts its wars, there naturally are differences between an agrarian, industrial and information society. Theorists of different eras emphasize things specific to their times, but warfare, like any human interaction, is always bounded by temporality. Not only is the pace of warfare dependent on the progress of the society, but time permeates warfare in all its aspects. This research paper focuses on two specific topics that arose from the texts themselves; how should time be managed and manipulated in warfare and how to economize and “win” it from the enemy. A method where lengthy quotations are used to illustrate the main point of the strategists has been chosen for this research paper. While Clausewitz is the most prominent source of quotations, thoughts from ancient India and China are represented as well to prove that the combination of right force in the right place at the right time is still the way of the victorious. Tactics change in the course of time but the principles of strategy remain unaltered and are only adapted to suit new situations. While ancient and pre-modern societies had their focus on finding auspicious moments for battle in the flow of kronos-time based on divinities, portents and auguries, we can trace elements of manipulation of time in warfare from the earliest surviving texts. While time as a fourth dimension of the battlespace emerged only in the modern era, all through the history of military thought it has had a profound meaning. In the past time could be squandered, today it always has to be won. This paper asks the question “why”.