46 resultados para Minimum processing
Resumo:
This thesis gives an overview of the use of the level set methods in the field of image science. The similar fast marching method is discussed for comparison, also the narrow band and the particle level set methods are introduced. The level set method is a numerical scheme for representing, deforming and recovering structures in an arbitrary dimensions. It approximates and tracks the moving interfaces, dynamic curves and surfaces. The level set method does not define how and why some boundary is advancing the way it is but simply represents and tracks the boundary. The principal idea of the level set method is to represent the N dimensional boundary in the N+l dimensions. This gives the generality to represent even the complex boundaries. The level set methods can be powerful tools to represent dynamic boundaries, but they can require lot of computing power. Specially the basic level set method have considerable computational burden. This burden can be alleviated with more sophisticated versions of the level set algorithm like the narrow band level set method or with the programmable hardware implementation. Also the parallel approach can be used in suitable applications. It is concluded that these methods can be used in a quite broad range of image applications, like computer vision and graphics, scientific visualization and also to solve problems in computational physics. Level set methods and methods derived and inspired by it will be in the front line of image processing also in the future.
Resumo:
The thesis presents an overview of third generation of IP telephony. The architecture of 3G IP Telephony and its components are described. The main goal of the thesis is to investigate the interface between the Call Processing Server and Multimedia IP Networks. The interface functionality, proposed protocol stack and a general description are presented in the thesis. To provide useful services, 3G IP Telephony requires a set of control protocols for connection establishment, capabilities exchange and conference control. The Session Initiation Protocol (SIP) and the H.323 are two protocols that meet these needs. In the thesis these two protocols are investigated and compared in terms of Complexity, Extensibility, Scalability, Services, Resource Utilization and Management.
Resumo:
Diplomityön tavoitteena oli sopivimman yritysostokohteen valitseminen useiden kilpailijoiden joukosta puunkäsittelykoneiden toimittajalle. Ensin esiteltiin Suomen metsäteollisuus sekä sen osaamistarpeista noussut metsäklusteri pääosin kohdeyrityksen näkökulmasta. Seuraavaksi annettiin kuva yrityksen tuotteista, kilpailijoista ja asiakkaista. Yritysostoprosessi kuvattiin sekä esille tuotiin yleiset motiivit ja kriittiset menestystekijät. Lisäksi kuvattiin kilpailijoiden ja liiketoimintaympäristön analysointi yrityksen menestyksen edellytyksenä. Puuntyöstökoneiden markkinat segmentoitiin ja analysoitiin vuodesta 1990 aina tähän päivään asti, jotta löydettäisiin kehityskelpoiset osa-alueet eli alueet, joissa yrityksen markkinaosuutta voitaisiin kasvattaa. Kandidaattien ominaisuuksia verrattiin yritysoston motiiveihin. Yritysten tuotteet sekä maantieteellinen sijainti pisteytettiin, jotta sopivimmat yritykset nousisivat esille. Kolme yritystä valittiin syvällisempään tarkasteluun. Yritysten tuotteita, taloudellista asemaa ja globaalia verkostoa vertailtiin keskenään muiden tekijöiden, kuten maailmantalouden ohessa. Taloudellisesti vakaa ja teknisesti monipuolinen yritys kohtasi yritysoston motiivit parhaiten. Kohteen positiivisia puolia olivat sijainti, tuotteet ja palvelut. Lisäksi, yritys sopii ostajan strategiaan sekä auttaa kohtaamaan asiakkaiden nykyiset ja tulevat tarpeet.
Resumo:
Työssä verrattiin perälaatikoilla 1 ja 2 valmistettujen papereiden rakenteellisia ominaisuuksia. Paperin rakenteessa formaatio oli tärkein ominaisuus ja sen jälkeen kuituorientaatio. Näytteiden valintaperusteena pidettiin sitä, että vertailtavilla näytteillä formaatio (pohja) oli paras kyseisellä perälaatikolla ja vetolujuussuhteet olivat samoja. Toinen valintatapa oli verrata samoissa virtausolosuhteissa valmistettuja papereita keskenään. Näytteiden formaatio mitattiin betaradiografialla. Fosforikuvalevystä skannattu kuva analysoitiin kuva-analyysillä. Mittauksen etuna oli suuri erottelukyky, joka mahdollisti monipuolisen tunnuslukujen laskennan. Näistä esimerkkeinä olivat keskihajonta, vinous ja huipukkuus. Lisäksi määritettiin flokkikokojakaumat sekä kone- että poikkisuuntaan. Kuituorientaation määrityksessä paperinäyte revittiin kerroksiin, kerrokset skannattiin ja kuvat analysoitiin kuvankäsittelyohjelmilla. Juova- ja kuituorientaatioanalyysissä määritettiin orientaatiokulma, max/min-arvo ja anisotropia. Virtaviiva-analyysin tunnusluku oli pyörrekoko. Käytettäessä tunnuslukuna variaatiokerrointa formaatio oli parempi perälaatikolla 1 ali- ja yliperällä. Tasaperän läheisyydessä formaatio oli huonompi. Keskihajonta oli pienempi perälaatikolla 1, mutta erot perälaatikoiden välillä tasaantuivat lähellä tasaperää. Flokkikoko oli koko s/v-alueella hieman suurempi perälaatikolla 1. Virtaviiva-analyysin avulla saatiin selville, että perälaatikolla 1 valmistettujen papereiden paikallinen orientaatiovaihtelu ja pintojen toispuoleisuus oli lievempää kuin perälaatikolla 2.
Resumo:
This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4
Resumo:
The objectives of this research work “Identification of the Emerging Issues in Recycled Fiber processing” are discovering of emerging research issues and presenting of new approaches to identify promising research themes in recovered paper application and production. The projected approach consists of identifying technological problems often encountered in wastepaper preparation processes and also improving the quality of recovered paper and increasing its proportion in the composition of paper and board. The source of information for the problem retrieval is scientific publications in which waste paper application and production were discussed. The study has exploited several research methods to understand the changes related to utilization of recovered paper. The all assembled data was carefully studied and categorized by applying software called RefViz and CiteSpace. Suggestions were made on the various classes of these problems that need further investigation in order to propose an emerging research trends in recovered paper.
Resumo:
Dirt counting and dirt particle characterisation of pulp samples is an important part of quality control in pulp and paper production. The need for an automatic image analysis system to consider dirt particle characterisation in various pulp samples is also very critical. However, existent image analysis systems utilise a single threshold to segment the dirt particles in different pulp samples. This limits their precision. Based on evidence, designing an automatic image analysis system that could overcome this deficiency is very useful. In this study, the developed Niblack thresholding method is proposed. The method defines the threshold based on the number of segmented particles. In addition, the Kittler thresholding is utilised. Both of these thresholding methods can determine the dirt count of the different pulp samples accurately as compared to visual inspection and the Digital Optical Measuring and Analysis System (DOMAS). In addition, the minimum resolution needed for acquiring a scanner image is defined. By considering the variation in dirt particle features, the curl shows acceptable difference to discriminate the bark and the fibre bundles in different pulp samples. Three classifiers, called k-Nearest Neighbour, Linear Discriminant Analysis and Multi-layer Perceptron are utilised to categorize the dirt particles. Linear Discriminant Analysis and Multi-layer Perceptron are the most accurate in classifying the segmented dirt particles by the Kittler thresholding with morphological processing. The result shows that the dirt particles are successfully categorized for bark and for fibre bundles.
Resumo:
The ability of the supplier firm to generate and utilise customer-specific knowledge has attracted increasing attention in the academic literature during the last decade. It has been argued the customer knowledge should treated as a strategic asset the same as any other intangible assets. Yet, at the same time it has been shown that the management of customer-specific knowledge is challenging in practice, and that many firms are better at acquiring customer knowledge than at making use of it. This study examines customer knowledge processing in the context of key account management in large industrial firms. This focus was chosen because key accounts are demanding and complex. It is not unusual for a single key account relationship to constitute a complex web of relationships between the supplier and the key account – thus easily leading to the dispersion of customer-specific knowledge in the supplier firm. Although the importance of customer-specific knowledge generation has been widely acknowledged in the literature, surprisingly little attention has been paid to the processes through which firms generate, disseminate and use such knowledge internally for enhancing the relationships with their major, strategically important key account customers. This thesis consists of two parts. The first part comprises a theoretical overview and draws together the main findings of the study, whereas the second part consists of five complementary empirical research papers based on survey data gathered from large industrial firms in Finland. The findings suggest that the management of customer knowledge generated about and form key accounts is a three-dimensional process consisting of acquisition, dissemination and utilization. It could be concluded from the results that customer-specific knowledge is a strategic asset because the supplier’s customer knowledge processing activities have a positive effect on supplier’s key account performance. Moreover, in examining the determinants of each phase separately the study identifies a number of intra-organisational factors that facilitate the process in supplier firms. The main contribution of the thesis lies in linking the concept of customer knowledge processing to the previous literature on key account management. Moreover, given than this literature is mainly conceptual or case-based, a further contribution is to examine its consequences and determinants based on quantitative empirical data.
Resumo:
In dentistry, yttrium partially stabilized zirconia (ZrO2) has become one of the most attractive ceramic materials for prosthetic applications. The aim of this series of studies was to evaluate whether certain treatments used in the manufacturing process, such as sintering time, color shading or heat treatment of zirconia affect the material properties. Another aim was to evaluate the load-bearing capacity and marginal fit of manually copy-milled custom-made versus prefabricated commercially available zirconia implant abutments. Mechanical properties such as flexural strength and surface microhardness were determined for green-stage milled and sintered yttrium partially stabilized zirconia after different sintering time, coloring process and heat treatments. Scanning electron microscope (SEM) was used for analyzing the possible changes in surface structure of zirconia material after reduced sintering time, coloring and heat treatments. Possible phase change from the tetragonal to the monoclinic phase was evaluated by X-ray diffraction analysis (XRD). The load-bearing capacity of different implant abutments was measured and the fit between abutment and implant replica was examined with SEM. The results of these studies showed that the shorter sintering time or the thermocycling did not affect the strength or surface microhardness of zirconia. Coloring of zirconia decreased strength compared to un-colored control zirconia, and some of the colored zirconia specimens also showed a decrease in surface microhardness. Coloring also affected the dimensions of zirconia. Significantly decreased shrinkage was found for colored zirconia specimens during sintering. Heat treatment of zirconia did not seem to affect materials’ mechanical properties but when a thin coating of wash and glaze porcelain was fired on the tensile side of the disc the flexural strength decreased significantly. Furthermore, it was found that thermocycling increased the monoclinic phase on the surface of the zirconia. Color shading or heat treatment did not seem to affect phase transformation but small monoclinic peaks were detected on the surface of the heat treated specimens with a thin coating of wash and glaze porcelain on the opposite side. Custom-made zirconia abutments showed comparable load-bearing capacity to the prefabricated commercially available zirconia abutments. However, the fit of the custom-made abutments was less satisfactory than that of the commercially available abutments. These studies suggest that zirconia is a durable material and other treatments than color shading used in the manufacturing process of zirconia bulk material does not affect the material’s strength. The decrease in strength and dimensional changes after color shading needs to be taken into account when fabricating zirconia substructures for fixed dental prostheses. Manually copy-milled custom-made abutments have acceptable load-bearing capacity but the marginal accuracy has to be evaluated carefully.
Resumo:
Knowledge of the behaviour of cellulose, hemicelluloses, and lignin during wood and pulp processing is essential for understanding and controlling the processes. Determination of monosaccharide composition gives information about the structural polysaccharide composition of wood material and helps when determining the quality of fibrous products. In addition, monitoring of the acidic degradation products gives information of the extent of degradation of lignin and polysaccharides. This work describes two capillary electrophoretic methods developed for the analysis of monosaccharides and for the determination of aliphatic carboxylic acids from alkaline oxidation solutions of lignin and wood. Capillary electrophoresis (CE), in its many variants is an alternative separation technique to chromatographic methods. In capillary zone electrophoresis (CZE) the fused silica capillary is filled with an electrolyte solution. An applied voltage generates a field across the capillary. The movement of the ions under electric field is based on the charge and hydrodynamic radius of ions. Carbohydrates contain hydroxyl groups that are ionised only in strongly alkaline conditions. After ionisation, the structures are suitable for electrophoretic analysis and identification through either indirect UV detection or electrochemical detection. The current work presents a new capillary zone electrophoretic method, relying on in-capillary reaction and direct UV detection at the wavelength of 270 nm. The method has been used for the simultaneous separation of neutral carbohydrates, including mono- and disaccharides and sugar alcohols. The in-capillary reaction produces negatively charged and UV-absorbing compounds. The optimised method was applied to real samples. The methodology is fast since no other sample preparation, except dilution, is required. A new method for aliphatic carboxylic acids in highly alkaline process liquids was developed. The goal was to develop a method for the simultaneous analysis of the dicarboxylic acids, hydroxy acids and volatile acids that are oxidation and degradation products of lignin and wood polysaccharides. The CZE method was applied to three process cases. First, the fate of lignin under alkaline oxidation conditions was monitored by determining the level of carboxylic acids from process solutions. In the second application, the degradation of spruce wood using alkaline and catalysed alkaline oxidation were compared by determining carboxylic acids from the process solutions. In addition, the effectiveness of membrane filtration and preparative liquid chromatography in the enrichment of hydroxy acids from black liquor was evaluated, by analysing the effluents with capillary electrophoresis.
Resumo:
Contemporary organisations have to embrace the notion of doing ‘more with less’. This challenges knowledge production within companies and public organisations, forcing them to reorganise their structures and rethink what knowledge production actually means in the context of innovation and how knowledge is actually produced among various professional groups within the organisation in their everyday actions. Innovations are vital for organisational survival, and ‘ordinary’ employees and customers are central but too-often ignored producers of knowledge for contemporary organisations. Broader levels of participation and reflexive practices are needed. This dissertation discusses the missing links between innovation research conducted in the context of industrial management, arts, and culture; applied drama and theatre practices (specifically post-Boalian approaches); and learning – especially organising reflection – in organisational settings. This dissertation (1) explores and extends the role of research-based theatre to organising reflection and reflexive practices in the context of practice-based innovation, (2) develops a reflexive model of RBT for investigating and developing practice-based organisational process innovations in order to contribute to the development of a tool for innovation management and analysis, and (3) operationalises this model within private- and publicsector organisations. The proposed novel reflexive model of research-based theatre for investigating and developing practice-based organisational process innovations extends existing methods and offers a different way of organising reflection and reflexive practices in the context of general innovation management. The model was developed through five participatory action research processes conducted in four different organisations. The results provide learning steps – a reflection path – for understanding complex organisational life, people, and relations amid renewal and change actions. The proposed model provides a new approach to organising and cultivating reflexivity in practice-based innovation activities via research-based theatre. The results can be utilised as a guideline when processing practice-based innovation within private or public organisations. The model helps innovation managers to construct, together with their employees, temporary communities where they can learn together through reflecting on their own and each others’ experiences and to break down assumptions related to their own perspectives. The results include recommendations for practical development steps applicable in various organisations with regard to (i) application of research-based theatre and (ii) related general innovation management. The dissertation thus contributes to the development of novel learning approaches in knowledge production. Keywords: practice-based innovation, research-based theatre, learning, reflection, mode 2b knowledge production
Resumo:
Presentation at the Nordic Perspectives on Open Access and Open Science seminar, Helsinki, October 15, 2013
Resumo:
Chaotic behaviour is one of the hardest problems that can happen in nonlinear dynamical systems with severe nonlinearities. It makes the system's responses unpredictable. It makes the system's responses to behave similar to noise. In some applications it should be avoided. One of the approaches to detect the chaotic behaviour is nding the Lyapunov exponent through examining the dynamical equation of the system. It needs a model of the system. The goal of this study is the diagnosis of chaotic behaviour by just exploring the data (signal) without using any dynamical model of the system. In this work two methods are tested on the time series data collected from AMB (Active Magnetic Bearing) system sensors. The rst method is used to nd the largest Lyapunov exponent by Rosenstein method. The second method is a 0-1 test for identifying chaotic behaviour. These two methods are used to detect if the data is chaotic. By using Rosenstein method it is needed to nd the minimum embedding dimension. To nd the minimum embedding dimension Cao method is used. Cao method does not give just the minimum embedding dimension, it also gives the order of the nonlinear dynamical equation of the system and also it shows how the system's signals are corrupted with noise. At the end of this research a test called runs test is introduced to show that the data is not excessively noisy.