40 resultados para post-processing method


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis the sludge of Southeastern-Finland, the companies which produce sludge and the current methods and those still in development have been surveyed. 85 % of the waste sludge from industry comes from forest industries. The sludge from municipal waste water treatment plants is mostly used as a raw material for bioplants. The sludge from forest industry is mostly incinerated. New circulation methods increase the recycling value of the waste by creating new products but they still lack a full-scale plant. The political pressure from Europe and the politics driven by the government of Finland will drive circular economy forward and thus uplifting the processing options of waste slurries. This work is divided in two parts, first contains the survey and the second contains the experimental part and the operational methods based on the survey. In the experimental part wet hard sheet waste sludge was de-watered with shaking filter and the applications for waste sludge from cellulose factory were considered. The results are, that the wet hard sheet waste sludge can be dewatered to high enough total solids content for the inteded use. Also, the cellulose waste sludge has too high Cd content in almost all of the batches to be used as a land improment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Currently, laser scribing is growing material processing method in the industry. Benefits of laser scribing technology are studied for example for improving an efficiency of solar cells. Due high-quality requirement of the fast scribing process, it is important to monitor the process in real time for detecting possible defects during the process. However, there is a lack of studies of laser scribing real time monitoring. Commonly used monitoring methods developed for other laser processes such a laser welding, are sufficient slow and existed applications cannot be implemented in fast laser scribing monitoring. The aim of this thesis is to find a method for laser scribing monitoring with a high-speed camera and evaluate reliability and performance of the developed monitoring system with experiments. The laser used in experiments is an IPG ytterbium pulsed fiber laser with 20 W maximum average power and Scan head optics used in the laser is Scanlab’s Hurryscan 14 II with an f100 tele-centric lens. The camera was connected to laser scanner using camera adapter to follow the laser process. A powerful fully programmable industrial computer was chosen for executing image processing and analysis. Algorithms for defect analysis, which are based on particle analysis, were developed using LabVIEW system design software. The performance of the algorithms was analyzed by analyzing a non-moving image from the scribing line with resolution 960x20 pixel. As a result, the maximum analysis speed was 560 frames per second. Reliability of the algorithm was evaluated by imaging scribing path with a variable number of defects 2000 mm/s when the laser was turned off and image analysis speed was 430 frames per second. The experiment was successful and as a result, the algorithms detected all defects from the scribing path. The final monitoring experiment was performed during a laser process. However, it was challenging to get active laser illumination work with the laser scanner due physical dimensions of the laser lens and the scanner. For reliable error detection, the illumination system is needed to be replaced.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Water treatment using photocatalysis has gained extensive attention in recent years. Photocatalysis is promising technology from green chemistry point of view. The most widely studied and used photocatalyst for decomposition of pollutants in water under ultraviolet irradiation is TiO2 because it is not toxic, relatively cheap and highly active in various reactions. Within this thesis unmodified and modified TiO2 materials (powders and thin films) were prepared. Physico-chemical properties of photocatalytic materials were characterized with UV-visible spectroscopy, scanning electron microscopy (SEM), transmission electron microscopy (TEM), X-ray photoelectron spectrometry (XPS), inductively coupled plasma optical emission spectroscopy (ICP-OES), ellipsometry, time-of-flight secondary ion mass spectrometry (ToF-SIMS), Raman spectroscopy, goniometry, diffuse reflectance measurements, thermogravimetric analysis (TGA) and nitrogen adsorption/desorption. Photocatalytic activity of prepared samples in aqueous environment was tested using model compounds such as phenol, formic acid and metazachlor. Also purification of real pulp and paper wastewater effluent was studied. Concentration of chosen pollutants was measured with high pressure liquid chromatography (HPLC). Mineralization and oxidation of organic contaminants were monitored with total organic carbon (TOC) and chemical oxygen demand (COD) analysis. Titanium dioxide powders prepared via sol-gel method and doped with dysprosium and praseodymium were photocatalytically active for decomposition of metazachlor. The highest degradation rate of metazachlor was observed when Pr-TiO2 treated at 450ºC (8h) was used. The photocatalytic LED-based treatment of wastewater effluent from plywood mill using commercially available TiO2 was demonstrated to be promising post-treatment method (72% of COD and 60% of TOC was decreased after 60 min of irradiation). The TiO2 coatings prepared by atomic layer deposition technique on aluminium foam were photocatalytically active for degradation of formic and phenol, however suppression of activity was observed. Photocatalytic activity of TiO2/SiO2 films doped with gold bipyramid-like nanoparticles was about two times higher than reference, which was not the case when gold nanospheres were used.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study discusses the importance of establishing trust in post-acquisition integration context and how the use of e-channels facilitates or inhibits this process. The objective of this study is to analyze how the use of electronic communication channels influences the post-acquisition integration process in terms of trust establishment and overall integration efficiency, developing a framework as a result. Three sub-objectives are introduced: to find out the building blocks of trust in M&A’s, to analyse how the use of e-channels influence the process of trust establishment in post-acquisition integration context, and to define the consequences trust and use of e-channels have for the process. The theoretical background of the study includes literature and theories relating to trust establishment in post-acquisition integration context and how the use of e-channels influences the process of trust development on a general level. The empirical research is conducted as a single case study, based on key informant interviews. The interview data was collected between October 2015 and January 2016. Altogether nine interviews were realised; six with representatives from the acquiring firm and three with target firm members. Thematic analysis was selected as the main method for analysing and processing the qualitative data. This study finds that trust has an essential role in post-acquisition integration context, facilitating the integration process in various different ways. Hence, identifying the different building blocks of trust is important in order for members of the organisations to be better able establish and maintain trust. In today’s international business, the role of electronic communication channels has also increased in importance significantly and it was confirmed that these pose both challenges and possibilities for the development of interpersonal trust. One of the most important underlying factors influencing the trust levels via e-communication channels is the level of user’s comfort in using the different e-channels. Without sufficient and meaningful training, the communication conducted via these channels in inhibited in a number of ways. Hence, understanding the defining characteristics of e-communication together with the risks and opportunities related to the use of these can have far-reaching consequences for the post-acquisition integration process as a whole. The framework based on the findings and existing theory introduces the most central factors influencing the trust establishment together with the positive and negative consequences these have for the integration process. Moreover, organizational level consistency and the existence of shared guidelines on appropriate selection of communication channels according to the nature of the task at hand are seen as important

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kokonaisvaltaisen laatujohtamisen malli TQM (Total Quality Management) on noussut yhdeksi merkittävimmistä konsepteista globaalissa liiketoiminnassa, missä laatu on tärkeä kilpailutekijä. Tämä diplomityö pureutuu nykyaikaiseen kokonaisvaltaisen laatujohtamisen konseptiin, joka nostaa perinteisen laatuajattelun uudelle tasolle. Moderni laatujohtamisajattelu on kasvanut koskemaan yrityksen toiminnan kaikkia osa-alueita. Työn tavoitteena on TietoEnator Käsittely ja Verkkopalvelut liiketoiminta-alueen osalta laadun sekä liiketoiminnallisen suorituskyvyn parantaminen kokonaisvaltaisesti. Ennen varsinaisen laatujohtamis-konseptin käsittelyä työ esittelee ensin yleisellä tasolla perinteistä laatu käsitettä sekä käsittelee lyhyestiICT-liiketoimintaa ja siihen liittyviä standardeja. Lopuksi tutkimus esittelee priorisoituja parannusehdotuksia ja askeleita jotka auttavat organisaatiota saavuttamaan kokonaisvaltaisen laatujohtamiskonseptin mukaisia pyrkimyksiä.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän kannattavuustutkimuksen lähtökohtana oli se, että Yhtyneet Sahat Oy:n Kaukaan sahalla ja Luumäen jatkojalostuslaitoksella haluttiin selvittää pellettitehtaan kannattavuus nykyisessä markkinatilanteessa. Tämä työon luonteeltaan teknis-taloudellinen selvitys eli ns. feasibility study. Pelletöintiprosessi on tekniikaltaan yksinkertainen eikä edellytä korkea teknologian laitteita. Toimiala on maailmanlaajuisesti varsin uusi. Suomessa pellettimarkkinat ovat vielä pienet ja kehittymättömät, mutta kasvua on viime vuosina tapahtunut. Valtaosa kotimaan tuotannosta menee vientiin. Investoinnin laskentaprosessissa saadut tuotannon alkuarvot sekä kustannusrakenteen määrittelyt ovat perustana varsinaisille kannattavuuslaskelmille. Laskelmista on selvitetty investointeihin liittyvät yleisimmät taloudelliset tunnusluvut ja herkimpiä muuttujia on tutkittu ja pohdittu herkkyysanalyysiä apuna käyttäen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis gives an overview of the use of the level set methods in the field of image science. The similar fast marching method is discussed for comparison, also the narrow band and the particle level set methods are introduced. The level set method is a numerical scheme for representing, deforming and recovering structures in an arbitrary dimensions. It approximates and tracks the moving interfaces, dynamic curves and surfaces. The level set method does not define how and why some boundary is advancing the way it is but simply represents and tracks the boundary. The principal idea of the level set method is to represent the N dimensional boundary in the N+l dimensions. This gives the generality to represent even the complex boundaries. The level set methods can be powerful tools to represent dynamic boundaries, but they can require lot of computing power. Specially the basic level set method have considerable computational burden. This burden can be alleviated with more sophisticated versions of the level set algorithm like the narrow band level set method or with the programmable hardware implementation. Also the parallel approach can be used in suitable applications. It is concluded that these methods can be used in a quite broad range of image applications, like computer vision and graphics, scientific visualization and also to solve problems in computational physics. Level set methods and methods derived and inspired by it will be in the front line of image processing also in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a calibration method which can be utilized for the analysis of SEM images. The field of application of the developed method is a calculation of surface potential distribution of biased silicon edgeless detector. The suggested processing of the data collected by SEM consists of several stages and takes into account different aspects affecting the SEM image. The calibration method doesn’t pretend to be precise but at the same time it gives the basics of potential distribution when the different biasing voltages applied to the detector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cutin and suberin are structural and protective polymers of plant surfaces. The epidermal cells of the aerial parts of plants are covered with an extracellular cuticular layer, which consists of polyester cutin, highly resistant cutan, cuticular waxes and polysaccharides which link the layer to the epidermal cells. A similar protective layer is formed by a polyaromatic-polyaliphatic biopolymer suberin, which is present particularly in the cell walls of the phellem layer of periderm of the underground parts of plants (e.g. roots and tubers) and the bark of trees. In addition, suberization is also a major factor in wound healing and wound periderm formation regardless of the plants’ tissue. Knowledge of the composition and functions of cuticular and suberin polymers is important for understanding the physiological properties for the plants and for nutritional quality when these plants are consumed as foods. The aims of the practical work were to assess the chemical composition of cuticular polymers of several northern berries and seeds and suberin of two varieties of potatoes. Cutin and suberin were studied as isolated polymers and further after depolymerization as soluble monomers and solid residues. Chemical and enzymatic depolymerization techniques were compared and a new chemical depolymerization method was developed. Gas chromatographic analysis with mass spectrometric detection (GC-MS) was used to assess the monomer compositions. Polymer investigations were conducted with solid state carbon-13 cross polarization magic angle spinning nuclear magnetic resonance spectroscopy (13C CP-MAS NMR), Fourier transform infrared spectroscopy (FTIR) and microscopic analysis. Furthermore, the development of suberin over one year of post-harvest storage was investigated and the cuticular layers from berries grown in the North and South of Finland were compared. The results show that the amounts of isolated cuticular layers and cutin monomers, as well as monomeric compositions vary greatly between the berries. The monomer composition of seeds was found to differ from the corresponding berry peel monomers. The berry cutin monomers were composed mostly of long-chain aliphatic ω-hydroxy acids, with various mid-chain functionalities (double-bonds, epoxy, hydroxy and keto groups). Substituted α,ω-diacids predominated over ω-hydroxy acids in potato suberin monomers and slight differences were found between the varieties. The newly-developed closed tube chemical method was found to be suitable for cutin and suberin analysis and preferred over the solvent-consuming and laborious reflux method. Enzymatic hydrolysis with cutinase was less effective than chemical methanolysis and showed specificity towards α,ω-diacid bonds. According to 13C CP-MAS NMR and FTIR, the depolymerization residues contained significant amounts of aromatic structures, polysaccharides and possible cutan-type aliphatic moieties. Cultivation location seems to have effect on cuticular composition. The materials studied contained significant amounts of different types of biopolymers that could be utilized for several purposes with or without further processing. The importance of the so-called waste material from industrial processes of berries and potatoes as a source of either dietary fiber or specialty chemicals should be further investigated in detail. The evident impact of cuticular and suberin polymers, among other fiber components, on human health should be investigated in clinical trials. These by-product materials may be used as value-added fiber fractions in the food industry and as raw materials for specialty chemicals such as lubricants and emulsifiers, or as building blocks for novel polymers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge of the behaviour of cellulose, hemicelluloses, and lignin during wood and pulp processing is essential for understanding and controlling the processes. Determination of monosaccharide composition gives information about the structural polysaccharide composition of wood material and helps when determining the quality of fibrous products. In addition, monitoring of the acidic degradation products gives information of the extent of degradation of lignin and polysaccharides. This work describes two capillary electrophoretic methods developed for the analysis of monosaccharides and for the determination of aliphatic carboxylic acids from alkaline oxidation solutions of lignin and wood. Capillary electrophoresis (CE), in its many variants is an alternative separation technique to chromatographic methods. In capillary zone electrophoresis (CZE) the fused silica capillary is filled with an electrolyte solution. An applied voltage generates a field across the capillary. The movement of the ions under electric field is based on the charge and hydrodynamic radius of ions. Carbohydrates contain hydroxyl groups that are ionised only in strongly alkaline conditions. After ionisation, the structures are suitable for electrophoretic analysis and identification through either indirect UV detection or electrochemical detection. The current work presents a new capillary zone electrophoretic method, relying on in-capillary reaction and direct UV detection at the wavelength of 270 nm. The method has been used for the simultaneous separation of neutral carbohydrates, including mono- and disaccharides and sugar alcohols. The in-capillary reaction produces negatively charged and UV-absorbing compounds. The optimised method was applied to real samples. The methodology is fast since no other sample preparation, except dilution, is required. A new method for aliphatic carboxylic acids in highly alkaline process liquids was developed. The goal was to develop a method for the simultaneous analysis of the dicarboxylic acids, hydroxy acids and volatile acids that are oxidation and degradation products of lignin and wood polysaccharides. The CZE method was applied to three process cases. First, the fate of lignin under alkaline oxidation conditions was monitored by determining the level of carboxylic acids from process solutions. In the second application, the degradation of spruce wood using alkaline and catalysed alkaline oxidation were compared by determining carboxylic acids from the process solutions. In addition, the effectiveness of membrane filtration and preparative liquid chromatography in the enrichment of hydroxy acids from black liquor was evaluated, by analysing the effluents with capillary electrophoresis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contemporary organisations have to embrace the notion of doing ‘more with less’. This challenges knowledge production within companies and public organisations, forcing them to reorganise their structures and rethink what knowledge production actually means in the context of innovation and how knowledge is actually produced among various professional groups within the organisation in their everyday actions. Innovations are vital for organisational survival, and ‘ordinary’ employees and customers are central but too-often ignored producers of knowledge for contemporary organisations. Broader levels of participation and reflexive practices are needed. This dissertation discusses the missing links between innovation research conducted in the context of industrial management, arts, and culture; applied drama and theatre practices (specifically post-Boalian approaches); and learning – especially organising reflection – in organisational settings. This dissertation (1) explores and extends the role of research-based theatre to organising reflection and reflexive practices in the context of practice-based innovation, (2) develops a reflexive model of RBT for investigating and developing practice-based organisational process innovations in order to contribute to the development of a tool for innovation management and analysis, and (3) operationalises this model within private- and publicsector organisations. The proposed novel reflexive model of research-based theatre for investigating and developing practice-based organisational process innovations extends existing methods and offers a different way of organising reflection and reflexive practices in the context of general innovation management. The model was developed through five participatory action research processes conducted in four different organisations. The results provide learning steps – a reflection path – for understanding complex organisational life, people, and relations amid renewal and change actions. The proposed model provides a new approach to organising and cultivating reflexivity in practice-based innovation activities via research-based theatre. The results can be utilised as a guideline when processing practice-based innovation within private or public organisations. The model helps innovation managers to construct, together with their employees, temporary communities where they can learn together through reflecting on their own and each others’ experiences and to break down assumptions related to their own perspectives. The results include recommendations for practical development steps applicable in various organisations with regard to (i) application of research-based theatre and (ii) related general innovation management. The dissertation thus contributes to the development of novel learning approaches in knowledge production. Keywords: practice-based innovation, research-based theatre, learning, reflection, mode 2b knowledge production

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the study is to examine and increase knowledge on customer knowledge processing in B2B context from sales perspective. Further objectives include identifying possible inhibiting and enabling factors in each phase of the process. The theoretical framework is based on customer knowledge management literature. The study is a qualitative study, in which the research method utilized is a case study. The empirical part was implemented in a case company by conducting in-depth interviews with the company’s value-selling champions located internationally. Context was maintenance business. Altogether 17 interviews were conducted. The empirical findings indicate that customer knowledge processing has not been clearly defined within the maintenance business line. Main inhibiting factors in acquiring customer knowledge are lack of time and vast amount of customer knowledge received. Enabling factors recognized are good customer relationships and sales representatives’ communication skills. Internal dissemination of knowledge is mainly inhibited by lack of time and restrictions in customer relationship management systems. Enabling factors are composition of the sales team and updated customer knowledge. Inhibiting utilization is lack of goals to utilize the customer knowledge and a low quality of the knowledge. Moreover, customer knowledge is not systematically updated nor analysed. Management of customer knowledge is based on the CRM system. As implications of the study, it is suggested for the case company to define customer knowledge processing in order to support maintenance business process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usage of digital content, such as video clips and images, has increased dramatically during the last decade. Local image features have been applied increasingly in various image and video retrieval applications. This thesis evaluates local features and applies them to image and video processing tasks. The results of the study show that 1) the performance of different local feature detector and descriptor methods vary significantly in object class matching, 2) local features can be applied in image alignment with superior results against the state-of-the-art, 3) the local feature based shot boundary detection method produces promising results, and 4) the local feature based hierarchical video summarization method shows promising new new research direction. In conclusion, this thesis presents the local features as a powerful tool in many applications and the imminent future work should concentrate on improving the quality of the local features.