12 resultados para Simplified procedure
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Selostus: Tasoskannerin ja digitaalisen kuva-analyysimenetelmän kalibrointi juurten morfologian kvantifioimiseksi
Resumo:
In this thesis, cleaning of ceramic filter media was studied. Mechanisms of fouling and dissolution of iron compounds, as well as methods for cleaning ceramic membranes fouled by iron deposits were studied in the literature part. Cleaning agents and different methods were closer examined in the experimental part of the thesis. Pyrite is found in the geologic strata. It is oxidized to form ferrous ions Fe(II) and ferric ions Fe(III). Fe(III) is further oxidized in the hydrolysis to form ferric hydroxide. Hematite and goethite, for instance, are naturally occurring iron oxidesand hydroxides. In contact with filter media, they can cause severe fouling, which common cleaning techniques competent enough to remove. Mechanisms for the dissolution of iron oxides include the ligand-promoted pathway and the proton-promoted pathway. The dissolution can also be reductive or non-reductive. The most efficient mechanism is the ligand-promoted reductive mechanism that comprises two stages: the induction period and the autocatalytic dissolution.Reducing agents(such as hydroquinone and hydroxylamine hydrochloride), chelating agents (such as EDTA) and organic acids are used for the removal of iron compounds. Oxalic acid is the most effective known cleaning agent for iron deposits. Since formulations are often more effective than organic acids, reducing agents or chelating agents alone, the citrate¿bicarbonate¿dithionite system among others is well studied in the literature. The cleaning is also enhanced with ultrasound and backpulsing.In the experimental part, oxalic acid and nitric acid were studied alone andin combinations. Also citric acid and ascorbic acid among other chemicals were tested. Soaking experiments, experiments with ultrasound and experiments for alternative methods to apply the cleaning solution on the filter samples were carried out. Permeability and ISO Brightness measurements were performed to examine the influence of the cleaning methods on the samples. Inductively coupled plasma optical emission spectroscopy (ICP-OES) analysis of the solutions was carried out to determine the dissolved metals.
Resumo:
The building industry has a particular interest in using clinching as a joining method for frame constructions of light-frame housing. Normally many clinch joints are required in joining of frames.In order to maximise the strength of the complete assembly, each clinch joint must be as sound as possible. Experimental testing is the main means of optimising a particular clinch joint. This includes shear strength testing and visual observation of joint cross-sections. The manufacturers of clinching equipment normally perform such experimental trials. Finite element analysis can also be used to optimise the tool geometry and the process parameter, X, which represents the thickness of the base of the joint. However, such procedures require dedicated software, a skilled operator, and test specimens in order to verify the finite element model. In addition, when using current technology several hours' computing time may be necessary. The objective of the study was to develop a simple calculation procedure for rapidly establishing an optimum value for the parameter X for a given tool combination. It should be possible to use the procedure on a daily basis, without stringent demands on the skill of the operator or the equipment. It is also desirable that the procedure would significantly decrease thenumber of shear strength tests required for verification. The experimental workinvolved tests in order to obtain an understanding of the behaviour of the sheets during clinching. The most notable observation concerned the stage of the process in which the upper sheet was initially bent, after which the deformation mechanism changed to shearing and elongation. The amount of deformation was measured relative to the original location of the upper sheet, and characterised as the C-measure. By understanding in detail the behaviour of the upper sheet, it waspossible to estimate a bending line function for the surface of the upper sheet. A procedure was developed, which makes it possible to estimate the process parameter X for each tool combination with a fixed die. The procedure is based on equating the volume of material on the punch side with the volume of the die. Detailed information concerning the behaviour of material on the punch side is required, assuming that the volume of die does not change during the process. The procedure was applied to shear strength testing of a sample material. The sample material was continuously hot-dip zinc-coated high-strength constructional steel,with a nominal thickness of 1.0 mm. The minimum Rp0.2 proof stress was 637 N/mm2. Such material has not yet been used extensively in light-frame housing, and little has been published on clinching of the material. The performance of the material is therefore of particular interest. Companies that use clinching on a daily basis stand to gain the greatest benefit from the procedure. By understanding the behaviour of sheets in different cases, it is possible to use data at an early stage for adjusting and optimising the process. In particular, the functionality of common tools can be increased since it is possible to characterise the complete range of existing tools. The study increases and broadens the amount ofbasic information concerning the clinching process. New approaches and points of view are presented and used for generating new knowledge.
Resumo:
The central goal of food safety policy in the European Union (EU) is to protect consumer health by guaranteeing a high level of food safety throughout the food chain. This goal can in part be achieved by testing foodstuffs for the presence of various chemical and biological hazards. The aim of this study was to facilitate food safety testing by providing rapid and user-friendly methods for the detection of particular food-related hazards. Heterogeneous competitive time-resolved fluoroimmunoassays were developed for the detection of selected veterinary residues, that is coccidiostat residues, in eggs and chicken liver. After a simplified sample preparation procedure, the immunoassays were performed either in manual format with dissociation-enhanced measurement or in automated format with pre-dried assay reagents and surface measurement. Although the assays were primarily designed for screening purposes providing only qualitative results, they could also be used in a quantitative mode. All the developed assays had good performance characteristics enabling reliable screening of samples at concentration levels required by the authorities. A novel polymerase chain reaction (PCR)-based assay system was developed for the detection of Salmonella spp. in food. The sample preparation included a short non-selective pre-enrichment step, after which the target cells were collected with immunomagnetic beads and applied to PCR reaction vessels containing all the reagents required for the assay in dry form. The homogeneous PCR assay was performed with a novel instrument platform, GenomEra™, and the qualitative assay results were automatically interpreted based on end-point time-resolved fluorescence measurements and cut-off values. The assay was validated using various food matrices spiked with sub-lethally injured Salmonella cells at levels of 1-10 colony forming units (CFU)/25 g of food. The main advantage of the system was the exceptionally short time to result; the entire process starting from the pre-enrichment and ending with the PCR result could be completed in eight hours. In conclusion, molecular methods using state-of-the-art assay techniques were developed for food safety testing. The combination of time-resolved fluorescence detection and ready-to-use reagents enabled sensitive assays easily amenable to automation. Consequently, together with the simplified sample preparation, these methods could prove to be applicable in routine testing.
Resumo:
The increasing incidence of type 1 diabetes has led researchers on a quest to find the reason behind this phenomenon. The rate of increase is too great to be caused simply by changes in the genetic component, and many environmental factors are under investigation for their possible contribution. These studies require, however, the participation of those individuals most likely to develop the disease, and the approach chosen by many is to screen vast populations to find persons with increased genetic risk factors. The participating individuals are then followed for signs of disease development, and their exposure to suspected environmental factors is studied. The main purpose of this study was to find a suitable tool for easy and inexpensive screening of certain genetic risk markers for type 1 diabetes. The method should be applicable to using whole blood dried on sample collection cards as sample material, since the shipping and storage of samples in this format is preferred. However, the screening of vast sample libraries of extracted genomic DNA should also be possible, if such a need should arise, for example, when studying the effect of newly discovered genetic risk markers. The method developed in this study is based on homogeneous assay chemistry and an asymmetrical polymerase chain reaction (PCR). The generated singlestranded PCR product is probed by lanthanide-labelled, LNA (locked nucleic acid)-spiked, short oligonucleotides with exact complementary sequences. In the case of a perfect match, the probe is hybridised to the product. However, if even a single nucleotide difference occurs, the probe is bound instead of the PCR product to a complementary quencher-oligonucleotide labelled with a dabcyl-moiety, causing the signal of the lanthanide label to be quenched. The method was applied to the screening of the well-known type 1 diabetes risk alleles of the HLA-DQB1 gene. The method was shown to be suitable as an initial screening step including thousands of samples in the scheme used in the TEDDY (The Environmental Determinants of Diabetes in the Young) study to identify those individuals at increased genetic risk. The method was further developed into dry-reagent form to allow an even simpler approach to screening. The reagents needed in the assay were in dry format in the reaction vessel, and performing the assay required only the addition of the sample and, if necessary, water to rehydrate the reagents. This allows the assay to be successfully executed even by a person with minimal laboratory experience.
Resumo:
The human genome comprises roughly 20 000 protein coding genes. Proteins are the building material for cells and tissues, and proteins are functional compounds having an important role in many cellular responses, such as cell signalling. In multicellular organisms such as humans, cells need to communicate with each other in order to maintain a normal function of the tissues within the body. This complex signalling between and within cells is transferred by proteins and their post-translational modifications, one of the most important being phosphorylation. The work presented here concerns the development and use of tools for phosphorylation analysis. Mass spectrometers have become essential tools to study proteins and proteomes. In mass spectrometry oriented proteomics, proteins can be identified and their post-translational modifications can be studied. In this Ph.D. thesis the objectives were to improve the robustness of sample handling methods prior to mass spectrometry analysis for peptides and their phosphorylation status. The focus was to develop strategies that enable acquisition of more MS measurements per sample, higher quality MS spectra and simplified and rapid enrichment procedures for phosphopeptides. Furthermore, an objective was to apply these methods to characterize phosphorylation sites of phosphopeptides. In these studies a new MALDI matrix was developed which allowed more homogenous, intense and durable signals to be acquired when compared to traditional CHCA matrix. This new matrix along with other matrices was subsequently used to develop a new method that combines multiple spectra from different matrises from identical peptides. With this approach it was possible to identify more phosphopeptides than with conventional LC/ESI-MS/MS methods, and to use 5 times less sample. Also, phosphopeptide affinity MALDI target was prepared to capture and immobilise phosphopeptides from a standard peptide mixture while maintaining their spatial orientation. In addition a new protocol utilizing commercially available conductive glass slides was developed that enabled fast and sensitive phosphopeptide purification. This protocol was applied to characterize the in vivo phosphorylation of a signalling protein, NFATc1. Evidence for 12 phosphorylation sites were found, and many of those were found in multiply phosphorylated peptides
Resumo:
The search for new renewable materials has intensified in recent years. Pulp and paper mill process streams contain a number of potential compounds which could be used in biofuel production and as raw materials in the chemical, food and pharmaceutical industries. Prior to utilization, these compounds require separation from other compounds present in the process stream. One feasible separation technique is membrane filtration but to some extent, fouling still limits its implementation in pulp and paper mill applications. To mitigate fouling and its effects, foulants and their fouling mechanisms need to be well understood. This thesis evaluates fouling in filtration of pulp and paper mill process streams by means of polysaccharide model substance filtrations and by development of a procedure to analyze and identify potential foulants, i.e. wood extractives and carbohydrates, from fouled membranes. The model solution filtration results demonstrate that each polysaccharide has its own fouling mechanism, which also depends on the membrane characteristics. Polysaccharides may foul the membranes by adsorption and/or by gel/cake layer formation on the membrane surface. Moreover, the polysaccharides interact, which makes fouling evaluation of certain compound groups very challenging. Novel methods to identify wood extractive and polysaccharide foulants are developed in this thesis. The results show that it is possible to extract and identify wood extractives from membranes fouled in filtration of pulp and paper millstreams. The most effective solvent was found to be acetone:water (9:1 v/v) because it extracted both lipophilic extractives and lignans at high amounts from the fouled membranes and it was also non-destructive for the membrane materials. One hour of extraction was enough to extract wood extractives at high amounts for membrane samples with an area of 0.008 m2. If only qualitative knowledge of wood extractives is needed a simplified extraction procedure can be used. Adsorption was the main fouling mechanism in extractives-induced fouling and dissolved fatty and resin acids were mostly the reason for the fouling; colloidal fouling was negligible. Both process water and membrane characteristics affected extractives-induced fouling. In general, the more hydrophilic regenerated cellulose (RC) membrane fouled less that the more hydrophobic polyethersulfone (PES) and polyamide (PA) membranes independent of the process water used. Monosaccharide and uronic acid units could also be identified from the fouled synthetic polymeric membranes. It was impossible to analyze all monosaccharide units from the RC membrane because the analysis result obtained contained degraded membrane material. One of the fouling mechanisms of carbohydrates was adsorption. Carbohydrates were not potential adsorptive foulants to the sameextent as wood extractives because their amount in the fouled membranes was found to be significantly lower than the amount of wood extractives.
Resumo:
The aim of this study is to develop a suitable project control procedure for a target company that can be used in engineering, procurement, and construction or con-struction management contracts. This procedure contains suitable project control software and a model for the use of the software in practice. This study is divided into two main sections. Theoretical part deals with project management, focusing on cost and time dimensions in projects. Empirical part deals with the development of the project control procedure for the target compa-ny. This development takes place in two parts. In the first part, semi-structured interviews are used to find out the company’s employees’ demands and desires for the project control software which will then be used in the developed procedure. These demands and desires are compared to available software in the market and the most suitable one will be chosen. Interview results show that important factors are cost tracking, integration with other software, English language availability, references, helpdesk, and no need for regular updates. The most suitable one is CMPro5 cost control software. The chosen software is used in a pilot project, where its functions and use are analyzed. Project control procedure which will be used in the future is developed based on these procedures. The five steps in developed procedure include employment of a cost engineer, whose task is to maintain the procedure in the target company.
Resumo:
Tässä diplomityössä käsitellään sorvauksen työstövärähtelyjen ja sorvin keskiökärjen rakenteen yhteyttä. Työ on osa Lappeenrannan teknillisen yliopiston VMAX-projektia, ja sen taustalla on pyrkimys uudenlaisen, sorvin kärkipylkän puristusvoiman ajonaikaiseen säätämiseen perustuvan työstövärähtelyjen välttämismenetelmän kehittämiseen. Tämän menetelmän toiminnan todentaminen oli työn ensimmäinen tavoite. Menetelmän toteuttaminen asettaa kuitenkin käytetyn keskiökärjen rakenteelle tiettyjä vaatimuksia. Työn toisena tavoitteena oli nämä vaatimukset täyttävän keskiökärjen prototyypin kehittäminen. Tutkimus eteni seuraavasti. Ensimmäiseksi ongelma määriteltiin tutustumalla työn teoreettiseen taustaan ja aiheeseen liittyvään tutkimukseen Lappeenrannan teknillisestä yliopistosta ja muualta. Myös keskiökärkiä valmistavien yritysten tuotekatalogeja tarkasteltiin. Seuraavaksi siirryttiin alustavaan suunnitteluvaiheeseen, jossa verifioitiin menetelmän toiminta ja luotiin konsepteja keskiökärjen rakenteen kehittämistä varten. Tämän alustavan vaiheen jälkeen suoritettiin suunnitteluprosessi keskiökärjen prototyypille. Lopuksi, suunnitellun prototyypin rakenteen käyttäytymistä arvioitiin tietokonemallinnuksen avulla. Lisätuloksena tutkimuksen aikana johdettiin yksinkertaistettu elementtimenetelmään perustuva laskentamalli järjestelmän ominaistaajuuksien selvittämiseksi. Laskentamallin tarkkuutta arvoitiin. Suunnitteluprosessin tuloksena saatiin kaikki menetelmän toiminnan sekä normaalin käytön asettamat vaatimukset täyttävä rakenne keskiökärjen prototyypille. Myös johdetun laskentamallin tulokset ovat varsin lähellä 3D-elementtimallinnuksen antamia tuloksia. Tutkimuksen tavoitteiden voidaan siis sanoa toteutuneen. Koska prototyyppiä ja laskentamallia ei kuitenkaan ole vielä kokeellisesti verifioitu, tämä ei ole täysin varmaa.