16 resultados para Electronic data processing -- Quality control
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
The increased awareness and evolved consumer habits have set more demanding standards for the quality and safety control of food products. The production of foodstuffs which fulfill these standards can be hampered by different low-molecular weight contaminants. Such compounds can consist of, for example residues of antibiotics in animal use or mycotoxins. The extremely small size of the compounds has hindered the development of analytical methods suitable for routine use, and the methods currently in use require expensive instrumentation and qualified personnel to operate them. There is a need for new, cost-efficient and simple assay concepts which can be used for field testing and are capable of processing large sample quantities rapidly. Immunoassays have been considered as the golden standard for such rapid on-site screening methods. The introduction of directed antibody engineering and in vitro display technologies has facilitated the development of novel antibody based methods for the detection of low-molecular weight food contaminants. The primary aim of this study was to generate and engineer antibodies against low-molecular weight compounds found in various foodstuffs. The three antigen groups selected as targets of antibody development cause food safety and quality defects in wide range of products: 1) fluoroquinolones: a family of synthetic broad-spectrum antibacterial drugs used to treat wide range of human and animal infections, 2) deoxynivalenol: type B trichothecene mycotoxin, a widely recognized problem for crops and animal feeds globally, and 3) skatole, or 3-methyindole is one of the two compounds responsible for boar taint, found in the meat of monogastric animals. This study describes the generation and engineering of antibodies with versatile binding properties against low-molecular weight food contaminants, and the consecutive development of immunoassays for the detection of the respective compounds.
Resumo:
Tämän diplomityön tarkoituksena oli tehdä selvitys EDI:in liittyvistä vaikutuksista, tarpeista ja eduista sekä valmistella Oracle Applications- toiminnanohjausjärjestelmän EDI Gateway- modulin ottamista tuotantokäyttöön. Tietoa tarvekartoitukseen saatiin keskustelujen avulla. Uusia kaupallisista lähtökohdista johdettuja, yritysten väliseen kaupankäyntiin ja internet-teknologian hyödyntämiseen kehitettyjä aloitteita käsiteltiin EDI-näkökulmasta tulevaisuutta varten. Ajankohtaisinta tietoa tätä diplomityötä varten löydettiin myös internetistä. Tämän jälkeen oli mahdollista toteuttaa sopivan laaja mutta rajattu EDI pilottiprojekti EDI-konseptin luomista varten. EDI:n vaikutuksiin ostossa keskityttiin tässä diplomityössä enemmän ja EDI:ä päätettiin soveltaa aluksi ostotilauksissa. EDI:n hyötyjä on vaikea mitata numeerisesti. Suurta määrää rahaa tai tuoteyksiköitä on käsiteltävä EDI-partnerin kanssa riittävän usein. EDI:n käyttöönottovaiheessa pääongelmat ovat sovelluksiin liittyviä tietotekniikkaongelmia. Selvityksistä ja EDI-projektista saatu tieto on mahdollista hyödyntää jatkokehityksessä. Lisätoimenpiteitä tarvitaan kokonaan toimivan järjestelmän luomiseksi.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
The Chinese welding industry is growing every year due to rapid development of the Chinese economy. Increasingly, companies around the world are looking to use Chinese enterprises as their cooperation partners. However, the Chinese welding industry also has its weaknesses, such as relatively low quality and weak management. A modern, advanced welding management system appropriate for local socio-economic conditions is required to enable Chinese enterprises to enhance further their business development. The thesis researches the design and implementation of a new welding quality management system for China. This new system is called ‗welding production quality control management model in China‘ (WQMC). Constructed on the basis of analysis of a survey and in-company interviews, the welding management system comprises the following different elements and perspectives: a ‗Localized congenital existing problem resolution strategies‘ (LCEPRS) database, a ‗human factor designed training system‘ (HFDT) training strategy, the theory of modular design, ISO 3834 requirements, total welding management (TWM), and lean manufacturing (LEAN) theory. The methods used in the research are literature review, questionnaires, interviews, and the author‘s model design experiences and observations, i.e. the approach is primarily qualitative and phenomenological. The thesis describes the design and implementation of a HFDT strategy in Chinese welding companies. Such training is an effective way to increase employees‘ awareness of quality and issues associated with quality assurance. The study identified widely existing problems in the Chinese welding industry and constructed a LCEPRS database that can be used in efforts to mitigate and avoid common problems. The work uses the theory of modular design, TWM and LEAN as tools for the implementation of the WQMC system.
Resumo:
Tutkimuksen päätavoitteena oli atk-avusteisen tilintarkastuksen käytön tutkiminen. Tutkimus jakaantuu teoreettiseen ja empiiriseen osaan. Teoriaosuudessa käydään läpi tilintarkastusprosessia ja esitellään tietokoneavusteisen tilintarkastuksen työvälineitä sekä arvioidaan kirjallisuuden ja muun lähdeaineiston perusteella atk:n tuottamia hyötyjä ja aiheuttamia riskejä. Empiriaosuudessa tutkittiin tilintarkastajille suunnatun kyselytutkimuksen avulla miten laajaa atk:n hyväksikäyttö on tilintarkastusmaailmassa ja miten tilintarkastajat itse näkevät sen tuomat hyödyt ja haitat sekä atk-avusteisen tilintarkastuksen kehittymisen lähitulevaisuudessa. Tutkimustuloksia verrataan aikaisemmin samasta aihepiiristä tehtyjen tutkimusten tuloksiin. Tutkimustuloksia verrattaessa käy ilmi, että tietokoneen käyttö ja hyödyntäminen tilintarkastustyössä on selvästi lisääntynyt. On huomattava, että atk:n mukaantulo tilintarkastustoimintaan tuo mukanaan ongelmia, jotka tulee tiedostaa, mutta atk:n tuottamien lisäetujen määrä on niin huomattava, että tulevaisuudessa tehokas tilintarkastustyö ei onnistu ilman atk-avusteisia menetelmiä.
Resumo:
The aim of this master’s thesis is to research and analyze how purchase invoice processing can be automated and streamlined in a system renewal project. The impacts of workflow automation on invoice handling are studied by means of time, cost and quality aspects. Purchase invoice processing has a lot of potential for automation because of its labor-intensive and repetitive nature. As a case study combining both qualitative and quantitative methods, the topic is approached from a business process management point of view. The current process was first explored through interviews and workshop meetings to create a holistic understanding of the process at hand. Requirements for process streamlining were then researched focusing on specified vendors and their purchase invoices, which helped to identify the critical factors for successful invoice automation. To optimize the flow from invoice receipt to approval for payment, the invoice receiving process was outsourced and the automation functionalities of the new system utilized in invoice handling. The quality of invoice data and the need of simple structured purchase order (PO) invoices were emphasized in the system testing phase. Hence, consolidated invoices containing references to multiple PO or blanket release numbers should be simplified in order to use automated PO matching. With non-PO invoices, it is important to receive the buyer reference details in an applicable invoice data field so that automation rules could be created to route invoices to a review and approval flow. In the beginning of the project, invoice processing was seen ineffective both time- and cost-wise, and it required a lot of manual labor to carry out all tasks. In accordance with testing results, it was estimated that over half of the invoices could be automated within a year after system implementation. Processing times could be reduced remarkably, which would then result savings up to 40 % in annual processing costs. Due to several advancements in the purchase invoice process, business process quality could also be perceived as improved.
Resumo:
The objectives of this master’s thesis were to understand the importance of bubbling fluidized bed (BFB) conditions and to find out how digital image processing and acoustic emission technology can help in monitoring the bed quality. An acoustic emission (AE) measurement system and a bottom ash camera system were evaluated in acquiring information about the bed conditions. The theory part of the study describes the fundamentals of BFB boiler and evaluates the characteristics of bubbling bed. Causes and effects of bed material coarsening are explained. The ways and methods to monitor the behaviour of BFB are determined. The study introduces the operating principles of AE technology and digital image processing. The empirical part of the study describes an experimental arrangement and results of a case study at an industrial BFB boiler. Sand consumption of the boiler was reduced by optimization of bottom ash handling and sand feeding. Furthermore, data from the AE measurement system and the bottom ash camera system was collected. The feasibility of these two systems was evaluated. The particle size of bottom ash and the changes in particle size distribution were monitored during the test period. Neither of the systems evaluated was ready to serve in bed quality control accurately or fast enough. Particle size distributions according to the bottom ash camera did not correspond to the results of manual sieving. Comprehensive interpretation of the collected AE data requires much experience. Both technologies do have potential and with more research and development they may enable acquiring reliable and real-time information about the bed conditions. This information could help to maintain disturbance-free combustion process and to optimize bottom ash handling system.
Resumo:
Työn tavoitteena oli vertailla paperikoneen lajinvaihdon säätötapoja. Vertailun kohteina olivat Metso Automationin IQGradeChange lajinvaihto-ohjelmisto ja operaattoreiden käsin tekemät lajinvaihdot. Kattavan tutkimusaineiston saamiseksi paperikoneen lajinvaihtodataa kerättiin seitsemän kuukauden ajan. Kerätyt lajinvaihdot käytiin läpi Matlab-ympäristössä lajinvaihtoaikojen selvittämiseksi. Lisäksi lajinvaihdoista laskettiin tuotannon muutokset ((t/h)/min) vanhan ja uudenlajin välillä, jotta päästiin selvyyteen lajinvaihdon laajuudesta ja eri lajinvaihtotapojen suorituskyvyistä. Koeajojaksona paperikoneelta kerättiin kaikkiaan 130 lajinvaihdon tiedot. Näistä lajinvaihdoista 58 tehtiin IQGradeChange lajinvaihto-ohjelmistolla ja 72 oli operaattoreiden käsin tekemiä lajinvaihtoja. Kerätyistä 130 lajinvaihdosta 27 kappaletta päättyi ratakatkoon. Yhtenä tehtävänä olikin tutkia katkoon päättyneitä lajinvaihtoja.
Resumo:
Controlling the quality variables (such as basis weight, moisture etc.) is a vital part of making top quality paper or board. In this thesis, an advanced data assimilation tool is applied to the quality control system (QCS) of a paper or board machine. The functionality of the QCS is based on quality observations that are measured with a traversing scanner making a zigzag path. The basic idea is the following: The measured quality variable has to be separated into its machine direction (MD) and cross direction (CD) variations due to the fact that the QCS works separately in MD and CD. Traditionally this is done simply by assuming one scan of the zigzag path to be the CD profile and its mean value to be one point of the MD trend. In this thesis, a more advanced method is introduced. The fundamental idea is to use the signals’ frequency components to represent the variation in both CD and MD. To be able to get to the frequency domain, the Fourier transform is utilized. The frequency domain, that is, the Fourier components are then used as a state vector in a Kalman filter. The Kalman filter is a widely used data assimilation tool to combine noisy observations with a model. The observations here refer to the quality measurements and the model to the Fourier frequency components. By implementing the two dimensional Fourier transform into the Kalman filter, we get an advanced tool for the separation of CD and MD components in total variation or, to be more general, for data assimilation. A piece of a paper roll is analyzed and this tool is applied to model the dataset. As a result, it is clear that the Kalman filter algorithm is able to reconstruct the main features of the dataset from a zigzag path. Although the results are made with a very short sample of paper roll, it seems that this method has great potential to be used later on as a part of the quality control system.
The effects of real time control of welding parameters on weld quality in plasma arc keyhole welding
Resumo:
Joints intended for welding frequently show variations in geometry and position, for which it is unfortunately not possible to apply a single set of operating parameters to ensure constant quality. The cause of this difficulty lies in a number of factors, including inaccurate joint preparation and joint fit up, tack welds, as well as thermal distortion of the workpiece. In plasma arc keyhole welding of butt joints, deviations in the gap width may cause weld defects such as an incomplete weld bead, excessive penetration and burn through. Manual adjustment of welding parameters to compensate for variations in the gap width is very difficult, and unsatisfactory weld quality is often obtained. In this study a control system for plasma arc keyhole welding has been developed and used to study the effects of the real time control of welding parameters on gap tolerance during welding of austenitic stainless steel AISI 304L. The welding tests demonstrated the beneficial effect of real time control on weld quality. Compared with welding using constant parameters, the maximum tolerable gap width with an acceptable weld quality was 47% higher when using the real time controlled parameters for a plate thickness of 5 mm. In addition, burn through occurred with significantly larger gap widths when parameters were controlled in real time. Increased gap tolerance enables joints to be prepared and fit up less accurately, saving time and preparation costs for welding. In addition to the control system, a novel technique for back face monitoring is described in this study. The test results showed that the technique could be successfully applied for penetration monitoring when welding non magnetic materials. The results also imply that it is possible to measure the dimensions of the plasma efflux or weld root, and use this information in a feedback control system and, thus, maintain the required weld quality.
Resumo:
The topic of this thesis is the simulation of a combination of several control and data assimilation methods, meant to be used for controlling the quality of paper in a paper machine. Paper making is a very complex process and the information obtained from the web is sparse. A paper web scanner can only measure a zig zag path on the web. An assimilation method is needed to process estimates for Machine Direction (MD) and Cross Direction (CD) profiles of the web. Quality control is based on these measurements. There is an increasing need for intelligent methods to assist in data assimilation. The target of this thesis is to study how such intelligent assimilation methods are affecting paper web quality. This work is based on a paper web simulator, which has been developed in the TEKES funded MASI NoTes project. The simulator is a valuable tool in comparing different assimilation methods. The thesis contains the comparison of four different assimilation methods. These data assimilation methods are a first order Bayesian model estimator, an ARMA model based on a higher order Bayesian estimator, a Fourier transform based Kalman filter estimator and a simple block estimator. The last one can be considered to be close to current operational methods. From these methods Bayesian, ARMA and Kalman all seem to have advantages over the commercial one. The Kalman and ARMA estimators seems to be best in overall performance.
Resumo:
This work is devoted to the analysis of signal variation of the Cross-Direction and Machine-Direction measurements from paper web. The data that we possess comes from the real paper machine. Goal of the work is to reconstruct the basis weight structure of the paper and to predict its behaviour to the future. The resulting synthetic data is needed for simulation of paper web. The main idea that we used for describing the basis weight variation in the Cross-Direction is Empirical Orthogonal Functions (EOF) algorithm, which is closely related to Principal Component Analysis (PCA) method. Signal forecasting in time is based on Time-Series analysis. Two principal mathematical procedures that we used in the work are Autoregressive-Moving Average (ARMA) modelling and Ornstein–Uhlenbeck (OU) process.
Resumo:
Neutral alpha-mannosidase and lysosomal MAN2B1 alpha-mannosidase belong to glycoside hydrolase family 38, which contains essential enzymes required for the modification and catabolism of asparagine-linked glycans on proteins. MAN2B1 catalyses lysosomal glycan degradation, while neutral α-mannosidase is most likely involved in the catabolism of cytosolic free oligosaccharides. These mannose containing saccharides are generated during glycosylation or released from misfolded glycoproteins, which are detected by quality control in the endoplasmic reticulum. To characterise the biological function of human neutral α-mannosidase, I cloned the alpha-mannosidase cDNA and recombinantly expressed the enzyme. The purified enzyme trimmed the putative natural substrate Man9GlcNAc to Man5GlcNAc, whereas the reducing end GlcNAc2 limited trimming to Man8GlcNAc2. Neutral α-mannosidase showed highest enzyme activity at neutral pH and was activated by the cations Fe2+, Co2+ and Mn2+, Cu2+ in turn had a strong inhibitory effect on alpha-mannosidase activity. Analysis of its intracellular localisation revealed that neutral alpha-mannosidase is cytosolic and colocalises with proteasomes. Further work showed that the overexpression of neutral alpha-mannosidase affected the cytosolic free oligosaccharide content and led to enhanced endoplasmic reticulum associated degradation and underglycosylation of secreted proteins. The second part of the study focused on MAN2B1 and the inherited lysosomal storage disorder α-mannosidosis. In this disorder, deficient MAN2B1 activity is associated with mutations in the MAN2B1 gene. The thesis reports the molecular consequences of 35 alpha-mannosidosis associated mutations, including 29 novel missense mutations. According to experimental analyses, the mutations fall into four groups: Mutations, which prevent transport to lysosomes are accompanied with a lack of proteolytic processing of the enzyme (groups 1 and 3). Although the rest of the mutations (groups 2 and 4) allow transport to lysosomes, the mutated proteins are less efficiently processed to their mature form than is wild type MAN2B1. Analysis of the effect of the mutations on the model structure of human lysosomal alpha-mannosidase provides insights on their structural consequences. Mutations, which affect amino acids important for folding (prolines, glycines, cysteines) or domain interface interactions (arginines), arrest the enzyme in the endoplasmic reticulum. Surface mutations and changes, which do not drastically alter residue volume, are tolerated better. Descriptions of the mutations and clinical data are compiled in an α-mannosidosis database, which will be available for the scientific community. This thesis provides a detailed insight into two ubiquitous human alpha-mannosidases. It demonstrates that neutral alpha-mannosidase is involved in the degradation of cytosolic oligosaccharides and suggests that the regulation of this α-mannosidase is important for maintaining the cellular homeostasis of N-glycosylation and glycan degradation. The study on alpha-mannosidosis associated mutations identifies multiple mechanisms for how these mutations are detrimental for MAN2B1 activity. The α-mannosidosis database will benefit both clinicians and scientific research on lysosomal alpha‑mannosidosis.
Resumo:
Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.
Resumo:
Tutkimuksen tavoitteena oli luoda kohdeyritykseen toimintamalli, jonka avulla saadaan osallistettua työntekijät, tiiminvetäjät ja työnjohto tuotannon jatkuvaan parantamiseen sekä parannettua tuotannon takaisinkytkentää tiimitasolla. Tutkimus rajattiin pilottitiimiin sekä tiimissä olevien työpisteiden kautta reititettyihin tuotteisiin. Ennen tutkimuksen aloitusta yrityksellä oli jo olemassa sähköinen aloitejärjestelmä, mutta sen käyttö oli organisaation uudelleen järjestelyjen myötä vähentynyt. Tutkimuksen teoriaosassa tutustuttiin jatkuvan parantamisen kulttuuriin ja työkaluihin. Lisäksi tutustuttiin laadunhallinnan sisältöön, käsitteistöön ja laadunvalvontatyökaluihin sekä tuotannon mittareihin. Teorian pohjalta tutkimuksessa luotiin jatkuvan parantamisen toimintamalli, joka tunnistaa ja eliminoi prosessissa olevaa hukkaa osallistamalla pilottitiimin työntekijöitä hukkakorttien avulla. Lisäksi tutkimuksessa luotiin toimintamalli tuotannon kehitysideoiden raportointiin ja käsittelyyn. Tuotannon takaisinkytkentää kehitettiin luomalla pilottitiimiin tuloskortti sekä perustamalla yritykseen päiväkatsauskäytäntö. Tutkimuksessa suoritettiin myös toimihenkilötason kehitysprojekteja käyttäen apuna teoriassa esiteltyjä malleja ja työkaluja. Tuloksena saatiin toimintamalli, joka tuottaa työntekijämäärään suhteutettuna enemmän kehitysideoita sekä käsittelee ne tehokkaammin kuin sähköinen aloitejärjestelmä. Hukkakorteilla toteutetun hukan raportoinnin kautta tunnistettiin ja raportoitiin seitsemän viikon tarkasteluajanjakson aikana yhteensä 23,6 tuntia hukka-aikaa. Tiimin tuloskortin avulla tiimin työntekijät pystyivät viikkotasolla seuraamaan oman tiiminsä suorituskykyä tavoitearvoihin verrattuna. Tämä näkyi muun muassa tiimin suoritustason nousuna. Kehitysprojektien avulla saatiin parannettua pilottitiimin toiminnan ja tuotteiden laatua. Päiväkatsauskäytännön avulla saatiin osallistettua tiiminvetäjät ongelmaratkaisuun sekä tuotannon suorituskyvyn varmistamiseen.