996 resultados para Process Standardization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chemical-looping combustion (CLC) is a novel combustion technology with inherent separation of the greenhouse gas CO2. The technique typically employs a dual fluidized bed system where a metal oxide is used as a solid oxygen carrier that transfers the oxygen from combustion air to the fuel. The oxygen carrier is looping between the air reactor, where it is oxidized by the air, and the fuel reactor, where it is reduced by the fuel. Hence, air is not mixed with the fuel, and outgoing CO2 does not become diluted by the nitrogen, which gives a possibility to collect the CO2 from the flue gases after the water vapor is condensed. CLC is being proposed as a promising and energy efficient carbon capture technology, since it can achieve both an increase in power station efficiency simultaneously with low energy penalty from the carbon capture. The outcome of a comprehensive literature study concerning the current status of CLC development is presented in this thesis. Also, a steady state model of the CLC process, based on the conservation equations of mass and energy, was developed. The model was used to determine the process conditions and to calculate the reactor dimensions of a 100 MWth CLC system with bunsenite (NiO) as oxygen carrier and methane (CH4) as fuel. This study has been made in Oxygen Carriers and Their Industrial Applications research project (2008 – 2011), funded by the Tekes – Functional Material program. I would like to acknowledge Tekes and participating companies for funding and all project partners for good and comfortable cooperation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objectives of this Master’s Thesis were to find out what kind of knowledge management strategy would fit best an IT organization that uses ITIL (Information Technology Infrastructure Library) framework for IT Service Management and to create a knowledge management process model to support chosen strategy. The empirical material for this research was collected through qualitative semi-structured interviews of a case organization Stora Enso Corporate IT. The results of the qualitative interviews indicate that codification knowledge management strategy would fit best for the case organization. The knowledge management process model was created based on earlier studies and a literature of knowledge management. The model was evaluated in the interview research and the results showed that the created process model is realistic, useful, and it responds to a real life phenomenon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a networked business environment the visibility requirements towards the supply operations and customer interface has become tighter. In order to meet those requirements the master data of case company is seen as an enabler. However the current state of master data and its quality are not seen good enough to meet those requirements. In this thesis the target of research was to develop a process for managing master data quality as a continuous process and find solutions to cleanse the current customer and supplier data to meet the quality requirements defined in that process. Based on the theory of Master Data Management and data cleansing, small amount of master data was analyzed and cleansed using one commercial data cleansing solution available on the market. This was conducted in cooperation with the vendor as a proof of concept. In the proof of concept the cleansing solution’s applicability to improve the quality of current master data was proved. Based on those findings and the theory of data management the recommendations and proposals for improving the quality of data were given. In the results was also discovered that the biggest reasons for poor data quality is the lack of data governance in the company, and the current master data solutions and its restrictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The optical, mechanical, and microstructural properties of MgF2 single layers grown by ion beam sputtering have been investigated by spectrophotometric measurements, film stress characterization, x-ray photoelectron spectroscopy (XPS), x-ray diffraction, and transmission electron microscopy. The deposition conditions, using fluorine reactive gas or not, have been found to greatly influence the optical absorption and the stress of the films as well as their microstructure. The layers grown with fluorine compensation exhibit a regular columnar microstructure and an UV-optical absorption which can be very low, either as deposited or after thermal annealings at very low temperatures. On the contrary, layers grown without fluorine compensation exhibit a less regular microstructure and a high ultraviolet absorption which is particularly hard to cure. On the basis of calculations, it is shown that F centers are responsible for this absorption, whereas all the films were found to be stoichiometric, in the limit of the XPS sensitivity. On the basis of external data taken from literature, our experimental curves are analyzed, so we propose possible diffusion mechanisms which could explain the behaviors of the coatings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Työssä tarkastellaan hitsauksen tuotekehityksen suunnittelun toteuttamista ja hitsauksen kehittämistä konepajalla. Yritykset tietävät omat valmistustekniset ongelmansa, mutta kehittämisen aloittaminen on monesti hankalaa tai sitä ei edes aloiteta. Pienimmillä yrityksillä ei usein ole mahdollisuuksia eikä osaamista uusien tuotantomenetelmien ja tuotteiden kehittämiseen. Työssä annetaan neuvoja miten alentaa kehittämistyön aloituskynnystä ja auttaa suunnittelutehtävän jäsentelyssä. Tuotteen suunnittelu on vaativa tehtävä, koska siinä lyödään lukkoon tuotteen toimivuus ja suurin osa kustannuksista. Suunnittelijoiden on tiedettävä eri valmistusmenetelmistä ja valmistusystävällisestä suunnittelusta, jotta he voisivat huomioida valmistettavuus- ja kokoonpanonäkökohdat. Hyvässä hitsatussa tuotteessa on mahdollisimman vähän hitsiä, käytetty modulointia ja standardointia sekä käytetty hitsausta korvaavia menetelmiä. Työssä on esitelty valmistusystävälliseen suunnitteluun kehitelty "Design For Welding" -malli, jossa käydään läpi hitsatun rakenteen erityispiirteet suunnittelun kannalta. Valmistusystävällisen suunnittelun tarkoituksena on vähentää tilauskohtaista suunnittelua, hitsausta, hitsauksen jälkeistä käsittelyä ja koneistusta.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the field of observational methodology the observer is obviously a central figure, and close attention should be paid to the process through which he or she acquires, applies, and maintains the skills required. Basic training in how to apply the operational definitions of categories and the rules for coding, coupled with the opportunity to use the observation instrument in real-life situations, can have a positive effect in terms of the degree of agreement achieved when one evaluates intra- and inter-observer reliability. Several authors, including Arias, Argudo, & Alonso (2009) and Medina and Delgado (1999), have put forward proposals for the process of basic and applied training in this context. Reid y De Master (1982) focuses on the observer's performance and how to maintain the acquired skills, it being argued that periodic checks are needed after initial training because an observer may, over time, become less reliable due to the inherent complexity of category systems. The purpose of this subsequent training is to maintain acceptable levels of observer reliability. Various strategies can be used to this end, including providing feedback about those categories associated with a good reliability index, or offering re-training in how to apply those that yield lower indices. The aim of this study is to develop a performance-based index that is capable of assessing an observer's ability to produce reliable observations in conjunction with other observers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the thesis is to devise a framework for analyzing simulation games, in particular introductory supply chain simulation games which are used in education and process development. The framework is then applied to three case examples which are introductory supply chain simulation games used at Lappeenranta University of Technology. The theoretical part of the thesis studies simulation games in the context of education and training as well as of process management. Simulation games can be seen as learning processes which comprise of briefing, micro cycle, and debriefing which includes observation and reflection as well as conceptualization. The micro cycle, i.e. the game itself, is defined through elements and characteristics. Both briefing and debriefing ought to support the micro cycle. The whole learning process needs to support learning objectives of the simulation game. Based on the analysis of the case simulation games, suggestions on how to boost the debriefing and promote long term effects of the games are made. In addition, a framework is suggested to be used in designing simulation games and characteristics of introductory supply chain simulation games are defined. They are designed for general purposes, are simple and operated manually, are multifunctional interplays, and last about 2.5 4 hours. Participants co operate during a game run and competition arises between different runs or game sessions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An efficient approach for organizing large ad hoc networks is to divide the nodesinto multiple clusters and designate, for each cluster, a clusterhead which is responsible forholding intercluster control information. The role of a clusterhead entails rights and duties.On the one hand, it has a dominant position in front of the others because it manages theconnectivity and has access to other node¿s sensitive information. But on the other hand, theclusterhead role also has some associated costs. Hence, in order to prevent malicious nodesfrom taking control of the group in a fraudulent way and avoid selfish attacks from suitablenodes, the clusterhead needs to be elected in a secure way. In this paper we present a novelsolution that guarantees the clusterhead is elected in a cheat-proof manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El escenario de asesoramiento colaborativo es un espacio sociocultural donde lo que se hace y se dice condiciona el tipo de interacción que se va a dar entre las personas que participan así como la actividad que llevan a cabo de manera conjunta. En este artículo vamos a reflexionar sobre el papel de la motivación de los profesores y asesores para mantener el asesoramiento así como en las posibilidades que tiene el asesor para crear y mantener un contexto que resulte motivante para asesor y asesorados

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents biochemical characterization of a lipase from a new strain of Bacillus sp. ITP-001, immobilized using a sol gel process (IB). The results from the biochemical characterization of IB showed increased activity for hydrolysis, with 526.63 U g-1 at pH 5.0 and 80 ºC, and thermal stability at 37 ºC. Enzymatic activity was stimulated by ions such as EDTA, Fe+3, Mn+2, Zn+2, and Ca+2, and in various organic solvents. Kinetic parameters obtained for the IB were Km = 14.62 mM, and Vmax = 0.102 mM min-1 g-1. The results of biochemical characterization revealed the improved catalytic properties of IB.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An improved method based on reverse flow injection is proposed for determining sulfate concentration in the wet-process of phosphoric acid (WPA). The effect of reagent composition, flow rate, temperature, acid concentration, length of the reaction coil, and linear response range on the flow system is discussed in detail. Optimal conditions are established for determining sulfate in the WPA samples. Baseline drift is avoided by a periodic washing step with EDTA in an alkaline medium. A linear response is observed within a range of 20 - 360 mg L-1, given by the equation A = 0.0020C (mg L-1) + 0.0300, R² = 0.9991. The detection limit of the proposed method for sulfate analysis is 3 mg L-1, and the relative standard deviation (n = 12) of sulfate absorbance peak is less than 1.60%. This method has a rate of up to 29 samples per hour, and the results compare well with those obtained with gravimetric method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the thesis was to create a performance measurement system for the logistics process of a company. In addition, one goal was to make suggestions for improvements based on description and analysis of the process and current measures. The logistics process was described in detail, and the objectives for it were derived from the company strategy and goals. Suggestions for performance measurement system and process improvement were made based on current state analysis. As a result of the thesis, three new performance measures were decided to take into use. In addition, several improvements were suggested to the ERP system to make process smoother. Some of the improvements have already been added to the system and the rest will be added in the near future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research was motivated by the need to examine the potential application areas of process intensification technologies in Neste Oil Oyj. According to the company’s interest membrane reactor technology was chosen and applicability of this technology in refining industry was investigated. Moreover, Neste Oil suggested a project which is related to the CO2 capture from FCC unit flue gas stream. The flowrate of the flue gas is 180t/h and consist of approximately 14% by volume CO2. Membrane based absorption process (membrane contactor) was chosen as a potential technique to model CO2 capture from fluid catalytic cracking (FCC) unit effluent. In the design of membrane contactor, a mathematical model was developed to describe CO2 absorption from a gas mixture using monoethanole amine (MEA) aqueous solution. According to the results of literature survey, in the hollow fiber contactor for laminar flow conditions approximately 99 % percent of CO2 can be removed by using a 20 cm in length polyvinylidene fluoride (PDVF) membrane. Furthermore, the design of whole process was performed by using PRO/II simulation software and the CO2 removal efficiency of the whole process obtained as 97 %. The technical and economical comparisons among existing MEA absorption processes were performed to determine the advantages and disadvantages of membrane contactor technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis the main objective is to examine and model configuration system and related processes. When and where configuration information is created in product development process and how it is utilized in order-delivery process? These two processes are the essential part of the whole configuration system from the information point of view. Empirical part of the work was done as a constructive research inside a company that follows a mass customization approach. Data models and documentation are created for different development stages of the configuration system. A base data model already existed for new structures and relations between these structures. This model was used as the basis for the later data modeling work. Data models include different data structures, their key objects and attributes, and relations between. Representation of configuration rules for the to-be configuration system was defined as one of the key focus point. Further, it is examined how the customer needs and requirements information can be integrated into the product development process. Requirements hierarchy and classification system is presented. It is shown how individual requirement specifications can be connected for physical design structure via features by developing the existing base data model further.