169 resultados para agile methods
Resumo:
Työn tarkoituksena oli kehittää analyyttinen erotusmenetelmä eräässä valmistusprosessissa käytettävän hapettavan aineen ja liuottimen välillä syntyvien reaktiotuotteiden tutkimiseen ja analysoimiseen. Lisäksi tarkoituksena oli tutkia prosessiolosuhteiden turvallisuutta. Kirjallisuusosassa käsitellään erilaisia orgaanisia peroksideja, niiden käyttötarkoituksia ja niiden käyttöön liittyviä huomioitavia asioita. Lisäksi tarkastellaan yleisimpiä analyysimenetelmiä, joita on käytetty erilaisten peroksidien analysoinnissa. Näitä analyysimenetelmiä on useimmiten käytetty nestemäisten näytteiden tutkimuksissa. Harvemmin on analysoitu kaasu- ja kiintoainenäytteitä. Kokeellisessa osassa kehitettiin kirjallisuuden perusteella peroksidiyhdisteille identifiointimenetelmä ja tutkittiin prosessin näytteet. Analyysimenetelmiksi valittiin iodometrinen titraus ja HPLC-UV-MS-menetelmä. Lisäksi käytettiin peroksidimittaukseen soveltuvia testiliuskoja. Tutkimus osoitti, että iodometrisen titrauksen ja testiliuskojen perusteella näytteissä oli vähäisiä määriä peroksideja viikon jälkeen peroksidilisäyksestä. HPLC-UV-MS-analyysien perusteella näytteiden analysointia häiritsi selluloosa, jota löytyi jokaisesta näytteestä.
Resumo:
Statistical analyses of measurements that can be described by statistical models are of essence in astronomy and in scientific inquiry in general. The sensitivity of such analyses, modelling approaches, and the consequent predictions, is sometimes highly dependent on the exact techniques applied, and improvements therein can result in significantly better understanding of the observed system of interest. Particularly, optimising the sensitivity of statistical techniques in detecting the faint signatures of low-mass planets orbiting the nearby stars is, together with improvements in instrumentation, essential in estimating the properties of the population of such planets, and in the race to detect Earth-analogs, i.e. planets that could support liquid water and, perhaps, life on their surfaces. We review the developments in Bayesian statistical techniques applicable to detections planets orbiting nearby stars and astronomical data analysis problems in general. We also discuss these techniques and demonstrate their usefulness by using various examples and detailed descriptions of the respective mathematics involved. We demonstrate the practical aspects of Bayesian statistical techniques by describing several algorithms and numerical techniques, as well as theoretical constructions, in the estimation of model parameters and in hypothesis testing. We also apply these algorithms to Doppler measurements of nearby stars to show how they can be used in practice to obtain as much information from the noisy data as possible. Bayesian statistical techniques are powerful tools in analysing and interpreting noisy data and should be preferred in practice whenever computational limitations are not too restrictive.
Resumo:
The papermaking industry has been continuously developing intelligent solutions to characterize the raw materials it uses, to control the manufacturing process in a robust way, and to guarantee the desired quality of the end product. Based on the much improved imaging techniques and image-based analysis methods, it has become possible to look inside the manufacturing pipeline and propose more effective alternatives to human expertise. This study is focused on the development of image analyses methods for the pulping process of papermaking. Pulping starts with wood disintegration and forming the fiber suspension that is subsequently bleached, mixed with additives and chemicals, and finally dried and shipped to the papermaking mills. At each stage of the process it is important to analyze the properties of the raw material to guarantee the product quality. In order to evaluate properties of fibers, the main component of the pulp suspension, a framework for fiber characterization based on microscopic images is proposed in this thesis as the first contribution. The framework allows computation of fiber length and curl index correlating well with the ground truth values. The bubble detection method, the second contribution, was developed in order to estimate the gas volume at the delignification stage of the pulping process based on high-resolution in-line imaging. The gas volume was estimated accurately and the solution enabled just-in-time process termination whereas the accurate estimation of bubble size categories still remained challenging. As the third contribution of the study, optical flow computation was studied and the methods were successfully applied to pulp flow velocity estimation based on double-exposed images. Finally, a framework for classifying dirt particles in dried pulp sheets, including the semisynthetic ground truth generation, feature selection, and performance comparison of the state-of-the-art classification techniques, was proposed as the fourth contribution. The framework was successfully tested on the semisynthetic and real-world pulp sheet images. These four contributions assist in developing an integrated factory-level vision-based process control.
Resumo:
Stochastic approximation methods for stochastic optimization are considered. Reviewed the main methods of stochastic approximation: stochastic quasi-gradient algorithm, Kiefer-Wolfowitz algorithm and adaptive rules for them, simultaneous perturbation stochastic approximation (SPSA) algorithm. Suggested the model and the solution of the retailer's profit optimization problem and considered an application of the SQG-algorithm for the optimization problems with objective functions given in the form of ordinary differential equation.
Resumo:
Protein engineering aims to improve the properties of enzymes and affinity reagents by genetic changes. Typical engineered properties are affinity, specificity, stability, expression, and solubility. Because proteins are complex biomolecules, the effects of specific genetic changes are seldom predictable. Consequently, a popular strategy in protein engineering is to create a library of genetic variants of the target molecule, and render the population in a selection process to sort the variants by the desired property. This technique, called directed evolution, is a central tool for trimming protein-based products used in a wide range of applications from laundry detergents to anti-cancer drugs. New methods are continuously needed to generate larger gene repertoires and compatible selection platforms to shorten the development timeline for new biochemicals. In the first study of this thesis, primer extension mutagenesis was revisited to establish higher quality gene variant libraries in Escherichia coli cells. In the second study, recombination was explored as a method to expand the number of screenable enzyme variants. A selection platform was developed to improve antigen binding fragment (Fab) display on filamentous phages in the third article and, in the fourth study, novel design concepts were tested by two differentially randomized recombinant antibody libraries. Finally, in the last study, the performance of the same antibody repertoire was compared in phage display selections as a genetic fusion to different phage capsid proteins and in different antibody formats, Fab vs. single chain variable fragment (ScFv), in order to find out the most suitable display platform for the library at hand. As a result of the studies, a novel gene library construction method, termed selective rolling circle amplification (sRCA), was developed. The method increases mutagenesis frequency close to 100% in the final library and the number of transformants over 100-fold compared to traditional primer extension mutagenesis. In the second study, Cre/loxP recombination was found to be an appropriate tool to resolve the DNA concatemer resulting from error-prone RCA (epRCA) mutagenesis into monomeric circular DNA units for higher efficiency transformation into E. coli. Library selections against antigens of various size in the fourth study demonstrated that diversity placed closer to the antigen binding site of antibodies supports generation of antibodies against haptens and peptides, whereas diversity at more peripheral locations is better suited for targeting proteins. The conclusion from a comparison of the display formats was that truncated capsid protein three (p3Δ) of filamentous phage was superior to the full-length p3 and protein nine (p9) in obtaining a high number of uniquely specific clones. Especially for digoxigenin, a difficult hapten target, the antibody repertoire as ScFv-p3Δ provided the clones with the highest affinity for binding. This thesis on the construction, design, and selection of gene variant libraries contributes to the practical know-how in directed evolution and contains useful information for scientists in the field to support their undertakings.
Resumo:
This study focused on identifying various system boundaries and evaluating methods of estimating energy performance of biogas production. First, the output-input ratio method used for evaluating energy performance from the system boundaries was reviewed. Secondly, ways to assess the efficiency of biogas use and parasitic energy demand were investigated. Thirdly, an approach for comparing biogas production to other energy production methods was evaluated. Data from an existing biogas plant, located in Finland, was used for the evaluation of the methods. The results indicate that calculating and comparing the output-input ratios (Rpr1, Rpr2, Rut, Rpl and Rsy) can be used in evaluating the performance of biogas production system. In addition, the parasitic energy demand calculations (w) and the efficiency of utilizing produced biogas (η) provide detailed information on energy performance of the biogas plant. Furthermore, Rf and energy output in relation to total solid mass of feedstock (FO/TS) are useful in comparing biogas production with other energy recovery technologies. As a conclusion it is essential for the comparability of biogas plants that their energy performance would be calculated in a more consistent manner in the future.
Resumo:
In today's logistics environment, there is a tremendous need for accurate cost information and cost allocation. Companies searching for the proper solution often come across with activity-based costing (ABC) or one of its variations which utilizes cost drivers to allocate the costs of activities to cost objects. In order to allocate the costs accurately and reliably, the selection of appropriate cost drivers is essential in order to get the benefits of the costing system. The purpose of this study is to validate the transportation cost drivers of a Finnish wholesaler company and ultimately select the best possible driver alternatives for the company. The use of cost driver combinations as an alternative is also studied. The study is conducted as a part of case company's applied ABC-project using the statistical research as the main research method supported by a theoretical, literature based method. The main research tools featured in the study include simple and multiple regression analyses, which together with the literature and observations based practicality analysis forms the basis for the advanced methods. The results suggest that the most appropriate cost driver alternatives are the delivery drops and internal delivery weight. The possibility of using cost driver combinations is not suggested as their use doesn't provide substantially better results while increasing the measurement costs, complexity and load of use at the same time. The use of internal freight cost drivers is also questionable as the results indicate weakening trend in the cost allocation capabilities towards the end of the period. Therefore more research towards internal freight cost drivers should be conducted before taking them in use.
Resumo:
Today lean-philosophy has gathered a lot of popularity and interest in many industries. This customer-oriented philosophy helps to understand customer’s value creation which can be used to improve efficiency. A comprehensive study of lean and lean-methods in service industry were created in this research. In theoretical part lean-philosophy is studied in different levels which will help to understand its diversity. To support lean, this research also presents basic concepts of process management. Lastly theoretical part presents a development model to support process development in systematical way. The empirical part of the study was performed by performing experimental measurements during the service center’s product return process and by analyzing this data. Measurements were used to map out factors that have a negative influence on the process flow. Several development propositions were discussed to remove these factors. Problems mainly occur due to challenges in controlling customers and due to the lack of responsibility and continuous improvement on operational level. Development propositions concern such factors as change in service center’s physical environment, standardization of work tasks and training. These factors will remove waste in the product return process and support the idea of continuous improvement.
Resumo:
Yleisesti voidaan sanoa, että suuri osa ohjelmistokehitys -projekteista epäonnistuu. Tämä johtuu kehitykseen kohdistuvista aikarajoitteista, muuttuvista vaatimuksista sekä nopeasti muuttuvasta teknologiasta. Scrum -menetelmä on yksi vaihtoehto kehitettäessä ohjelmistoja alati muuttuvassa ympäristössä. Scrum -menetelmän säännöt on helppo oppia, mutta menetelmän tehokas hallinta vaatii kuitenkin harjoittelua. Tietojohtamisen keinoin voidaan kuitenkin pyrkiä tehostamaan tätä scrum -menetelmän täysimääräistä hyödyntämistä. Tutkimus toteutettiin laadullisin menetelmin tapaustutkimuksena.
Resumo:
Panel at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
The purpose of this thesis was to study the design of demand forecasting processes and management of demand. In literature review were different processes found and forecasting methods and techniques interviewed. Also role of bullwhip effect in supply chain was identified and how to manage it with information sharing operations. In the empirical part of study is at first described current situation and challenges in case company. After that will new way to handle demand introduced with target budget creation and how information sharing with 5 products and a few customers would bring benefits to company. Also the new S&OP process created within this study and organization for it.
Resumo:
Presentation at "Soome-ugri keelte andmebaasid ja e-leksikograafia" at Eesti Keele Instituut (Institution of Estonian Languages) in Tallnn on the 18th of November 2014.
Resumo:
Innovative gas cooled reactors, such as the pebble bed reactor (PBR) and the gas cooled fast reactor (GFR) offer higher efficiency and new application areas for nuclear energy. Numerical methods were applied and developed to analyse the specific features of these reactor types with fully three dimensional calculation models. In the first part of this thesis, discrete element method (DEM) was used for a physically realistic modelling of the packing of fuel pebbles in PBR geometries and methods were developed for utilising the DEM results in subsequent reactor physics and thermal-hydraulics calculations. In the second part, the flow and heat transfer for a single gas cooled fuel rod of a GFR were investigated with computational fluid dynamics (CFD) methods. An in-house DEM implementation was validated and used for packing simulations, in which the effect of several parameters on the resulting average packing density was investigated. The restitution coefficient was found out to have the most significant effect. The results can be utilised in further work to obtain a pebble bed with a specific packing density. The packing structures of selected pebble beds were also analysed in detail and local variations in the packing density were observed, which should be taken into account especially in the reactor core thermal-hydraulic analyses. Two open source DEM codes were used to produce stochastic pebble bed configurations to add realism and improve the accuracy of criticality calculations performed with the Monte Carlo reactor physics code Serpent. Russian ASTRA criticality experiments were calculated. Pebble beds corresponding to the experimental specifications within measurement uncertainties were produced in DEM simulations and successfully exported into the subsequent reactor physics analysis. With the developed approach, two typical issues in Monte Carlo reactor physics calculations of pebble bed geometries were avoided. A novel method was developed and implemented as a MATLAB code to calculate porosities in the cells of a CFD calculation mesh constructed over a pebble bed obtained from DEM simulations. The code was further developed to distribute power and temperature data accurately between discrete based reactor physics and continuum based thermal-hydraulics models to enable coupled reactor core calculations. The developed method was also found useful for analysing sphere packings in general. CFD calculations were performed to investigate the pressure losses and heat transfer in three dimensional air cooled smooth and rib roughened rod geometries, housed inside a hexagonal flow channel representing a sub-channel of a single fuel rod of a GFR. The CFD geometry represented the test section of the L-STAR experimental facility at Karlsruhe Institute of Technology and the calculation results were compared to the corresponding experimental results. Knowledge was gained of the adequacy of various turbulence models and of the modelling requirements and issues related to the specific application. The obtained pressure loss results were in a relatively good agreement with the experimental data. Heat transfer in the smooth rod geometry was somewhat under predicted, which can partly be explained by unaccounted heat losses and uncertainties. In the rib roughened geometry heat transfer was severely under predicted by the used realisable k − epsilon turbulence model. An additional calculation with a v2 − f turbulence model showed significant improvement in the heat transfer results, which is most likely due to the better performance of the model in separated flow problems. Further investigations are suggested before using CFD to make conclusions of the heat transfer performance of rib roughened GFR fuel rod geometries. It is suggested that the viewpoints of numerical modelling are included in the planning of experiments to ease the challenging model construction and simulations and to avoid introducing additional sources of uncertainties. To facilitate the use of advanced calculation approaches, multi-physical aspects in experiments should also be considered and documented in a reasonable detail.
Resumo:
The aim of this work is to apply approximate Bayesian computation in combination with Marcov chain Monte Carlo methods in order to estimate the parameters of tuberculosis transmission. The methods are applied to San Francisco data and the results are compared with the outcomes of previous works. Moreover, a methodological idea with the aim to reduce computational time is also described. Despite the fact that this approach is proved to work in an appropriate way, further analysis is needed to understand and test its behaviour in different cases. Some related suggestions to its further enhancement are described in the corresponding chapter.