9 resultados para optimering
Resumo:
Analyserna av hålltiderna i Normaliseringsugnen visar att i de flesta fall kan hålltiderna förkortas, beroende på om man väljer att godkänna värmebehandlingen vid en temperatur15C° eller 5C° under börvärdet. Om 5C° väljas kan det krävas att vissa ugnstider förlängs. Det finns ingen anledning att hålla övertid på plåtarna i ugnen, vilket kan leda till korntillväxt och ett sprött material. Undersökningen visar att övertiden ökar med plåttjockleken. Temperaturavvikelsediagrammet för värmebehandlingen visar att inom tjockleksintervallet 30mm - 50mm är temperaturdifferensen vid hålltidens slut störst gentemot börvärdet. Man bör dock komma ihåg att pyrometern endast mäter yttemperaturen. Kärntemperaturen är förmodligen något lägre och då krävs en viss övertid för att säkerställa en homogen temperatur genom hela plåten. Vid ett lyckat experiment med att plana en plåt, efter att ha värmt upp den till 650°C, gjordes en simulering i dataprogrammet Steeltemp 2D. Simuleringen visar svalningskurvor för en given analyskod från 600°C för olika plåttjocklekar. Temperaturen på experimentplåten vid inträdet i planmaskinen var 350°C. Resultatet som redovisas i Steeltempdiagrammet visar att det inte finns risk att plåttemperaturen sjunker under 350°C.
Resumo:
Tiivistelmä ReferatAbstract Metabolomics is a rapidly growing research field that studies the response of biological systems to environmental factors, disease states and genetic modifications. It aims at measuring the complete set of endogenous metabolites, i.e. the metabolome, in a biological sample such as plasma or cells. Because metabolites are the intermediates and end products of biochemical reactions, metabolite compositions and metabolite levels in biological samples can provide a wealth of information on on-going processes in a living system. Due to the complexity of the metabolome, metabolomic analysis poses a challenge to analytical chemistry. Adequate sample preparation is critical to accurate and reproducible analysis, and the analytical techniques must have high resolution and sensitivity to allow detection of as many metabolites as possible. Furthermore, as the information contained in the metabolome is immense, the data set collected from metabolomic studies is very large. In order to extract the relevant information from such large data sets, efficient data processing and multivariate data analysis methods are needed. In the research presented in this thesis, metabolomics was used to study mechanisms of polymeric gene delivery to retinal pigment epithelial (RPE) cells. The aim of the study was to detect differences in metabolomic fingerprints between transfected cells and non-transfected controls, and thereafter to identify metabolites responsible for the discrimination. The plasmid pCMV-β was introduced into RPE cells using the vector polyethyleneimine (PEI). The samples were analyzed using high performance liquid chromatography (HPLC) and ultra performance liquid chromatography (UPLC) coupled to a triple quadrupole (QqQ) mass spectrometer (MS). The software MZmine was used for raw data processing and principal component analysis (PCA) was used in statistical data analysis. The results revealed differences in metabolomic fingerprints between transfected cells and non-transfected controls. However, reliable fingerprinting data could not be obtained because of low analysis repeatability. Therefore, no attempts were made to identify metabolites responsible for discrimination between sample groups. Repeatability and accuracy of analyses can be influenced by protocol optimization. However, in this study, optimization of analytical methods was hindered by the very small number of samples available for analysis. In conclusion, this study demonstrates that obtaining reliable fingerprinting data is technically demanding, and the protocols need to be thoroughly optimized in order to approach the goals of gaining information on mechanisms of gene delivery.
Resumo:
Company X develops a laboratory information system (LIS) called System Y. The informationsystem has a two-tier database architecture consisting of a production database and a historicaldatabase. A database constitutes the backbone of a IS, which makes the design of the databasevery important. A poorly designed database can cause major problems within an organization.The two databases in System Y are poorly modeled, particularly the historical database. Thecause of the poor modeling was unclear concepts. The unclear concepts have remained in thedatabase and in the company organization and caused a general confusion of concepts. The splitdatabase architecture itself has evolved into a bottleneck and is the cause of many problemsduring the development of System Y.Company X investigates the possibility of integrating the historical database with the productiondatabase. The goal of our thesis is to conduct a consequence analysis of such integration andwhat the effects would be on System Y, and to create a new design for the integrated database.We will also examine and describe the practical effects of confusion of concepts for a databaseconceptual design.To achieve the goal of the thesis, five different method steps have been performed: a preliminarystudy of the organization, a change analysis, a consequence analysis and an investigation of theconceptual design of the database. These method steps have helped identify changes necessaryfor the organization, a new design proposal for an integrated database, the impact of theproposed design and a number of effects of confusion for the database.
Resumo:
Man kan förbättra energibesparingen på årsbasis för ett traditionellt svenskt kombisolvärmesystem från knappt 20 % för ett referenssystem till över 25 %. Alla de studerade systemen har 10 m2 solfångare, lika stor tank, och samma värme- och tappvarmvattenlast. Skillnaderna består endast i systemutformning. Arbetet har utförts genom mätningar i värmelaboratorium och simuleringsberäkningar. Inom området optisk design av solenergisystem har metoder utvecklats för: - analys av solinstråningens himmelsfordelning och asymmetriska årsfordelning - optimering av sollangare med reflektorer - optimering av solceller med tillsatsreflektorer Programmet PRESIM, grafisk indatabehandlare till simuleringsprogrammet TRNSYS, har vidareutvecklats i linje med användarnas önskemål, men förutsättningarna för fortsatt utveckling har försämrats. En förbättrad version, delvis finansierad av Statens energimyndighet och anpassad till TRNSYS 15.0, kommer att släppas under år 2000, men därefter kommer aktiviteten att ligga på en lägre nivå.
Resumo:
This thesis deals with control of stock in an inventory, focusing on inventory placement. The purpose of this thesis is to reduce the transport distance within the main stock house while gathering inventory. This will be achieved by reconstructing the inventory placement in consideration with how frequently the inventories get picked and mass of the inventory. In particular, the literature and the data that is collected from the company´s business system have laid the foundation for the thesis. In general, interviews and observations also contributed to the data collection. To fulfill the aim and to produce arbitrary results, two issues have been developed regarding which attributes that should determine the position of the inventory in the stock house and how to obtain a more effective inventory structure? The authors have jointly produced a result of suggestions for future inventory placement in terms of picking frequency and weight. Initially a situation analysis was conducted to identify known problems with the inventory´s placement and storage systems. The problems that were identified were that the inventory placement has no consideration regarding picking frequency. To determine the most frequent picked inventory an ABC analysis was conducted. All of the inventories were spread out throughout the whole stock house. To take in account, the additional criterion, which was weight, a multi-criteria analysis was performed in combination with the ABC analysis. The results of the combined analysis provided that the basis for drawing up concepts for future inventory placement. The proposal includes optimized inventory placements in different zones of the most frequently picked inventory with weight as an additional criterion.
Resumo:
The usage of multi material structures in industry, especially in the automotive industry are increasing. To overcome the difficulties in joining these structures, adhesives have several benefits over traditional joining methods. Therefore, accurate simulations of the entire process of fracture including the adhesive layer is crucial. In this paper, material parameters of a previously developed meso mechanical finite element (FE) model of a thin adhesive layer are optimized using the Strength Pareto Evolutionary Algorithm (SPEA2). Objective functions are defined as the error between experimental data and simulation data. The experimental data is provided by previously performed experiments where an adhesive layer was loaded in monotonically increasing peel and shear. Two objective functions are dependent on 9 model parameters (decision variables) in total and are evaluated by running two FEsimulations, one is loading the adhesive layer in peel and the other in shear. The original study converted the two objective functions into one function that resulted in one optimal solution. In this study, however, a Pareto frontis obtained by employing the SPEA2 algorithm. Thus, more insight into the material model, objective functions, optimal solutions and decision space is acquired using the Pareto front. We compare the results and show good agreement with the experimental data.
Resumo:
The mechanical behaviour and performance of a ductile iron component is highly dependent on the local variations in solidification conditions during the casting process. Here we show a framework which combine a previously developed closed chain of simulations for cast components with a micro-scale Finite Element Method (FEM) simulation of the behaviour and performance of the microstructure. A casting process simulation, including modelling of solidification and mechanical material characterization, provides the basis for a macro-scale FEM analysis of the component. A critical region is identified to which the micro-scale FEM simulation of a representative microstructure, generated using X-ray tomography, is applied. The mechanical behaviour of the different microstructural phases are determined using a surrogate model based optimisation routine and experimental data. It is discussed that the approach enables a link between solidification- and microstructure-models and simulations of as well component as microstructural behaviour, and can contribute with new understanding regarding the behaviour and performance of different microstructural phases and morphologies in industrial ductile iron components in service.
Resumo:
Purpose: The purpose of this work is to increase the possibilities of designing building components for specific demands to increase the building’s value, and to investigate how the possibilities can be affected by automating the production process. Method: The theoretical framework, which this study is based on, was collected using literature studies and was thereafter combined with the empirics, which were retrieved from qualitative methods as interviews and planned observations. A case study was made of the building Ormhuset in Jönköping. Findings: The objective of this work is to investigate the possibilities for designing roofs by using new automation methods for the production process of wooden roof structures. This study implies that parametric design can be used to generate new innovative shapes and designs that are optimised according to specific criteria. Furthermore, an increased use of automation in the production process of wooden roof trusses result in cheaper roof trusses, regardless of their shapes. The generated optimized designs are therefore cheaper and easier to produce using more automation in the production process. Implications: If parametric design is used, almost any kind of shapes can be generated and optimised. To ensure manufacturability of a design, an early connection between architect and manufacturer is important. Furthermore, increased use of automation can lead to easier and faster production of roof trusses and investing in more automation can be relevant for companies with large production volumes. Using digital files to control the manufacturing machines is time saving. There are alternative manufacturing methods for advanced roof structurers in wood, which are better suited for production, which cannot be rationalized as for roof trusses. Constraints for increased automation are often a high investment cost and limited space. Limitations: If the study is performed on another case than Ormhuset and with other respondents, the result might have differed but could be similar, why this study is not generally valid but only shows one possible outcome.