23 resultados para automated correlation optimized warping
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Tässä diplomityössä tutkitaan automatisoitua testausta ja käyttöliittymätestauksen tekemistä helpommaksi Symbian-käyttöjärjestelmässä. Työssä esitellään Symbian ja Symbian-sovelluskehityksessä kohdattavia haasteita. Lisäksi kerrotaan testausstrategioista ja -tavoista sekä automatisoidusta testaamisesta. Lopuksi esitetään työkalu, jolla testitapausten luominen toiminnalisuus- ja järjestelmätestaukseen tehdään helpommaksi. Graafiset käyttöliittymättuovat ainutlaatuisia haasteita ohjelmiston testaamiseen. Ne tehdään usein monimutkaisista komponenteista ja niitä suunnitellaan jatkuvasti uusiksi ohjelmistokehityksen aikana. Graafisten käyttöliittymien testaukseen käytetään usein kaappaus- ja toistotyökaluja. Käyttöliittymätestauksen testitapausten suunnittelu ja toteutus vaatii paljon panostusta. Koska graafiset käyttöliittymät muodostavat suuren osan koodista, voitaisiin säästää paljon resursseja tekemällä testitapausten luomisesta helpompaa. Käytännön osuudessa toteutettu projekti pyrkii tähän tekemällä testiskriptien luomisesta visuaalista. Näin ollen itse testien skriptikieltä ei tarvitse ymmärtää ja testien hahmottaminen on myös helpompaa.
Resumo:
Tässä diplomityössä esitellään ohjelmistotestauksen ja verifioinnin yleisiä periaatteita sekä käsitellään tarkemmin älypuhelinohjelmistojen verifiointia. Työssä esitellään myös älypuhelimissa käytettävä Symbian-käyttöjärjestelmä. Työn käytännön osuudessa suunniteltiin ja toteutettiin Symbian-käyttöjärjestelmässä toimiva palvelin, joka tarkkailee ja tallentaa järjestelmäresurssien käyttöä. Verifiointi on tärkeä ja kuluja aiheuttava tehtävä älypuhelinohjelmistojen kehityssyklissä. Kuluja voidaan vähentää automatisoimalla osa verifiointiprosessista. Toteutettu palvelin automatisoijärjestelmäresurssien tarkkailun tallentamalla tietoja niistä tiedostoon testien ajon aikana. Kun testit ajetaan uudestaan, uusia tuloksia vertaillaan lähdetallenteeseen. Jos tulokset eivät ole käyttäjän asettamien virherajojen sisällä, siitä ilmoitetaan käyttäjälle. Virherajojen ja lähdetallenteen määrittäminen saattaa osoittautua vaikeaksi. Kuitenkin, jos ne määritetään sopivasti, palvelin tuottaa hyödyllistä tietoa poikkeamista järjestelmäresurssien kulutuksessa testaajille.
Resumo:
Over 70% of the total costs of an end product are consequences of decisions that are made during the design process. A search for optimal cross-sections will often have only a marginal effect on the amount of material used if the geometry of a structure is fixed and if the cross-sectional characteristics of its elements are property designed by conventional methods. In recent years, optimalgeometry has become a central area of research in the automated design of structures. It is generally accepted that no single optimisation algorithm is suitable for all engineering design problems. An appropriate algorithm, therefore, mustbe selected individually for each optimisation situation. Modelling is the mosttime consuming phase in the optimisation of steel and metal structures. In thisresearch, the goal was to develop a method and computer program, which reduces the modelling and optimisation time for structural design. The program needed anoptimisation algorithm that is suitable for various engineering design problems. Because Finite Element modelling is commonly used in the design of steel and metal structures, the interaction between a finite element tool and optimisation tool needed a practical solution. The developed method and computer programs were tested with standard optimisation tests and practical design optimisation cases. Three generations of computer programs are developed. The programs combine anoptimisation problem modelling tool and FE-modelling program using three alternate methdos. The modelling and optimisation was demonstrated in the design of a new boom construction and steel structures of flat and ridge roofs. This thesis demonstrates that the most time consuming modelling time is significantly reduced. Modelling errors are reduced and the results are more reliable. A new selection rule for the evolution algorithm, which eliminates the need for constraint weight factors is tested with optimisation cases of the steel structures that include hundreds of constraints. It is seen that the tested algorithm can be used nearly as a black box without parameter settings and penalty factors of the constraints.
Resumo:
The productivity, quality and cost efficiency of welding work are critical for metal industry today. Welding processes must get more effective and this can be done by mechanization and automation. Those systems are always expensive and they have to pay the investment back. In this case it is really important to optimize the needed intelligence and this way needed automation level, so that a company will get the best profit. This intelligence and automation level was earlier classified in several different ways which are not useful for optimizing the process of automation or mechanization of welding. In this study the intelligence of a welding system is defined in a new way to enable the welding system to produce a weld good enough. In this study a new way is developed to classify and select the internal intelligence level of a welding system needed to produce the weld efficiently. This classification contains the possible need of human work and its effect to the weld and its quality but does not exclude any different welding processes or methods. In this study a totally new way is developed to calculate the best optimization for the needed intelligence level in welding. The target of this optimization is the best possible productivity and quality and still an economically optimized solution for several different cases. This new optimizing method is based on grounds of product type, economical productivity, the batch size of products, quality and criteria of usage. Intelligence classification and optimization were never earlier made by grounds of a made product. Now it is possible to find the best type of welding system needed to welddifferent types of products. This calculation process is a universal way for optimizing needed automation or mechanization level when improving productivity of welding. This study helps the industry to improve productivity, quality and cost efficiency of welding workshops.
Resumo:
This study presents the information required to describe the machine and device resources in the turret punch press environment which are needed for the development of the analysing method for automated production. The description of product and device resources and their interconnectedness is the starting point for method comparison the development of expenses, production planning and the performance of optimisation. The manufacturing method cannot be optimized unless the variables and their interdependence are known. Sheet metal parts in particular may then become remarkably complex, and their automatic manufacture may be difficult or, with some automatic equipment, even impossible if not know manufacturing properties. This thesis consists of three main elements, which constitute the triangulation. In the first phase of triangulation, the manufacture occuring on a turret punch press is examined in order to find the factors that affect the efficiency of production. In the second phase of triangulation, the manufacturability of products on turret punch presses is examined through a set of laboratory tests. The third phase oftriangulation involves an examination of five industry parts. The main key findings of this study are: all possible efficiency in high automation level machining cannot be achieved unless the raw materials used in production and the dependencies of the machine and tools are well known. Machine-specific manufacturability factors for turret punch presses were not taken into account in the industrial case samples. On the grounds of the performed tests and industrial case samples, the designer of a sheet metal product can directly influence the machining time, material loss, energy consumption and the number of tools required on a turret punch press by making decisions in the way presented in the hypothesis of thisstudy. The sheet metal parts to be produced can be optimised to bemanufactured on a turret punch press when the material to be used and the kinds of machine and tool options available are known. This provides in-depth knowledge of the machine and tool properties machine and tool-specifically. None of the optimisation starting points described here is a separate entity; instead, they are all connected to each other.
Resumo:
This MSc work was done in the project of BIOMECON financed by Tekes. The prime target of the research was, to develop methods for separation and determination of carbohydrates (sugars), sugar acids and alcohols, and some other organic acids in hydrolyzed pulp samples by capillary electrophoresis (CE) using UV detection. Aspen, spruce, and birch pulps are commonly used for production of papers in Finland. Feedstock components in pulp predominantly consist of carbohydrates, organic acids, lignin, extractives, and proteins. Here in this study, pulps have been hydrolyzed in analytical chemistry laboratories of UPM Company and Lappeenranta University in order to convert them into sugars, acids, alcohols, and organic acids. Foremost objective of this study was to quantify and identify the main and by-products in the pulp samples. For the method development and optimization, increased precision in capillary electrophoresis was accomplished by calculating calibration data of 16 analytes such as D-(-)-fructose, D(+)-xylose, D(+)-mannose, D(+)-cellobiose, D-(+)-glucose, D-(+)-raffinose, D(-)-mannitol, sorbitol, rhamnose, sucrose, xylitol, galactose, maltose, arabinose, ribose, and, α-lactose monohydratesugars and 16 organic acids such as D-glucuronic, oxalic, acetic, propionic, formic, glycolic, malonic, maleic, citric, L-glutamic, tartaric, succinic, adipic, ascorbic, galacturonic, and glyoxylic acid. In carbohydrate and polyalcohol analyses, the experiments with CE coupled to direct UV detection and positive separation polarity was performed in 36 mM disodium hydrogen phosphate electrolyte solution. For acid analyses, CE coupled indirect UV detection, using negative polarity, and electrolyte solution made of 2,3 pyridinedicarboxylic acid, Ca2+ salt, Mg2+ salts, and myristyltrimethylammonium hydroxide in water was used. Under optimized conditions, limits of detection, relative standard deviations and correlation coefficients of each compound were measured. The optimized conditions were used for the identification and quantification of carbohydrates and acids produced by hydrolyses of pulp. The concentrations of the analytes varied between 1 mg – 0.138 g in liter hydrolysate.
Resumo:
Forest inventories are used to estimate forest characteristics and the condition of forest for many different applications: operational tree logging for forest industry, forest health state estimation, carbon balance estimation, land-cover and land use analysis in order to avoid forest degradation etc. Recent inventory methods are strongly based on remote sensing data combined with field sample measurements, which are used to define estimates covering the whole area of interest. Remote sensing data from satellites, aerial photographs or aerial laser scannings are used, depending on the scale of inventory. To be applicable in operational use, forest inventory methods need to be easily adjusted to local conditions of the study area at hand. All the data handling and parameter tuning should be objective and automated as much as possible. The methods also need to be robust when applied to different forest types. Since there generally are no extensive direct physical models connecting the remote sensing data from different sources to the forest parameters that are estimated, mathematical estimation models are of "black-box" type, connecting the independent auxiliary data to dependent response data with linear or nonlinear arbitrary models. To avoid redundant complexity and over-fitting of the model, which is based on up to hundreds of possibly collinear variables extracted from the auxiliary data, variable selection is needed. To connect the auxiliary data to the inventory parameters that are estimated, field work must be performed. In larger study areas with dense forests, field work is expensive, and should therefore be minimized. To get cost-efficient inventories, field work could partly be replaced with information from formerly measured sites, databases. The work in this thesis is devoted to the development of automated, adaptive computation methods for aerial forest inventory. The mathematical model parameter definition steps are automated, and the cost-efficiency is improved by setting up a procedure that utilizes databases in the estimation of new area characteristics.
Resumo:
The problem of software (SW) defaults is becoming more and more topical because of increasing amount of the SW and its complication. The majority of these defaults are founded during the test part that consumes about 40-50% of the development efforts. Test automation allows reducing the cost of this process and increasing testing effectiveness. In the middle of 1980 the first tools for automated testing appeared and the automated process was implemented in different kinds of SW testing. In short time, it became obviously, automated testing can cause many problems such as increasing product cost, decreasing reliability and even project fail. This thesis describes automated testing process, its concept, lists main problems, and gives an algorithm for automated test tools selection. Also this work presents an overview of the main automated test tools for embedded systems.
Resumo:
Tässä diplomityössä määritellään biopolttoainetta käyttävän voimalaitoksen käytönaikainen tuotannon optimointimenetelmä. Määrittelytyö liittyy MW Powerin MultiPower CHP –voimalaitoskonseptin jatkokehitysprojektiin. Erilaisten olemassa olevien optimointitapojen joukosta valitaan tarkoitukseen sopiva, laitosmalliin ja kustannusfunktioon perustuva menetelmä, jonka tulokset viedään automaatiojärjestelmään PID-säätimien asetusarvojen muodossa. Prosessin mittaustulosten avulla lasketaan laitoksen energia- ja massataseet, joiden tuloksia käytetään seuraavan optimointihetken lähtötietoina. Optimoinnin kohdefunktio on kustannusfunktio, jonka termit ovat voimalaitoksen käytöstä aiheutuvia tuottoja ja kustannuksia. Prosessia optimoidaan säätimille annetut raja-arvot huomioiden niin, että kokonaiskate maksimoituu. Kun laitokselle kertyy käyttöikää ja historiadataa, voidaan prosessin optimointia nopeuttaa hakemalla tilastollisesti historiadatasta nykytilanteen olosuhteita vastaava hetki. Kyseisen historian hetken katetta verrataan kustannusfunktion optimoinnista saatuun katteeseen. Paremman katteen antavan menetelmän laskemat asetusarvot otetaan käyttöön prosessin ohjausta varten. Mikäli kustannusfunktion laskenta eikä historiadatan perusteella tehty haku anna paranevaa katetta, niiden laskemia asetusarvoja ei oteta käyttöön. Sen sijaan optimia aletaan hakea deterministisellä optimointialgoritmilla, joka hakee nykyhetken ympäristöstä paremman katteen antavia säätimien asetusarvoja. Säätöjärjestelmä on mahdollista toteuttaa myös tulevaisuutta ennustavana. Työn käytännön osuudessa voimalaitosmalli luodaan kahden eri mallinnusohjelman avulla, joista toisella kuvataan kattilan ja toisella voimalaitosprosessin toimintaa. Mallinnuksen tuloksena saatuja prosessiarvoja hyödynnetään lähtötietoina käyttökatteen laskennassa. Kate lasketaan kustannusfunktion perusteella. Tuotoista suurimmat liittyvät sähkön ja lämmön myyntiin sekä tuotantotukeen, ja suurimmat kustannukset liittyvät investoinnin takaisinmaksuun ja polttoaineen ostoon. Kustannusfunktiolle tehdään herkkyystarkastelu, jossa seurataan katteen muutosta prosessin teknisiä arvoja muutettaessa. Tuloksia vertaillaan referenssivoimalaitoksella suoritettujen verifiointimittausten tuloksiin, ja havaitaan, että tulokset eivät ole täysin yhteneviä. Erot johtuvat sekä mallinnuksen puutteista että mittausten lyhyehköistä tarkasteluajoista. Automatisoidun optimointijärjestelmän käytännön toteutusta alustetaan määrittelemällä käyttöön otettava optimointitapa, siihen liittyvät säätöpiirit ja tarvittavat lähtötiedot. Projektia tullaan jatkamaan järjestelmän ohjelmoinnilla, testauksella ja virityksellä todellisessa voimalaitosympäristössä ja myöhemmin ennustavan säädön toteuttamisella.
Resumo:
Bacteria can exist as planktonic, the lifestyle in which single cells exist in suspension, and as biofilms, which are surface-attached bacterial communities embedded in a selfproduced matrix. Most of the antibiotics and the methods for antimicrobial work have been developed for planktonic bacteria. However, the majority of the bacteria in natural habitats live as biofilms. Biofilms develop dauntingly fast high resistance towards conventional antibacterial treatments and thus, there is a great need to meet the demands of effective anti-biofilm therapy. In this thesis project it was attempted to fill the void of anti-biofilm screening methods by developing a platform of assays that evaluate the effect that screened compounds have on the total biomass, viability and the extracellular polysaccharide (EPS) layer of the biofilms. Additionally, a new method for studying biofilms and their interactions with compounds in a continuous flow system was developed using capillary electrochromatography (CEC). The screening platform was utilized with a screening campaign using a small library of cinchona alkaloids. The assays were optimized to be statistically robust enough for screening. The first assay, based on crystal violet staining, measures total biofilm biomass, and it was automated using a liquid handling workstation to decrease the manual workload and signal variation. The second assay, based on resazurin staining, measures viability of the biofilm, and it was thoroughly optimized for the strain used, but was then a very simple and fast method to be used for primary screening. The fluorescent resazurin probe is not toxic to the biofilms. In fact, it was also shown in this project that staining the biofilms with resazurin prior to staining with crystal violet had no effect on the latter and they can be used in sequence on the same screening plate. This sequential addition step was indeed a major improvement on the use of reagents and consumables and also shortened the work time. As a third assay in the platform a wheat germ agglutinin based assay was added to evaluate the effect a compound has on the EPS layer. Using this assay it was found that even if compounds might have clear effect on both biomass and viability, the EPS layer can be left untouched or even be increased. This is a clear implication of the importance of using several assays to be able to find “true hits” in a screening setting. In the pilot study of screening for antimicrobial and anti-biofilm effects using a cinchona alkaloid library, one compound was found to have antimicrobial effect against planktonic bacteria and prevent biofilm formation at low micromolar concentration. To eradicate biofilms, a higher concentration was needed. It was also shown that the chemical space occupied by the active compound was slightly different than the rest of the cinchona alkaloids as well as the rest of the compounds used for validatory screening during the optimization processes of the separate assays.
Resumo:
The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.
Resumo:
Bioprocess technology is a multidisciplinary industry that combines knowledge of biology and chemistry with process engineering. It is a growing industry because its applications have an important role in the food, pharmaceutical, diagnostics and chemical industries. In addition, the current pressure to decrease our dependence on fossil fuels motivates new, innovative research in the replacement of petrochemical products. Bioprocesses are processes that utilize cells and/or their components in the production of desired products. Bioprocesses are already used to produce fuels and chemicals, especially ethanol and building-block chemicals such as carboxylic acids. In order to enable more efficient, sustainable and economically feasible bioprocesses, the raw materials must be cheap and the bioprocesses must be operated at optimal conditions. It is essential to measure different parameters that provide information about the process conditions and the main critical process parameters including cell density, substrate concentrations and products. In addition to offline analysis methods, online monitoring tools are becoming increasingly important in the optimization of bioprocesses. Capillary electrophoresis (CE) is a versatile analysis technique with no limitations concerning polar solvents, analytes or samples. Its resolution and efficiency are high in optimized methods creating a great potential for rapid detection and quantification. This work demonstrates the potential and possibilities of CE as a versatile bioprocess monitoring tool. As a part of this study a commercial CE device was modified for use as an online analysis tool for automated monitoring. The work describes three offline CE analysis methods for the determination of carboxylic, phenolic and amino acids that are present in bioprocesses, and an online CE analysis method for the monitoring of carboxylic acid production during bioprocesses. The detection methods were indirect and direct UV, and laser-induced frescence. The results of this work can be used for the optimization of bioprocess conditions, for the development of more robust and tolerant microorganisms, and to study the dynamics of bioprocesses.
Resumo:
Esitys KDK-käytettävyystyöryhmän järjestämässä seminaarissa: Miten käyttäjien toiveet haastavat metatietokäytäntöjämme? / How users' expectations challenge our metadata practices? 30.9.2014.