45 resultados para test data generation

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This master’s thesis aims to study and represent from literature how evolutionary algorithms are used to solve different search and optimisation problems in the area of software engineering. Evolutionary algorithms are methods, which imitate the natural evolution process. An artificial evolution process evaluates fitness of each individual, which are solution candidates. The next population of candidate solutions is formed by using the good properties of the current population by applying different mutation and crossover operations. Different kinds of evolutionary algorithm applications related to software engineering were searched in the literature. Applications were classified and represented. Also the necessary basics about evolutionary algorithms were presented. It was concluded, that majority of evolutionary algorithm applications related to software engineering were about software design or testing. For example, there were applications about classifying software production data, project scheduling, static task scheduling related to parallel computing, allocating modules to subsystems, N-version programming, test data generation and generating an integration test order. Many applications were experimental testing rather than ready for real production use. There were also some Computer Aided Software Engineering tools based on evolutionary algorithms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To enable a mathematically and physically sound execution of the fatigue test and a correct interpretation of its results, statistical evaluation methods are used to assist in the analysis of fatigue testing data. The main objective of this work is to develop step-by-stepinstructions for statistical analysis of the laboratory fatigue data. The scopeof this project is to provide practical cases about answering the several questions raised in the treatment of test data with application of the methods and formulae in the document IIW-XIII-2138-06 (Best Practice Guide on the Statistical Analysis of Fatigue Data). Generally, the questions in the data sheets involve some aspects: estimation of necessary sample size, verification of the statistical equivalence of the collated sets of data, and determination of characteristic curves in different cases. The series of comprehensive examples which are given in this thesis serve as a demonstration of the various statistical methods to develop a sound procedure to create reliable calculation rules for the fatigue analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Data mining, as a heatedly discussed term, has been studied in various fields. Its possibilities in refining the decision-making process, realizing potential patterns and creating valuable knowledge have won attention of scholars and practitioners. However, there are less studies intending to combine data mining and libraries where data generation occurs all the time. Therefore, this thesis plans to fill such a gap. Meanwhile, potential opportunities created by data mining are explored to enhance one of the most important elements of libraries: reference service. In order to thoroughly demonstrate the feasibility and applicability of data mining, literature is reviewed to establish a critical understanding of data mining in libraries and attain the current status of library reference service. The result of the literature review indicates that free online data resources other than data generated on social media are rarely considered to be applied in current library data mining mandates. Therefore, the result of the literature review motivates the presented study to utilize online free resources. Furthermore, the natural match between data mining and libraries is established. The natural match is explained by emphasizing the data richness reality and considering data mining as one kind of knowledge, an easy choice for libraries, and a wise method to overcome reference service challenges. The natural match, especially the aspect that data mining could be helpful for library reference service, lays the main theoretical foundation for the empirical work in this study. Turku Main Library was selected as the case to answer the research question: whether data mining is feasible and applicable for reference service improvement. In this case, the daily visit from 2009 to 2015 in Turku Main Library is considered as the resource for data mining. In addition, corresponding weather conditions are collected from Weather Underground, which is totally free online. Before officially being analyzed, the collected dataset is cleansed and preprocessed in order to ensure the quality of data mining. Multiple regression analysis is employed to mine the final dataset. Hourly visits are the independent variable and weather conditions, Discomfort Index and seven days in a week are dependent variables. In the end, four models in different seasons are established to predict visiting situations in each season. Patterns are realized in different seasons and implications are created based on the discovered patterns. In addition, library-climate points are generated by a clustering method, which simplifies the process for librarians using weather data to forecast library visiting situation. Then the data mining result is interpreted from the perspective of improving reference service. After this data mining work, the result of the case study is presented to librarians so as to collect professional opinions regarding the possibility of employing data mining to improve reference services. In the end, positive opinions are collected, which implies that it is feasible to utilizing data mining as a tool to enhance library reference service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The safe use of nuclear power plants (NPPs) requires a deep understanding of the functioning of physical processes and systems involved. Studies on thermal hydraulics have been carried out in various separate effects and integral test facilities at Lappeenranta University of Technology (LUT) either to ensure the functioning of safety systems of light water reactors (LWR) or to produce validation data for the computer codes used in safety analyses of NPPs. Several examples of safety studies on thermal hydraulics of the nuclear power plants are discussed. Studies are related to the physical phenomena existing in different processes in NPPs, such as rewetting of the fuel rods, emergency core cooling (ECC), natural circulation, small break loss-of-coolant accidents (SBLOCA), non-condensable gas release and transport, and passive safety systems. Studies on both VVER and advanced light water reactor (ALWR) systems are included. The set of cases include separate effects tests for understanding and modeling a single physical phenomenon, separate effects tests to study the behavior of a NPP component or a single system, and integral tests to study the behavior of the whole system. In the studies following steps can be found, not necessarily in the same study. Experimental studies as such have provided solutions to existing design problems. Experimental data have been created to validate a single model in a computer code. Validated models are used in various transient analyses of scaled facilities or NPPs. Integral test data are used to validate the computer codes as whole, to see how the implemented models work together in a code. In the final stage test results from the facilities are transferred to the NPP scale using computer codes. Some of the experiments have confirmed the expected behavior of the system or procedure to be studied; in some experiments there have been certain unexpected phenomena that have caused changes to the original design to avoid the recognized problems. This is the main motivation for experimental studies on thermal hydraulics of the NPP safety systems. Naturally the behavior of the new system designs have to be checked with experiments, but also the existing designs, if they are applied in the conditions that differ from what they were originally designed for. New procedures for existing reactors and new safety related systems have been developed for new nuclear power plant concepts. New experiments have been continuously needed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämän diplomityön tavoitteena oli kehittää menetelmiä ja ohjeitataajuusmuuttajan sulautetun ohjelmiston kehityksen aikaiseen testaukseen. Soveltuvia menetelmiä etsittiin tutkimalla laajasti kirjallisuutta sekä selvittämälläyrityksen testauskäytäntöä. Tutkittuja kirjallisuudesta löytyneitä menetelmä olivat testauskehykset, simulointi ja staattinen sekä automaattinen testaus. Kirjallisuudesta etsittiin myös menetelmiä, joiden avulla testausprosessia voidaan helpottaa tai muuten parantaa. Tällaisista menetelmistä tutkittiin muun muassa testidatan valintaa, testauslähtöistä kehitystä sekä testattavuuden parantamista. Lisäksi selvitettiin uudelleenkäytettävien testien ohjelmointiin soveltuvia ohjelmointikieliä. Haastatteluiden ja dokumentaation avulla saatiin hyvä käsitys yrityksessä vallitsevasta testauskäytännöstä sekä sen ongelmakohdista. Testauksen ongelmiksi havaittiin testausprosessin järjestelmällisyyden puute sekä tarve suunnittelijoiden testauskoulutukseen. Testausprosessin parantamiseksi esitetään moduulitestauskehyksen käyttöönottoa. Lisäksi suunnittelijoiden testauskoulutuksella arvioidaan olevan suuri vaikutus koko testausprosessiin. Testitapausten suunnitteluun esitetään menetelmiä, joiden avulla voidaan suunnitella kattavampia testejä.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this thesis is to reveal how the laser cutting parameters influence lasercutting of particleboard, HDF and MDF. The literature review introduces the basic principle of CO2 laser, CO2 laser equipment and its usage in cutting of wood-based materials. The experimental part focuses on the discussion and analysis ofthe test data and attempts to draw conclusions on the influence of various parameters, including laser power, focal length of the lens and cutting gas, on the cutting speed and kerf quality. The tested materials include various thicknesses of particleboard, HDF and MDF samples. A TRUMPF TLF2700 HQ laser equipment was used for the experiments. To obtain valid data, the test samples must be completely cut through without any bonding of wood fibre. The maximum cutting speed is linear dependent on the laser power in thecondition that the other parameters are constant. For each thickness of a specific material type, there is a minimum laser power for cutting. Normally, the topand bottom kerf widths increase with the enhancement of laser power. There may be a critical laser power which can generate the minimum cross-sectional kerf width. Lens of larger focal length may achieve higher cutting speed. As the focal length becomes larger, the top kerf width tends to increase while the bottom andcross-sectional kerf widths to the opposite. Of all cutting gases, oxygen can help achieve higher cutting speed. The gas pressure of nitrogen does not seem to have strong influence on the cutting result. Generally, 2 bar air is more preferable for higher cutting speed. For particleboard and MDF samples of larger thickness than 12 mm, 2 bar argon can be used to reach remarkably higher cutting speed than the 5 bar. Generally, the 190.5 mm lens can produce smallest total kerf width. The kerf sides of thicker samples are darker than the thinner ones. The sample darkness tends to be lower as laser power increased. 63.5 mm lens seemed tocause more darkness than other lens. 5 bar cutting gases can produce less dark side kerfs than 2 bar ones. Oxygen normally causes darker kerfs than other gases. No distinct differences were found between nitrogen and argon.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämädiplomityö tutkii kuinka Eclipse -ympäristöä voidaan käyttää testitapausten generoinnissa. Eräs diplomityön pääaiheista on tutkia voidaanko olemassa olevilla Eclipsen komponenteilla parantaa symboolitietoutta, jotta testitapausten generointiin saataisiin lisää tietoa. Aluksi diplomityö antaa lyhyen katsauksen ohjelmistojentestaukseen, jotta lukija ymmärtää mitä ohjelmistotekniikan osa-aluetta diplomityö käsittelee. Tämän jälkeen kerrotaan lisää tietoa itse testitapausten generointiprosessista. Kun perusteet on käsitelty, tutustetaan lukija Eclipse -ympäristöön, mikä se on, mistä se koostuu ja mitä sillä voidaan tehdä. Tarkempaa tietoa kerrotaan Eclipsen komponenteista joita voidaan käyttää apuna testitapausten generoinnissa. Integrointi esimerkkinä diplomityössä esitellään valmiin testitapausgeneraattorin integrointi Eclipse -ympäristöön. Lopuksi Eclipse -pohjaista ratkaisua verrataan symboolitietouden sekä ajoajan kannalta aikaisempaan ratkaisuun. Diplomityön tuloksena syntyi prototyyppi jonka avulla todistettiin, että Eclipse - ympäristöön on mahdollista integroida testitapausgeneraattori ja että se voi lisätä symboolitietoutta. Tämätietouden lisäys kuitenkin lisäsi myös tarvittavaa ajoaikaa, joissakintapauksissa jopa merkittävästi. Samalla todettiin, että tällä hetkellä on menossa projekteja joiden tarkoituksena on parantaa käytettyjen Eclipse komponenttien suorituskykyä ja että tämä voi parantaa tuloksia tulevaisuudessa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diplomityössä tutkitaan verkkokäyttöisten harjattomasti magnetoitujen tahtimoottorien käynnistyshäiriötä, jossa moottori magnetoituu vasta useiden sekuntien kuluttua magnetoinnin kytkemisestä magnetointilaitteiston normaalista toiminnasta huolimatta. Syy magnetoitumisen viivästymiseen on magnetointikoneen oikosulkeutuminen roottorin ylijännitesuojana toimivan tyristorihaaran kautta siitä huolimatta, että tyristorihaaran tyristorien on tarkoitus olla johtamattomassa tilassa magnetointikoneen alkaessa syöttää magnetointivirtaa. Syitä tyristorien johtavana pysymiseen magnetoinnin kytkennän jälkeen etsitään tahtimoottorin käynnistyskokeista saatujen mittaustulosten sekä SMT- ja FCSMEK-laskentaohjelmilla tehtyjen käynnistyssimulointien avulla. Samalla arvioidaan ohjelmien käyttökelpoisuutta käynnistyshäiriön ennakoimisessa. Diplomityössä esitetään syyt kahden esimerkkikoneen magnetoitumisen viivästymiseen sekä muutoksia roottoripiiriin ja käynnistysproseduuriin, joiden avulla tutkittu käynnistyshäiriö voitaisiin tulevaisuudessa todennäköisesti välttää.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this work was to design and carry out thermal-hydraulic experiments dealing with overcooling transients of a VVER-440-type nuclear reactor pressure vessel. Sudden overcooling accident could have negative effect on the mechanical strength of the pressure vessel. If part of the pressure vessel is compromised, the intense pressure inside a pressurized water reactor could cause the wall to fracture. Information on the heat transfer along the outside of the pressure vessel wall is necessary for stress analysis. Basic knowledge of the overcooling accident and heat transfer types on the outside of the pressure vessel is presented as background information. Test facility was designed and built based to study and measure heat transfer during specific overcooling scenarios. Two test series were conducted with the first one concentrating on the very beginning of the transient and the second one concentrating on steady state heat transfer. Heat transfer coefficients are calculated from the test data using an inverse method, which yields better results in fast transients than direct calculation from the measurement results. The results show that heat transfer rate varies considerably during the transient, being very high in the beginning and dropping to steady state in a few minutes. The test results show that appropriate correlations can be used in future analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämä työ vastaa tarpeeseen hallita korkeapainevesisumusuuttimen laatua virtausmekaniikan työkalujen avulla. Työssä tutkitaan suutinten testidatan lisäksi virtauksen käyttäytymistä suuttimen sisällä CFD-laskennan avulla. Virtausmallinnus tehdään Navier-Stokes –pohjaisella laskentamenetelmällä. Työn teoriaosassa käsitellään virtaustekniikkaa ja sen kehitystä yleisesti. Lisäksi esitetään suuttimen laskennassa käytettävää perusteoriaa sekä teknisiä ratkaisuja. Teoriaosassa käydään myös läpi laskennalliseen virtausmekaniikkaan (CFD-laskenta) liittyvää perusteoriaa. Tutkimusosiossa esitetään käsitellyt suutintestitulokset sekä mallinnetaan suutinvirtausta ajasta riippumattomaan virtauslaskentaan perustuvalla laskentamenetelmällä. Virtauslaskennassa käytetään OpenFOAM-laskentaohjelmiston SIMPLE-virtausratkaisijaa sekä k-omega SST –turbulenssimallia. Tehtiin virtausmallinnus kaikilla paineilla, joita suuttimen testauksessa myös todellisuudessa käytetään. Lisäksi selvitettiin mahdolliset kavitaatiokohdat suuttimessa ja suunniteltiin kavitaatiota ehkäisevä suutingeometria. Todettiin myös lämpötilan ja epäpuhtauksien vaikuttavan kavitaatioon sekä mallinnettiin lämpötilan vaikutusta. Luotiin malli, jolla suuttimen suunnitteluun liittyviin haasteisiin voidaan vastata numeerisella laskennalla.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent advances in Information and Communication Technology (ICT), especially those related to the Internet of Things (IoT), are facilitating smart regions. Among many services that a smart region can offer, remote health monitoring is a typical application of IoT paradigm. It offers the ability to continuously monitor and collect health-related data from a person, and transmit the data to a remote entity (for example, a healthcare service provider) for further processing and knowledge extraction. An IoT-based remote health monitoring system can be beneficial in rural areas belonging to the smart region where people have limited access to regular healthcare services. The same system can be beneficial in urban areas where hospitals can be overcrowded and where it may take substantial time to avail healthcare. However, this system may generate a large amount of data. In order to realize an efficient IoT-based remote health monitoring system, it is imperative to study the network communication needs of such a system; in particular the bandwidth requirements and the volume of generated data. The thesis studies a commercial product for remote health monitoring in Skellefteå, Sweden. Based on the results obtained via the commercial product, the thesis identified the key network-related requirements of a typical remote health monitoring system in terms of real-time event update, bandwidth requirements and data generation. Furthermore, the thesis has proposed an architecture called IReHMo - an IoT-based remote health monitoring architecture. This architecture allows users to incorporate several types of IoT devices to extend the sensing capabilities of the system. Using IReHMo, several IoT communication protocols such as HTTP, MQTT and CoAP has been evaluated and compared against each other. Results showed that CoAP is the most efficient protocol to transmit small size healthcare data to the remote servers. The combination of IReHMo and CoAP significantly reduced the required bandwidth as well as the volume of generated data (up to 56 percent) compared to the commercial product. Finally, the thesis conducted a scalability analysis, to determine the feasibility of deploying the combination of IReHMo and CoAP in large numbers in regions in north Sweden.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis, tool support is addressed for the combined disciplines of Model-based testing and performance testing. Model-based testing (MBT) utilizes abstract behavioral models to automate test generation, thus decreasing time and cost of test creation. MBT is a functional testing technique, thereby focusing on output, behavior, and functionality. Performance testing, however, is non-functional and is concerned with responsiveness and stability under various load conditions. MBPeT (Model-Based Performance evaluation Tool) is one such tool which utilizes probabilistic models, representing dynamic real-world user behavior patterns, to generate synthetic workload against a System Under Test and in turn carry out performance analysis based on key performance indicators (KPI). Developed at Åbo Akademi University, the MBPeT tool is currently comprised of a downloadable command-line based tool as well as a graphical user interface. The goal of this thesis project is two-fold: 1) to extend the existing MBPeT tool by deploying it as a web-based application, thereby removing the requirement of local installation, and 2) to design a user interface for this web application which will add new user interaction paradigms to the existing feature set of the tool. All phases of the MBPeT process will be realized via this single web deployment location including probabilistic model creation, test configurations, test session execution against a SUT with real-time monitoring of user configurable metric, and final test report generation and display. This web application (MBPeT Dashboard) is implemented with the Java programming language on top of the Vaadin framework for rich internet application development. The Vaadin framework handles the complicated web communications processes and front-end technologies, freeing developers to implement the business logic as well as the user interface in pure Java. A number of experiments are run in a case study environment to validate the functionality of the newly developed Dashboard application as well as the scalability of the solution implemented in handling multiple concurrent users. The results support a successful solution with regards to the functional and performance criteria defined, while improvements and optimizations are suggested to increase both of these factors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays the used fuel variety in power boilers is widening and new boiler constructions and running models have to be developed. This research and development is done in small pilot plants where more faster analyse about the boiler mass and heat balance is needed to be able to find and do the right decisions already during the test run. The barrier on determining boiler balance during test runs is the long process of chemical analyses of collected input and outputmatter samples. The present work is concentrating on finding a way to determinethe boiler balance without chemical analyses and optimise the test rig to get the best possible accuracy for heat and mass balance of the boiler. The purpose of this work was to create an automatic boiler balance calculation method for 4 MW CFB/BFB pilot boiler of Kvaerner Pulping Oy located in Messukylä in Tampere. The calculation was created in the data management computer of pilot plants automation system. The calculation is made in Microsoft Excel environment, which gives a good base and functions for handling large databases and calculations without any delicate programming. The automation system in pilot plant was reconstructed und updated by Metso Automation Oy during year 2001 and the new system MetsoDNA has good data management properties, which is necessary for big calculations as boiler balance calculation. Two possible methods for calculating boiler balance during test run were found. Either the fuel flow is determined, which is usedto calculate the boiler's mass balance, or the unburned carbon loss is estimated and the mass balance of the boiler is calculated on the basis of boiler's heat balance. Both of the methods have their own weaknesses, so they were constructed parallel in the calculation and the decision of the used method was left to user. User also needs to define the used fuels and some solid mass flowsthat aren't measured automatically by the automation system. With sensitivity analysis was found that the most essential values for accurate boiler balance determination are flue gas oxygen content, the boiler's measured heat output and lower heating value of the fuel. The theoretical part of this work concentrates in the error management of these measurements and analyses and on measurement accuracy and boiler balance calculation in theory. The empirical part of this work concentrates on the creation of the balance calculation for the boiler in issue and on describing the work environment.