73 resultados para Automated Test
Resumo:
Forest inventories are used to estimate forest characteristics and the condition of forest for many different applications: operational tree logging for forest industry, forest health state estimation, carbon balance estimation, land-cover and land use analysis in order to avoid forest degradation etc. Recent inventory methods are strongly based on remote sensing data combined with field sample measurements, which are used to define estimates covering the whole area of interest. Remote sensing data from satellites, aerial photographs or aerial laser scannings are used, depending on the scale of inventory. To be applicable in operational use, forest inventory methods need to be easily adjusted to local conditions of the study area at hand. All the data handling and parameter tuning should be objective and automated as much as possible. The methods also need to be robust when applied to different forest types. Since there generally are no extensive direct physical models connecting the remote sensing data from different sources to the forest parameters that are estimated, mathematical estimation models are of "black-box" type, connecting the independent auxiliary data to dependent response data with linear or nonlinear arbitrary models. To avoid redundant complexity and over-fitting of the model, which is based on up to hundreds of possibly collinear variables extracted from the auxiliary data, variable selection is needed. To connect the auxiliary data to the inventory parameters that are estimated, field work must be performed. In larger study areas with dense forests, field work is expensive, and should therefore be minimized. To get cost-efficient inventories, field work could partly be replaced with information from formerly measured sites, databases. The work in this thesis is devoted to the development of automated, adaptive computation methods for aerial forest inventory. The mathematical model parameter definition steps are automated, and the cost-efficiency is improved by setting up a procedure that utilizes databases in the estimation of new area characteristics.
Resumo:
This master’s thesis is focused on the active magnetic bearings system commissioning. The scope of the work is to test the existent procedures with old and new prototypes of an AMB system and additionally automate necessary steps instead of their hand tuning, because determination of rotor clearances and finding effective rotor-origins are time consuming and error prone. The final goal is to get a documented and mostly automated step by step methodology for end efficient system’s commissioning.
Resumo:
Diplomityön tavoitteena oli kehittää vuokamallisille kartonkipakkauksille laadunvarmistuslaitteisto. Kirjallisen osan alussa esiteltiin vuokamallisten kartonkipakkausten valmistusprosessia. Tästä siirryttiin laatuasioihin, jossa tärkeimmät asiat olivat kartonkivuokien valmistuksessa esiintyvät laatupoikkeamat ja konenäkö. Tutkimusosan alussa esitellään Lappeenrannan teknillisessä yliopistossa kehitetty kartonkivuokien valmistuslinjasto. Tämän jälkeen vaatimuslistan pohjalta suunnitellaan kyseiseen linjastoon sopiva automaattinen laadunvalvontalaite, johon sisältyy myös kartonkivuokien siirtolaite. Suunnitteluprosessi aloitettiin koekuvaamalla kartonkivuokia erilaisilla kameroilla ja valaistusmenetelmillä. Koekuvausten perusteella valittiin konenäkölaitteisto. Tämän jälkeen toiminnoista luotiin periaatepiirroksia, joista kehitettiin varsinainen suunnitelma. Työn tuloksena saatiin suunnitelma konenäköön perustuvan automaattisen laadunvalvontalaitteen rakentamiselle.
Resumo:
In this thesis, the components important for testing work and organisational test process are identified and analysed. This work focuses on the testing activities in reallife software organisations, identifying the important test process components, observing testing work in practice, and analysing how the organisational test process could be developed. Software professionals from 14 different software organisations were interviewed to collect data on organisational test process and testing‐related factors. Moreover, additional data on organisational aspects was collected with a survey conducted on 31 organisations. This data was further analysed with the Grounded Theory method to identify the important test process components, and to observe how real‐life test organisations develop their testing activities. The results indicate that the test management at the project level is an important factor; the organisations do have sufficient test resources available, but they are not necessarily applied efficiently. In addition, organisations in general are reactive; they develop their process mainly to correct problems, not to enhance their efficiency or output quality. The results of this study allows organisations to have a better understanding of the test processes, and develop towards better practices and a culture of preventing problems, not reacting to them.
Resumo:
Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.
Resumo:
This paper presents the design for a graphical parameter editor for Testing and Test Control Notation 3 (TTCN-3) test suites. This work was done in the context of OpenTTCN IDE, a TTCN-3 development environment built on top of the Eclipse platform. The design presented relies on an additional parameter editing tab added to the launch configurations for test campaigns. This parameter editing tab shows the list of editable parameters and allows opening editing components for the different parameters. Each TTCN-3 primitive type will have a specific editing component providing tools to ease modification of values of that type.
Resumo:
CHARGE syndrome, Sotos syndrome and 3p deletion syndrome are examples of rare inherited syndromes that have been recognized for decades but for which the molecular diagnostics only have been made possible by recent advances in genomic research. Despite these advances, development of diagnostic tests for rare syndromes has been hindered by diagnostic laboratories having limited funds for test development, and their prioritization of tests for which a (relatively) high demand can be expected. In this study, the molecular diagnostic tests for CHARGE syndrome and Sotos syndrome were developed, resulting in their successful translation into routine diagnostic testing in the laboratory of Medical Genetics (UTUlab). In the CHARGE syndrome group, mutation was identified in 40.5% of the patients and in the Sotos syndrome group, in 34%, reflecting the use of the tests in routine diagnostics in differential diagnostics. In CHARGE syndrome, the low prevalence of structural aberrations was also confirmed. In 3p deletion syndrome, it was shown that small terminal deletions are not causative for the syndrome, and that testing with arraybased analysis provides a reliable estimate of the deletion size but benign copy number variants complicate result interpretation. During the development of the tests, it was discovered that finding an optimal molecular diagnostic strategy for a given syndrome is always a compromise between the sensitivity, specificity and feasibility of applying a new method. In addition, the clinical utility of the test should be considered prior to test development: sometimes a test performing well in a laboratory has limited utility for the patient, whereas a test performing poorly in the laboratory may have a great impact on the patient and their family. At present, the development of next generation sequencing methods is changing the concept of molecular diagnostics of rare diseases from single tests towards whole-genome analysis.
Resumo:
kuv., 8 x 15 cm
Resumo:
kuv., 10 x 15 cm
Resumo:
kuv., 10 x 15 cm
Resumo:
kuv., 8 x 15 cm
Resumo:
kuv., 10 x 15 cm