959 resultados para automated software testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of new procedures for quickly obtaining accurate information on the physiological potential of seed lots is essential for developing quality control programs for the seed industry. In this study, the effectiveness of an automated system of seedling image analysis (Seed Vigor Imaging System - SVIS) in determining the physiological potential of sun hemp seeds and its relationship with electrical conductivity tests, were evaluated. SVIS evaluations were performed three and four days after sowing and data on the vigor index and the length and uniformity of seedling growth were collected. The electrical conductivity test was made on 50 seed replicates placed in containers with 75 mL of deionised water at 25 ºC and readings were taken after 1, 2, 4, 8 and 16 hours of imbibition. Electrical conductivity measurements at 4 or 8 hours and the use of the SVIS on 3-day old seedlings can effectively detect differences in vigor between different sun hemp seed lots.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The software Seed Vigor Imaging System (SVIS®), has been successfully used to evaluate seed physiological potential by automated analyses of scanned seedlings. In this research, the efficiency of this system was compared to other tests accepted for assessing cucumber (Cucumis sativus L.) seed vigor of distinct seed lots of Supremo and Safira cultivars. Seeds were subjected to germination, traditional and saturated salt accelerated aging, seedling emergence, seedling length and SVIS analyses (determination of vigor indices and seedling growth uniformity, lengths of primary root, hypocotyl and whole seedlings). It was also determined whether the definition of seedling growth/uniformity ratios affects the sensitivity of the SVIS®. Results showed that analyses SVIS have provided consistent identification of seed lots performance, and have produced information comparable to those from recommended seed vigor tests, thus demonstrating a suitable sensitivity for a rapid and objective evaluation of physiological potential of cucumber seeds. Analyses of four-days-old cucumber seedlings using the SVIS® are more accurate and growth/uniformity does not affect the precision of results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work describes the development of a simulation tool which allows the simulation of the Internal Combustion Engine (ICE), the transmission and the vehicle dynamics. It is a control oriented simulation tool, designed in order to perform both off-line (Software In the Loop) and on-line (Hardware In the Loop) simulation. In the first case the simulation tool can be used in order to optimize Engine Control Unit strategies (as far as regard, for example, the fuel consumption or the performance of the engine), while in the second case it can be used in order to test the control system. In recent years the use of HIL simulations has proved to be very useful in developing and testing of control systems. Hardware In the Loop simulation is a technology where the actual vehicles, engines or other components are replaced by a real time simulation, based on a mathematical model and running in a real time processor. The processor reads ECU (Engine Control Unit) output signals which would normally feed the actuators and, by using mathematical models, provides the signals which would be produced by the actual sensors. The simulation tool, fully designed within Simulink, includes the possibility to simulate the only engine, the transmission and vehicle dynamics and the engine along with the vehicle and transmission dynamics, allowing in this case to evaluate the performance and the operating conditions of the Internal Combustion Engine, once it is installed on a given vehicle. Furthermore the simulation tool includes different level of complexity, since it is possible to use, for example, either a zero-dimensional or a one-dimensional model of the intake system (in this case only for off-line application, because of the higher computational effort). Given these preliminary remarks, an important goal of this work is the development of a simulation environment that can be easily adapted to different engine types (single- or multi-cylinder, four-stroke or two-stroke, diesel or gasoline) and transmission architecture without reprogramming. Also, the same simulation tool can be rapidly configured both for off-line and real-time application. The Matlab-Simulink environment has been adopted to achieve such objectives, since its graphical programming interface allows building flexible and reconfigurable models, and real-time simulation is possible with standard, off-the-shelf software and hardware platforms (such as dSPACE systems).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Weak lensing experiments such as the future ESA-accepted mission Euclid aim to measure cosmological parameters with unprecedented accuracy. It is important to assess the precision that can be obtained in these measurements by applying analysis software on mock images that contain many sources of noise present in the real data. In this Thesis, we show a method to perform simulations of observations, that produce realistic images of the sky according to characteristics of the instrument and of the survey. We then use these images to test the performances of the Euclid mission. In particular, we concentrate on the precision of the photometric redshift measurements, which are key data to perform cosmic shear tomography. We calculate the fraction of the total observed sample that must be discarded to reach the required level of precision, that is equal to 0.05(1+z) for a galaxy with measured redshift z, with different ancillary ground-based observations. The results highlight the importance of u-band observations, especially to discriminate between low (z < 0.5) and high (z ~ 3) redshifts, and the need for good observing sites, with seeing FWHM < 1. arcsec. We then construct an optimal filter to detect galaxy clusters through photometric catalogues of galaxies, and we test it on the COSMOS field, obtaining 27 lensing-confirmed detections. Applying this algorithm on mock Euclid data, we verify the possibility to detect clusters with mass above 10^14.2 solar masses with a low rate of false detections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il tumore al seno è il più comune tra le donne nel mondo. La radioterapia è comunemente usata dopo la chirurgia per distruggere eventuali cellule maligne rimaste nel volume del seno. Nei trattamenti di radioterapia bisogna cercare di irradiare il volume da curare limitando contemporaneamente la tossicità nei tessuti sani. In clinica i parametri che definiscono il piano di trattamento radioterapeutico sono selezionati manualmente utilizzando un software di simulazione per trattamenti. Questo processo, detto di trial and error, in cui i differenti parametri vengono modificati e il trattamento viene simulato nuovamente e valutato, può richiedere molte iterazioni rendendolo dispendioso in termini di tempo. Lo studio presentato in questa tesi si concentra sulla generazione automatica di piani di trattamento per irradiare l'intero volume del seno utilizzando due fasci approssimativamente opposti e tangenti al paziente. In particolare ci siamo concentrati sulla selezione delle direzioni dei fasci e la posizione dell'isocentro. A questo scopo, è stato investigata l'efficacia di un approccio combinatorio, nel quale sono stati generati un elevato numero di possibili piani di trattamento utilizzando differenti combinazioni delle direzioni dei due fasci. L'intensità del profilo dei fasci viene ottimizzata automaticamente da un algoritmo, chiamato iCycle, sviluppato nel ospedale Erasmus MC di Rotterdam. Inizialmente tra tutti i possibili piani di trattamento generati solo un sottogruppo viene selezionato, avente buone caratteristiche per quel che riguarda l'irraggiamento del volume del seno malato. Dopo di che i piani che mostrano caratteristiche ottimali per la salvaguardia degli organi a rischio (cuore, polmoni e seno controlaterale) vengono considerati. Questi piani di trattamento sono matematicamente equivalenti quindi per selezionare tra questi il piano migliore è stata utilizzata una somma pesata dove i pesi sono stati regolati per ottenere in media piani che abbiano caratteristiche simili ai piani di trattamento approvati in clinica. Questo metodo in confronto al processo manuale oltre a ridurre considerevol-mente il tempo di generazione di un piano di trattamento garantisce anche i piani selezionati abbiano caratteristiche ottimali nel preservare gli organi a rischio. Inizialmente è stato utilizzato l'isocentro scelto in clinica dal tecnico. Nella parte finale dello studio l'importanza dell'isocentro è stata valutata; ne è risultato che almeno per un sottogruppo di pazienti la posizione dell'isocentro può dare un importante contributo alla qualità del piano di trattamento e quindi potrebbe essere un ulteriore parametro da ottimizzare. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this report a new automated optical test for next generation of photonic integrated circuits (PICs) is provided by the test-bed design and assessment. After a briefly analysis of critical problems of actual optical tests, the main test features are defined: automation and flexibility, relaxed alignment procedure, speed up of entire test and data reliability. After studying varied solutions, the test-bed components are defined to be lens array, photo-detector array, and software controller. Each device is studied and calibrated, the spatial resolution, and reliability against interference at the photo-detector array are studied. The software is programmed in order to manage both PIC input, and photo-detector array output as well as data analysis. The test is validated by analysing state-of-art 16 ports PIC: the waveguide location, current versus power, and time-spatial power distribution are measured as well as the optical continuity of an entire path of PIC. Complexity, alignment tolerance, time of measurement are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this thesis is to facilitate the process of industrial automated systems development applying formal methods to ensure the reliability of systems. A new formulation of distributed diagnosability problem in terms of Discrete Event Systems theory and automata framework is presented, which is then used to enforce the desired property of the system, rather then just verifying it. This approach tackles the state explosion problem with modeling patterns and new algorithms, aimed for verification of diagnosability property in the context of the distributed diagnosability problem. The concepts are validated with a newly developed software tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

EUMETSAT (www.eumetsat.int) e’ l’agenzia europea per operazioni su satelliti per monitorare clima, meteo e ambiente terrestre. Dal centro operativo situato a Darmstadt (Germania), si controllano satelliti meteorologici su orbite geostazionarie e polari che raccolgono dati per l’osservazione dell’atmosfera, degli oceani e della superficie terrestre per un servizio continuo di 24/7. Un sistema di monitoraggio centralizzato per programmi diversi all’interno dell’ambiente operazionale di EUMETSAT, e’ dato da GEMS (Generic Event Monitoring System). Il software garantisce il controllo di diverse piattaforme, cross-monitoring di diverse sezioni operative, ed ha le caratteristiche per potere essere esteso a future missioni. L’attuale versione della GEMS MMI (Multi Media Interface), v. 3.6, utilizza standard Java Server Pages (JSP) e fa uso pesante di codici Java; utilizza inoltre files ASCII per filtri e display dei dati. Conseguenza diretta e’ ad esempio, il fatto che le informazioni non sono automaticamente aggiornate, ma hanno bisogno di ricaricare la pagina. Ulteriori inputs per una nuova versione della GEMS MMI vengono da diversi comportamenti anomali riportati durante l’uso quotidiano del software. La tesi si concentra sulla definizione di nuovi requisiti per una nuova versione della GEMS MMI (v. 4.4) da parte della divisione ingegneristica e di manutenzione di operazioni di EUMETSAT. Per le attivita’ di supporto, i test sono stati condotti presso Solenix. Il nuovo software permettera’ una migliore applicazione web, con tempi di risposta piu’ rapidi, aggiornamento delle informazioni automatico, utilizzo totale del database di GEMS e le capacita’ di filtri, insieme ad applicazioni per telefoni cellulari per il supporto delle attivita’ di reperibilita’. La nuova versione di GEMS avra’ una nuova Graphical User Interface (GUI) che utilizza tecnologie moderne. Per un ambiente di operazioni come e’ quello di EUMETSAT, dove l’affidabilita’ delle tecnologie e la longevita’ dell’approccio scelto sono di vitale importanza, non tutti gli attuali strumenti a disposizione sono adatti e hanno bisogno di essere migliorati. Allo stesso tempo, un’ interfaccia moderna, in termini di visual design, interattivita’ e funzionalita’, e’ importante per la nuova GEMS MMI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La radioterapia è una tecnica molto impiegata per la cura del cancro. Attualmente la somministrazione avviene principalmente attraverso la intensity modulated radiotherapy (IMRT, sovrapposizione di campi ad intensità modulata), un cui sviluppo recente è la volumetric modulated arc therapy (VMAT, irradiazione continua lungo un arco ininterrotto). La generazione di piani richiede esperienza ed abilità: un dosimetrista seleziona cost functions ed obiettivi ed un TPS ottimizza la disposizione dei segmenti ad intensità modulata. Se il medico giudica il risultato non soddisfacente, il processo riparte da capo (trial-and-error). Una alternativa è la generazione automatica di piani. Erasmus-iCycle, software prodotto presso ErasmusMC (Rotterdam, The Netherlands), è un algoritmo di ottimizzazione multicriteriale di piani radioterapici per ottimizzazione di intensità basato su una wish list. L'output consiste di piani Pareto-ottimali ad intensità modulata. La generazione automatica garantisce maggiore coerenza e qualità più elevata con tempi di lavoro ridotti. Nello studio, una procedura di generazione automatica di piani con modalità VMAT è stata sviluppata e valutata per carcinoma polmonare. Una wish list è stata generata attraverso una procedura iterativa su un gruppo ristretto di pazienti con la collaborazione di fisici medici ed oncologi e poi validata su un gruppo più ampio di pazienti. Nella grande maggioranza dei casi, i piani automatici sono stati giudicati dagli oncologi migliori rispetto ai rispettivi piani IMRT clinici generati manualmente. Solo in pochi casi una rapida calibrazione manuale specifica per il paziente si è resa necessaria per soddisfare tutti i requisiti clinici. Per un sottogruppo di pazienti si è mostrato che la qualità dei piani VMAT automatici era equivalente o superiore rispetto ai piani VMAT generati manualmente da un dosimetrista esperto. Complessivamente, si è dimostrata la possibilità di generare piani radioterapici VMAT ad alta qualità automaticamente, con interazione umana minima. L'introduzione clinica della procedura automatica presso ErasmusMC è iniziata (ottobre 2015).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The multi-target screening method described in this work allows the simultaneous detection and identification of 700 drugs and metabolites in biological fluids using a hybrid triple-quadrupole linear ion trap mass spectrometer in a single analytical run. After standardization of the method, the retention times of 700 compounds were determined and transitions for each compound were selected by a "scheduled" survey MRM scan, followed by an information-dependent acquisition using the sensitive enhanced product ion scan of a Q TRAP hybrid instrument. The identification of the compounds in the samples analyzed was accomplished by searching the tandem mass spectrometry (MS/MS) spectra against the library we developed, which contains electrospray ionization-MS/MS spectra of over 1,250 compounds. The multi-target screening method together with the library was included in a software program for routine screening and quantitation to achieve automated acquisition and library searching. With the help of this software application, the time for evaluation and interpretation of the results could be drastically reduced. This new multi-target screening method has been successfully applied for the analysis of postmortem and traffic offense samples as well as proficiency testing, and complements screening with immunoassays, gas chromatography-mass spectrometry, and liquid chromatography-diode-array detection. Other possible applications are analysis in clinical toxicology (for intoxication cases), in psychiatry (antidepressants and other psychoactive drugs), and in forensic toxicology (drugs and driving, workplace drug testing, oral fluid analysis, drug-facilitated sexual assault).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Code duplication is common in current programming-practice: programmers search for snippets of code, incorporate them into their projects and then modify them to their needs. In today's practice, no automated scheme is in place to inform both parties of any distant changes of the code. As code snippets continues to evolve both on the side of the user and on the side of the author, both may wish to benefit from remote bug fixes or refinements --- authors may be interested in the actual usage of their code snippets, and researchers could gather information on clone usage. We propose maintaining a link between software clones across repositories and outline how the links can be created and maintained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When reengineering legacy systems, it is crucial to assess if the legacy behavior has been preserved or how it changed due to the reengineering effort. Ideally if a legacy system is covered by tests, running the tests on the new version can identify potential differences or discrepancies. However, writing tests for an unknown and large system is difficult due to the lack of internal knowledge. It is especially difficult to bring the system to an appropriate state. Our solution is based on the acknowledgment that one of the few trustable piece of information available when approaching a legacy system is the running system itself. Our approach reifies the execution traces and uses logic programming to express tests on them. Thereby it eliminates the need to programatically bring the system in a particular state, and handles the test-writer a high-level abstraction mechanism to query the trace. The resulting system, called TESTLOG, was used on several real-world case studies to validate our claims.