941 resultados para test case optimization
Resumo:
The self, roles and the ongoing coordination of human action. Trying to see ‘society’ as neither prison nor puppet theatre In the article it is argued that structural North-American role-sociology may be integrated with theories emphasizing ‘society’ as ongoing processes (f. ex. Giddens’ theory of structuration). This is possible if the concept of role is defined as a recurrence oriented to the action of others standing out as a regularity in a societal process. But this definition makes it necessary to in a fundamental way understand what kind of social being the role-actor is. This is done with the help of Hans Joas’ theory of creativity and Merleau-Pontys concept of ‘flesh’ arguing that Meads concept of the ‘I’ maybe understood as an embodied self-asserting I, which at least in reflexive modernity has the creative power to split Meads ‘me’ into a self-voiced subject-me and an other voiced object-me. The embodied I communicating with the subject-me may be viewed as that role-actor which is something else than the role played. But this kind of role-actor is making for new troubles because it is hard to understand how this kind of self is creating self-coherence by using Meads concept of ‘the generalized other’. This trouble is handled by using Alain Touraines concept of the ‘subject’ and arguing that the generalized other is dissolving in de-modernized modernity. In split modernity self-coherence may instead be created by what in the article is called the generalized subject. This concept means a kind of communicative future based evaluation, which has its base in the ‘subject’ opposing the split powers of both the instrumentality of markets and of life-worlds trying to create ‘fundamentalistic’ self-identities. This kind of self is communicative because it also must respect the other as ‘subject’. It exists only in the battle against the forces of the market or a community. It never constructs an ideal city or a higher type of individual. It creates and protects a clearing that is constantly being invaded, to use the words of the old Frenchman himself. Asa kind of test-case it is by the way in the article shown how Becks concept of individualization may be understood in a deeply social and role-sociological way.
Resumo:
Research on inverted pendulum has gained momentum over the last decade on a number of robotic laboratories over the world; due to its unstable proprieties is a good example for control engineers to verify a control theory. To verify that the pendulum can balance we can make some simulations using a closed-loop controller method such as the linear quadratic regulator or the proportional–integral–derivative method. Also the idea of robotic teleoperation is gaining ground. Controlling a robot at a distance and doing that precisely. However, designing the tool to takes the best benefit of the human skills while keeping the error minimal is interesting, and due to the fact that the inverted pendulum is an unstable system it makes a compelling test case for exploring dynamic teleoperation. Therefore this thesis focuses on the construction of a two-wheel inverted pendulum robot, which sensor we can use to do that, how they must be integrated in the system and how we can use a human to control an inverted pendulum. The inverted pendulum robot developed employs technology like sensors, actuators and controllers. This Master thesis starts by presenting an introduction to inverted pendulums and some information about related areas such as control theory. It continues by describing related work in this area. Then we describe the mathematical model of a two-wheel inverted pendulum and a simulation made in Matlab. We also focus in the construction of this type of robot and its working theory. Because this is a mobile robot we address the theme of the teleoperation and finally this thesis finishes with a general conclusion and ideas of future work.
Resumo:
Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies
Resumo:
The Weyl-Wigner prescription for quantization on Euclidean phase spaces makes essential use of Fourier duality. The extension of this property to more general phase spaces requires the use of Kac algebras, which provide the necessary background for the implementation of Fourier duality on general locally compact groups. Kac algebras - and the duality they incorporate - are consequently examined as candidates for a general quantization framework extending the usual formalism. Using as a test case the simplest nontrivial phase space, the half-plane, it is shown how the structures present in the complete-plane case must be modified. Traces, for example, must be replaced by their noncommutative generalizations - weights - and the correspondence embodied in the Weyl-Wigner formalism is no longer complete. Provided the underlying algebraic structure is suitably adapted to each case, Fourier duality is shown to be indeed a very powerful guide to the quantization of general physical systems.
Resumo:
This paper develops a novel full analytic model for vibration analysis of solid-state electronic components. The model is just as accurate as finite element models and numerically light enough to permit for quick design trade-offs and statistical analysis. The paper shows the development of the model, comparison to finite elements and an application to a common engineering problem. A gull-wing flat pack component was selected as the benchmark test case, although the presented methodology is applicable to a wide range of component packages. Results showed very good agreement between the presented method and finite elements and demonstrated the usefulness of the method in how to use standard test data for a general application. © 2013 Elsevier Ltd.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
T he people’s daily lives are surrounded by computing devices, with increasing resources (sensors) and, with increasing processing. How these devices communicate is still not natural and This retards the growth of Ubiquitous Computing. This paper presents a way in which these devices can communicate using Jini technology and concepts of Service Oriented Architecture, applying these concepts in a test case of Ubiquitous Computing. To conduct the test case was constructed a fictitious system for management of a soccer championship, where users can interact with each other and with the system in a simplified way, have access to data in real time of the championship during the event. This communication is performed by services built using Jini technology, which were based on key SOA concepts, such as modularity and reusability
Resumo:
Pós-graduação em Engenharia Mecânica - FEG
Resumo:
The ability of nanoassisted laser desorption-ionization mass spectrometry (NALDI-MS) imaging to provide selective chemical monitoring with proper spatial distribution of lipid profiles from tumor tissues after plate imprinting has been tested. NALDI-MS imaging identified and mapped several potential lipid biomarkers in a murine model of melanoma tumor (inoculation of B16/F10 cells). It also confirmed that the in vivo treatment of tumor bearing mice with synthetic supplement containing phosphoethanolamine (PHO-S) promoted an accentuated decrease in relative abundance of the tumor biomarkers. NALDI-MS imaging is a matrix-free LDI protocol based on the selective imprinting of lipids in the NALDI plate followed by the removal of the tissue. It therefore provides good quality and selective chemical images with preservation of spatial distribution and less interference from tissue material. The test case described herein illustrates the potential of chemically selective NALDI-MS imaging for biomarker discovery.
Resumo:
The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.
Resumo:
In questo lavoro di tesi sono state evidenziate alcune problematiche relative alle macchine exascale (sistemi che sviluppano un exaflops di Potenza di calcolo) e all'evoluzione dei software che saranno eseguiti su questi sistemi, prendendo in esame principalmente la necessità del loro sviluppo, in quanto indispensabili per lo studio di problemi scientifici e tecnologici di più grandi dimensioni, con particolare attenzione alla Material Science, che è uno dei campi che ha avuto maggiori sviluppi grazie all'utilizzo di supercomputer, ed ad uno dei codici HPC più utilizzati in questo contesto: Quantum ESPRESSO. Dal punto di vista del software sono state presentate le prime misure di efficienza energetica su architettura ibrida grazie al prototipo di cluster EURORA sul software Quantum ESPRESSO. Queste misure sono le prime ad essere state pubblicate nel contesto software per la Material Science e serviranno come baseline per future ottimizzazioni basate sull'efficienza energetica. Nelle macchine exascale infatti uno dei requisiti per l'accesso sarà la capacità di essere energeticamente efficiente, così come oggi è un requisito la scalabilità del codice. Un altro aspetto molto importante, riguardante le macchine exascale, è la riduzione del numero di comunicazioni che riduce il costo energetico dell'algoritmo parallelo, poiché in questi nuovi sistemi costerà di più, da un punto di vista energetico, spostare i dati che calcolarli. Per tale motivo in questo lavoro sono state esposte una strategia, e la relativa implementazione, per aumentare la località dei dati in uno degli algoritmi più dispendiosi, dal punto di vista computazionale, in Quantum ESPRESSO: Fast Fourier Transform (FFT). Per portare i software attuali su una macchina exascale bisogna iniziare a testare la robustezza di tali software e i loro workflow su test case che stressino al massimo le macchine attualmente a disposizione. In questa tesi per testare il flusso di lavoro di Quantum ESPRESSO e WanT, un software per calcolo di trasporto, è stato caratterizzato un sistema scientificamente rilevante costituito da un cristallo di PDI - FCN2 che viene utilizzato per la costruzione di transistor organici OFET. Infine è stato simulato un dispositivo ideale costituito da due elettrodi in oro con al centro una singola molecola organica.
Resumo:
Questo lavoro si concentra sullo studio fluidodinamico del flusso multifase cavitante di un iniettore per applicazioni a motori ad iniezione diretta (GDI). L’analisi è stata svolta tramite l’uso del software CFD (Computational Fluid Dynamics) Star-CCM+^® sviluppato da CD-ADAPCO. L’obiettivo di questo studio è investigare i motivi che portano ad un diverso comportamento tra i rilievi della prova sperimentale di caratterizzazione dell’iniettore e quanto atteso dai valori nominali dettati dalla specifica dell’iniettore, con particolare riferimento alla distribuzione di portata fra i diversi ugelli. Il presente lavoro fa parte di una coppia di elaborati collegati tra loro e, pertanto, ha inoltre lo scopo di fornire dati utili allo sviluppo dell’altro tema di analisi mirato alla individuazione di parametri di qualità della miscela aria-combustibile non reagente utili alla previsione della formazione del particolato prodotto dalla combustione di un motore GDI. L’elaborato, costituito di 5 capitoli, è strutturato secondo lo schema sottostante. Nel capitolo 1 vengono presentate le motivazioni che lo hanno avviato e viene esposto lo stato dell’arte della tecnologia GDI. Il capitolo 2 è a sfondo teorico: in esso vengono riportati i fondamenti del processo di cavitazione nella prima parte e i modelli numerici utilizzati nell’analisi nella seconda. Il capitolo 3 descrive la modellazione e successiva validazione dei modelli tramite confronto con il test case ‘Comprensive hydraulic and flow field documentation in model throttle experiments under cavitation conditions’ (E. Winklhofer, 2001). Nella scelta dei modelli e dei parametri relativi, l’analisi si è basata su precedenti lavori trovati in letteratura. Successivamente è stato svolto uno studio di sensibilità per valutare la stabilità della soluzione a piccole variazioni nei valori dei parametri. La scelta dei parametri modellistici nel caso di interesse, l’iniettore multihole, si è basata inizialmente sui valori ‘ottimali’ ottenuti nel test case ed è l’argomento del capitolo 4. All’interno del capitolo si parla inoltre dell’analisi di sensibilità successiva, svolta con lo scopo di comprendere i motivi che portano allo sbilanciamento tra fori corrispondenti e al maggiore sviluppo del getto centrale rispetto agli altri. Nel capitolo 5 dopo un breve riepilogo dei punti fondamentali trattati nello svolgimento dell’elaborato, si tirano le conclusioni sull’analisi e si espongono gli sviluppi futuri.
Resumo:
The AEGISS (Ascertainment and Enhancement of Gastrointestinal Infection Surveillance and Statistics) project aims to use spatio-temporal statistical methods to identify anomalies in the space-time distribution of non-specific, gastrointestinal infections in the UK, using the Southampton area in southern England as a test-case. In this paper, we use the AEGISS project to illustrate how spatio-temporal point process methodology can be used in the development of a rapid-response, spatial surveillance system. Current surveillance of gastroenteric disease in the UK relies on general practitioners reporting cases of suspected food-poisoning through a statutory notification scheme, voluntary laboratory reports of the isolation of gastrointestinal pathogens and standard reports of general outbreaks of infectious intestinal disease by public health and environmental health authorities. However, most statutory notifications are made only after a laboratory reports the isolation of a gastrointestinal pathogen. As a result, detection is delayed and the ability to react to an emerging outbreak is reduced. For more detailed discussion, see Diggle et al. (2003). A new and potentially valuable source of data on the incidence of non-specific gastro-enteric infections in the UK is NHS Direct, a 24-hour phone-in clinical advice service. NHS Direct data are less likely than reports by general practitioners to suffer from spatially and temporally localized inconsistencies in reporting rates. Also, reporting delays by patients are likely to be reduced, as no appointments are needed. Against this, NHS Direct data sacrifice specificity. Each call to NHS Direct is classified only according to the general pattern of reported symptoms (Cooper et al, 2003). The current paper focuses on the use of spatio-temporal statistical analysis for early detection of unexplained variation in the spatio-temporal incidence of non-specific gastroenteric symptoms, as reported to NHS Direct. Section 2 describes our statistical formulation of this problem, the nature of the available data and our approach to predictive inference. Section 3 describes the stochastic model. Section 4 gives the results of fitting the model to NHS Direct data. Section 5 shows how the model is used for spatio-temporal prediction. The paper concludes with a short discussion.
Resumo:
In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.