916 resultados para automatic test case generation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The immersed boundary method is a versatile tool for the investigation of flow-structure interaction. In a large number of applications, the immersed boundaries or structures are very stiff and strong tangential forces on these interfaces induce a well-known, severe time-step restriction for explicit discretizations. This excessive stability constraint can be removed with fully implicit or suitable semi-implicit schemes but at a seemingly prohibitive computational cost. While economical alternatives have been proposed recently for some special cases, there is a practical need for a computationally efficient approach that can be applied more broadly. In this context, we revisit a robust semi-implicit discretization introduced by Peskin in the late 1970s which has received renewed attention recently. This discretization, in which the spreading and interpolation operators are lagged. leads to a linear system of equations for the inter-face configuration at the future time, when the interfacial force is linear. However, this linear system is large and dense and thus it is challenging to streamline its solution. Moreover, while the same linear system or one of similar structure could potentially be used in Newton-type iterations, nonlinear and highly stiff immersed structures pose additional challenges to iterative methods. In this work, we address these problems and propose cost-effective computational strategies for solving Peskin`s lagged-operators type of discretization. We do this by first constructing a sufficiently accurate approximation to the system`s matrix and we obtain a rigorous estimate for this approximation. This matrix is expeditiously computed by using a combination of pre-calculated values and interpolation. The availability of a matrix allows for more efficient matrix-vector products and facilitates the design of effective iterative schemes. We propose efficient iterative approaches to deal with both linear and nonlinear interfacial forces and simple or complex immersed structures with tethered or untethered points. One of these iterative approaches employs a splitting in which we first solve a linear problem for the interfacial force and then we use a nonlinear iteration to find the interface configuration corresponding to this force. We demonstrate that the proposed approach is several orders of magnitude more efficient than the standard explicit method. In addition to considering the standard elliptical drop test case, we show both the robustness and efficacy of the proposed methodology with a 2D model of a heart valve. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The self, roles and the ongoing coordination of human action. Trying to see ‘society’ as neither prison nor puppet theatre In the article it is argued that structural North-American role-sociology may be integrated with theories emphasizing ‘society’ as ongoing processes (f. ex. Giddens’ theory of structuration). This is possible if the concept of role is defined as a recurrence oriented to the action of others standing out as a regularity in a societal process. But this definition makes it necessary to in a fundamental way understand what kind of social being the role-actor is. This is done with the help of Hans Joas’ theory of creativity and Merleau-Pontys concept of ‘flesh’ arguing that Meads concept of the ‘I’ maybe understood as an embodied self-asserting I, which at least in reflexive modernity has the creative power to split Meads ‘me’ into a self-voiced subject-me and an other voiced object-me. The embodied I communicating with the subject-me may be viewed as that role-actor which is something else than the role played. But this kind of role-actor is making for new troubles because it is hard to understand how this kind of self is creating self-coherence by using Meads concept of ‘the generalized other’. This trouble is handled by using Alain Touraines concept of the ‘subject’ and arguing that the generalized other is dissolving in de-modernized modernity. In split modernity self-coherence may instead be created by what in the article is called the generalized subject. This concept means a kind of communicative future based evaluation, which has its base in the ‘subject’ opposing the split powers of both the instrumentality of markets and of life-worlds trying to create ‘fundamentalistic’ self-identities. This kind of self is communicative because it also must respect the other as ‘subject’. It exists only in the battle against the forces of the market or a community. It never constructs an ideal city or a higher type of individual. It creates and protects a clearing that is constantly being invaded, to use the words of the old Frenchman himself. Asa kind of test-case it is by the way in the article shown how Becks concept of individualization may be understood in a deeply social and role-sociological way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A system built in terms of autonomous agents may require even greater correctness assurance than one which is merely reacting to the immediate control of its users. Agents make substantial decisions for themselves, so thorough testing is an important consideration. However, autonomy also makes testing harder; by their nature, autonomous agents may react in different ways to the same inputs over time, because, for instance they have changeable goals and knowledge. For this reason, we argue that testing of autonomous agents requires a procedure that caters for a wide range of test case contexts, and that can search for the most demanding of these test cases, even when they are not apparent to the agents’ developers. In this paper, we address this problem, introducing and evaluating an approach to testing autonomous agents that uses evolutionary optimization to generate demanding test cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research on inverted pendulum has gained momentum over the last decade on a number of robotic laboratories over the world; due to its unstable proprieties is a good example for control engineers to verify a control theory. To verify that the pendulum can balance we can make some simulations using a closed-loop controller method such as the linear quadratic regulator or the proportional–integral–derivative method. Also the idea of robotic teleoperation is gaining ground. Controlling a robot at a distance and doing that precisely. However, designing the tool to takes the best benefit of the human skills while keeping the error minimal is interesting, and due to the fact that the inverted pendulum is an unstable system it makes a compelling test case for exploring dynamic teleoperation. Therefore this thesis focuses on the construction of a two-wheel inverted pendulum robot, which sensor we can use to do that, how they must be integrated in the system and how we can use a human to control an inverted pendulum. The inverted pendulum robot developed employs technology like sensors, actuators and controllers. This Master thesis starts by presenting an introduction to inverted pendulums and some information about related areas such as control theory. It continues by describing related work in this area. Then we describe the mathematical model of a two-wheel inverted pendulum and a simulation made in Matlab. We also focus in the construction of this type of robot and its working theory. Because this is a mobile robot we address the theme of the teleoperation and finally this thesis finishes with a general conclusion and ideas of future work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article introduces an efficient method to generate structural models for medium-sized silicon clusters. Geometrical information obtained from previous investigations of small clusters is initially sorted and then introduced into our predictor algorithm in order to generate structural models for large clusters. The method predicts geometries whose binding energies are close (95%) to the corresponding value for the ground-state with very low computational cost. These predictions can be used as a very good initial guess for any global optimization algorithm. As a test case, information from clusters up to 14 atoms was used to predict good models for silicon clusters up to 20 atoms. We believe that the new algorithm may enhance the performance of most optimization methods whenever some previous information is available. (C) 2003 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Weyl-Wigner prescription for quantization on Euclidean phase spaces makes essential use of Fourier duality. The extension of this property to more general phase spaces requires the use of Kac algebras, which provide the necessary background for the implementation of Fourier duality on general locally compact groups. Kac algebras - and the duality they incorporate - are consequently examined as candidates for a general quantization framework extending the usual formalism. Using as a test case the simplest nontrivial phase space, the half-plane, it is shown how the structures present in the complete-plane case must be modified. Traces, for example, must be replaced by their noncommutative generalizations - weights - and the correspondence embodied in the Weyl-Wigner formalism is no longer complete. Provided the underlying algebraic structure is suitably adapted to each case, Fourier duality is shown to be indeed a very powerful guide to the quantization of general physical systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a novel full analytic model for vibration analysis of solid-state electronic components. The model is just as accurate as finite element models and numerically light enough to permit for quick design trade-offs and statistical analysis. The paper shows the development of the model, comparison to finite elements and an application to a common engineering problem. A gull-wing flat pack component was selected as the benchmark test case, although the presented methodology is applicable to a wide range of component packages. Results showed very good agreement between the presented method and finite elements and demonstrated the usefulness of the method in how to use standard test data for a general application. © 2013 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

T he people’s daily lives are surrounded by computing devices, with increasing resources (sensors) and, with increasing processing. How these devices communicate is still not natural and This retards the growth of Ubiquitous Computing. This paper presents a way in which these devices can communicate using Jini technology and concepts of Service Oriented Architecture, applying these concepts in a test case of Ubiquitous Computing. To conduct the test case was constructed a fictitious system for management of a soccer championship, where users can interact with each other and with the system in a simplified way, have access to data in real time of the championship during the event. This communication is performed by services built using Jini technology, which were based on key SOA concepts, such as modularity and reusability

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEG

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability of nanoassisted laser desorption-ionization mass spectrometry (NALDI-MS) imaging to provide selective chemical monitoring with proper spatial distribution of lipid profiles from tumor tissues after plate imprinting has been tested. NALDI-MS imaging identified and mapped several potential lipid biomarkers in a murine model of melanoma tumor (inoculation of B16/F10 cells). It also confirmed that the in vivo treatment of tumor bearing mice with synthetic supplement containing phosphoethanolamine (PHO-S) promoted an accentuated decrease in relative abundance of the tumor biomarkers. NALDI-MS imaging is a matrix-free LDI protocol based on the selective imprinting of lipids in the NALDI plate followed by the removal of the tissue. It therefore provides good quality and selective chemical images with preservation of spatial distribution and less interference from tissue material. The test case described herein illustrates the potential of chemically selective NALDI-MS imaging for biomarker discovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, due to the rapid convergence of multimedia services, Internet and wireless communications, there has been a growing trend of heterogeneity (in terms of channel bandwidths, mobility levels of terminals, end-user quality-of-service (QoS) requirements) for emerging integrated wired/wireless networks. Moreover, in nowadays systems, a multitude of users coexists within the same network, each of them with his own QoS requirement and bandwidth availability. In this framework, embedded source coding allowing partial decoding at various resolution is an appealing technique for multimedia transmissions. This dissertation includes my PhD research, mainly devoted to the study of embedded multimedia bitstreams in heterogenous networks, developed at the University of Bologna, advised by Prof. O. Andrisano and Prof. A. Conti, and at the University of California, San Diego (UCSD), where I spent eighteen months as a visiting scholar, advised by Prof. L. B. Milstein and Prof. P. C. Cosman. In order to improve the multimedia transmission quality over wireless channels, joint source and channel coding optimization is investigated in a 2D time-frequency resource block for an OFDM system. We show that knowing the order of diversity in time and/or frequency domain can assist image (video) coding in selecting optimal channel code rates (source and channel code rates). Then, adaptive modulation techniques, aimed at maximizing the spectral efficiency, are investigated as another possible solution for improving multimedia transmissions. For both slow and fast adaptive modulations, the effects of imperfect channel estimation errors are evaluated, showing that the fast technique, optimal in ideal systems, might be outperformed by the slow adaptive modulation, when a real test case is considered. Finally, the effects of co-channel interference and approximated bit error probability (BEP) are evaluated in adaptive modulation techniques, providing new decision regions concepts, and showing how the widely used BEP approximations lead to a substantial loss in the overall performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In questo lavoro di tesi sono state evidenziate alcune problematiche relative alle macchine exascale (sistemi che sviluppano un exaflops di Potenza di calcolo) e all'evoluzione dei software che saranno eseguiti su questi sistemi, prendendo in esame principalmente la necessità del loro sviluppo, in quanto indispensabili per lo studio di problemi scientifici e tecnologici di più grandi dimensioni, con particolare attenzione alla Material Science, che è uno dei campi che ha avuto maggiori sviluppi grazie all'utilizzo di supercomputer, ed ad uno dei codici HPC più utilizzati in questo contesto: Quantum ESPRESSO. Dal punto di vista del software sono state presentate le prime misure di efficienza energetica su architettura ibrida grazie al prototipo di cluster EURORA sul software Quantum ESPRESSO. Queste misure sono le prime ad essere state pubblicate nel contesto software per la Material Science e serviranno come baseline per future ottimizzazioni basate sull'efficienza energetica. Nelle macchine exascale infatti uno dei requisiti per l'accesso sarà la capacità di essere energeticamente efficiente, così come oggi è un requisito la scalabilità del codice. Un altro aspetto molto importante, riguardante le macchine exascale, è la riduzione del numero di comunicazioni che riduce il costo energetico dell'algoritmo parallelo, poiché in questi nuovi sistemi costerà di più, da un punto di vista energetico, spostare i dati che calcolarli. Per tale motivo in questo lavoro sono state esposte una strategia, e la relativa implementazione, per aumentare la località dei dati in uno degli algoritmi più dispendiosi, dal punto di vista computazionale, in Quantum ESPRESSO: Fast Fourier Transform (FFT). Per portare i software attuali su una macchina exascale bisogna iniziare a testare la robustezza di tali software e i loro workflow su test case che stressino al massimo le macchine attualmente a disposizione. In questa tesi per testare il flusso di lavoro di Quantum ESPRESSO e WanT, un software per calcolo di trasporto, è stato caratterizzato un sistema scientificamente rilevante costituito da un cristallo di PDI - FCN2 che viene utilizzato per la costruzione di transistor organici OFET. Infine è stato simulato un dispositivo ideale costituito da due elettrodi in oro con al centro una singola molecola organica.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Questo lavoro si concentra sullo studio fluidodinamico del flusso multifase cavitante di un iniettore per applicazioni a motori ad iniezione diretta (GDI). L’analisi è stata svolta tramite l’uso del software CFD (Computational Fluid Dynamics) Star-CCM+^® sviluppato da CD-ADAPCO. L’obiettivo di questo studio è investigare i motivi che portano ad un diverso comportamento tra i rilievi della prova sperimentale di caratterizzazione dell’iniettore e quanto atteso dai valori nominali dettati dalla specifica dell’iniettore, con particolare riferimento alla distribuzione di portata fra i diversi ugelli. Il presente lavoro fa parte di una coppia di elaborati collegati tra loro e, pertanto, ha inoltre lo scopo di fornire dati utili allo sviluppo dell’altro tema di analisi mirato alla individuazione di parametri di qualità della miscela aria-combustibile non reagente utili alla previsione della formazione del particolato prodotto dalla combustione di un motore GDI. L’elaborato, costituito di 5 capitoli, è strutturato secondo lo schema sottostante. Nel capitolo 1 vengono presentate le motivazioni che lo hanno avviato e viene esposto lo stato dell’arte della tecnologia GDI. Il capitolo 2 è a sfondo teorico: in esso vengono riportati i fondamenti del processo di cavitazione nella prima parte e i modelli numerici utilizzati nell’analisi nella seconda. Il capitolo 3 descrive la modellazione e successiva validazione dei modelli tramite confronto con il test case ‘Comprensive hydraulic and flow field documentation in model throttle experiments under cavitation conditions’ (E. Winklhofer, 2001). Nella scelta dei modelli e dei parametri relativi, l’analisi si è basata su precedenti lavori trovati in letteratura. Successivamente è stato svolto uno studio di sensibilità per valutare la stabilità della soluzione a piccole variazioni nei valori dei parametri. La scelta dei parametri modellistici nel caso di interesse, l’iniettore multihole, si è basata inizialmente sui valori ‘ottimali’ ottenuti nel test case ed è l’argomento del capitolo 4. All’interno del capitolo si parla inoltre dell’analisi di sensibilità successiva, svolta con lo scopo di comprendere i motivi che portano allo sbilanciamento tra fori corrispondenti e al maggiore sviluppo del getto centrale rispetto agli altri. Nel capitolo 5 dopo un breve riepilogo dei punti fondamentali trattati nello svolgimento dell’elaborato, si tirano le conclusioni sull’analisi e si espongono gli sviluppi futuri.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interest in automatic volume meshing for finite element analysis (FEA) has grown more since the appearance of microfocus CT (μCT), due to its high resolution, which allows for the assessment of mechanical behaviour at a high precision. Nevertheless, the basic meshing approach of generating one hexahedron per voxel produces jagged edges. To prevent this effect, smoothing algorithms have been introduced to enhance the topology of the mesh. However, whether smoothing also improves the accuracy of voxel-based meshes in clinical applications is still under question. There is a trade-off between smoothing and quality of elements in the mesh. Distorted elements may be produced by excessive smoothing and reduce accuracy of the mesh. In the present work, influence of smoothing on the accuracy of voxel-based meshes in micro-FE was assessed. An accurate 3D model of a trabecular structure with known apparent mechanical properties was used as a reference model. Virtual CT scans of this reference model (with resolutions of 16, 32 and 64 μm) were then created and used to build voxel-based meshes of the microarchitecture. Effects of smoothing on the apparent mechanical properties of the voxel-based meshes as compared to the reference model were evaluated. Apparent Young’s moduli of the smooth voxel-based mesh were significantly closer to those of the reference model for the 16 and 32 μm resolutions. Improvements were not significant for the 64 μm, due to loss of trabecular connectivity in the model. This study shows that smoothing offers a real benefit to voxel-based meshes used in micro-FE. It might also broaden voxel-based meshing to other biomechanical domains where it was not used previously due to lack of accuracy. As an example, this work will be used in the framework of the European project ContraCancrum, which aims at providing a patient-specific simulation of tumour development in brain and lungs for oncologists. For this type of clinical application, such a fast, automatic, and accurate generation of the mesh is of great benefit.