983 resultados para test cases generator


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The modelling of diffusive terms in particle methods is a delicate matter and several models were proposed in the literature to take such terms into account. The diffusion velocity method (DVM), originally designed for the diffusion of passive scalars, turns diffusive terms into convective ones by expressing them as a divergence involving a so-called diffusion velocity. In this paper, DVM is extended to the diffusion of vectorial quantities in the three-dimensional Navier–Stokes equations, in their incompressible, velocity–vorticity formulation. The integration of a large eddy simulation (LES) turbulence model is investigated and a DVM general formulation is proposed. Either with or without LES, a novel expression of the diffusion velocity is derived, which makes it easier to approximate and which highlights the analogy with the original formulation for scalar transport. From this statement, DVM is then analysed in one dimension, both analytically and numerically on test cases to point out its good behaviour.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En este trabajo se aborda la aplicación de SPEA2, un método para optimización multiobjetivo, al cálculo de un esquema de dosificación para el tratamiento quimioterapéutico de una masa tumoral; entiéndase por esquema de dosificación la especificación del o de los agentes cito-tóxicos, sus dosis y tiempos en que deben administrarse. El problema de optimización aquí resuelto es uno multiobjetivo, pues el esquema de dosificación a calcularse debe minimizar no solo el tamaño del tumor, sino también la toxicidad remanente al término del tratamiento, su costo, etc. El SPEA2 es un algoritmo genético que aplica el criterio de Pareto; por lo tanto, lo que calcula es una aproximación a la frontera de Pareto, soluciones de entre las cuales el usuario puede escoger la “mejor”. En el proceso de esta investigación se construyó SoT-Q, una herramienta de software que consiste de dos módulos principales: un optimizador para calcular los esquemas de dosificación óptimos, y un simulador para aplicar dichos esquemas a un paciente (simulado) con masa tumoral; el funcionamiento del simulador se basa en un modelo fármaco-dinámico que representa el tumor. El programa SoT-Q podría en el futuro -una vez extensamente probado y depurado- asistir a médicos oncólogos en la toma de decisiones respecto a tratamientos quimioterapéuticos; o podría servir también como ayuda pedagógica en el entrenamiento de nuevos profesionales de la salud. Los resultados obtenidos fueron muy buenos; en todos los casos de prueba utilizados se logró reducir de manera significativa tanto el tamaño del tumor como la toxicidad remanente al término del tratamiento; en algunos casos la reducción fue de tres órdenes de magnitud.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2015.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Production companies use raw materials to compose end-products. They often make different products with the same raw materials. In this research, the focus lies on the production of two end-products consisting of (partly) the same raw materials as cheap as possible. Each of the products has its own demand and quality requirements consisting of quadratic constraints. The minimization of the costs, given the quadratic constraints is a global optimization problem, which can be difficult because of possible local optima. Therefore, the multi modal character of the (bi-) blend problem is investigated. Standard optimization packages (solvers) in Matlab and GAMS were tested on their ability to solve the problem. In total 20 test cases were generated and taken from literature to test solvers on their effectiveness and efficiency to solve the problem. The research also gives insight in adjusting the quadratic constraints of the problem in order to make a robust problem formulation of the bi-blend problem.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Security defects are common in large software systems because of their size and complexity. Although efficient development processes, testing, and maintenance policies are applied to software systems, there are still a large number of vulnerabilities that can remain, despite these measures. Some vulnerabilities stay in a system from one release to the next one because they cannot be easily reproduced through testing. These vulnerabilities endanger the security of the systems. We propose vulnerability classification and prediction frameworks based on vulnerability reproducibility. The frameworks are effective to identify the types and locations of vulnerabilities in the earlier stage, and improve the security of software in the next versions (referred to as releases). We expand an existing concept of software bug classification to vulnerability classification (easily reproducible and hard to reproduce) to develop a classification framework for differentiating between these vulnerabilities based on code fixes and textual reports. We then investigate the potential correlations between the vulnerability categories and the classical software metrics and some other runtime environmental factors of reproducibility to develop a vulnerability prediction framework. The classification and prediction frameworks help developers adopt corresponding mitigation or elimination actions and develop appropriate test cases. Also, the vulnerability prediction framework is of great help for security experts focus their effort on the top-ranked vulnerability-prone files. As a result, the frameworks decrease the number of attacks that exploit security vulnerabilities in the next versions of the software. To build the classification and prediction frameworks, different machine learning techniques (C4.5 Decision Tree, Random Forest, Logistic Regression, and Naive Bayes) are employed. The effectiveness of the proposed frameworks is assessed based on collected software security defects of Mozilla Firefox.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The idea of spacecraft formations, flying in tight configurations with maximum baselines of a few hundred meters in low-Earth orbits, has generated widespread interest over the last several years. Nevertheless, controlling the movement of spacecraft in formation poses difficulties, such as in-orbit high-computing demand and collision avoidance capabilities, which escalate as the number of units in the formation is increased and complicated nonlinear effects are imposed to the dynamics, together with uncertainty which may arise from the lack of knowledge of system parameters. These requirements have led to the need of reliable linear and nonlinear controllers in terms of relative and absolute dynamics. The objective of this thesis is, therefore, to introduce new control methods to allow spacecraft in formation, with circular/elliptical reference orbits, to efficiently execute safe autonomous manoeuvres. These controllers distinguish from the bulk of literature in that they merge guidance laws never applied before to spacecraft formation flying and collision avoidance capacities into a single control strategy. For this purpose, three control schemes are presented: linear optimal regulation, linear optimal estimation and adaptive nonlinear control. In general terms, the proposed control approaches command the dynamical performance of one or several followers with respect to a leader to asymptotically track a time-varying nominal trajectory (TVNT), while the threat of collision between the followers is reduced by repelling accelerations obtained from the collision avoidance scheme during the periods of closest proximity. Linear optimal regulation is achieved through a Riccati-based tracking controller. Within this control strategy, the controller provides guidance and tracking toward a desired TVNT, optimizing fuel consumption by Riccati procedure using a non-infinite cost function defined in terms of the desired TVNT, while repelling accelerations generated from the CAS will ensure evasive actions between the elements of the formation. The relative dynamics model, suitable for circular and eccentric low-Earth reference orbits, is based on the Tschauner and Hempel equations, and includes a control input and a nonlinear term corresponding to the CAS repelling accelerations. Linear optimal estimation is built on the forward-in-time separation principle. This controller encompasses two stages: regulation and estimation. The first stage requires the design of a full state feedback controller using the state vector reconstructed by means of the estimator. The second stage requires the design of an additional dynamical system, the estimator, to obtain the states which cannot be measured in order to approximately reconstruct the full state vector. Then, the separation principle states that an observer built for a known input can also be used to estimate the state of the system and to generate the control input. This allows the design of the observer and the feedback independently, by exploiting the advantages of linear quadratic regulator theory, in order to estimate the states of a dynamical system with model and sensor uncertainty. The relative dynamics is described with the linear system used in the previous controller, with a control input and nonlinearities entering via the repelling accelerations from the CAS during collision avoidance events. Moreover, sensor uncertainty is added to the control process by considering carrier-phase differential GPS (CDGPS) velocity measurement error. An adaptive control law capable of delivering superior closed-loop performance when compared to the certainty-equivalence (CE) adaptive controllers is finally presented. A novel noncertainty-equivalence controller based on the Immersion and Invariance paradigm for close-manoeuvring spacecraft formation flying in both circular and elliptical low-Earth reference orbits is introduced. The proposed control scheme achieves stabilization by immersing the plant dynamics into a target dynamical system (or manifold) that captures the desired dynamical behaviour. They key feature of this methodology is the addition of a new term to the classical certainty-equivalence control approach that, in conjunction with the parameter update law, is designed to achieve adaptive stabilization. This parameter has the ultimate task of shaping the manifold into which the adaptive system is immersed. The performance of the controller is proven stable via a Lyapunov-based analysis and Barbalat’s lemma. In order to evaluate the design of the controllers, test cases based on the physical and orbital features of the Prototype Research Instruments and Space Mission Technology Advancement (PRISMA) are implemented, extending the number of elements in the formation into scenarios with reconfigurations and on-orbit position switching in elliptical low-Earth reference orbits. An extensive analysis and comparison of the performance of the controllers in terms of total Δv and fuel consumption, with and without the effects of the CAS, is presented. These results show that the three proposed controllers allow the followers to asymptotically track the desired nominal trajectory and, additionally, those simulations including CAS show an effective decrease of collision risk during the performance of the manoeuvre.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the traceless Oldroyd viscoelastic model, the viscoelastic extra stress tensor is decomposed into its traceless (deviatoric) and spherical parts, leading to a reformulation of the classical Oldroyd model. The equivalence of the two models is established comparing model predictions for simple test cases. The new model is validated using several 2D benchmark problems. The structure and behavior of the new model are discussed and the future use of the new model in envisioned, both on the theoretical and numerical perspectives.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A traceless variant of the Johnson-Segalman viscoelastic model is presented. The viscoelastic extra stress tensor is de composed into its traceless (deviatoric) and spherical parts, leading to a reformulation of the classical Johnson-Segalman model. The equivalente of the two models is established comparing model predictions for simple test cases. The new model is validated using several 2D benchmark problems.The structure and behavior of the new model are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The work presented in this thesis aims to contribute to innovation in the Urban Air Mobility and Delivery sector and represents a solid starting point for air logistics and its future scenarios. The dissertation focuses on modeling, simulation, and control of a formation of multirotor aircraft for cooperative load transportation, with particular attention to environmental sustainability. First, a simulation and test environment is developed to assess technologies for suspended load stabilization. Starting from the mathematical model of two identical multirotors, formation-flight-keeping and collision-avoidance algorithms are analyzed. This approach guarantees both the safety of the vehicles within the formation and that of the payload, which may be made of people in the very near future. Afterwards, a mathematical model for the suspended load is implemented, as well as an active controller for its stabilization. The key focus of this part is represented by both analysis and control of payload oscillatory motion, by thoroughly investigating load kinetic energy decay. At this point, several test cases were introduced, in order to understand which strategy is the most effective and safe in terms of future applications in the field of air logistics.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Introduction Dengue is prevalent in many tropical and sub-tropical regions. The clinical diagnosis of dengue is still complex, and not much data are available. This work aimed at assessing the diagnostic accuracy of the tourniquet test in patients with suspected dengue infection and its positivity in different classifications of this disease as reported to the Information System for Notifiable Disease in Belo Horizonte, State of Minas Gerais, Brazil between 2001 and 2006. Methods Cross-section analysis of the diagnostic accuracy of the tourniquet test for dengue, using IgM-anti-DENV ELISA as a gold standard. Results We selected 9,836 suspected cases, of which 41.1% were confirmed to be dengue. Classic dengue was present in 95.8%, dengue with complications in 2.5% and dengue hemorrhagic fever in 1.7%. The tourniquet test was positive in 16.9% of classic dengue cases, 61.7% of dengue cases with complications and 82.9% of cases of dengue hemorrhagic fever. The sensitivity and specificity of the tourniquet test were 19.1% and 86.4%, respectively. Conclusions A positive tourniquet test can be a valuable tool to support diagnosis of dengue where laboratory tests are not available. However, the absence of a positive test should not be read as the absence of infection. In addition, the tourniquet test was demonstrated to be an indicator of dengue severity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A combination of perl scripts and LaTeX files, this generates multiple multiple choice class tests from a single set of questions. You input a list of questions and answers into a text file. The script then produces any number of class tests that can be used, together with master answer sheets, by scrambling the order of the questions and the answers. Includes a detailed README file, but best just to try it and see.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.