941 resultados para test case optimization


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper proposes two meta-heuristics (Genetic Algorithm and Evolutionary Particle Swarm Optimization) for solving a 15 bid-based case of Ancillary Services Dispatch in an Electricity Market. A Linear Programming approach is also included for comparison purposes. A test case based on the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is used to demonstrate that the use of meta-heuristics is suitable for solving this kind of optimization problem. Faster execution times and lower computational resources requirements are the most relevant advantages of the used meta-heuristics when compared with the Linear Programming approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Modern sophisticated telecommunication devices require even more and more comprehensive testing to ensure quality. The test case amount to ensure well enough coverage of testing has increased rapidly and this increased demand cannot be fulfilled anymore only by using manual testing. Also new agile development models require execution of all test cases with every iteration. This has lead manufactures to use test automation more than ever to achieve adequate testing coverage and quality. This thesis is separated into three parts. Evolution of cellular networks is presented at the beginning of the first part. Also software testing, test automation and the influence of development model for testing are examined in the first part. The second part describes a process which was used to implement test automation scheme for functional testing of LTE core network MME element. In implementation of the test automation scheme agile development models and Robot Framework test automation tool were used. In the third part two alternative models are presented for integrating this test automation scheme as part of a continuous integration process. As a result, the test automation scheme for functional testing was implemented. Almost all new functional level testing test cases can now be automated with this scheme. In addition, two models for integrating this scheme to be part of a wider continuous integration pipe were introduced. Also shift from usage of a traditional waterfall model to a new agile development based model in testing stated to be successful.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Pertinent domestic and international developments involving issues related to tensions affecting religious or belief communities have been increasingly occupying the international law agenda. Those who generate and, thus, shape international law jurisprudence are in the process of seeking some of the answers to these questions. Thus the need for reconceptualization of the right to freedom of religion or belief continues as demands to the right to freedom of religion or belief challenge the boundaries of religious freedom in national and international law. This thesis aims to contribute to the process of “re-conceptualization” by exploring the notion of the collective dimension of freedom of religion or belief with a view to advance the protection of the right to freedom of religion or belief. The case of Turkey provides a useful test case where both the domestic legislation can be assessed against international standards, while at the same time lessons can be drawn for the improvement of the standard of international review of the protection of the collective dimension of freedom of religion or belief. The right to freedom of religion or belief, as enshrined in international human rights documents, is unique in its formulation in that it provides protection for the enjoyment of the rights “in community with others”.1 It cannot be realized in isolation; it crosses categories of human rights with aspects that are individual, aspects that can be effectively realized only in an organized community of individuals and aspects that belong to the field of economic, social and cultural rights such as those related to religious or moral education. This study centers on two primary questions; first, what is the scope and nature of protection afforded to the collective dimension of freedom of religion or belief in international law, and, secondly, how does the protection of the collective dimension of freedom of religion or belief in Turkey compare and contrast to international standards? Section I explores and examines the notion of the collective dimension of freedom of religion or belief, and the scope of its protection in international law with particular reference to the right to acquire legal personality and autonomy religious/belief communities. In Section II, the case study on Turkey constitutes the applied part of the thesis; here, the protection of the collective dimension is assessed with a view to evaluate the compliance of Turkish legislation and practice with international norms as well as seeking to identify how the standard of international review of the collective dimension of freedom of religion or belief can be improved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Accurate knowledge of species’ habitat associations is important for conservation planning and policy. Assessing habitat associations is a vital precursor to selecting appropriate indicator species for prioritising sites for conservation or assessing trends in habitat quality. However, much existing knowledge is based on qualitative expert opinion or local scale studies, and may not remain accurate across different spatial scales or geographic locations. Data from biological recording schemes have the potential to provide objective measures of habitat association, with the ability to account for spatial variation. We used data on 50 British butterfly species as a test case to investigate the correspondence of data-derived measures of habitat association with expert opinion, from two different butterfly recording schemes. One scheme collected large quantities of occurrence data (c. 3 million records) and the other, lower quantities of standardised monitoring data (c. 1400 sites). We used general linear mixed effects models to derive scores of association with broad-leaf woodland for both datasets and compared them with scores canvassed from experts. Scores derived from occurrence and abundance data both showed strongly positive correlations with expert opinion. However, only for occurrence data did these fell within the range of correlations between experts. Data-derived scores showed regional spatial variation in the strength of butterfly associations with broad-leaf woodland, with a significant latitudinal trend in 26% of species. Sub-sampling of the data suggested a mean sample size of 5000 occurrence records per species to gain an accurate estimation of habitat association, although habitat specialists are likely to be readily detected using several hundred records. Occurrence data from recording schemes can thus provide easily obtained, objective, quantitative measures of habitat association.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Test is an area in system development. Test can be performed manually or automated. Test activities can be supported by Word documents and Excel sheets for documenting and executing test cases and as well for follow up, but there are also new test tools designed to support and facilitate the testing process and the activities of the test. This study has described manual test and identified strengths and weaknesses of manual testing with a testing tool called Microsoft Test Manager (MTM) and of manual testing using test cases and test log templates developed by the testers at Sogeti. The result that emerged from the problem and strength analysis and the analysis of literature studies and firsthand experiences (in terms of creating, documenting and executing test cases) addresses the issue of the following weaknesses and strengths. Strengths of the test tool is that it contains needed functionality all in one place and it is available when needed without having to open up other programs which saves many steps of activity. Strengths with test without the support of test tools is mainly that it is easy to learn and gives a good overview, easy to format text as desired and flexible to changes during execution of a test case. Weaknesses in test with the support of test tools include that it is difficult to get a good overview of the entire test case, that it is not possible to format the text in the test steps. It is as well not possible to modify the test steps during execution. It is also difficult to use some of the test design techniques of TMap, for example a checklist, when using the test tool MTM. Weaknesses with test without the support of the testing tool MTM is that the tester gets many more steps of activities to do compared to doing the same activities with the support of the testing tool MTM. There is more to remember because the documents the tester use are not directly linked. Altogether the strengths of the test tool stands out when it comes to supporting the testing process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A system built in terms of autonomous agents may require even greater correctness assurance than one which is merely reacting to the immediate control of its users. Agents make substantial decisions for themselves, so thorough testing is an important consideration. However, autonomy also makes testing harder; by their nature, autonomous agents may react in different ways to the same inputs over time, because, for instance they have changeable goals and knowledge. For this reason, we argue that testing of autonomous agents requires a procedure that caters for a wide range of test case contexts, and that can search for the most demanding of these test cases, even when they are not apparent to the agents’ developers. In this paper, we address this problem, introducing and evaluating an approach to testing autonomous agents that uses evolutionary optimization to generate demanding test cases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article introduces an efficient method to generate structural models for medium-sized silicon clusters. Geometrical information obtained from previous investigations of small clusters is initially sorted and then introduced into our predictor algorithm in order to generate structural models for large clusters. The method predicts geometries whose binding energies are close (95%) to the corresponding value for the ground-state with very low computational cost. These predictions can be used as a very good initial guess for any global optimization algorithm. As a test case, information from clusters up to 14 atoms was used to predict good models for silicon clusters up to 20 atoms. We believe that the new algorithm may enhance the performance of most optimization methods whenever some previous information is available. (C) 2003 Wiley Periodicals, Inc.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, due to the rapid convergence of multimedia services, Internet and wireless communications, there has been a growing trend of heterogeneity (in terms of channel bandwidths, mobility levels of terminals, end-user quality-of-service (QoS) requirements) for emerging integrated wired/wireless networks. Moreover, in nowadays systems, a multitude of users coexists within the same network, each of them with his own QoS requirement and bandwidth availability. In this framework, embedded source coding allowing partial decoding at various resolution is an appealing technique for multimedia transmissions. This dissertation includes my PhD research, mainly devoted to the study of embedded multimedia bitstreams in heterogenous networks, developed at the University of Bologna, advised by Prof. O. Andrisano and Prof. A. Conti, and at the University of California, San Diego (UCSD), where I spent eighteen months as a visiting scholar, advised by Prof. L. B. Milstein and Prof. P. C. Cosman. In order to improve the multimedia transmission quality over wireless channels, joint source and channel coding optimization is investigated in a 2D time-frequency resource block for an OFDM system. We show that knowing the order of diversity in time and/or frequency domain can assist image (video) coding in selecting optimal channel code rates (source and channel code rates). Then, adaptive modulation techniques, aimed at maximizing the spectral efficiency, are investigated as another possible solution for improving multimedia transmissions. For both slow and fast adaptive modulations, the effects of imperfect channel estimation errors are evaluated, showing that the fast technique, optimal in ideal systems, might be outperformed by the slow adaptive modulation, when a real test case is considered. Finally, the effects of co-channel interference and approximated bit error probability (BEP) are evaluated in adaptive modulation techniques, providing new decision regions concepts, and showing how the widely used BEP approximations lead to a substantial loss in the overall performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In der vorliegenden Arbeit werden Entwicklung und Test einesneuartigen Interferometers mit zwei örtlich separierten,phasenkorrelierten Röntgenquellen zur Messung des Realteilsdes komplexen Brechungsindex von dünnen, freitragendenFolien beschrieben. Die Röntgenquellen sind zwei Folien, indenen relativistische Elektronen der Energie 855 MeVÜbergangsstrahlung erzeugen. Das am Mainzer Mikrotron MAMIrealisierte Interferometer besteht aus einer Berylliumfolieeiner Dicke von 10 Mikrometer und einer Nickel-Probefolieeiner Dicke von 2.1 Mikrometer. Die räumlichenInterferenzstrukturen werden als Funktion desFolienabstandes in einer ortsauflösenden pn-CCD nach derFourier-Analyse des Strahlungsimpulses mittels einesSilizium-Einkristallspektrometers gemessen. Die Phase derIntensitätsoszillationen enthält Informationen über dieDispersion, die die in der strahlaufwärtigen Folie erzeugteWelle in der strahlabwärtigen Probefolie erfährt. AlsFallstudie wurde die Dispersion von Nickel im Bereich um dieK-Absorptionskane bei 8333 eV, sowie bei Photonenenergien um9930 eV gemessen. Bei beiden Energien wurden deutlicheInterferenzstrukturen nachgewiesen, wobei die Kohärenz wegenWinkelmischungen mit steigendem Folienabstand bzw.Beobachtungswinkel abnimmt. Es wurden Anpassungen vonSimulationsrechnungen an die Messdaten durchgeführt, die diekohärenzvermindernden Effekte berücksichtigen. Aus diesenAnpassungen konnte bei beiden untersuchten Energien dieDispersion der Nickelprobe mit einer relativen Genauigkeitvon kleiner gleich 1.5 % in guter Übereinstimmung mit derLiteratur bestimmt werden.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Population growth is always increasing, and thus the concept of smart and cognitive cities is becoming more important. Developed countries are aware of and working towards needed changes in city management. However, emerging countries require the optimization of their own city management. This chapter illustrates, based on a use case, how a city in an emerging country can quickly progress using the concept of smart and cognitive cities. Nairobi, the capital of Kenya, is chosen for the test case. More than half of the population of Nairobi lives in slums with poor sanitation, and many slum inhabitants often share a single toilet, so the proper functioning and reliable maintenance of toilets are crucial. For this purpose, an approach for processing text messages based on cognitive computing (using soft computing methods) is introduced. Slum inhabitants can inform the responsible center via text messages in cases when toilets are not functioning properly. Through cognitive computer systems, the responsible center can fix the problem in a quick and efficient way by sending repair workers to the area. Focusing on the slum of Kibera, an easy-to-handle approach for slum inhabitants is presented, which can make the city more efficient, sustainable and resilient (i.e., cognitive).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Usually, data mining projects that are based on decision trees for classifying test cases will use the probabilities provided by these decision trees for ranking classified test cases. We have a need for a better method for ranking test cases that have already been classified by a binary decision tree because these probabilities are not always accurate and reliable enough. A reason for this is that the probability estimates computed by existing decision tree algorithms are always the same for all the different cases in a particular leaf of the decision tree. This is only one reason why the probability estimates given by decision tree algorithms can not be used as an accurate means of deciding if a test case has been correctly classified. Isabelle Alvarez has proposed a new method that could be used to rank the test cases that were classified by a binary decision tree [Alvarez, 2004]. In this paper we will give the results of a comparison of different ranking methods that are based on the probability estimate, the sensitivity of a particular case or both.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The constant need to improve helicopter performance requires the optimization of existing and future rotor designs. A crucial indicator of rotor capability is hover performance, which depends on the near-body flow as well as the structure and strength of the tip vortices formed at the trailing edge of the blades. Computational Fluid Dynamics (CFD) solvers must balance computational expenses with preservation of the flow, and to limit computational expenses the mesh is often coarsened in the outer regions of the computational domain. This can lead to degradation of the vortex structures which compose the rotor wake. The current work conducts three-dimensional simulations using OVERTURNS, a three-dimensional structured grid solver that models the flow field using the Reynolds-Averaged Navier-Stokes equations. The S-76 rotor in hover was chosen as the test case for evaluating the OVERTURNS solver, focusing on methods to better preserve the rotor wake. Using the hover condition, various computational domains, spatial schemes, and boundary conditions were tested. Furthermore, a mesh adaption routine was implemented, allowing for the increased refinement of the mesh in areas of turbulent flow without the need to add points to the mesh. The adapted mesh was employed to conduct a sweep of collective pitch angles, comparing the resolved wake and integrated forces to existing computational and experimental results. The integrated thrust values saw very close agreement across all tested pitch angles, while the power was slightly over predicted, resulting in under prediction of the Figure of Merit. Meanwhile, the tip vortices have been preserved for multiple blade passages, indicating an improvement in vortex preservation when compared with previous work. Finally, further results from a single collective pitch case were presented to provide a more complete picture of the solver results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Facility location concerns the placement of facilities, for various objectives, by use of mathematical models and solution procedures. Almost all facility location models that can be found in literature are based on minimizing costs or maximizing cover, to cover as much demand as possible. These models are quite efficient for finding an optimal location for a new facility for a particular data set, which is considered to be constant and known in advance. In a real world situation, input data like demand and travelling costs are not fixed, nor known in advance. This uncertainty and uncontrollability can lead to unacceptable losses or even bankruptcy. A way of dealing with these factors is robustness modelling. A robust facility location model aims to locate a facility that stays within predefined limits for all expectable circumstances as good as possible. The deviation robustness concept is used as basis to develop a new competitive deviation robustness model. The competition is modelled with a Huff based model, which calculates the market share of the new facility. Robustness in this model is defined as the ability of a facility location to capture a minimum market share, despite variations in demand. A test case is developed by which algorithms can be tested on their ability to solve robust facility location models. Four stochastic optimization algorithms are considered from which Simulated Annealing turned out to be the most appropriate. The test case is slightly modified for a competitive market situation. With the Simulated Annealing algorithm, the developed competitive deviation model is solved, for three considered norms of deviation. At the end, also a grid search is performed to illustrate the landscape of the objective function of the competitive deviation model. The model appears to be multimodal and seems to be challenging for further research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Carrying out information about the microstructure and stress behaviour of ferromagnetic steels, magnetic Barkhausen noise (MBN) has been used as a basis for effective non-destructive testing methods, opening new areas in industrial applications. One of the factors that determines the quality and reliability of the MBN analysis is the way information is extracted from the signal. Commonly, simple scalar parameters are used to characterize the information content, such as amplitude maxima and signal root mean square. This paper presents a new approach based on the time-frequency analysis. The experimental test case relates the use of MBN signals to characterize hardness gradients in a AISI4140 steel. To that purpose different time-frequency (TFR) and time-scale (TSR) representations such as the spectrogram, the Wigner-Ville distribution, the Capongram, the ARgram obtained from an AutoRegressive model, the scalogram, and the Mellingram obtained from a Mellin transform are assessed. It is shown that, due to nonstationary characteristics of the MBN, TFRs can provide a rich and new panorama of these signals. Extraction techniques of some time-frequency parameters are used to allow a diagnostic process. Comparison with results obtained by the classical method highlights the improvement on the diagnosis provided by the method proposed.