941 resultados para test case optimization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sustainable forest restoration and management practices require a thorough understanding of the influence that habitat fragmentation has on the processes shaping genetic variation and its distribution in tree populations. We quantified genetic variation at isozyme markers and chloroplast DNA (cpDNA), analysed by polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) in severely fragmented populations of Sorbus aucuparia (Rosaceae) in a single catchment (Moffat) in southern Scotland. Remnants maintain surprisingly high levels of gene diversity (H-E) for isozymes (H-E = 0.195) and cpDNA markers (H-E = 0.490). Estimates are very similar to those from non-fragmented populations in continental Europe, even though the latter were sampled over a much larger spatial scale. Overall, no genetic bottleneck or departures from random mating were detected in the Moffat fragments. However, genetic differentiation among remnants was detected for both types of marker (isozymes Theta(n) = 0.043, cpDNA Theta(c) = 0.131; G-test, P-value < 0.001). In this self-incompatible, insect-pollinated, bird-dispersed tree species, the estimated ratio of pollen flow to seed flow between fragments is close to 1 (r = 1.36). Reduced pollen-mediated gene flow is a likely consequence of habitat fragmentation, but effective seed dispersal by birds is probably helping to maintain high levels of genetic diversity within remnants and reduce genetic differentiation between them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessments for assigning the conservation status of threatened species that are based purely on subjective judgements become problematic because assessments can be influenced by hidden assumptions, personal biases and perceptions of risks, making the assessment process difficult to repeat. This can result in inconsistent assessments and misclassifications, which can lead to a lack of confidence in species assessments. It is almost impossible to Understand an expert's logic or visualise the underlying reasoning behind the many hidden assumptions used throughout the assessment process. In this paper, we formalise the decision making process of experts, by capturing their logical ordering of information, their assumptions and reasoning, and transferring them into a set of decisions rules. We illustrate this through the process used to evaluate the conservation status of species under the NatureServe system (Master, 1991). NatureServe status assessments have been used for over two decades to set conservation priorities for threatened species throughout North America. We develop a conditional point-scoring method, to reflect the current subjective process. In two test comparisons, 77% of species' assessments using the explicit NatureServe method matched the qualitative assessments done subjectively by NatureServe staff. Of those that differed, no rank varied by more than one rank level under the two methods. In general, the explicit NatureServe method tended to be more precautionary than the subjective assessments. The rank differences that emerged from the comparisons may be due, at least in part, to the flexibility of the qualitative system, which allows different factors to be weighted on a species-by-species basis according to expert judgement. The method outlined in this study is the first documented attempt to explicitly define a transparent process for weighting and combining factors under the NatureServe system. The process of eliciting expert knowledge identifies how information is combined and highlights any inconsistent logic that may not be obvious in Subjective decisions. The method provides a repeatable, transparent, and explicit benchmark for feedback, further development, and improvement. (C) 2004 Elsevier SAS. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view population-based optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for continuous, population-based optimization in terms of a stochastic gradient descent on the Kullback-Leibler divergence between the model probability density and the objective function, represented as an unknown density of assumed form. This leads to an update rule that is related and compared with previous theoretical work, a continuous version of the population-based incremental learning algorithm, and the generalized mean shift clustering framework. Experimental results are presented that demonstrate the dynamics of the new algorithm on a set of simple test problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, numerical simulations are used in an attempt to find optimal Source profiles for high frequency radiofrequency (RF) volume coils. Biologically loaded, shielded/unshielded circular and elliptical birdcage coils operating at 170 MHz, 300 MHz and 470 MHz are modelled using the FDTD method for both 2D and 3D cases. Taking advantage of the fact that some aspects of the electromagnetic system are linear, two approaches have been proposed for the determination of the drives for individual elements in the RF resonator. The first method is an iterative optimization technique with a kernel for the evaluation of RF fields inside an imaging plane of a human head model using pre-characterized sensitivity profiles of the individual rungs of a resonator; the second method is a regularization-based technique. In the second approach, a sensitivity matrix is explicitly constructed and a regularization procedure is employed to solve the ill-posed problem. Test simulations show that both methods can improve the B-1-field homogeneity in both focused and non-focused scenarios. While the regularization-based method is more efficient, the first optimization method is more flexible as it can take into account other issues such as controlling SAR or reshaping the resonator structures. It is hoped that these schemes and their extensions will be useful for the determination of multi-element RF drives in a variety of applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Achieving consistency between a specification and its implementation is an important part of software development In previous work, we have presented a method and tool support for testing a formal specification using animation and then verifying an implementation of that specification. The method is based on a testgraph, which provides a partial model of the application under test. The testgraph is used in combination with an animator to generate test sequences for testing the formal specification. The same testgraph is used during testing to execute those same sequences on the implementation and to ensure that the implementation conforms to the specification. So far, the method and its tool support have been applied to software components that can be accessed through an application programmer interface (API). In this paper, we use an industrially-based case study to discuss the problems associated with applying the method to a software system with a graphical user interface (GUI). In particular, the lack of a standardised interface, as well as controllability and observability problems, make it difficult to automate the testing of the implementation. The method can still be applied, but the amount of testing that can be carried on the implementation is limited by the manual effort involved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fuzzy signal detection analysis can be a useful complementary technique to traditional signal detection theory analysis methods, particularly in applied settings. For example, traffic situations are better conceived as being on a continuum from no potential for hazard to high potential, rather than either having potential or not having potential. This study examined the relative contribution of sensitivity and response bias to explaining differences in the hazard perception performance of novices and experienced drivers, and the effect of a training manipulation. Novice drivers and experienced drivers were compared (N = 64). Half the novices received training, while the experienced drivers and half the novices remained untrained. Participants completed a hazard perception test and rated potential for hazard in occluded scenes. The response latency of participants to the hazard perception test replicated previous findings of experienced/novice differences and trained/untrained differences. Fuzzy signal detection analysis of both the hazard perception task and the occluded rating task suggested that response bias may be more central to hazard perception test performance than sensitivity, with trained and experienced drivers responding faster and with a more liberal bias than untrained novices. Implications for driver training and the hazard perception test are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A formalism for modelling the dynamics of Genetic Algorithms (GAs) using methods from statistical mechanics, originally due to Prugel-Bennett and Shapiro, is reviewed, generalized and improved upon. This formalism can be used to predict the averaged trajectory of macroscopic statistics describing the GA's population. These macroscopics are chosen to average well between runs, so that fluctuations from mean behaviour can often be neglected. Where necessary, non-trivial terms are determined by assuming maximum entropy with constraints on known macroscopics. Problems of realistic size are described in compact form and finite population effects are included, often proving to be of fundamental importance. The macroscopics used here are cumulants of an appropriate quantity within the population and the mean correlation (Hamming distance) within the population. Including the correlation as an explicit macroscopic provides a significant improvement over the original formulation. The formalism is applied to a number of simple optimization problems in order to determine its predictive power and to gain insight into GA dynamics. Problems which are most amenable to analysis come from the class where alleles within the genotype contribute additively to the phenotype. This class can be treated with some generality, including problems with inhomogeneous contributions from each site, non-linear or noisy fitness measures, simple diploid representations and temporally varying fitness. The results can also be applied to a simple learning problem, generalization in a binary perceptron, and a limit is identified for which the optimal training batch size can be determined for this problem. The theory is compared to averaged results from a real GA in each case, showing excellent agreement if the maximum entropy principle holds. Some situations where this approximation brakes down are identified. In order to fully test the formalism, an attempt is made on the strong sc np-hard problem of storing random patterns in a binary perceptron. Here, the relationship between the genotype and phenotype (training error) is strongly non-linear. Mutation is modelled under the assumption that perceptron configurations are typical of perceptrons with a given training error. Unfortunately, this assumption does not provide a good approximation in general. It is conjectured that perceptron configurations would have to be constrained by other statistics in order to accurately model mutation for this problem. Issues arising from this study are discussed in conclusion and some possible areas of further research are outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores how transaction attributes of technology affect differences in the relationship between technology buyers and suppliers. It also examines the impact on performance of different patterns of relationship between technology buyers and suppliers. Data obtained from 147 manufacturing firms in Malaysia are used to test several hypotheses, which were derived from a review of the literature on technology, transaction cost theory and buyer–supplier relationships (BSR). The research results indicate that the higher the level of technological complexity, specificity and uncertainty, the more firms are likely to engage in a closer relationship with technology suppliers. Even though the majority of firms reported improvements in their performance, results indicate that firms demonstrating a closer relationship with technology suppliers are more likely to achieve higher levels of performance than those that do not. It is also shown that with high levels of transaction attribute, implementation performance suffers more when firms have weak relationships with technology suppliers than with moderate and low levels of transaction attribute.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On 20 October 1997 the London Stock Exchange introduced a new trading system called SETS. This system was to replace the dealer system SEAQ, which had been in operation since 1986. Using the iterative sum of squares test introduced by Inclan and Tiao (1994), we investigate whether there was a change in the unconditional variance of opening and closing returns, at the time SETS was introduced. We show that for the FTSE-100 stocks traded on SETS, on the days following its introduction, there was a widespread increase in the volatility of both opening and closing returns. However, no synchronous volatility changes were found to be associated with the FTSE-100 index or FTSE-250 stocks. We conclude therefore that the introduction of the SETS trading mechanism caused an increase in noise at the time the system was introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops and applies an integrated multiple criteria decision making approach to optimize the facility location-allocation problem in the contemporary customer-driven supply chain. Unlike the traditional optimization techniques, the proposed approach, combining the analytic hierarchy process (AHP) and the goal programming (GP) model, considers both quantitative and qualitative factors, and also aims at maximizing the benefits of deliverer and customers. In the integrated approach, the AHP is used first to determine the relative importance weightings or priorities of alternative locations with respect to both deliverer oriented and customer oriented criteria. Then, the GP model, incorporating the constraints of system, resource, and AHP priority is formulated to select the best locations for setting up the warehouses without exceeding the limited available resources. In this paper, a real case study is used to demonstrate how the integrated approach can be applied to deal with the facility location-allocation problem, and it is proved that the integrated approach outperforms the traditional costbased approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - This article examines the internationalisation of Tesco and extracts the salient lessons learned from this process. Design/methodology/ approach - This research draws on a dataset of 62 in-depth interviews with key executives, sell- and buy-side analysts and corporate advisers at the leading investment banks in the City of London to detail the experiences of Tesco's European expansion. Findings - The case study of Tesco illuminates a number of different dimensions of the company's international experience. It offers some new insights into learning in international distribution environments such as the idea that learning is facilitated by uncertainty or "shocks" in the international retail marketplace; the size of the domestic market may inhibit change and so disable international learning; and learning is not necessarily facilitated by step-by-step incremental approaches to expansion. Research limitations/implications - The paper explores learning from a rather broad perspective, although it is hoped that these parameters can be used to raise a new set of more detailed priorities for future research on international retail learning. It is also recognised that the data gathered for this case study focus on Tesco's European operations. Practical implications - This paper raises a number of interesting issues such as whether the extremities of the business may be a more appropriate place for management to experiment and test new retail innovations, and the extent to which retailers take self-reflection seriously. Originality/value - The paper applies a new theoretical learning perspective to capture the variety of experiences during the internationalisation process, thus addressing a major gap in our understanding of the whole internationalisation process. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: There is limited research concerning how small companies in particular, respond to health and safety messages. AIMS: To understand individuals' knowledge and beliefs about chemical risks and to compare these with those of experts. METHODS: The use of chromic acid in particular, and also other chemicals associated with chrome plating were studied. All chromium plating firms were based in the West Midlands. The methodology involved initial face to face interviews (n = 21) with chromium platers, structured questionnaires (n = 84) to test the prevalence of beliefs identified in the interviews, an expert questionnaire, and a workshop to discuss findings. The responses of platers were compared with those of occupational health and safety experts. RESULTS: Although chromium platers appeared to understand the short term adverse effects of the chemicals to which they are exposed, their understanding of long term, or chronic effects appeared to be incomplete. They had good knowledge of acute effects based primarily on experience. Platers were aware of the hazardous nature of the chemicals with which they work, but did not draw distinction between the terms "hazards" and "risks". They had difficulties articulating the effects of the chemicals and how exposure might occur; although it is inappropriate to equate this with lack of knowledge. A significant minority of platers displayed deficiencies in understanding key technical terms used in Safety Data Sheets. CONCLUSIONS: This study provides a method which can be used to gain some understanding of workers' knowledge and beliefs about risks that they are exposed to in the workplace. The study also identifies gaps between the platers' knowledge and beliefs and those of experts. New risk information needs to be designed which addresses the information needs of platers using language that they understand.