993 resultados para Basis path testing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Basis path testing is a very powerful structural testing criterion. The number of test paths equals to the cyclomatic complexity of program defined by McCabe. Traditional test generation methods select the paths either without consideration of the constraints of variables or interactively. In this note, an efficient method is presented to generate a set of feasible basis paths. The experiments show that this method can generate feasible basis paths for real-world C programs automatically in acceptable time.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software product line (SPL) engineering offers several advantages in the development of families of software products such as reduced costs, high quality and a short time to market. A software product line is a set of software intensive systems, each of which shares a common core set of functionalities, but also differs from the other products through customization tailored to fit the needs of individual groups of customers. The differences between products within the family are well-understood and organized into a feature model that represents the variability of the SPL. Products can then be built by generating and composing features described in the feature model. Testing of software product lines has become a bottleneck in the SPL development lifecycle, since many of the techniques used in their testing have been borrowed from traditional software testing and do not directly take advantage of the similarities between products. This limits the overall gains that can be achieved in SPL engineering. Recent work proposed by both industry and the research community for improving SPL testing has begun to consider this problem, but there is still a need for better testing techniques that are tailored to SPL development. In this thesis, I make two primary contributions to software product line testing. First I propose a new definition for testability of SPLs that is based on the ability to re-use test cases between products without a loss of fault detection effectiveness. I build on this idea to identify elements of the feature model that contribute positively and/or negatively towards SPL testability. Second, I provide a graph based testing approach called the FIG Basis Path method that selects products and features for testing based on a feature dependency graph. This method should increase our ability to re-use results of test cases across successive products in the family and reduce testing effort. I report the results of a case study involving several non-trivial SPLs and show that for these objects, the FIG Basis Path method is as effective as testing all products, but requires us to test no more than 24% of the products in the SPL.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. Weed eradication efforts often must be sustained for long periods owing to the existence of persistent seed banks, among other factors. Decision makers need to consider both the amount of investment required and the period over which investment must be maintained when determining whether to commit to (or continue) an eradication programme. However, a basis for estimating eradication programme duration based on simple data has been lacking. Here, we present a stochastic dynamic model that can provide such estimates. 2. The model is based upon the rates of progression of infestations from the active to the monitoring state (i.e. no plants detected for at least 12 months), rates of reversion of infestations from monitoring to the active state and the frequency distribution of time since last detection for all infestations. Isoquants that illustrate the combinations of progression and reversion parameters corresponding to eradication within different time frames are generated. 3. The model is applied to ongoing eradication programmes targeting branched broomrape Orobanche ramosa and chromolaena Chromolaena odorata. The minimum periods in which eradication could potentially be achieved were 22 and 23 years, respectively. On the basis of programme performance until 2008, however, eradication is predicted to take considerably longer for both species (on average, 62 and 248 years, respectively). Performance of the branched broomrape programme could be best improved through reducing rates of reversion to the active state; for chromolaena, boosting rates of progression to the monitoring state is more important. 4. Synthesis and applications. Our model for estimating weed eradication programme duration, which captures critical transitions between a limited number of states, is readily applicable to any weed.Aparticular strength of the method lies in its minimal data requirements. These comprise estimates of maximum seed persistence and infested area, plus consistent annual records of the detection (or otherwise) of the weed in each infestation. This work provides a framework for identifying where improvements in management are needed and a basis for testing the effectiveness of alternative tactics. If adopted, our approach should help improve decision making with regard to eradication as a management strategy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The influence of particle shape on the stress-strain response of fine silica sand is investigated experimentally. Two sands from the same source and with the same particle size distribution were examined using Fourier descriptor analysis for particle shape. Their grains were, on average, found to have similar angularity but different elongation. During triaxial stress path testing, the stress-strain behavior of the sands for both loading and creep stages were found to be influenced by particle elongation. In particular, the behavior of the sand with less elongated grains was more like that of rounded glass beads during creep. The results highlight the role of particle shape in stress transmission in granular packings and suggest that shape should be taken more rigorously into consideration in characterizing geomaterials. © 2005 Taylor & Francis Group.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Isolation basin records from the Seymour-Belize Inlet Complex, a remote area of central mainland British Columbia, Canada are used to constrain post-glacial sea-level changes and provide a preliminary basis for testing geophysical model predictions of relative sea-level (RSL) change. Sedimentological and diatom data from three low-lying (<4 m elevation) basins record falling RSLs in late-glacial times and isolation from the sea by ~11,800–11,200 14C BP. A subsequent RSL rise during the early Holocene (~8000 14C BP) breached the 2.13 m sill of the lowest basin (Woods Lake), but the two more elevated basins (sill elevations of ~3.6 m) remained isolated. At ~2400 14C BP, RSL stood at 1.49 ± 0.34 m above present MTL. Falling RSLs in the late Holocene led to the final emergence of the Woods Lake basin by 1604 ± 36 14C BP. Model predictions generated using the ICE-5G model partnered with a small number of different Earth viscosity models generally show poor agreement with the observational data, indicating that the ice model and/or Earth models considered can be improved upon. The best data-model fits were achieved with relatively low values of upper mantle viscosity (5 × 1019 Pa s), which is consistent with previous modelling results from the region. The RSL data align more closely with observational records from the southeast of the region (eastern Vancouver Island, central Strait of Georgia), than the immediate north (Bella Bella–Bella Coola and Prince Rupert-Kitimat) and areas to the north-west (Queen Charlotte Sound, Hecate Strait), underlining the complexity of the regional response to glacio-isostatic recovery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Compacted clay fills are generally placed at the optimum value of water content and, immediately after placement, they are unsaturated. Wetting might subsequently occur due, for example, to rainfall infiltration, which can cause volumetric deformation of the fill (either swell or collapse) with associated loss of shear strength and structural integrity. If swelling takes place under partially restrained deformation, due for example to the presence of a buried rigid structure or a retaining wall, additional stresses will develop in the soil and these can be detrimental to the stability of walling elements and other building assets. Factors such as dry density, overburden pressure, compaction water content and type of clay are known to influence the development of stresses. This paper investigates these factors by means of an advanced stress path testing programme performed on four different clays with different mineralogy, index properties and geological histories. Specimens of kaolin clay, London Clay, Belfast Clay and Ampthill Clay were prepared at different initial states and subjected to ‘controlled’ wetting, whereby the suction was reduced gradually to zero under laterally restrainedconditions (i.e. K0 conditions). The results showed that the magnitude of the increase in horizontal stresses (and therefore the increase of K0) is influenced by the overburden pressure, compaction water content, dry density at the time of compaction and mineralogy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Birnbaum-Saunders regression model is becoming increasingly popular in lifetime analyses and reliability studies. In this model, the signed likelihood ratio statistic provides the basis for testing inference and construction of confidence limits for a single parameter of interest. We focus on the small sample case, where the standard normal distribution gives a poor approximation to the true distribution of the statistic. We derive three adjusted signed likelihood ratio statistics that lead to very accurate inference even for very small samples. Two empirical applications are presented. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation analyzes how marketers define markets in technology-based industries. One of the most important strategic decisions marketers face is determining the optimal market for their products. Market definition is critical in dynamic high technology markets characterized by high levels of market and technological uncertainty. Building on literature from marketing and related disciplines, this research is the first in-depth study of market definition in industrial markets. Using a national, probability sample stratified by firm size, 1,000 marketing executives in nine industries (automation, biotechnology, computers, medical equipment and instrumentation, pharmaceuticals, photonics, software, subassemblies and components, and telecommunications) were surveyed via a mail questionnaire. A 20.8% net response rate yielding 203 surveys was achieved. The market structure-conduct-performance (SCP) paradigm from industrial organization provided a conceptual basis for testing a causal market definition model via LISREL. A latent exogenous variable (competitive intensity) and four latent endogenous variables (marketing orientation, technological orientation, market definition criteria, and market definition success) were used to develop and test hypothesized relationships among constructs. Research questions relating to market redefinition, market definition characteristics, and internal (within the firm) and external (competitive) market definition were also investigated. Market definition success was found to be positively associated with a marketing orientation and the use of market definition criteria. Technological orientation was not significantly related to market definition success. Customer needs were the key market definition characteristic to high-tech firms (technology, competition, customer groups, and products were also important). Market redefinition based on changing customer needs was the most effective of seven strategies tested. A majority of firms regularly defined their market at the corporate and product-line level within the firm. From a competitive perspective, industry, industry sector, and product-market definitions were used most frequently.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The search-experience-credence framework from economics of information, the human-environment relations models from environmental psychology, and the consumer evaluation process from services marketing provide a conceptual basis for testing the model of "Pre-purchase Information Utilization in Service Physical Environments." The model addresses the effects of informational signs, as a dimension of the service physical environment, on consumers' perceptions (perceived veracity and perceived performance risk), emotions (pleasure) and behavior (willingness to buy). The informational signs provide attribute quality information (search and experience) through non-personal sources of information (simulated word-of-mouth and non-personal advocate sources).^ This dissertation examines: (1) the hypothesized relationships addressed in the model of "Pre-purchase Information Utilization in Service Physical Environments" among informational signs, perceived veracity, perceived performance risk, pleasure, and willingness to buy, and (2) the effects of attribute quality information and sources of information on consumers' perceived veracity and perceived performance risk.^ This research is the first in-depth study about the role and effects of information in service physical environments. Using a 2 x 2 between subjects experimental research procedure, undergraduate students were exposed to the informational signs in a simulated service physical environment. The service physical environments were simulated through color photographic slides.^ The results of the study suggest that: (1) the relationship between informational signs and willingness to buy is mediated by perceived veracity, perceived performance risk and pleasure, (2) experience attribute information shows higher perceived veracity and lower perceived performance risk when compared to search attribute information, and (3) information provided through simulated word-of-mouth shows higher perceived veracity and lower perceived performance risk when compared to information provided through non-personal advocate sources. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Flow features inside centrifugal compressor stages are very complicated to simulate with numerical tools due to the highly complex geometry and varying gas conditions all across the machine. For this reason, a big effort is currently being made to increase the fidelity of the numerical models during the design and validation phases. Computational Fluid Dynamics (CFD) plays an increasing role in the assessment of the performance prediction of centrifugal compressor stages. Historically, CFD was considered reliable for performance prediction on a qualitatively level, whereas tests were necessary to predict compressors performance on a quantitatively basis. In fact "standard" CFD with only the flow-path and blades included into the computational domain is known to be weak in capturing efficiency level and operating range accurately due to the under-estimation of losses and the lack of secondary flows modeling. This research project aims to fill the gap in accuracy between "standard" CFD and tests data by including a high fidelity reproduction of the gas domain and the use of advanced numerical models and tools introduced in the author's OEM in-house CFD code. In other words, this thesis describes a methodology by which virtual tests can be conducted on single stages and multistage centrifugal compressors in a similar fashion to a typical rig test that guarantee end users to operate machines with a confidence level not achievable before. Furthermore, the new "high fidelity" approach allowed understanding flow phenomena not fully captured before, increasing aerodynamicists capability and confidence in designing high efficiency and high reliable centrifugal compressor stages.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES Primary care physicians (PCPs) should prescribe faecal immunochemical testing (FIT) or colonoscopy for colorectal cancer screening based on their patient's values and preferences. However, there are wide variations between PCPs in the screening method prescribed. The objective was to assess the impact of an educational intervention on PCPs' intent to offer FIT or colonoscopy on an equal basis. DESIGN Survey before and after training seminars, with a parallel comparison through a mailed survey to PCPs not attending the training seminars. SETTING All PCPs in the canton of Vaud, Switzerland. PARTICIPANTS Of 592 eligible PCPs, 133 (22%) attended a seminar and 106 (80%) filled both surveys. 109 (24%) PCPs who did not attend the seminars returned the mailed survey. INTERVENTION A 2 h-long interactive seminar targeting PCP knowledge, skills and attitudes regarding offering a choice of colorectal cancer (CRC) screening options. OUTCOME MEASURES The primary outcome was PCP intention of having their patients screened with FIT and colonoscopy in equal proportions (between 40% and 60% each). Secondary outcomes were the perceived role of PCPs in screening decisions (from paternalistic to informed decision-making) and correct answer to a clinical vignette. RESULTS Before the seminars, 8% of PCPs reported that they had equal proportions of their patients screened for CRC by FIT and colonoscopy; after the seminar, 33% foresaw having their patients screened in equal proportions (p<0.001). Among those not attending, there was no change (13% vs 14%, p=0.8). Of those attending, there was no change in their perceived role in screening decisions, while the proportion responding correctly to a clinical vignette increased (88-99%, p<0.001). CONCLUSIONS An interactive training seminar increased the proportion of physicians with the intention to prescribe FIT and colonoscopy in equal proportions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Microvariant alleles, defined as alleles that contain an incomplete repeat unit, often complicate the process of DNA analysis. Understanding the molecular basis of microvariants would help to catalogue results and improve upon the analytical process involved in DNA testing. The first step is to determine the sequence/cause of a microvariant. This was done by sequencing samples that were determined to have a microvariant at the FGA or D21S11 loci. The results indicate that a .2 microvariant at the D21S11 locus is caused by a -TA- dinucleotide partial repeat before the last full TCTA repeat. The .2 microvariant at the FGA locus is caused by a -TT- dinucleotide partial repeat after the fifth full repeat and before the variable CTTT repeat motif. There are several possibilities for the reason the .2 microvariants are all the same at a locus, each of which carry implications on the forensic community. The first possibility is that the microvariants are identical by descent, which means that the microvariant is an old allele that has been passed down through the generations. The second possibility is that the microvariants are identical by state, which would mean that there is a mechanism selecting for these microvariants. Future research studying the flanking regions of these microvariants is proposed to determine which of these possibilities is the actual cause and to learn more about the molecular basis of microvariants.