995 resultados para Basis path testing


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software product line (SPL) engineering offers several advantages in the development of families of software products such as reduced costs, high quality and a short time to market. A software product line is a set of software intensive systems, each of which shares a common core set of functionalities, but also differs from the other products through customization tailored to fit the needs of individual groups of customers. The differences between products within the family are well-understood and organized into a feature model that represents the variability of the SPL. Products can then be built by generating and composing features described in the feature model. Testing of software product lines has become a bottleneck in the SPL development lifecycle, since many of the techniques used in their testing have been borrowed from traditional software testing and do not directly take advantage of the similarities between products. This limits the overall gains that can be achieved in SPL engineering. Recent work proposed by both industry and the research community for improving SPL testing has begun to consider this problem, but there is still a need for better testing techniques that are tailored to SPL development. In this thesis, I make two primary contributions to software product line testing. First I propose a new definition for testability of SPLs that is based on the ability to re-use test cases between products without a loss of fault detection effectiveness. I build on this idea to identify elements of the feature model that contribute positively and/or negatively towards SPL testability. Second, I provide a graph based testing approach called the FIG Basis Path method that selects products and features for testing based on a feature dependency graph. This method should increase our ability to re-use results of test cases across successive products in the family and reduce testing effort. I report the results of a case study involving several non-trivial SPLs and show that for these objects, the FIG Basis Path method is as effective as testing all products, but requires us to test no more than 24% of the products in the SPL.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes a practical application of MDA and reverse engineering based on a domain-specific modelling language. A well defined metamodel of a domain-specific language is useful for verification and validation of associated tools. We apply this approach to SIFA, a security analysis tool. SIFA has evolved as requirements have changed, and it has no metamodel. Hence, testing SIFA’s correctness is difficult. We introduce a formal metamodelling approach to develop a well-defined metamodel of the domain. Initially, we develop a domain model in EMF by reverse engineering the SIFA implementation. Then we transform EMF to Object-Z using model transformation. Finally, we complete the Object-Z model by specifying system behavior. The outcome is a well-defined metamodel that precisely describes the domain and the security properties that it analyses. It also provides a reliable basis for testing the current SIFA implementation and forward engineering its successor.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Changes in gene expression have been measured 24 h after injury to mammalian spinal cords that can and cannot regenerate In opossums there is a critical period of development when regeneration stops being possible at 9 days postnatal cervical spinal cords regenerate, at 12 days they do not By the use of marsupial cDNA microarrays we detected 158 genes that respond differentially to injury at the two ages critical for regeneration For selected candidates additional measurements were made by real time PCR and sites of their expression were shown by immunostaining Candidate genes have been classified so as to select those that promote or prevent regeneration Up regulated by injury at 8 days and/or down regulated by injury at 13 days were genes known to promote growth, such as Mitogen activated protein kinase kinase 1 or transcripton factor TCF7L2 By contrast, at 13 days up regulation occurred of Inhibitory molecules including annexins ephrins and genes related to apoptosis and neurodegeneranve diseases Certain genes such as calmodulin 1 and NOGO changed expression similarly in animals that could and could not regenerate without any additional changes in response to injury These findings confirmed and extended changes of gene expression found in earlier screens on 9 and 12 day preparations without lesions and provide a comprehensive list of genes that serve as a basis for testing how identified molecules singly or in combination, promote and prevent central nervous system regeneration (C) 2010 Elsevier B V All rights reserved

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Birnbaum-Saunders regression model is becoming increasingly popular in lifetime analyses and reliability studies. In this model, the signed likelihood ratio statistic provides the basis for testing inference and construction of confidence limits for a single parameter of interest. We focus on the small sample case, where the standard normal distribution gives a poor approximation to the true distribution of the statistic. We derive three adjusted signed likelihood ratio statistics that lead to very accurate inference even for very small samples. Two empirical applications are presented. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation analyzes how marketers define markets in technology-based industries. One of the most important strategic decisions marketers face is determining the optimal market for their products. Market definition is critical in dynamic high technology markets characterized by high levels of market and technological uncertainty. Building on literature from marketing and related disciplines, this research is the first in-depth study of market definition in industrial markets. Using a national, probability sample stratified by firm size, 1,000 marketing executives in nine industries (automation, biotechnology, computers, medical equipment and instrumentation, pharmaceuticals, photonics, software, subassemblies and components, and telecommunications) were surveyed via a mail questionnaire. A 20.8% net response rate yielding 203 surveys was achieved. The market structure-conduct-performance (SCP) paradigm from industrial organization provided a conceptual basis for testing a causal market definition model via LISREL. A latent exogenous variable (competitive intensity) and four latent endogenous variables (marketing orientation, technological orientation, market definition criteria, and market definition success) were used to develop and test hypothesized relationships among constructs. Research questions relating to market redefinition, market definition characteristics, and internal (within the firm) and external (competitive) market definition were also investigated. Market definition success was found to be positively associated with a marketing orientation and the use of market definition criteria. Technological orientation was not significantly related to market definition success. Customer needs were the key market definition characteristic to high-tech firms (technology, competition, customer groups, and products were also important). Market redefinition based on changing customer needs was the most effective of seven strategies tested. A majority of firms regularly defined their market at the corporate and product-line level within the firm. From a competitive perspective, industry, industry sector, and product-market definitions were used most frequently.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The search-experience-credence framework from economics of information, the human-environment relations models from environmental psychology, and the consumer evaluation process from services marketing provide a conceptual basis for testing the model of "Pre-purchase Information Utilization in Service Physical Environments." The model addresses the effects of informational signs, as a dimension of the service physical environment, on consumers' perceptions (perceived veracity and perceived performance risk), emotions (pleasure) and behavior (willingness to buy). The informational signs provide attribute quality information (search and experience) through non-personal sources of information (simulated word-of-mouth and non-personal advocate sources).^ This dissertation examines: (1) the hypothesized relationships addressed in the model of "Pre-purchase Information Utilization in Service Physical Environments" among informational signs, perceived veracity, perceived performance risk, pleasure, and willingness to buy, and (2) the effects of attribute quality information and sources of information on consumers' perceived veracity and perceived performance risk.^ This research is the first in-depth study about the role and effects of information in service physical environments. Using a 2 x 2 between subjects experimental research procedure, undergraduate students were exposed to the informational signs in a simulated service physical environment. The service physical environments were simulated through color photographic slides.^ The results of the study suggest that: (1) the relationship between informational signs and willingness to buy is mediated by perceived veracity, perceived performance risk and pleasure, (2) experience attribute information shows higher perceived veracity and lower perceived performance risk when compared to search attribute information, and (3) information provided through simulated word-of-mouth shows higher perceived veracity and lower perceived performance risk when compared to information provided through non-personal advocate sources. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the past decade a series of trials of the EORTC Brain Tumor Group (BTG) has substantially influenced and shaped the standard-of-care of primary brain tumors. All these trials were coupled with biological research that has allowed for better understanding of the biology of these tumors. In glioblastoma, EORTC trial 26981/22981 conducted jointly with the National Cancer Institute of Canada Clinical Trials Group showed superiority of concomitant radiochemotherapy with temozolomide over radiotherapy alone. It also identified the first predictive marker for benefit from alkylating agent chemotherapy in glioblastoma, the methylation of the O6-methyl-guanyl-methly-transferase (MGMT) gene promoter. In another large randomized trial, EORTC 26951, adjuvant chemotherapy in anaplastic oligodendroglial tumors was investigated. Despite an improvement in progression-free survival this did not translate into a survival benefit. The third example of a landmark trial is the EORTC 22845 trial. This trial led by the EORTC Radiation Oncology Group forms the basis for an expectative approach to patients with low-grade glioma, as early radiotherapy indeed prolongs time to tumor progression but with no benefit in overall survival. This trial is the key reference in deciding at what time in their disease adult patients with low-grade glioma should be irradiated. Future initiatives will continue to focus on the conduct of controlled trials, rational academic drug development as well as systematic evaluation of tumor tissue including biomarker development for personalized therapy. Important lessons learned in neurooncology are to dare to ask real questions rather than merely rapidly testing new compounds, and the value of well designed trials, including the presence of controls, central pathology review, strict radiology protocols and biobanking. Structurally, the EORTC BTG has evolved into a multidisciplinary group with strong transatlantic alliances. It has contributed to the maturation of neurooncology within the oncological sciences.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Flow features inside centrifugal compressor stages are very complicated to simulate with numerical tools due to the highly complex geometry and varying gas conditions all across the machine. For this reason, a big effort is currently being made to increase the fidelity of the numerical models during the design and validation phases. Computational Fluid Dynamics (CFD) plays an increasing role in the assessment of the performance prediction of centrifugal compressor stages. Historically, CFD was considered reliable for performance prediction on a qualitatively level, whereas tests were necessary to predict compressors performance on a quantitatively basis. In fact "standard" CFD with only the flow-path and blades included into the computational domain is known to be weak in capturing efficiency level and operating range accurately due to the under-estimation of losses and the lack of secondary flows modeling. This research project aims to fill the gap in accuracy between "standard" CFD and tests data by including a high fidelity reproduction of the gas domain and the use of advanced numerical models and tools introduced in the author's OEM in-house CFD code. In other words, this thesis describes a methodology by which virtual tests can be conducted on single stages and multistage centrifugal compressors in a similar fashion to a typical rig test that guarantee end users to operate machines with a confidence level not achievable before. Furthermore, the new "high fidelity" approach allowed understanding flow phenomena not fully captured before, increasing aerodynamicists capability and confidence in designing high efficiency and high reliable centrifugal compressor stages.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES Primary care physicians (PCPs) should prescribe faecal immunochemical testing (FIT) or colonoscopy for colorectal cancer screening based on their patient's values and preferences. However, there are wide variations between PCPs in the screening method prescribed. The objective was to assess the impact of an educational intervention on PCPs' intent to offer FIT or colonoscopy on an equal basis. DESIGN Survey before and after training seminars, with a parallel comparison through a mailed survey to PCPs not attending the training seminars. SETTING All PCPs in the canton of Vaud, Switzerland. PARTICIPANTS Of 592 eligible PCPs, 133 (22%) attended a seminar and 106 (80%) filled both surveys. 109 (24%) PCPs who did not attend the seminars returned the mailed survey. INTERVENTION A 2 h-long interactive seminar targeting PCP knowledge, skills and attitudes regarding offering a choice of colorectal cancer (CRC) screening options. OUTCOME MEASURES The primary outcome was PCP intention of having their patients screened with FIT and colonoscopy in equal proportions (between 40% and 60% each). Secondary outcomes were the perceived role of PCPs in screening decisions (from paternalistic to informed decision-making) and correct answer to a clinical vignette. RESULTS Before the seminars, 8% of PCPs reported that they had equal proportions of their patients screened for CRC by FIT and colonoscopy; after the seminar, 33% foresaw having their patients screened in equal proportions (p<0.001). Among those not attending, there was no change (13% vs 14%, p=0.8). Of those attending, there was no change in their perceived role in screening decisions, while the proportion responding correctly to a clinical vignette increased (88-99%, p<0.001). CONCLUSIONS An interactive training seminar increased the proportion of physicians with the intention to prescribe FIT and colonoscopy in equal proportions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Microvariant alleles, defined as alleles that contain an incomplete repeat unit, often complicate the process of DNA analysis. Understanding the molecular basis of microvariants would help to catalogue results and improve upon the analytical process involved in DNA testing. The first step is to determine the sequence/cause of a microvariant. This was done by sequencing samples that were determined to have a microvariant at the FGA or D21S11 loci. The results indicate that a .2 microvariant at the D21S11 locus is caused by a -TA- dinucleotide partial repeat before the last full TCTA repeat. The .2 microvariant at the FGA locus is caused by a -TT- dinucleotide partial repeat after the fifth full repeat and before the variable CTTT repeat motif. There are several possibilities for the reason the .2 microvariants are all the same at a locus, each of which carry implications on the forensic community. The first possibility is that the microvariants are identical by descent, which means that the microvariant is an old allele that has been passed down through the generations. The second possibility is that the microvariants are identical by state, which would mean that there is a mechanism selecting for these microvariants. Future research studying the flanking regions of these microvariants is proposed to determine which of these possibilities is the actual cause and to learn more about the molecular basis of microvariants.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A revision of a similar publication, AMS-16.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The power of computer game technology is currently being harnessed to produce “serious games”. These “games” are targeted at the education and training marketplace, and employ various key game-engine components such as the graphics and physics engines to produce realistic “digital-world” simulations of the real “physical world”. Many approaches are driven by the technology and often lack a consideration of a firm pedagogical underpinning. The authors believe that an analysis and deployment of both the technological and pedagogical dimensions should occur together, with the pedagogical dimension providing the lead. This chapter explores the relationship between these two dimensions, and explores how “pedagogy may inform the use of technology”, how various learning theories may be mapped onto the use of the affordances of computer game engines. Autonomous and collaborative learning approaches are discussed. The design of a serious game is broken down into spatial and temporal elements. The spatial dimension is related to the theories of knowledge structures, especially “concept maps”. The temporal dimension is related to “experiential learning”, especially the approach of Kolb. The multi-player aspect of serious games is related to theories of “collaborative learning” which is broken down into a discussion of “discourse” versus “dialogue”. Several general guiding principles are explored, such as the use of “metaphor” (including metaphors of space, embodiment, systems thinking, the internet and emergence). The topological design of a serious game is also highlighted. The discussion of pedagogy is related to various serious games we have recently produced and researched, and is presented in the hope of informing the “serious game community”.